MLPerf is a consortium involving more than 40 leading companies and university researchers, which has released several rounds of results. MLPerf’s goals are:
Accelerate progress in ML via fair and useful measurement
Encourage innovation across state-of-the-art ML systems
Serve both industrial and research communities
Enforce replicability to ensure reliable results
Keep benchmark effort affordable so all can play Read More
Tag Archives: Performance
The Vision Behind MLPerf
A broad ML benchmark suite for measuring the performance of ML software frameworks, ML hardware accelerators, and ML cloud and edge platforms.
… since 2012 the amount of compute used in the largest AI training runs has been increasing exponentially with a 3.5 month-doubling time (by comparison, Moore’s Law had an 18-month doubling period). Since 2012, this metric has grown by more than 300,000x (an 18-month doubling period would yield only a 12x increase). Improvements in compute have been a key component of AI progress, so as long as this trend continues, it’s worth preparing for the implications of systems far outside today’s capabilities.” Read More