Access and Feeds

Artificial Intelligence: AI Compute Doubling Every 3.5 months

By Dick Weisinger

Artificial Intelligence models have grown in complexity and sophistication. But the gains have come not so much in the area of software and new algorithms but in the hardware horse power used to run them.

Compute, or the number of calculations performed has gone off the charts as models have increased in their complexity and astonishing accuracy. OpenAI reports that compute has increased by a factor of 300,000 times between 2012 and 2019.

John Chong, vice president at Kionix, said that “five years ago, hardware was out. No one wanted to invest in it. It took too long, and everything was about software. Now, the pendulum has swung back. Hardware is in, and it’s not because you’re making money on the hardware. The hardware is the enabler.”

From OpenAI. The Compute (petaflops/day) used for training well-known AI models.

Tiernan Ray, technology writer at ZDNet, said that “in the past twenty years, big data and fast parallel computation became the norm and propelled machine learning, bringing about deep learning. The next wave of computer hardware and software will probably be about vast amounts of memory and neural networks that are dynamically constructed to take advantage of highly parallel chip architectures. The future looks very interesting. “

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Leave a Reply

Your email address will not be published. Required fields are marked *

*