Access and Feeds

Deep Learning: Are GPUs the Only Way?

By Dick Weisinger

Deep Learning has enabled impressive advances in areas like healthcare, finance, machine vision, and self-driving cars. But to be successful, the algorithms need massive amounts of data and CPU cycles. Most Deep Learning projects today rely on specialized GPU processors that are much more expensive than standard commodity CPUs.

Now a startup is called Neural Magic has devised a way to efficiently run Deep Learning algorithms on standard CPUs. The company advertises ‘GPU speeds without GPUs’.

Nir Shavit, MIT professor and co-founder of Neural Magic, said that “our vision is to enable data science teams to take advantage of the ubiquitous computing platforms they already own to run deep learning models at GPU speeds — in a flexible and containerized way that only commodity CPUs can deliver… Yes, running on a commodity processor you get the cost savings of running on a CPU, but more importantly, it eliminates all of these huge commercialization problems and essentially this big limitation of the whole field of machine learning of having to work on small models and small data sets because the accelerators are kind of limited. This is the big unlock of Neural Magic.”

It remains to be seen if the company can live up to their hype. Neural Magic said that their approach has already been demonstrated with typically very computing-intensive applications like image classification and object detection. They report that they’re able to do those types of applications as fast as GPUs and that their approach provides greater flexibility and can process larger images and video streams.

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)
0 comments on “Deep Learning: Are GPUs the Only Way?
1 Pings/Trackbacks for "Deep Learning: Are GPUs the Only Way?"
  1. […] Deep Learning: Are GPUs the Only Way?  Formtek Blog “machine vision” – Google News […]

Leave a Reply

Your email address will not be published. Required fields are marked *

*

13 + 20 =