Access and Feeds

Artificial Intelligence: Impetus for Both Hardware and Software Innovation

By Dick Weisinger

The widespread use and success of artificial intelligence is driving developments in hardware and software that can improve and optimize the AI process.

On the hardware side, GPUs have become the preferred processor to use when implementing AI algorithms. Compared to standard CPUs, GPUs have proven wildly successful as ways to speed up machine and deep learning algorithms. While vendors like Nvidia continue to improve their GPU chip performance and maintain market share for those chips, expect to see GPU market share for AI-specific applications to plateau or at least slow as other vendors and tech companies compete to develop custom AI-specific chips that target and are optimized for processing AI algorithms. VCs are investing hundreds of millions of dollars to fund startups with business plans that target custom AI chip development.

Sean Stetson, director of technology advancement at Seegrid, promotes using custom AI chips rather than non-specialized CPU and GPU chips, saying that “in order to make any algorithm work, whether it’s machine learning or image processing or graphics processing, they all have very specific workflows. If you do not have a compute core set up specific to those patterns, you do a lot of wasteful data loads and transfers. It’s when you are moving data around when you are most inefficient, that’s where you incur a lot of signaling and transient power. The efficiency of a processor is measured in energy used per instruction.”

On the software side too, the ability to translate AI algorithms easily into software is driving preferences for computer programming languages like Python and R, but it is also spurring the development of new kinds of programming languages, especially in the area of probabilistic programming.

Michael Kozlov and Ashish Kulkarni, executives at WorldQuant, wrote that “probabilistic computing will allow future systems to deal with the uncertainties inherent in natural data, enabling the development of computers capable of understanding, predicting and making decisions. “

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Leave a Reply

Your email address will not be published. Required fields are marked *

*