Access and Feeds

Artificial Intelligence Chips: Google ‘s TPU 3.0 Offers 8x Performance Boost

By Dick Weisinger

In May, Google announced the third generation release of a new chip called a TPU (Tensor Processing Unit).  The first generation of TPUs was release in 2016, followed up shortly afterward with a speed-enhanced TPU version 2.0.  The newest TPU 3.0 is 8 times faster than the TPU 2.0 and can handle 100 petaFLOPs or higher, putting it in the range of a supercomputer performance.  But all that computational power makes the chips very hot — they require a special copper-plated cooling system to keep them from overheating.

Google plans to offer TPU processors via a cloud service.  Currently users can rent a board of TPUs for their projects.  Google expects to start rolling out the TPU version 3.0 in the cloud soon.

TPUs are optimized to run Google’s TensorFlow algorithm, used primarily for machine learning and artificial intelligence.  With this approach, Google is trying to own the entire stack.  Google recently reorganized and aligned a number of projects that it had all under the umbrella that it is now calling “Google AI.” If Google’s platform can become advanced enough to attract users, they can lock developers into it. Competition in the same AI chip business is very active from Nvidia, Facebook, Amazon and Microsoft.

 

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Leave a Reply

Your email address will not be published. Required fields are marked *

*