The AI and the machine learning universe has gathered around the Google’s TensorFlow library having a large resisting factor was the -DeepMind performing it’s most researches based on the Torch7 Library until DeepMind introduced their latest TensorFlow using portions of AlphaGo.

As on their blog,for the sake of more optimization for the TensorFlow they have developed and engineered a ASIC chip based custom TPU or the Tensor Processing Unit.

As we know that most of the applications can be boosted and optimized by using a dedicated hardware by tons of time but it should be made worthy the same times.As an instance the video playback which the world sees almost everyday and enjoys as they like it but using their own codecs which boost up their performance as well as at the cost of flexibility.Obviously Google does it better.

As we see the evidence using our own eyes , to be exact, Google’s Street View being on of them Google is using TPUs everywhere in the sake of modernisation  and speed.

Their data server rack looks like this-The Google’s brain as we understand.

*Information from TheVerge.

Load More Related Articles
Load More By Satadru Das
Load More In networking