X

Google’s new Tensor Processing Unit custom chip

Google announced on Wednesday its new Tensor Processing Unit (TPU) custom chip that, according to the company, will take chip technology “seven years into the future”, thanks to its precise and rapid functionality.

Alphabet Inc (Google’s home company) has been using this chip for almost two years on its offices. The chip is believed to accelerate artificial-intelligence software and has not competition in the market so far.

Google announced on Wednesday its new Tensor Processing Unit (TPU) custom chip that, according to the company, will take chip technology “seven years into the future”. Photo credit: Venture Beat

It has been rumored that the company was working on the chip for several months and Google confirmed the rumored information when announcing the Tensor Processing Unit better known as TPU during this year’s I/O developer conference.

According to the company, The TPU is currently being used in Google’s most known applications, with 100 teams using the AI chip into applications such as Street View, Inbox Smart Reply and voice search.

Tensor Processing Unit

According to the company the Tensor Processing Unit is a custom ASIC, which is an application-specific integrated circuit. This type of circuit is customized for a specific use rather than for a general purpose.

Some common ASIC chips include microprocessors, memory blocks such as ROM and Ram memories and several systems.  This type of chip has a designed that features from two to nine metal layers, one perpendicular and one below and are suitable for hardware devices.

Google’s ASIC chip was created specifically for machine learning or artificial intelligence that runs with TensorFlow and artificial intelligence software library that has different data flow graphs.

“We’ve been running TPUs inside our data centers for more than a year, and have found them to deliver an order of magnitude better-optimized performance per watt for machine learning,” said Google in a public release.

The newly designed TPU is expected to fast-forward artificial intelligence technology seven years on to the future or in Moore’s law years, which is the observation that the numbers of transistors in a dense integrated circuit. This system doubles every two years.

Moore’s law is commonly used in technology developing as a reference point in circuit and chip technology.

According to the company the Tensor Processing Unit is a custom ASIC, which is an application-specific integrated circuit. Photo credit: Google Cloud Platform Blog

The TPU is more precise

Since the chip was created for machine learning applications, the TPU chip is more tolerant of reduced computational precision, which means that the chip is more precise and runs faster to accomplish its tasks.

Google managed to develop a chip that can run more operations per second on the TPU with more sophisticated artificial intelligence models. This translates into more rapid results for the TPU users.

The chip is able to fit into any hard disk drive slot into the company’s data center racks and has been working with Google applications such as Rank Brain which improves Google’s search results and Street View that improves the accuracy and quality of the company’s map and navigation.

The TPU chip was used in the matches of Go world champion Lee Sedol allowing the device to think and run much faster.

Source: Google Cloud Platform Blog

Categories: Technology
Maria Gabriela Méndez:
Related Post