19.9 C
New York
Saturday, April 20, 2024

A new MIT computer chip could allow your smartphone to do complex AI tasks

Yesterday, a team of researchers from MIT introduced a new computer chip optimized for deep-learning, an approach to artificial intelligence that is gaining popularity. The chip, dubbed “Eyeriss” could allow mobile devices to perform tasks like natural language processing and facial recognition without being connected to the internet. It’s the latest attempt to make the complex operations of machine learning more portable. That means that our smartphones, wearables, robots, self-driving cars, and other IoT devices could begin performing complex deep learning processes locally — something that until now has been very difficult to do.

Deep learning has traditionally demanded large amounts of computer processing. GPUs, computer chips designed to render the graphics we see on our computer screens, are a good enough workhorse to handle the task. But GPUs come with a drawback: they suck up a ton a power. This makes them impractical to use for deep learning on mobile devices. The workaround to this has been to take raw data collected by devices, upload it over the internet, perform deep learning on powerful GPU servers, then shoot the results back over the internet to the device.

This can lead to some problems that the Eyeriss is promising to solve. The first is that if your mobile device can’t find an internet connection, then it can’t carry out deep learning tasks. (Siri, for example, needs lots of processing power to understand speech, which is why it won’t work unless it can reach Apple’s servers over the web.) When you do manage to connect to the internet, the data that devices upload to remote servers can be personal in nature, which leads to privacy issues. There’s also the pesky problem of transmission latency — the amount of time it takes for information to be sent from your mobile device and back. The researchers claim Eyeriss is designed in a way that makes it 10 times more power-efficient, which means it could avoid all of these problems without killing your battery.

Eyeriss is the latest in a number of chips being announced that are taking deep learning out of remote servers. Qualcomm just revealed its Snapdragon 820A and 820Am processors at CES 2016, which allow cars to detect multiple lane markings and understand traffic signals using deep learning. Nvidia also flaunted its machine learning wares at CES, demonstrating how its Tegra processor could use deep learning for be used for autonomous driving. And Google recently announced a partnership with chip maker Movidius aimed at improving facial recognition on phones.

The MIT research team, headed by Vivienne Sze of MIT’s department of electrical engineering and computer science, introduced the chip at the International Solid State Circuits Conference in San Francisco, where they used it to perform an image recognition task. Plans for when the Eyeriss will reach devices have not been announced.

Related articles

Beute nach dem Sieg über Tante Ethel in BG3

Tante Ethel in Baldur's Gate 3 Tante Ethel ist eine...

AMD’s 32-Core Threadripper Processor will be available later this year

Today at Computex, AMD teased it upcoming 32-Core Threadripper Processor....

New Intel 28-Core Desktop Processor is coming later this Year

Intel has announced it's new 28-core/56-thread high-end desktop processor....

New Intel Core i9, i7 and i5 Processors announced for Laptops

Today Intel unveiled the first ever Intel Core i9...

Intel 8th-Gen Processors with AMD Vega graphics announced

Intel has announced its 8th-Generation Core i7 and Core...