What’s on-device processing? A Google engineer explains


Each time a brand new Pixel cellphone comes out, you would possibly hear that “on-device processing” makes its cool new options attainable. Simply check out the new Pixel 9 phones — issues like Pixel Studio and Call Notes run “on gadget.” And it’s not simply telephones: Nest cameras, Pixel smartwatches and Fitbit devices additionally use this complete “on-device processing” factor. Given the units that use it and the options it’s powering, it sounds fairly essential.

It’s protected to imagine that the, er, processing, is going on on the, uh…effectively, the gadget. However to get a greater understanding of what which means, we talked to Trystan Upstill, who has been at Google for practically 20 years engaged on engineering groups throughout Android, Google Information and Search.

You have been on a workforce that helped develop among the thrilling options that shipped with our new Pixel units — are you able to inform me a bit of about what you labored on?

Most lately, I labored inside Android the place I led a workforce that focuses on melding Google’s numerous expertise stack into a tremendous expertise that’s significant to the person. Then determining construct it and ship it.

Since we’re bettering applied sciences and introducing new ones very often, it looks as if that will be a unending job.

Precisely! Inside latest years, there’s been this explosion in generative AI capabilities. At first after we began fascinated by operating giant language fashions on units, we thought it was type of a joke — like, “Positive we will do this, however perhaps by 2026.” However then we started scoping it out, and the expertise efficiency developed so shortly that we have been in a position to launch options utilizing Gemini Nano, our on-device mannequin, on Pixel 8 Professional in December 2023.

That’s what I need to know extra about: “on-device processing.” Let’s break it down and begin with what precisely “processing” means.

The principle processor, or system-on-a-chip (SoC), in your units, has quite a lot of what are referred to as Processing Models designed particularly to deal with the duties you need to do with that gadget. That is why you may see the chip (just like the Tensor chip present in Pixels) known as a “system-on-a-chip: There’s not only one processor, however a number of processing items, reminiscence, interfaces and far more, all collectively on one piece of silicon.

Let’s use Pixel smartphones for example: The processing items embrace a Central Processing Unit, or CPU, as the primary “engine” of kinds; a Graphics Processing Unit, or GPU, which renders visuals; and now at this time we’ve a Tensor Processing Unit, or TPU, specifically designed by Google to run AI/ML workloads on a tool. These all work collectively to assist your cellphone get issues performed — aka, processing.

For instance, once you take pictures, you’re typically utilizing all parts of your cellphone’s processing energy to good impact. The CPU might be busy operating core duties that management what the cellphone is doing, the GPU might be serving to render what the lens is seeing and, on a premium Android gadget like a Pixel, there’s additionally a variety of work taking place on the TPU to course of what the optical lens sees to make your pictures look superior.

Received it. “On-device” processing implies there’s off-device. The place is “off-device processing” taking place, precisely?

Off-device processing occurs within the cloud. Your gadget connects to the web and sends your request to servers elsewhere, which carry out the duty, after which ship the output again to your cellphone. So if we wished to take that course of and make it occur on gadget, we’d take the big machine studying mannequin that powered that process within the cloud and make it smaller and extra environment friendly so it may possibly run in your gadget’s working system and {hardware}.

What {hardware} makes that attainable?

New, extra highly effective chipsets. For instance, with the Pixel 9 Pro, that’s taking place due to our SoC referred to as Tensor G4. Tensor G4 allows these telephones to run fashions like Gemini Nano — it’s in a position to deal with these high-performance computations.

So mainly, Tensor is designed particularly to run Google AI, which is additionally what powers a variety of Pixel’s new gen AI capabilities.

Proper! And the generative AI options are undoubtedly a part of it, however there are many different issues on-device processing makes attainable, too. Rendering video, enjoying video games, HDR photograph enhancing, language translation — most every part you do along with your cellphone. These are all taking place in your cellphone, not being despatched as much as a server for processing.

Leave a Reply

Your email address will not be published. Required fields are marked *