Episode 4

How Android “listens” to you with ultra low power sensors (w/ Kieron Quinn)

Published on: 10th February, 2022

Sign up for the 451 Research and Esper webinar: https://www.esper.io/webinar/digital-transformation-for-dedicated-devices

Ever wondered how Android can just listen and identify a song at any time? Or how your phone can tell you've been in a car crash? Using an obscure Android resource that interacts with ultra low power sensors, Google has created powerful always-on integrations. But why aren't other Android devices taking advantage of this?

Android Bytes is hosted by Mishaal Rahman, Senior Technical Editor, and David Ruddock, Editor in Chief, of Esper.

Esper enables next-gen device management for company-owned and managed tablets, kiosks, smart phones, IoT edge devices, and more.

For more about Esper:

Transcript
Mishaal:

This week I wanted to talk a bit about a topic that many people do

Mishaal:

not know about because it involves.

Mishaal:

Underlying hardware that doesn't run an operating system you're probably

Mishaal:

familiar with and it involves API APIs that are not generally

Mishaal:

accessible to third-party developers.

Mishaal:

But I do have a third-party developer here who is spin, trying to basically hack

Mishaal:

his way into accessing those API APIs.

Mishaal:

I'd like to welcome to the show.

Mishaal:

Karen Quinn, who is a developer with a company in the UK.

Mishaal:

And he's also a well known third party app developer on the XDA forums and

Mishaal:

online who basically tries to hack his way into getting Google's applications,

Mishaal:

working on any device in any way he can.

Mishaal:

So welcome to the show.

Mishaal:

Karen,

Kieron:

thank you very much, but like you say, Pretty much what I intend

Kieron:

to do to get as many things working on as many devices as possible.

Kieron:

And tap, tap, like you say, is no exception to.

David:

Yeah.

David:

And so for, for background here, what we're talking about this obscure thing

David:

is the Android this has a name that I'm going to have to look up because I've

David:

just forgotten it, the the context hub.

David:

So this is like, if you, if you want.

David:

If you have some computing background, like this is a, like,

David:

this is a chip on most smartphones.

David:

It usually manifests in the form of a dedicated little part of

David:

the DSP digital signal processor.

David:

And this is branded variously by companies like Qualcomm and

David:

BD attacking Google and Samsung.

David:

And so it runs a tiny little real-time operating system.

David:

Very, very lightweight, designed to be extremely low power usage.

David:

And Android, you know, has had support for this for a while, but usage and

David:

adoption of it is kind of limited.

David:

And I think that's part of what we'll get into today, but maybe we should really

David:

start by explaining what this hub can do.

Mishaal:

Right.

Mishaal:

So to step back a bit you know, your slab rectangular Android

Mishaal:

smartphone is filled to the brim.

Mishaal:

A plethora of sensors.

Mishaal:

For example, any modern flagship phone will include sensors like an

Mishaal:

accelerometer, gyroscope, an ambient light sensor, a proximity sensor,

Mishaal:

and so on, and mostly sensors do exactly what their names say they do.

Mishaal:

They measure the acceleration in the case of the accelerometer, the

Mishaal:

magnetic field, in case of a maximum.

Mishaal:

The ambient light in the case of an ambient light sensor.

Mishaal:

And so, you know, these, these basic sensors there

Mishaal:

you'll find them pretty much.

Mishaal:

Every smartphone, a few low end smartphones might lack one sensor here

Mishaal:

or there, but in flagships, you'll find almost all of these sensors.

Mishaal:

And because there are so many sensors and there's so little room to put

Mishaal:

them in, into a smartphone package.

Mishaal:

A lot of vendors, what they do is they pack these physical sensors

Mishaal:

together into a single package.

Mishaal:

So for example, an IMU chip, that's an international sorry, I forgot the

Mishaal:

acronym stands for, but it combines an accelerometer and gyroscope into

Mishaal:

a single chip on the software side.

Mishaal:

Android can combine the sensors, not physically, but in software to create

Mishaal:

what's called a composite sensor that basically collects data from two or

Mishaal:

more sensors and fuses them together to where it can detect certain functions.

Mishaal:

So for example, there's a rotation vector composite sensor, which

Mishaal:

combines data from an accelerometer might need a meter and a gyroscope.

Mishaal:

So basic sensors themselves, like the accelerometer can also do multiple things.

Mishaal:

They don't have to be combined with another center to be useful.

Mishaal:

The accelerometer, for example, it can be used for basic step counter

Mishaal:

detection, or it can also be used for the significant motion detection.

Mishaal:

So for example, like if you were to start running or going on a bike, if

Mishaal:

it detect that your act doing those activities, or it can also be used to

Mishaal:

protective, just basically walking.

Mishaal:

So it has a lot of sensitivity there.

Mishaal:

So there's all sorts of sensors.

Mishaal:

You can find on a modern, smart form, but I wanted to know from you guys, are

Mishaal:

there any sensors that you wish were packed into more smartphones these days?

Mishaal:

Samsung, for example, used to have a heart rate sensor in their flagship galaxy

Mishaal:

S series, but they stopped including them with the galaxy S 10 launch.

Mishaal:

Do you think more smartphones should have dedicated health sensors like that?

Mishaal:

Or do you think another approach is more suited?

Kieron:

I liked the idea of having a heart rate sensor in the phone.

Kieron:

But I do question how much separating the cry at any moment, close off the

Kieron:

plethora of smartwatches that have the same sensor in them, and that the many,

Kieron:

many health apps that are associated with that, even the really cheap

Kieron:

smartphones nowadays have sensors in them.

Kieron:

In terms of having a dedicated one, as opposed to using the camera system.

Kieron:

It is a, it's a nice to have, but you do, you do have to then

Kieron:

consider where would you put it?

Kieron:

So yeah, on the back of the phone, they used to be sometimes

Kieron:

integrated into fingerprint sensors.

Kieron:

So that made sense.

Kieron:

That's why you think it usually was.

Kieron:

If you can do it with a camera.

Kieron:

And if the phone is set up in such a way, I have used a very similar API with an

Kieron:

app at work that I won't go into that used the camera other than the flashlights,

Kieron:

in order to detect the heart rate.

Kieron:

I, one of the problems we've had is that various different phones of the

Kieron:

flashlight in different positions, and then phones, these days have so

Kieron:

many different cameras on them and in different arrays that you have to have

Kieron:

your finger in a very specific position.

Kieron:

So it would have been really nice if we could say, well, it works.

Kieron:

The long list of Samsung smartphones.

Kieron:

That's still included them after the S 10, but obviously they

Kieron:

dropped them and we ended up just looking down the roots of all.

Kieron:

Can we integrate it into watches or anything like that?

Kieron:

We never got that far, but that is one of the things that we, we prefer to do.

Kieron:

So handing off this idea of, of doing heart rate sensors to the

Kieron:

watch is probably a better way of doing it nowadays with them being

Kieron:

the kind of dedicated health system.

Kieron:

But other senses, I don't know, I've been out to you guys.

David:

I think that if I were to choose a sensor to make more common on a phone and

David:

only one phone ever did it would, it would be radar solely on the pixel for while

David:

it was very limited in what it could do.

David:

The.

David:

Possibilities of that system were really intriguing.

David:

And I think that you could implement that kind of technology across

David:

a wide variety of form factors.

David:

And that's something we've seen because Google is trying to

David:

genericize solely to be used in other applications by other businesses.

David:

And you probably see solely popping up and things like

David:

what is the Google telepresence project star link, not star link.

David:

That's Elon Musk.

David:

I'm.

David:

Star line.

David:

That's what it is it star line.

David:

So when I look at something like radar, I think that holds a lot of, a lot of

David:

promise because it allows your device to see in a totally different way and also to

David:

do so, like in a way that can recognize.

David:

Motion in a very fine grain kind of manner.

David:

So maybe not a phone necessarily is the best use case for that, but something

David:

that's stationary where someone's sitting there or standing there.

David:

And that radar can determine an intent basically potentially using, you know,

David:

this low this low power sensor hub as part of like a wake series of commands,

David:

basically where like there's a basic presence detection that trips a low power

David:

sensor, and that wakes up the radar.

David:

So.

David:

So, yeah, I think that radar is one that I'm, I'm really intrigued by person.

Mishaal:

Yeah.

Mishaal:

And Google only included the solely radar sensor in a single smartphone

Mishaal:

series that pixel for, but before the pixel four launch, they actually

Mishaal:

in their patent applications, they showed off various video.

Mishaal:

That showed solely radars inside a wearable.

Mishaal:

But we haven't seen that use case emerged yet.

Mishaal:

We have seen them equip their second generation nest hub smart

Mishaal:

displays with solely radar.

Mishaal:

And I believe they use that for in, in conjunction with

Mishaal:

the sleep detection feature.

Mishaal:

But yes, that would be a fascinating example to see more deployment of.

Mishaal:

So I'm speaking of underutilized sensors you know, there's a lot of sensors

Mishaal:

that have been around for a long time.

Mishaal:

Intelligent use cases of the sensors goes pretty far back

Mishaal:

to the early days of Android.

Mishaal:

In fact, I think I'd say it's one of the key ways that one of

Mishaal:

the biggest smartphone makers, Motorola used to differentiate its

Mishaal:

smartphones from the competition.

Mishaal:

If you'll recall way back in the day, the company had a suite of actions

Mishaal:

called Moto actions that would let you do things like turning the flashlight

Mishaal:

on and off with a chopping motion by basically raising your hand.

Mishaal:

Moving it up and down quickly.

Mishaal:

They used to call that chop chop.

Mishaal:

And I think that I remember seeing a lot of commercials about that too.

Mishaal:

It was a pretty popular gesture.

Mishaal:

They had other gestures like launching the camera up, twisting

Mishaal:

your wrist, or like putting your phone face down and silence calls.

Mishaal:

These features.

Mishaal:

Aren't really all that special nowadays because well, they basically

Mishaal:

being copied all over, but they weren't common to find way back then.

Mishaal:

Personally, I think a lot of these emotion-based gestures are kind of silly

Mishaal:

cause I don't really see myself waving my hand around to do things with my phone.

Mishaal:

But what do you both think?

Mishaal:

Did you find any of these gestures particularly useful?

Kieron:

I can't say I've ever used the camera on either.

Kieron:

I'm much.

Kieron:

I'm much.

Kieron:

I'm much prefer the the double tap and triple tap.

Kieron:

The the more recent integration equivalent for

Kieron:

it, especially with Snapchat.

Kieron:

I can say a lot of people use them the face down to silence face down

Kieron:

to turn the screen up, especially if you don't have a number display.

Kieron:

So yeah, I can see, they use personally not a huge fan of

Kieron:

the Australian gestures either.

Kieron:

They look great on paper and they probably look great in commercials

Kieron:

and, and to board members, books.

Kieron:

Yeah, I don't think a lot of people would be using that kind of thing

Kieron:

in their daily life to be on.

David:

And I think that's right, that they demo really well.

David:

And the chop, chop, Michelle, you are literally in commercials and

David:

the Modo chop, they really marketed that what's interesting to me is

David:

that we never saw this really take off much with wearables and Kieron.

David:

Maybe know a little bit about this, but like, I, I always assumed this was a data

David:

noise issue where the wearable is getting so much accelerometer and gyro data that.

David:

Finding an action that can consistently be recognized with

David:

accuracy would be a real challenge.

Kieron:

I would imagine that is probably the Keisha right there, but

Kieron:

wearables also have the unique or the more unique type of thing where they

Kieron:

have a much smaller battery as well.

Kieron:

So even if you do have a low power CPU, you still have to be considered,

Kieron:

considered the battery aspect of it.

Kieron:

And I'm not sure.

Kieron:

I don't know if you guys know.

Kieron:

Whether the old wearable chips actually even had these lower power CPU's in them.

Kieron:

That might be an open question.

Kieron:

Do you know whether they

David:

did or not that I don't know.

David:

Michelle, do you have any insight?

David:

I know

Mishaal:

that over time these wearable chips has happened adding more and more.

Mishaal:

Low power subsystems that components.

Mishaal:

But I think that is one of the biggest differentiators between the wearable

Mishaal:

chip sets and the chip sets for bigger devices like smartphones is that

Mishaal:

wearable chip sets don't have enough of the low power, like machine learning

Mishaal:

chords, for example, to offload a lot of that processing to which is why you

Mishaal:

can see things like super low power efficient voice processing on smartphone

Mishaal:

devices, but not on like wearables.

David:

That's, you know, it is a good point.

David:

The power envelope, there is so much tighter.

David:

You do have to be very judicious with it.

David:

And I do wonder if even where O S would make something like that really practical

David:

for a developer, but that kind of gets into, you know, what can you do with the

David:

sensors, especially in the context of like a smartphone where these are really

David:

becoming very, very complex system.

Mishaal:

Yeah.

Mishaal:

So over the years, you know, we've seen a mobile SOC has become incredibly

Mishaal:

powerful, but we've also seen behind the scenes, these sensors they've

Mishaal:

become incredibly more advanced the data they're able to process as much more

Mishaal:

fine-grained smartphones themselves.

Mishaal:

Getting getting bigger so that they can pack more and more sensors inside of them.

Mishaal:

And as I mentioned before, machine learning is, you know, something that's

Mishaal:

really taken off in the recent years because of all of these innovations,

Mishaal:

we've seen software companies use sensor data in really innovative

Mishaal:

ways, ways that we've never really considered were possible many years ago.

Mishaal:

So I like to pose the question to both of you.

Mishaal:

What in your view has been the most impressive feature to come

Mishaal:

out of sensor applications?

Kieron:

I think car crash detection is quite quite a big one, but also just,

Kieron:

just in general use of sensor data for me, it's, it's, it's got to be the now

Kieron:

playing feature on pixels because that is even if it doesn't go through this

Kieron:

low power stop as such, or at least I'm not aware if it does complete.

Kieron:

The fact that they have managed to get that to work offline and with such a

Kieron:

big database is it's just insane to me.

Kieron:

I've looked at all the stuff behind it.

Kieron:

I still still boggles the mind when you look at it and it just works like you, you

Kieron:

barely have a song playing in the distance and it will recognize it sometimes.

Kieron:

So that always listening capability, I think is the top.

David:

I think I'd have to agree aside from car crash detection, which, I

David:

mean, we have it in the outline here.

David:

So I think we're all being drawn to it because it is very cool.

David:

But the music detection feature is obviously in evolution of hot

David:

word detection, which became table stakes on smart phones in the last

David:

six years, I guess, about that.

David:

How'd that long that's been around, something like that.

David:

So I do think that that feature is a great example of the convergence of

David:

machine learning and AI, and then using these ultra low power kind of sensor hubs

David:

to take a very, what is really a very.

David:

Data's signal, right?

David:

You are taking one piece of input and you were then feeding that into this massive

David:

dataset and coming out with, and you know, something actionable for the user.

David:

So that's in that sense.

David:

I do think that the.

David:

The music detection feature is more impressive than car crash detection,

David:

because you are doing something with like so little contacts.

David:

That's, what's impressive about it.

David:

The car crash detection feature, which Michelle, you can explain

David:

a little bit how that works.

David:

Really is impressive because it's able to fuse data from

David:

multiple sources into another.

Mishaal:

Yeah, car crash detection.

Mishaal:

You know, it's like David said, fuses data from multiple sensors, including a

Mishaal:

gyroscope accelerometer and a microphone.

Mishaal:

And basically because you know, it's such a life-saving event and needs

Mishaal:

to be processing data continuously from all these sensors, because you

Mishaal:

know, it can't miss a single beat.

Mishaal:

It can't, it can't miss a car crash event that would be catastrophic, potentially

Mishaal:

devastate life devastating to the user.

Mishaal:

And not as detrimental to the user, but the now playing feature that Karen

Mishaal:

mentioned, you know, that needs to continuously process microphone data

Mishaal:

in the background because it needs to pick up on audio cues and mashed up.

Mishaal:

Fingerprint from a, from a database that's stored on device.

Mishaal:

And because of this continuous processing requirement for the microphone for now

Mishaal:

playing and the continuous requirement for the gyroscope, accelerometer

Mishaal:

and microphone for the car crash detection, you know, companies like

Mishaal:

Google that are making devices with limited batteries in a smartphone, you

Mishaal:

know, they have to consider how do we implement these features with that?

Mishaal:

Destroying the battery life on a device.

Mishaal:

And you know, if you were to keep the main applications, processor on

Mishaal:

a device, such as the Google tensor chip in the pixel, six awake at all

Mishaal:

times in order to process that data, it probably destroy the battery life.

Mishaal:

So instead the solution is to not do that.

Mishaal:

Don't wake the application's processor at all times.

Mishaal:

Instead use something else, something much more low powered something called a

Mishaal:

sensor hub, which is what David alluded to in the beginning of this episode

Mishaal:

sensor hub, which is also called a context hub in other contexts is a low power

Mishaal:

processor that exists solely to process.

Mishaal:

Data from sensors and then wake the main applications processor,

Mishaal:

whenever something needs to be done.

Mishaal:

We've seen sensor hubs used in devices going back all the

Mishaal:

way to the Motorola example.

Mishaal:

They've they use a sensor hub.

Mishaal:

That's apparently a arm powered microcontroller.

Mishaal:

Sensor hubs can also exist on the diet.

Mishaal:

In the form of an island that runs his own operating system,

Mishaal:

you'll find it on Qualcomm chips.

Mishaal:

They have what's called an SLPI or a sensor low power island, which

Mishaal:

is a part of the hexagon DSP.

Mishaal:

Google devices have also had sensor hubs since the days of

Mishaal:

the nexus five X and six feet.

Mishaal:

And of course their latest smartphones also include a sensor hub, which

Mishaal:

they call an always on compute.

Mishaal:

So the challenge with implementing a sensor hub is that it's a

Mishaal:

different kind of platform.

Mishaal:

The software running on it is very different.

Mishaal:

It's not Android.

Mishaal:

Most of the times as go, as David mentioned earlier, it's a real-time

Mishaal:

operating system and many central hubs often run different operating systems.

Mishaal:

You'll have solutions like free our costs or Zephyr OSTP, and then

Mishaal:

you'll have proprietary ones such as the ones that many Silicon vendors.

Mishaal:

Because of this variability in the software platforms on sensor hubs,

Mishaal:

Google decided to create a standardized framework called the context hub

Mishaal:

runtime environment, which is a software environment to execute small and native

Mishaal:

applications written in C or C plus.

Mishaal:

These native apps, which are called, and I know apps because they are small and they

Mishaal:

are native only do basically three things.

Mishaal:

They start collecting data, they stopped collecting data and they handle events.

Mishaal:

And under the C H R E API, Android is able to interface with these nano apps.

Mishaal:

So that's a whole bunch of context and I'd like to ask Karen.

Mishaal:

A question about his research into nano apps, because he's been looking into

Mishaal:

these nano apps and context hubs, et cetera, while he was digging into how

Mishaal:

the pixel six, his back tap feature works.

Mishaal:

So, Karen, can you tell us a bit about the pixel six is backpack feature

Mishaal:

otherwise known as quick tap as well as your research into how it makes use of

Mishaal:

context hubs and the nano application.

Kieron:

Yeah, so little bit of background, the feature use there's

Kieron:

accelerometer and gyroscope data.

Kieron:

And basically just feeds that into machine learning algorithms to

Kieron:

figure out whether the user is tapped on the back of the device once.

Kieron:

And then does that within a certain period of time to detect whether

Kieron:

they've done it as a double tap.

Kieron:

That's as far as the future goes on pixel devices.

Kieron:

Previously, when we'd seen it in Android 11, it was done in Italian app.

Kieron:

So it was all visible code, except obviously the student learning

Kieron:

algorithm, which is through density.

Kieron:

So that was very high power.

Kieron:

And that is previously what people were using.

Kieron:

And the original version of that that's happened.

Kieron:

That's what was draining the battery for a lot of people.

Kieron:

Even if you run it on a device that uploaded some of the processing of

Kieron:

TensorFlow stuff onto low power CPU.

Kieron:

Having the processor online all the time, and therefore you're draining a lot,

Kieron:

your battery, but an Android 12, we saw the feature disappeared from the code

Kieron:

from the, the app code, but the works.

Kieron:

So I looked further into it and then discovered that actually there is now

Kieron:

a nano app called Columbus Columbus being the name of the feature in

Kieron:

term in Google, which does all of the processing of the gyroscope accelerometer

Kieron:

data and also the time-based data.

Kieron:

And just to mix these events for a double-tap.

Kieron:

The way that sets up interfaces with this is a little bit of a hack.

Kieron:

So already, as well as the permissions framework, you have to get around the

Kieron:

fact that the nano app is emitting just the double-tap event, which is great.

Kieron:

If you only want double tap or touch up, or is that triple tap?

Kieron:

So the way I get around that is it also a mitzvah events for logging

Kieron:

purposes for when a single tap happens.

Kieron:

So I then do the triple tap detection.

Kieron:

In the app itself.

Kieron:

So that does use a little bit more power, but it's still tapping into the,

Kieron:

the low power ability of the, of these nano apps, unfortunately, and something

Kieron:

that people have asked me a lot.

Kieron:

They aren't portable because you can't build them without having the source code.

Kieron:

And they are very specific.

Kieron:

They are specific to the base and in some cases from why, so you can't just

Kieron:

take one that's been built on, on pixel sex and then check it on a one plus

Kieron:

four and hope it will work because.

Kieron:

Even if the OnePlus phone had a context of available, which, which it doesn't.

Kieron:

So no, it can't be parted, but yes, it is a great feature to use.

Kieron:

If it is available, there are also other nano apps available.

Kieron:

So as well as Columbus, there also one for detecting like ambient background stuff.

Kieron:

So just noise levels, that sort of thing, which I think feeds into.

Kieron:

Activity recognition on, on Google maps and that sort of thing, which is itself

Kieron:

an extra nano ultimate that Michelle, the dimension is that they are quite modular.

Kieron:

So one app will use another app to do some of its recognition

Kieron:

and check I would across them.

Kieron:

So you've got one for activity.

Kieron:

There's the car crush one, obviously, which, which takes in a lot of data.

Kieron:

And then for metadata, Information on when a car crash happens,

Kieron:

but it's extremely specific.

Kieron:

It has to happen.

Kieron:

It has to pass a lot of checks because obviously arguably a a number of false

Kieron:

positives will be worse in terms of PR hopefully not for the person, but it'd be

Kieron:

worse in terms of PR not triggering at all, because that might be recoverable.

Kieron:

But if you end up with the feature of interview disabled, because you.

Kieron:

I'd state with the authorities then, but that's much, much worse with PR.

Kieron:

So they've been very careful with that one.

Kieron:

Interestingly, there's also things like geo-fencing.

Kieron:

So if you're not aware of geo-fencing, what that is is basically the

Kieron:

device has an app on it that says, I want to be notified when

Kieron:

the device was in this location.

Kieron:

So it's using things like Google pay.

Kieron:

When you go into a store to recommend you use your cards and that sort of thing.

Kieron:

That that needs to be done low power.

Kieron:

Cause you don't want the, the CPS be on all the time.

Kieron:

That's processing all of the location data so that that's

Kieron:

done at a low power CPU as well.

Kieron:

But we've also found that there's this strange ones, like

Kieron:

calibration stuff for rest sensors.

Kieron:

So there's, there's obviously a lot of stuff going on in the background

Kieron:

or that Google they're trying to optimize to, to stop it from draining

Kieron:

the battery, making use of their.

Kieron:

Tends to CPE, which probably has a lot of processing for this sort of thing compared

Kieron:

to maybe some other CPS, the others, the loss of nano apps that you don't even know

Kieron:

are happening behind the scenes, because it is completely transparent to the users.

Kieron:

Super interesting platform to look at.

Kieron:

And it's just a bit of a shame that it isn't opened up to more developers,

Kieron:

which we'll be going into in a moment.

Kieron:

I'm sure.

David:

Yeah.

David:

And I think maybe we could skip ahead here to really why certain device makers are

David:

using this and why they aren't, because that seems to be the bigger issue is

David:

that Google's framework for using this.

David:

These hubs has very low adoption.

David:

Kieron, why do you think that is

Kieron:

the partially due to what I've just said?

Kieron:

Y Yes, as a third party developer.

Kieron:

So there is less incentive for them to put it in their devices because there

Kieron:

aren't like loads of apps using it.

Kieron:

So they're like, oh, this doesn't work on a Samsung phone is literally the case

Kieron:

that if, if Samsung came along and said, we want to implement the CHR, a chances

Kieron:

are they've actually already got under the framework somewhere in the system, but

Kieron:

he's doing the same thing for their apps.

Kieron:

So there is little incentive for them to integrate the standard.

Kieron:

If it's not being used by lots of third party system, I'm not sure how it

Kieron:

works for their kind of like geo-fence like, I was just saying an activity

Kieron:

tracking, whether they run on a different framework on Samsung phones or on other

Kieron:

OEMs that have different processes implemented or whether they simply

Kieron:

run on the CPO and use not battery.

Kieron:

I don't know.

Kieron:

But that, that will be interesting to find out because that will

Kieron:

probably be one of the big reasons why it's not been implemented.

Kieron:

Other than that, it's probably also due to lack of resources.

Kieron:

So it's smaller.

Kieron:

May not have the resource or the ability from their developers to

Kieron:

implement this sorts of thing.

Kieron:

It's very, very low level.

Kieron:

When you look at some of the how it interacts with the system.

Kieron:

It's I think it's, I think from the presentation that the docent was

Kieron:

running and C plus plus, so it's, while the platform itself can be integrated

Kieron:

within Java, you need a C plus plus developers be able to do that as well.

Kieron:

Slightly different platform than, than most of Android.

Kieron:

And obviously a bit of in the current level book or Williams may not have the,

Kieron:

the capacity for that sort of thing.

Kieron:

Yeah.

Kieron:

I think those two reasons probably combined for on the whole,

Kieron:

that's probably why they're not.

Mishaal:

So I'd like to bring up one major downside to the fact that many OEMs

Mishaal:

apart from Google haven't implemented

Mishaal:

And it's a fact that as Karen alluded, alluded to earlier, some

Mishaal:

Google applications make you.

Mishaal:

Fi framework, they have nano apps for things like activity

Mishaal:

recognition or geo-fencing.

Mishaal:

And those Google applications are actually found on pretty much every entry device,

Mishaal:

Google play services, for example, implements activity recognition as an

Mishaal:

API that other devices can subscribe to.

Mishaal:

So if you're looking to detect, say when a user is.

Mishaal:

Walking or biking or running, you could use Google plays.

Mishaal:

The Google play services activity API to basically implement those into your

Mishaal:

app into your application, but because very few devices support or have

Mishaal:

nano apps that Google play services can run except for on pixel devices.

Mishaal:

Only pixel devices will be able to have that activity recognition being

Mishaal:

incredibly powerful and running continuously in the background.

Mishaal:

Karen, what do you think about that?

Mishaal:

Do you think it's something that I'm like, what do you think about this situation?

Kieron:

Why?

Kieron:

Because I th I think it's, it's a shame that it's not been

Kieron:

implemented on, on other devices.

Kieron:

It is possible for a device for, so for an application to load a,

Kieron:

a nano app at runtime, that is a possibility, but it has to be able to.

Kieron:

Interacts with it.

Kieron:

So it needs certain permissions and it needs to be signed by the system.

Kieron:

And also none of apps have their own security layer.

Kieron:

So they need to start with a certain certificates in order

Kieron:

to be able to do certain things.

Kieron:

So it's possible that if, how we can support with better, that

Kieron:

they will be able to implement more of these things properly.

Kieron:

But it, it does just leap back to this whole idea that there's been little

Kieron:

incentive for them to do it at least.

Kieron:

What shows to the user, especially if they have their own system to do it.

Kieron:

Play services is a complete black box with no idea how most of it works.

Kieron:

So there is, there's a decent possibility that in there somewhere there is the

Kieron:

equivalent of this that is processing on, I dunno, the ex-con CPU or Qualcomm CPU's

Kieron:

or similar things on, on Samsung devices that have their own implementation.

Kieron:

So on that level, the ma the may be.

Kieron:

Things that we aren't aware of that are alternatives to that.

David:

Sure.

David:

And that makes sense, because when you're talking about, especially anything,

David:

that's getting ML models involved too, you're going to have different

David:

ML blocks across chip set vendors.

David:

You're going to have different implementations across generations

David:

of Silicon because it's still evolving pretty rapidly.

David:

And so probably a lot of these more sophisticated use cases like

David:

Google's music scanning require, obviously like you said, Kieron, they

David:

require a very narrowly tailored.

David:

Nano app that is very specific to the use case and is only really going to be

David:

useful to the one device being targeted.

David:

So having a broadly accessible platform for building on this hub or this con this

David:

contextual hub, probably like you said, it doesn't have much appeal to the vendors.

David:

They're already using the tools they want to use and the

David:

frameworks they want to use.

David:

But I do think that still there's, there's a whole lot of potential

David:

here in terms of what the CRH can do and what it can see.

David:

And Michelle, there's a, you have a great list here of kind

David:

of the stuff it can gather up.

David:

A lot of which I don't think is even being used by anyone.

David:

Right.

Mishaal:

Yeah.

Mishaal:

So looking at the documentation, the CHR three implementation

Mishaal:

actually supports multiple sensors, including the basic ones, like the

Mishaal:

accelerometer, gyroscope, ambulance sensor, proximity sensor, and it also

Mishaal:

has API APIs to request location data.

Mishaal:

Scan for wifi networks, get cellular ID information and process

Mishaal:

batches of data from a microphone.

Mishaal:

So I think that's probably the big one.

Mishaal:

We probably haven't seen many use cases from the audio data processing.

Mishaal:

Google does.

Mishaal:

Have a few audio related features that seem to make use of this

Mishaal:

such as the now playing feature.

Mishaal:

But as Karen mentioned, it, doesn't, doesn't look like now playing actually

Mishaal:

has a nano app from what we can tell, but yeah, there's clearly

Mishaal:

a lot that can be done with this framework and sensor hubs in general.

Mishaal:

But I wanted to ask both of you, what do you think would be the next big

Mishaal:

feature to make use of a sensor hub?

Kieron:

It was opened out.

Kieron:

Go

David:

on.

David:

Oh, no, go ahead, Kiran.

David:

I

Kieron:

said, if it was opened out to third-party developers, it would be sleep

Kieron:

tracking apps because they process audio data, movement data, especially if the

Kieron:

device had a radar sensor on it as well.

Kieron:

I don't know.

Kieron:

I think some of them are recommend, but you can put your

Kieron:

phone physically next to you.

Kieron:

So they'll probably also use gyroscope and accelerometer data.

Kieron:

So something like that would be nice to have a nano up that exposed

Kieron:

some data to an app on the system.

Kieron:

So that, that would be a nice feature to have, which when you'd have to market it

Kieron:

to users as a health feature, probably.

Kieron:

But other than that, it's hard to tell until they come out with.

Kieron:

Because all of the ideas that have used nano app so far and

Kieron:

the no, no play gesture stuff.

Kieron:

They've before that they've always be, oh, it would be a nice feature, but it

Kieron:

uses internet or it uses too much power.

Kieron:

So until the OEMs come up with these ideas and you think, oh

Kieron:

yeah, that's a really good idea.

Kieron:

It's, it's hard to think ahead of time what they are going.

David:

I think that one for me, and this is something that Google experimented

David:

with for a long time with a smart lock was the pocket detection mode for

David:

phones, which I don't think still exists.

David:

Michelle, is that, is that still work?

David:

That it was deprecated a while ago?

David:

I thought like, basically it seems like it's.

Mishaal:

Yeah, the texts in your pocket.

Mishaal:

I'm not sure if it's still, I think smart lock removed a lot

Mishaal:

of the like unlocking abilities.

Mishaal:

Like you can still have your device unlocked if it's connected to your

Mishaal:

Bluetooth smartwatch, but I don't

David:

know about anything else.

David:

Right.

David:

So I think that I could see a future for the CHRs being able to do some,

David:

some fusion with kind of basically.

David:

A lower level of trust around personal authentication.

David:

So not quite biometrics, but something that helps the phone

David:

realize, okay, I'm in your pocket.

David:

You know, like I know how you walk.

David:

I know how you sound.

David:

I know like if you've left a place And use that to provide some more trust

David:

for like, you know, seeing content without explicit unlocking, because

David:

on Android, we're seeing that, you know, facial ID is either too expensive

David:

or too form factor compromising for a lot of the OEMs to adopt.

David:

Even Google got rid of it and fingerprint scanners are quite spoofed bubble.

David:

So I, I could see some security stuff going on there.

David:

And I imagine Google is probably already doing some of this with

David:

like deciding when the device needs to be unlocked manual.

Kieron:

I like the idea of walking detection, because

Kieron:

everybody walks slightly do.

David:

Yeah, I would be really interested to see what the sensor data would

David:

look like there because Google also, I remember, no, this was Qualcomm.

David:

I heard years ago.

David:

They said that if you have a good, good enough radar system, you can actually

David:

identify somebody's radar signature.

David:

Because everybody's radar signature is a little bit different.

David:

And so they had a proof of concept where they had radar.

David:

Wifi points deploy throughout a home and they could tell who was in what room.

David:

So that's less of a low power issue because these are stationary devices,

David:

but you could see a mobility case for something like that, potentially.

Mishaal:

Yeah.

Mishaal:

And speaking of walking detection, actually the digital wellbeing

Mishaal:

application on that's developed by Google, they actually have like a feature called.

Mishaal:

That the texts when you're walking and using your phone at the same time and

Mishaal:

like tries to warn you just cut it out because you know, that's dangerous.

Mishaal:

And that's, that's an example of a feature that I never would have

Mishaal:

thought of to implement before.

Mishaal:

And I'm sure processing data in a sensor hub, the accelerometer data to detect

Mishaal:

footsteps would be very power efficient and allow that application, that

Mishaal:

feature to be continuously monitoring for, you know, heads up moments.

Kieron:

Yeah.

Kieron:

What was, what was going to say as well as if so your walking detection, you could

Kieron:

use for accessibility purposes as well.

Kieron:

So if you're walking around a space that somebody isn't familiar

Kieron:

with in the partial sighted, then you could use it to vibrate.

Kieron:

If there is something unexpected in front of them, say that would take them lots of

Kieron:

sensor data, but where things like radar and Sullivan serving and that sort of.

Kieron:

It's a possibility I would imagine.

Kieron:

So basically your phone would become your navigation system

Kieron:

for somebody to get around.

Kieron:

It might be done in the future.

David:

Yeah.

David:

Where we're going to probably conclude here is getting into, okay.

David:

Well, the we've talked about smartphones.

David:

We've talked about what it looks like in some consumer implementations

David:

we've seen, but in our world, when we're thinking about like dedicated

David:

devices and enterprise and business and industry sensors are really

David:

commonly in use in all of these cases, especially in places like factory floors.

David:

So for example, on a factory floor, you have tons and tons of.

David:

Devices these days often communicating over Bluetooth, low energy or wifi you

David:

know, not necessarily very sophisticated, but there is a lot of wireless activity.

David:

There are a lot of computers crunching, a lot of data.

David:

And so imagine an environment like a factory.

David:

If you have people walking around.

David:

You, for example, if you use low power geo-fencing to detect like,

David:

Hey, you're entering a hardhat only zone, your watch is going to vibrate

David:

and say, Hey, are you wearing the right safety equipment right now?

David:

Or even go so far to say, Hey, you're entering a restricted zone.

David:

You're not authorized to be here.

David:

It could also be something like apples, fall detection on the apple watch.

David:

So if you have a safety incident at a workplace.

David:

Wearable can tell you, Hey, you have an employee who probably fell down that

David:

can help you respond much more quickly to an incident call paramedics, get

David:

somebody onsite if that's necessary.

David:

So there are a lot of ways you could be using something like this.

David:

And probably that aren't necessarily like sophisticated from a sensor

David:

data standpoint, especially in a really controlled setting.

David:

Like a factory floor, like an office building, or like, What's

David:

another location, like a hotel or a restaurant where the context is

David:

relatively fixed and most of the computing assets are fixed too.

David:

So it could be that you have employees wearing the device.

David:

It could be that you have employees using a handheld one example that we've

David:

already seen, and that apple is probably.

David:

The most famous for doing on the consumer side?

David:

Not that they were the first is using like super high frequency, no meter wave

David:

to do basically echolocation of things like air tags or your apple watch.

David:

So these things are, you know, they're becoming more and more

David:

common devices are emitting more and more types of signals and looking

David:

for more and more types of signals.

David:

So it only tracks that we're going to come up with ways to fuse that data usefully.

David:

So it's hard for me to.

David:

A future in which this doesn't start to get more attention, maybe not as

David:

the CHR three, I can't really speak to the viability of that as a platform

David:

or why it appeals specifically.

David:

But as a overall concept, I don't see sensor data becoming less important.

David:

I see it becoming way more important in more and more use cases in context.

Kieron:

Yeah.

Kieron:

I think just touching on the idea of location data as well.

Kieron:

The fact that in the last two years, the idea of precise location tracking from a

Kieron:

wearable device or a handheld device using things, both of that and high power GPS

Kieron:

that is going to feed into this as well.

Kieron:

So we've got a lot of Bluetooth tracking research that's been done for.

Kieron:

So some trucking and and trace and that sort of thing.

Kieron:

So that has, I know from experience that I've all go into that has fed into little

Kieron:

wearable devices that do this sort of thing that are able to track somebody's

Kieron:

location on the factory floor that are using systems or the van GPS, or other

Kieron:

than Wi-Fi in sensitive areas that that some things aren't allow with all

Kieron:

that you're not allowed your personal phone on you or that sort of thing.

Kieron:

So.

Kieron:

All this data and all of the new research that's been done the last few years

Kieron:

made cause a huge increase of, of this sort of thing in the next few years.

Kieron:

So it'll be very interesting to see the direction that this.

David:

And I guess before we wrap up, you know, maybe one more thing to

David:

think about in that context of why, why Android makes sense as the platform

David:

for this is the power efficiency.

David:

That we were talking about earlier, and that I think is why the wearable and

David:

mobility use case like these are new for a lot of these businesses and industries.

David:

And so there's just starting to learn how this stuff can be used to gather

David:

data and, you know, meaningfully improve processes or employee safety.

David:

Or whatever their goal may be.

David:

So you have tons of companies coming now over from windows machines and

David:

who are trying to use, like trying to learn this mobility landscape.

David:

And Android is kind of like, you know, CHR may not be very well known at all in the

David:

enterprise world at this point, but the fact that Android has that extensibility

David:

built in at the OEM level Undoubtedly there will be exploration there.

David:

And because I don't think that, could you build this on iOS?

David:

Could you build this on windows?

David:

Almost?

David:

Definitely.

David:

No.

David:

Could you build it on Linux?

David:

Well, sure.

David:

If you had all the time and money and resources in the world, but Android is

David:

the only one, the only platform that seems like primed for this change.

Kieron:

Absolutely agreed on that.

Kieron:

Been there, the only other prevalent light you say will

Kieron:

be sunsetted Linux environment.

Kieron:

And I think from research that was done for this, the Google

Kieron:

have been looking at implementing it on a, on a different system.

Kieron:

So it wouldn't be on Android, but perhaps something that doesn't require

Kieron:

the power to run the OS itself, which maybe would be useful on a smart screen

Kieron:

or something or small wearable device.

Kieron:

But Android is, is really the way to go for.

Kieron:

Handheld devices that use something like this.

Mishaal:

Yeah.

Mishaal:

And if you look at Google's work in particular, they're particularly invested

Mishaal:

in promoting a R toss called Zephyr OSPF.

Mishaal:

They've been contributing a lot of development effort to it.

Mishaal:

They've also recently started to port the CHR framework onto it, so

Mishaal:

that Enterprises or developers that are building embedded controllers

Mishaal:

and are seeking an operating system can use that for LS and

Mishaal:

implement Google CHRs framework.

Mishaal:

And who knows?

Mishaal:

We might see an uptake of chips with these embedded operating systems

Mishaal:

and interfacing with a high-level operating system like Android.

Mishaal:

Android is not competing with fees.

Mishaal:

Our costs it's working in conjunction with them.

Mishaal:

And I think Android as both David and Karen mentioned as

Mishaal:

a perfect platform for that.

Kieron:

Yeah.

Kieron:

One of the things that was mentioned by the guy during the presentation,

Kieron:

which I'm probably linked at some point from this, but he mentioned that you

Kieron:

could in theory have, because CHR is a.

Kieron:

Framework you could in theory, have modules that sit on top of it that can

Kieron:

be shared between different devices.

Kieron:

So you could have a module that you can just import, but it does all of your

Kieron:

location tracking for you, which saves you on lots of time compared to implementing

Kieron:

it yourself on a Linux-based system.

Kieron:

So with something like that, we could really see like a huge uptake in this sort

Kieron:

of use for, for lots of small electric.

Kieron:

Because it removes some lots of work that they have to do to program this.

David:

All right.

David:

Well, I think this is probably going to be the most exhaustive resource

David:

on CHRs and Android sensor hubs online, at least audio resource.

David:

So Kieron, thanks so much for joining us.

David:

This is a really esoteric and honestly, quite fascinating topic because.

David:

You know, the fusion of hardware and, you know, accessibility in terms of the

David:

operating system is really interesting.

David:

And an area where Android has been uniquely equipped from the beginning to

David:

really capitalize pretty well in ways that legacy platforms are mitigated ecosystems.

David:

Like apples just don't really have, even when Google does try to make

David:

it hard to play with its toys.

David:

So Kiran, where can folks find you and what you're working on

David:

and anything you'd like to do?

Kieron:

On the topic of this, I mean, I've got to have some taps out cause it's it

Kieron:

took a good few months to get the CHR.

Kieron:

We still found that on the update.

Kieron:

To a point that I was happy with, unfortunately, looks like you will

Kieron:

make no fight in Andrea 13 when that comes out with it, it still

Kieron:

be accessible with roots at least.

Kieron:

But yeah, tap tap is available on my hope, which is gethub.com/ Kevin Quinn.

Kieron:

I'm also on Twitter at Quinny 8, 9, 8.

Kieron:

So you're welcome to follow that if you want to follow with.

David:

And Michelle and I are with Esper.

David:

And if you found this show interesting, because you're trying to build at

David:

Android device, whether from the ground up or using a operating system,

David:

distributed by an OEM, come talk to us.

David:

This is the kind of discussion we'd love to have, because we want to

David:

know why you want to use Android.

David:

What you're trying to enable with it.

David:

It could be anything from a kiosk.

David:

It could be a smartwatch.

David:

It could be a television size display, really anything where you're trying

David:

to do something very specific with Android and probably in a, either

David:

business or customer contacts where somebody is interacting with this

David:

machine to do a specific sort of thing.

David:

Asper is really good at this.

David:

We build our own distro of Android that is designed for these use cases.

David:

That's really hard and.

David:

Able to handle like a lot of, you know, a lot of updates, really,

David:

really easy to implement overall.

David:

And we work on a pretty wide variety of hardware platforms, including x86.

David:

If you'd like to talk to us about that.

David:

We're at esper.io and this has been Android bites.

David:

Thank you for joining us, everyone.

Next Episode All Episodes Previous Episode
Show artwork for Android Bytes (powered by Esper)

About the Podcast

Android Bytes (powered by Esper)
A weekly show that dives deep into the Android OS
Android Bytes (powered by Esper) is the podcast that dives deep into the engineering and business decisions behind the world’s most popular OS. https://www.esper.io

Android powers over 3 billion devices worldwide and is the platform of choice for over a thousand companies. You’ll find Android on smartphones, tablets, watches, TV, cars, kiosks, and so much more. How does Google architect Android to run on so many form factors, and how do companies fork AOSP to make it run on even more devices? These are the kinds of questions the Android Bytes podcast considers each week.

Join cohosts Mishaal Rahman and David Ruddock, two journalists with extensive knowledge covering the Android OS platform and ecosystem, as they speak to system architects, kernel engineers, app developers, and other distinguished experts in the Android space.

Get in touch with us at Esper.io if you’re looking to use Android for your product — we have the experience you need.

About your hosts

David Ruddock

Profile picture for David Ruddock
David is the Editor in Chief of Esper, and cohosts Android Bytes. David spent over 10 years as the Editor in Chief of Android Police, where he reviewed more phones than he'd care to admit, broke dozens of exclusive mobile industry stories (and also, phones), and ran one of the web's most vibrant Android communities.

Mishaal Rahman

Profile picture for Mishaal Rahman
Mishaal is the Senior Technical Editor at Esper.io and a cohost of the Android Bytes podcast. At his previous role as Editor-in-Chief at XDA-Developers, Mishaal was at the forefront of Android tech journalism, breaking story after story on new OS features and platform changes.