Episode 20

A snapshot of Android's camera problems

Published on: 5th July, 2022

On this week's episode, we break down how camera APIs work in Android and why third-party camera apps just can't match the features and quality produced by the stock camera. Long story short, it's a mess. What gives? And what's being done about it?

We're joined by Mohit Shetty, a developer behind Secure Camera, the camera app on GrapheneOS and available to everyone on the Play Store.

  • 01:48 - How does hardware fragmentation make camera app development on Android inherently more challenging than on iOS?
  • 03:52 - Was there anything Google could have done in the early days to make things better?
  • 08:21 - Why don't OEMs bother with making sure third-party camera apps work the same as the stock camera app?
  • 12:27 - What are some features that OEMs can't expose to third-party camera apps through Android's camera API?
  • 17:20 - How does Android's camera architecture work? What is Camera HAL 3?
  • 20:23 - How will Google Requirements Freeze (GRF) affect camera HAL versioning?
  • 24:11 - How do third-party camera apps interface with multiple cameras?
  • 29:28 - What is the Camera2 API?
  • 32:52 - What is CameraX and what can (and can't) it do?

Android Bytes is hosted by Mishaal Rahman, Senior Technical Editor, and David Ruddock, Editor in Chief, of Esper.

Esper enables next-gen device management for company-owned and managed tablets, kiosks, smart phones, IoT edge devices, and more.

For more about Esper:

Our music is "19" by HOME and is licensed under CC BY 3.0.

Transcript
David:

Hello, and welcome to Android bites powered by SPER I'm David Reddick.

David:

And each week I'm joined by my co-host Michelle ramen.

David:

As we dive deep into the world of Android this week, we're talking about

David:

something that I think everyone on Android has had some kind of frustrating

David:

experience with, which is how.

David:

Third party applications be they Snapchat, Instagram, um, or any other

David:

application that has to use your camera on your device to capture content.

David:

How does that interact with the operating system?

David:

How does it interact with your device and what are the

David:

challenges associated with that?

David:

And we have a guest today who is really experienced in working with the

David:

Android camera APIs, and, uh, Michelle I'll allow you to introduce him.

Mishaal:

Thanks David.

Mishaal:

So on today's episode, we've invited Moji.

Mishaal:

He's part of the graphine OS project.

Mishaal:

If you recall, we talked about this project on a previous episode of the

Mishaal:

show and when we invited one of the developers on the actual operating

Mishaal:

system, but today we're talking about.

Mishaal:

The Android camera API.

Mishaal:

So we wanted to invite an expert who has worked with Android camera API.

Mishaal:

So that's why I invited Mohit who's part of the development of the

Mishaal:

camera application on graph OS.

Mishaal:

So Moji, can you just give us a brief introduction?

Mishaal:

Just tell us what is the application that you work on at graph OS.

Mishaal:

Hi

Mohit:

everyone.

Mohit:

So basically I've been working on the secure camera app at

Mohit:

Graphos over the last few months.

Mohit:

Currently, this application is being used in production by Graphos for all

Mohit:

this devices, which is on replacement to the default USP camera app.

Mohit:

Apart from that, you can always find the application on the play store, which so

Mohit:

we support under devices starting off.

Mohit:

Android

Mishaal:

10.

Mishaal:

Yeah.

Mishaal:

Thank you.

Mishaal:

Mohit.

Mishaal:

So he's working on a camera application and of course, in order

Mishaal:

to develop a camera application that supports multiple devices, you need

Mishaal:

access to the Android camera APIs.

Mishaal:

But as David mentioned at the beginning of this show, for those

Mishaal:

of you who are using Snapchat, Instagram, you're wondering like,

Mishaal:

how do they work and why do they not.

Mishaal:

Nearly as good photos as their stock camera app counterparts.

Mishaal:

It's a really tricky and complicated question to answer.

Mishaal:

And if you look on the iOS side of things, that's not true.

Mishaal:

Snapchat, Instagram you'll take pretty much nearly identical photos.

Mishaal:

If you use the stock iOS camera app, as you do with those social media

Mishaal:

apps, but on Android, you'll get very different results depending on.

Mishaal:

Why the issue of course stems from the big F word.

Mishaal:

The one we always bring up on pretty much every episode on like every newsletter

Mishaal:

it's fragmentation, but how exactly does fragmentation affect things?

Mishaal:

There's two aspects.

Mishaal:

There's camera, hardware, fragmentation, and there's

Mishaal:

camera, software fragmentation.

Mishaal:

First of all, hardware, fragmentation.

Mishaal:

This is like the reason why everything is the way it is and why camera

Mishaal:

software for fragmentation stems from.

Mishaal:

This one's kind of easy to imagine.

Mishaal:

Hardware is fragmented because whereas on iOS, you would have one manufacturer.

Mishaal:

You have apple who makes a handful of phones and they control the

Mishaal:

software stack from top to bottom.

Mishaal:

And there's only a handful of image sensors to consider.

Mishaal:

On the other hand, Android, you have thousands of different device models

Mishaal:

from dozens of different brands, each running their own operating system,

Mishaal:

fork of a O S P with different Silicon from media tech, Qualcomm, et cetera.

Mishaal:

And thus they have their own image, signal, processor, implementations, and

Mishaal:

then they also source image sensors from different vendors like Samsung and Sony.

Mishaal:

So you have all these different combinations of hardware and

Mishaal:

software to consider in the Android.

Mishaal:

So if you're a app developer looking to implement to create a camera app that

Mishaal:

supports all these different combinations of devices, you're gonna have a bad time

Mishaal:

because there's so many different quirks and different capabilities of devices that

Mishaal:

it's, it's a staggering mess to consider.

Mishaal:

So Moji, I wanted to ask you just taking a step back and

Mishaal:

looking at the overall picture.

Mishaal:

Do you think it was inevitable?

Mishaal:

The Android camera implementations, like the situation would've become a mess.

Mishaal:

Was there anything that Google could have done or maybe mandated

Mishaal:

in the early days of Android?

Mishaal:

So he wouldn't get to this.

Mohit:

So before we speak about how we could have solved this problem

Mohit:

back then and diving deeper into this discussion, it is important for us to

Mohit:

understand what the problem really is.

Mohit:

So the messy Android camera implementations that are often found

Mohit:

in apps that directly use the camera one or the camera to API targeting

Mohit:

a wide range of devices are mainly attributed to the OS or vendors.

Mohit:

Not.

Mohit:

Implementing these standard APIs with the hardware that the device

Mohit:

relies on this in tone leads to other apps, somehow working around those

Mohit:

unexpected issues from time to time while trying to solve this may sound

Mohit:

as simple and bad as short debugging.

Mohit:

Analyzing, what could have possibly gone wrong since having access to device

Mohit:

specific code isn't always feasible.

Mohit:

It isn't actually the case.

Mohit:

It's much worse than that.

Mohit:

Workarounds that work for a specific device may or may not work for in though.

Mohit:

Or could even worse cause existing support for another device to break

Mohit:

this leads to having an entire mapping of device specific implementations,

Mohit:

just for a single functionality, even for a simple camera app, that's

Mohit:

expected to work across a variety range of standard Android devices.

Mohit:

Now try imagining this for every possible camera functionality and more

Mohit:

importantly, your code and project that was meant to fulfill a different

Mohit:

set of requirements, but is now busy, heavily investing resources,

Mohit:

working around device specific bugs.

Mohit:

All these issues could have greatly been resolved if Google or the O HHA back then.

Mohit:

Stricter compatibility, test suites and compatibility

Mohit:

definition, document requirements.

Mohit:

That's CTS and CDD requirements would have been developed, promoted and

Mohit:

well enforced instead of the current scenario where the vendors don't

Mohit:

really need to get their devices CTS or CD compliant certified before

Mohit:

releasing that device into the market.

Mohit:

The scope of these CTAs could have been expanded, further, more requirements that

Mohit:

are described as recommendations could be made mandatory by enforcing the passing

Mohit:

of those CTS CTAs documents for vendors.

Mohit:

Apart from that, they could have also designed better camera APIs from the

Mohit:

beginning that are not needlessly complex for both the app developers

Mohit:

and the camera driver or hardware abstraction layer implementations,

Mohit:

to handle that by making the overall process for writing compliant code.

Mohit:

Much easier.

Mohit:

So I

David:

guess I, I have a question here and it's kind of.

David:

philosophical.

David:

If Google is more strict about this and had a much more rigorous CTS process

David:

around camera, API and compatibility and features, how much do we think this

David:

would reduce the amount of innovation that has happened in handsets?

David:

Because they've been able to do things.

David:

Break Android's camera framework, essentially because they won't work

David:

with third party apps the right way, but they will work with the vendor's camera

David:

application where they've developed this very special functionality, whatever it

David:

is, super zoom or some kind of special portrait mode or lighting filters even

David:

can be stuck behind that wall now.

David:

So I guess that's my question.

David:

You know, if Google is more strict, would we see.

David:

Fewer features, camera features on phones

Mohit:

for that.

Mohit:

Like we could separate those implementations.

Mohit:

And like, for example, for the basic functionalities that other applications

Mohit:

are expected to use, we could have hard and separate implementation for them.

Mohit:

And in order to add those additional features such as.

Mohit:

The filters that we see on modern phones, we precipitated them into

Mohit:

different implementation that could have been interfaced on top of the

Mohit:

existing, basic implementations.

Mohit:

And for the initial basic implementations, we could have ensured that all

Mohit:

the test cases passed and we could have, as I just said before, we

Mohit:

could span the CTS further and.

Mohit:

Requirements that are described as recommendation in the docs could be made

Mohit:

mandatory by enforcing the passing of the CTAs or CGD for the vendors that who want

Mohit:

to release their devices into the market.

Mishaal:

So I have a kind of different take on this

Mishaal:

question that you asked David.

Mishaal:

So like, Would innovation be hampered by enforcing camera compatibility

Mishaal:

with third party applications.

Mishaal:

And I think because of the way things work right now, I think

Mishaal:

the answer would be yes right now.

Mishaal:

Like there, there are ways that OEMs could implement feature parody between

Mishaal:

what their stock camera has access to.

Mishaal:

And what third party camera apps have access to.

Mishaal:

Of course, as Moit mentioned, Lee APIs themselves.

Mishaal:

They're a little frustrating to use and they're a little difficult to use and

Mishaal:

Google has made strides over the years to improve them, but there are APIs.

Mishaal:

Like they, they could expose a lot of the functionality they

Mishaal:

offer in their stock camera app.

Mishaal:

But why don't they the question then?

Mishaal:

Like what would imposing a CTS requirement actually do?

Mishaal:

And I think that would just significantly delay the launch of any devices.

Mishaal:

The way the Android market works for now is like, if you're in like a, a Chee or a

Mishaal:

shams competing in Southeast Asia, right?

Mishaal:

You don't have time to focus on making sure your latest mid-range phone, a

Mishaal:

camera app, your stock camera app, and third party camera apps have

Mishaal:

access to the same functionality.

Mishaal:

Right?

Mishaal:

You gotta get that phone from concept to design, to testing, to launch

Mishaal:

within like maybe a year or less for some of these like mass market phones.

Mishaal:

And if you're doing that with a whole range of, you know, budget mid-range

Mishaal:

and sometimes high end phones, you just don't have time and resources to invest

Mishaal:

in making sure all of the innovative features that you wanna market on your

Mishaal:

phone, on your stock applications will work the same on third party camera

Mishaal:

apps, because there's just so many different considerations to implement.

Mishaal:

Like why bother doing.

Mohit:

And the solution for the same could be like Google or the open handset

Mohit:

Alliance could have probably created a set of designed a set of standard

Mohit:

classes and tied up with certain hardware camera companies and probably given

Mohit:

out the code for each of those standard classes in the a S P source code itself.

Mohit:

So that would even ensure quick delivery.

Mohit:

The devices in the market, as well as we could have an fallback class for

Mohit:

all the other vendors that wish to have their own well research implementations.

Mohit:

And we could over there then see the CT C D thing in a much

Mohit:

more stricter way over there.

Mohit:

So like tracking those issues might, would be much easier than

Mohit:

having the mess we currently are in.

Mohit:

So like, that's just an out of box solution.

Mohit:

Like I haven't really worked with hardware and all, so like

Mohit:

there's an abstract solution away.

Mishaal:

That is a good point that you brought up though.

Mishaal:

Google should.

Mishaal:

I think that would be the most effective solution.

Mishaal:

Bypassing working with OEMs and working directly with the is P designers

Mishaal:

and working with the image sensor vendors, because OEMs, they have

Mishaal:

multiple different models, but the is P and the image sensor vendors.

Mishaal:

They're distributing just a few specific products and they're writing the drivers.

Mishaal:

So if Google could get them to standardize the way their drivers

Mishaal:

interact with Linux and Android, then that would go, that would do wonders.

Mishaal:

For how all that propagates through the market, but there are

Mishaal:

some complications to consider.

Mishaal:

First of all, is there even enough of a desire from consumers for OEMs to actually

Mishaal:

like bring feature parody between stock camera apps and third party camera apps?

Mishaal:

We do.

Mishaal:

We have seen, you know, some marketing from Samsung and Google,

Mishaal:

you know, the partnerships with Snapchat and Instagram and whatnot.

Mishaal:

So there is clearly some potential there because if they're not gonna

Mishaal:

be marketing these features, if they think people don't care about.

Mishaal:

Are they even legally allowed to expose some of the features that

Mishaal:

they offer in their stock camera app.

Mishaal:

There are a lot of vendors behind the scenes whose tech are in

Mishaal:

smartphones that you've never heard of.

Mishaal:

There are some facial recognition vendors who provide their software

Mishaal:

implementations to smartphones.

Mishaal:

We don't know the exact terms of their agreements, but maybe their

Mishaal:

terms say that you're only allowed to use this in your camera app.

Mishaal:

And we don't want our technology being used in by any arbitrary

Mishaal:

third party app, because we don't.

Mishaal:

A licensing fee from them.

Mishaal:

So like what would happen in that situation?

Mishaal:

Would the OEM even be allowed to license, maybe their BD mode drive from some

Mishaal:

third party tech, maybe, maybe not.

Mishaal:

So that could be one of the reasons why, you know, these features

Mishaal:

aren't being across the board, exposed to third party apps.

Mishaal:

So mahi, I kind of wanted to ask you, there are ways for OEMs to expose features

Mishaal:

to third party apps through camera APIs.

Mishaal:

There are also a lot of features that aren't even possible that they can't

Mishaal:

be exposed to third party apps simply because the Android camera APIs don't

Mishaal:

provide a way to expose those features.

Mishaal:

So what exists, what are some examples of, of features that say you can't

Mishaal:

implement and secure camera app?

Mishaal:

Because they're just not an API for it.

Mohit:

So the secure camera that web graph has mainly focused on providing the most

Mohit:

simple features to the users initially.

Mohit:

So we, for the timing have don't have any search fusion, mind that cannot

Mohit:

be implemented, but we are currently sticking to only using the camera cyber

Mohit:

to ensure that, uh, like we don't end.

Mohit:

Spending too much time on dealing with the device specific cos.

Mohit:

So in my experience, there are not really many examples of such cases in general.

Mohit:

Like the camera APIs that Android provides while being complex are

Mohit:

highly extensible nature and hence implementing any valid feature with

Mohit:

some additional code or liability.

Mohit:

Our support library.

Mohit:

Isn't impossible as search, but of course there could be other limitations

Mohit:

that could be based on the scope of the project, or maybe how practical

Mohit:

is it to implement a certain feature in terms of its maintenance or maybe

Mohit:

the hardware limitations that could.

Mohit:

Probably make it impractical to have a certain feature and quite a lot

Mohit:

of devices that the project targets.

Mohit:

So like mainly depends on the, how the developers and perceive the problem

Mohit:

and like what the scope of the project is and how much time and efforts

Mohit:

they are willing to give for it.

Mohit:

And of course they can be hardware limitations apart from that.

Mishaal:

Right.

Mishaal:

So can you just give me some examples of features that you can't

Mishaal:

implement because of API limitations?

Mishaal:

So like portrait mode, That is that something you can implement beauty mode?

Mishaal:

Is that something you could implement, et cetera?

Mishaal:

What are some features that you can't because of, you know,

Mishaal:

a lack of support in the API?

Mishaal:

So

Mohit:

like one of the straightforward examples for that could be the

Mohit:

vendor extensions that the camera X library provides, which aren't

Mohit:

available for most of the device in the market, which mainly include

Mohit:

the portrait mode and the night mode

David:

could that include augmented reality features, for

Mohit:

example, Yeah.

Mohit:

Like we could do that, but as far as I know, like the Android camera repairs

Mohit:

don't support anything related to augmented reality, but we could surely

Mohit:

use some external library for the same.

Mishaal:

Yeah, that's a good one.

Mishaal:

David pixel camera used to have a AR stickers.

Mishaal:

I think they were called in the camera app directly integrated and there was

Mishaal:

just no way like that wasn't exposed at.

Mishaal:

Two third party apps via API.

Mishaal:

Like of course there's a separate AR core API.

Mishaal:

That's part of Google play services, but the actual AR sticker feature in

Mishaal:

the Google camera app, they're just it.

Mishaal:

I wasn't exposed to third party apps as far as I'm aware.

Mishaal:

So speaking of the camera API, we've kind of talked about the hardware

Mishaal:

side device fragmentation side.

Mishaal:

Now I wanted to actually talk about the API itself.

Mishaal:

I think that's probably the most interesting part.

Mishaal:

The thing people are actually wanting to hear about clearly

Mishaal:

they've evolved over the years.

Mishaal:

Right?

Mishaal:

Mohi mentioned we had camera one and we have camera two and I'm sure people

Mishaal:

have also heard of camera Hal three.

Mishaal:

So like what do these numbers mean?

Mishaal:

And we'll get to that in a bit, but I wanted to talk briefly about the evolution

Mishaal:

of the camera API and how they work.

Mishaal:

It's only recently that the camera APIs have become not a nightmare to use.

Mishaal:

As Malik mentioned, the secure camera.

Mishaal:

App uses the camera X API, which is part of the reason why camera

Mishaal:

app development has become simpler.

Mishaal:

But in the past most camera app makers had to use the framework camera and they

Mishaal:

were kind of a nightmare to implement from what I've heard, reading on, uh, developer

Mishaal:

forums and people speculate the reason Snapchat for many years used to just take

Mishaal:

a screenshot of the view finder instead of actually using the camera API directly

Mishaal:

to take a photo is because they'd rather.

Mishaal:

Deal with using a, a low quality screenshot of the view.

Mishaal:

Find to actually implement the on every device they wanted to support

Mishaal:

which remember this is Snapchat.

Mishaal:

So that's like millions, tens of millions of users on multiple different devices.

Mishaal:

Then rather infamously, a few years ago, moment who sells camera, hardware,

Mishaal:

accessories, and they also make an app for iOS called pro camera.

Mishaal:

They actually tried to port their pro camera app to Android.

Mishaal:

But after two years they just gave up.

Mishaal:

They just said we quit.

Mishaal:

And they gave the reason to nine to five Google.

Mishaal:

They said that the reason we quit is because of fragmentation.

Mishaal:

They had this chart that showed here are all the features we support

Mishaal:

on pixels on Samsung devices.

Mishaal:

And like there's is like a whole bunch of green, whole bunch of

Mishaal:

yellow, whole bunch of reds.

Mishaal:

It's just so inconsistent with what they had to support.

Mishaal:

And there's so many different models.

Mishaal:

It just became not really feasible to support without investing.

Mishaal:

significant man hours.

Mishaal:

So why is this such a nightmare?

Mishaal:

Like what exactly left moment to quit?

Mishaal:

So let's talk about the Android camera architecture, a.

Mishaal:

So at the low level, you have the drivers for the actual camera

Mishaal:

slash image sensor on the device.

Mishaal:

Those are written image, sensor vendors, I Samsungs and Sony.

Mishaal:

Those are distributed to OEMs for integration into their builds.

Mishaal:

And then next in the pipeline, all the raw Bayer data from those image sensors are

Mishaal:

processed by the image signal processor.

Mishaal:

That's part of the SOC inside device.

Mishaal:

And of course, that image signal processor is developed by another company,

Mishaal:

the Qualcomm media tech, or Samsung.

Mishaal:

That ISB does processing of its own on the raw aired output from the image sensors.

Mishaal:

That doesn't happen if you know, you're taking off photo, of course,

Mishaal:

but that's another topic entirely.

Mishaal:

It's a feature that's not supported on every device.

Mishaal:

So you have the drivers that are closed source and provided by

Mishaal:

the image, uh, sensor vendors.

Mishaal:

And then you have the, is P architecture implementation.

Mishaal:

That's written by the Silicon.

Mishaal:

Both of those are black boxes, pretty much to camera apps.

Mishaal:

Like you have no insight into exactly what capabilities they

Mishaal:

have or their data sheets.

Mishaal:

You can instead only rely on what capabilities they expose to the

Mishaal:

framework, which is determined by the hardware abstraction layers that are

Mishaal:

written by the OEM slash the vendors.

Mishaal:

So apps on the Android side, they.

Mishaal:

With the camera hardware using the camera two API, which is the

Mishaal:

framework API that interacts with the underlying camera service and

Mishaal:

Android, which then interacts with the camera, hardware, abstraction, layers,

Mishaal:

hardware, abstraction, layers to those.

Mishaal:

You don't know.

Mishaal:

They define the standard interface between the higher level Android framework

Mishaal:

and the lower level camera driver.

Mishaal:

And as I mentioned, that implementation is what defines

Mishaal:

what capabilities are exposed to.

Mishaal:

So there are multiple camera.

Mishaal:

How interfaces that OEMs have to implement.

Mishaal:

There's a camera provider, how, and there's a camera device.

Mishaal:

How, but the problem is that OEMs aren't required to implement

Mishaal:

a recent version of each.

Mishaal:

How nor are they required to implement every capability

Mishaal:

introduced with each how version.

Mishaal:

So as Mohi mentioned, Google could update the CD and the CT.

Mishaal:

To test for more recent versions of these Hals and see if OEMs have actually

Mishaal:

implemented them and defined certain capabilities, but they don't right

Mishaal:

now in order to pass certification, if you launch a device with Android 10

Mishaal:

or later, you only have to implement camera device, Hal version 3.2 or later,

Mishaal:

Hal 3.2 was actually introduced all the way back with Android 5.0 lolli.

Mishaal:

Even Android 13 still has backward compatibility with how 3.2, even though

Mishaal:

the latest, how is 3.8, which added support for a really basic feature,

Mishaal:

the flashlight brightness control.

Mishaal:

So as you can see, a lot of features are being added along the way.

Mishaal:

Some even quite basic, but because there's no specific requirement to implement

Mishaal:

a specific it's, it's all gonna depend on the OEM and the Silicon vendor.

Mishaal:

Like what exactly are they willing to implement and expose

Mishaal:

to third party camera apps?

David:

Michelle, you know, before we drop into the next part, the

David:

reason this exists is basically the Google requirements freeze, right?

David:

Because Google has helped letting the phone vendors lag behind, essentially buy

David:

up to what, like three years effectively or more, or is this a different situation?

Mishaal:

That's actually something that's going to make this situation even

Mishaal:

worse because in the past, the hardware abstraction layers could be updated.

Mishaal:

They were basically required to be updated whenever a vendor like Qualcomm

Mishaal:

would have to update their implementation to support a newer Android version.

Mishaal:

But with GRF, as you mentioned, say, a device launches with Android 11, then

Mishaal:

whatever hardware abstraction layers that device ship with will never be

Mishaal:

updated until that device reaches.

Mishaal:

Once they're updated to Android 15.

Mishaal:

Because Google's guaranteeing backward compatibility with vendor implementations

Mishaal:

that are re letter versions behind.

Mishaal:

So device running Android, like to Android, 12 Android, 13 Android 14,

Mishaal:

and keep the same hardware abstraction layers, and the kernel interface that was

Mishaal:

introduced with Android 11, for example, the camera Hal 3.8, which introduces

Mishaal:

support for flashlight whiteness.

Mishaal:

Device that launches with Android 12 is not gonna have 3.8 because

Mishaal:

that version wasn't even introduced until now until this release.

Mishaal:

So device upgrading the Android 13, probably won't even get support for

Mishaal:

this basic flashlight brightness control feature, because vendors aren't

Mishaal:

gonna go back and, and update how so?

Mishaal:

Yeah, it's, uh, it's, it's a mess because there's no real requirement on.

Mishaal:

OEMs are required to expose and actually implement in their house.

Mishaal:

But of course, Google chugs along.

Mishaal:

They keep updating the underlying hardware abstraction layer interface.

Mishaal:

They keep defining new capabilities in each Hal version.

Mishaal:

For example, with camera device, Hal 3.5, they introduced the ability for

Mishaal:

OEMs to define zoom ratio, which actually provides support for optical camera,

Mishaal:

zoom capabilities to third party.

Mishaal:

Then, of course, as I mentioned, 3.8, introduce a

Mishaal:

flashlight, brightness control.

Mishaal:

And then also in Android 11, with 3.5, they introduced the ability

Mishaal:

for OEMs to expose bouquet support.

Mishaal:

So like they have to define these certain constants in their hardware

Mishaal:

abstraction layer to expose optical camera, zoom capabilities, and bouquet.

Mishaal:

It's basically up to the Goodwill and the willingness of the OEM and the

Mishaal:

vendor they provided the house from, in order to implement these features.

Mishaal:

So I wanted to ask you now, Mohit, if you are aware of any way your

Mishaal:

work has been affected by the, how versions and implementations across

Mishaal:

different devices, like, are you able to access certain features that are

Mishaal:

supported by Android technically, but because the device doesn't have the

Mishaal:

right how version, you know, you can't actually use it and secure camera.

Mishaal:

So

Mohit:

the secure camera app that we are working on graph know is at least for

Mohit:

most of the time hasn't directly relied on the hardware abstraction layers or

Mohit:

the camera one or the camera to API.

Mohit:

The main reason why we push for such a design is the device specific issues

Mohit:

that we would otherwise have to spend a lot of our time to deal with that

Mohit:

could unexpectedly blow up as more non camera supported features accumulate.

Mohit:

And as a app gets used by more kinds of devices, which is quite closely related

Mohit:

to the fragmentation issue that we were discussing earlier on this podcast.

Mohit:

This valuable time could be spent by contributors on other places that require

Mohit:

more help and attention and graph.

Mohit:

However we did recently make a few exceptions rule by supporting an AI story

Mohit:

that relies on the camera to interrupt API and introducing experimental support

Mohit:

for Zeel that has not been supported by a lot of devices for the time being.

Mohit:

So coming back to the question, no, like we haven't really faced any issues or

Mohit:

troubles while dealing with different versions of file, primarily because

Mohit:

we haven't really dealt with them.

Mohit:

In the code of a camera application.

Mishaal:

Yeah.

Mishaal:

At least with project travel introduction and the vendor test requirements

Mishaal:

surrounding how releases you can be assured that at least the main rear

Mishaal:

facing and front facing camera will be operable on any given device.

Mishaal:

So for example, if you were to take an device that supports project trouble,

Mishaal:

such as the Lenovo tab, K 10, and you were to flash a generic system image

Mishaal:

of Android, 11, 12, or 13 onto it.

Mishaal:

Very very likely you could just open the, a SB camera app and the rear facing camera

Mishaal:

and the front facing camera would work.

Mishaal:

And the reason is because that's something that Android actually mandates

Mishaal:

testing for part of the vendor test requirements is that the camera has to

Mishaal:

be at least the main rear facing and the front facing have to be operable.

Mishaal:

But of course, nothing else is guaranteed.

Mishaal:

You're not guaranteed to have the image processing models.

Mishaal:

And the add on camera features that the stock camera app has most devices

Mishaal:

are at least most smartphones.

Mishaal:

These days have multiple rear cameras.

Mishaal:

If you try to use those on a GSI, you probably won't be able to actually

Mishaal:

access the secondary cameras.

Mishaal:

Android actually does provide support for third party camera apps.

Mishaal:

Use those cameras.

Mishaal:

But the issue is that it's through an API that Google introduced an Android

Mishaal:

nine called the multi-camera API.

Mishaal:

So what OEMs have to do is they have to define logical camera devices,

Mishaal:

logical being like they're not physical.

Mishaal:

These logical cameras are composed of two or more physical cameras

Mishaal:

that point in the same direction.

Mishaal:

So for example, you can have a main camera and a telephoto camera, and

Mishaal:

you can create a logical camera.

Mishaal:

Is composed of the main and the telephoto and the benefit of doing that is that

Mishaal:

for the third party camera app, they see it as one camera and it can change

Mishaal:

between the two basically seamlessly.

Mishaal:

So if you're like zooming out from one X to five X, the underlying

Mishaal:

camera, how would basically handle the switching between the lenses seamlessly?

Mishaal:

The app itself wouldn't have to manually detect, oh, I'm

Mishaal:

supposed to change lenses here.

Mishaal:

The logical camera interface would define that change.

Mishaal:

The only problem of course is this is yet another thing that

Mishaal:

OEMs don't have to implement.

Mishaal:

Google has tried to make support for multi-camera implementation mandatory,

Mishaal:

but they actually reneged because that conflicts with GF, which is

Mishaal:

something David mentioned earlier, they wanted to make it mandatory for all.

Mishaal:

And Android 12 launch devices to support multiple cameras.

Mishaal:

They want to make it so that if a device ships with.

Mishaal:

Or more rear cameras on the back.

Mishaal:

The OEM has to define at least one logical camera for those rear cameras.

Mishaal:

But the problem is that because devices can launch with Android 12, but running

Mishaal:

Android 11 vendor software because of GF, Google can't mandate that because

Mishaal:

that would make those devices not capable of supporting this feature and thus

Mishaal:

not capable of launching with Android.

Mishaal:

so, uh, Mohi, I wanted to ask you a bit about your thoughts on the

Mishaal:

multi-camera support on Android.

Mishaal:

What are your general thoughts on Android support for exposing multiple

Mishaal:

cameras and actually using them in apps?

Mishaal:

Do you think a requirement for OEMs to support the API would

Mishaal:

actually make a difference?

Mishaal:

The

Mohit:

multiple cameras that we often find on modern Android phones for the most of

Mohit:

the time have had separate IDs assigned to each of them, such that we could choose

Mohit:

a camera by their vendor specified IDs.

Mohit:

A while ago, Android came up with some support for a multi-camera API.

Mohit:

That can be used to merge multiple cameras into a single logical camera

Mohit:

instance, having its own ID that is often used to enhance these zoom levels

Mohit:

provided by a single logical camera.

Mohit:

Although the API again, can be certainly used for several of the purposes.

Mohit:

Zooming is just one of them for the secure camera application.

Mohit:

Yes, it does make a difference in the range of zooming levels

Mohit:

that we provide to our users.

Mohit:

The multi camera zoom.

Mohit:

Expected to be implemented by the vendor in our case.

Mohit:

So like for example, we've come across many instances, especially in case of

Mohit:

non pixel devices where the device did have the hardware required to enhance the

Mohit:

zoom range, such as the ultraviolet links to support zooming out a bit further.

Mohit:

But the devices themselves didn't support zooming below the default Onex, mainly

Mohit:

because the vendors didn't actually implement them for their own reasons,

Mohit:

such as there was in case where the device actually predated the feature itself.

Mohit:

So the vendor themselves could not no longer roll out updates since

Mohit:

they no longer supported that device.

Mohit:

So now having a gen solution by maintaining a database

Mohit:

of physical camera IDs for.

Mohit:

All the also devices on end wouldn't really be feasible.

Mohit:

And perhaps it isn't even a viable solution for the camera X team, as

Mohit:

there's a possibility that these IDs may just change in between updates, which

Mohit:

could again, drastically increase the complexity and overall efforts required

Mohit:

for maintaining this entire work around.

Mohit:

Hence like according to us, it might be better to mandate multi camera

Mohit:

zoom in the test suite for all.

Mohit:

Upcoming devices that otherwise have the hardware to support it.

Mishaal:

Okay.

Mishaal:

So yeah, you just mentioned something.

Mishaal:

I actually wanted to talk about a little bit next it's about the

Mishaal:

camera IDs and how actually apps are actually supposed to be able to control

Mishaal:

the camera hardware using the API.

Mishaal:

So I've already mentioned the camera two API, which is the actual framework API.

Mishaal:

That enables enumerating IE listing the camera devices that are available on the

Mishaal:

device, or that are exposed to Android.

Mishaal:

That API also lets apps connect those devices, configure the outputs,

Mishaal:

send capture requests, and then read resulting metadata and image data.

Mishaal:

So this API it's, uh, a little difficult to use.

Mishaal:

Because there's a lot of legwork, a lot of preparation work that apps

Mishaal:

have to do before they can actually start doing capture requests.

Mishaal:

First of all, since OEMs have to actually expose the capabilities for

Mishaal:

each individual camera to the camera, too apps then actually have to probe.

Mishaal:

What features are supported by this specific camera device on this ID.

Mishaal:

If you're ever used one of those camera, two API probing apps before

Mishaal:

you've probably gotten like a high level summary of what they.

Mishaal:

You see strings that say limited or level three, those strings basically

Mishaal:

tell you, okay, this specific camera supports this list of features, but

Mishaal:

then it's up to the actual camera app to decide, okay, because this

Mishaal:

specific sensor supports this feature.

Mishaal:

This is what I'll enable inside the app for when the user is using it.

Mishaal:

Whenever you are trying to use a app that uses the camera two API on a

Mishaal:

specific device, you may notice different features than are what are available

Mishaal:

on other devices used by other users.

Mishaal:

So Mo I'm sure you've at least heard or seen bad reviews or complaints

Mishaal:

from users who report a feature or two is missing from their device.

Mishaal:

They blame you.

Mishaal:

Or, you know, the graph OS project, secure camera developers.

Mishaal:

One, in fact, it's because the OEM didn't expose the feature when writing

Mishaal:

their interface for camera two.

Mishaal:

So how do you deal with this?

Mishaal:

Like, what exactly can you do about this?

Mishaal:

So

Mohit:

while most of the features that our app provides are available across

Mohit:

all under devices that we support.

Mohit:

There are a few features that we only support when the OEM implements and

Mohit:

exposes them for other apps to use.

Mohit:

Just as discussed earlier, we have recently very often heard of complaints

Mohit:

regarding our app, not supporting zooming.

Mohit:

The default range, despite the device physically having support for

Mohit:

it on a lot of non pixel devices.

Mohit:

Apart from that, the major complaint that we have received is regarding the

Mohit:

support for vendor specific extensions or modes that the user thinks are missing

Mohit:

while they actually need to be implement either exposed or, or UN implement.

Mohit:

By the vendor themselves for a app to show up those modes on a given device,

Mohit:

based on where the issues reported, we inform the user about it and explain

Mohit:

them, why does a camera app not support the given set of features on the device?

Mohit:

And either try to look for some timeline for the scene that is if

Mohit:

it's easily available or request the user to inquire regarding the same

Mohit:

with their respective vendors, if by any chance it is something that the

Mohit:

camera X team needs to fix on their.

Mohit:

Or is related to pixel devices, not implementing something correctly.

Mohit:

We reported to the camera X team and discuss how the issue needs to be

Mohit:

resolved or could be further addressed.

Mohit:

They're quite friendly and supportive.

Mohit:

So reporting issues to the camera X team has never been on hassle.

Mishaal:

We've actually at least mentioned camera X a lot, but we've never actually

Mishaal:

explained what it is for those of you who don't know camera X is something that

Mishaal:

Google introduced back in IO of 2019.

Mishaal:

And the secure camera app that Mohi works on actually uses camera X.

Mishaal:

It's one of the few apps that I'm aware of that proudly states

Mishaal:

that it uses the camera X API.

Mishaal:

I wanted to ask you Mo this is a very basic question about camera X,

Mishaal:

just so our readers, our listeners can understand what exactly it does.

Mishaal:

What benefits does camera X offer over camera to?

Mishaal:

And like, why does the secure camera use it?

Mohit:

so essentially camera X is a better designed, simpler, higher

Mohit:

level API, which is packed codes compatible with the camera and API.

Mohit:

But it also as well takes advantage of the camera too.

Mohit:

And the advanced camera to features whenever they're available.

Mohit:

So essentially.

Mohit:

This ensures that the camera features that you develop are of high quality

Mohit:

work across a wide range of devices while allowing quick and steady

Mohit:

development of your application.

Mohit:

Thanks to the use case based modeling of the classes that, and the high level

Mohit:

simplicity that the library has to offer.

Mohit:

Apart from that, it has a very high quality and highly

Mohit:

performing cameras of face stack.

Mohit:

That's far better than the vast majority of.

Mohit:

Camera applications that are currently available as one short example

Mohit:

in the recent experimental Edison support that they recently added to

Mohit:

the library, they included a ring buffer and some pre-processing,

Mohit:

which was mainly to improve the overall performance of that feature.

Mohit:

And hence the quality of the ZL feature that they provide.

Mohit:

ZL basically essentially stands for zero shorter lag, which can be used

Mohit:

on the latency mode to take faster shots from your camera application.

Mohit:

Apart from that, the overall code quality of the implementation of the camera

Mohit:

features tends to improve a lot while using the camera X library in terms of

Mohit:

readability, which in turn drastically reduces the overall maintenance cost.

Mohit:

The reason being that camera X works around a lot of device specific.

Mohit:

Cos impacting a specific or a certain range of devices.

Mohit:

So which in turn it, it makes the code a lot more readable.

Mohit:

So that way, uh, by making any changes or perhaps trying to

Mohit:

add any custom implementation over gets a lot more easier.

Mohit:

Apart from that one can always expect good support from the camera

Mohit:

X team for the device specific issues that the users face.

Mohit:

With the officially supported camera X features.

Mohit:

So basically this in tone has also encouraged us to investigate such

Mohit:

issues and bug reports further and report them to the camera X team,

Mohit:

which could potentially help us help a lot of organizations for your to

Mohit:

come while using the camera X library.

Mohit:

One can always fall back to using the camera to APIs using.

Mohit:

Camera to interrupt API, which offers classes that are compatible

Mohit:

with the main classes that the camera library itself offers.

Mohit:

So, but however, one must note that, um, the support is limited to an extent

Mohit:

that is not all features of the camera to APIs can be accessed via camera.

Mishaal:

Thank you Mohi for the rundown of camera X versus camera two.

Mishaal:

So for those of you who listened to our previous episode on modern Android

Mishaal:

app development, we talked a bit about jet pack, support libraries and

Mishaal:

how they simplify app development.

Mishaal:

Well, camera X is actually one of the support libraries that's under jet pack

Mishaal:

and like the other libraries, you know, it simplifies developing across specific

Mishaal:

Android OS versions because camera two is an API updated with the OS itself.

Mishaal:

So, of course there's gonna be OS specific API methods.

Mishaal:

There's gonna be different behaviors depending on the OS version and

Mishaal:

the way the code is written, what each method accepts and uses.

Mishaal:

So like it's there and it's updated with each OS version and

Mishaal:

it's little complicated to use.

Mishaal:

So what camera X does is simplifies all of that by wrapping around it and basically

Mishaal:

just letting developers not worry.

Mishaal:

The underlying implementation in the OS and just use camera X, but under

Mishaal:

the hood camera X just passes those calls to camera two and simplifies

Mishaal:

that interface for app developers.

Mishaal:

Camera X, of course, because it's a newer API and it's actually just

Mishaal:

wrapping what's available in camera two.

Mishaal:

It's not fully interoperable with all the features that

Mishaal:

are available with camera too.

Mishaal:

As Moit mentioned, that's why.

Mishaal:

The camera X offers the camera to interop API, which lets some apps

Mishaal:

use camera two APIs when there's not a camera X equivalent, but of course

Mishaal:

that has some of its own limitations.

Mishaal:

And then one of the other things that, uh, we've been talking about

Mishaal:

a bit or MOS mentioned several times are vendor extensions.

Mishaal:

So that's actually something that was introduced alongside camera X,

Mishaal:

actually a little bit later in camera.

Mishaal:

X's, uh, life cycle that came about.

Mishaal:

So vendor extensions, for those of you don't know, basically

Mishaal:

it's a library that OEMs create.

Mishaal:

That exposes specific features to third party apps.

Mishaal:

So for example, an OEM can write a camera X vendor extension for HDR

Mishaal:

face free touch and night mode.

Mishaal:

And that would allow apps that are using the camera X API to actually use those

Mishaal:

OEM provided effects in their own apps.

Mishaal:

So like if an OEM has its own HDR implementation, a third party app

Mishaal:

could see that this device has a vendor extension for HDR, and then they

Mishaal:

could use that in their own camera.

Mishaal:

I wanted to ask you Mohe, the secure camera app uses vendor extensions.

Mishaal:

Can you tell us about the implementation in the secure camera app?

Mishaal:

Like what vendor extensions do you use if they're available and what are

Mishaal:

some of the challenges with vendor extensions as they are right now?

Mohit:

Secure camera currently supports all the five standard

Mohit:

vendor extensions that the camera lab degree has described in its talk.

Mohit:

However, the availability of these vendor specific extensions, such as

Mohit:

bouquet, HDR, face research and night.

Mohit:

Site depends upon whether the vendor or OEM of the device decides to

Mohit:

expose them to other apps by these standards defined by the official

Mohit:

documentation of the camera X lab degree.

Mohit:

Currently, from what we have known from our community of active users,

Mohit:

not a lot of devices, support camera, X's vendor specific extensions, all

Mohit:

pixels, the devices that mainly target.

Mohit:

At the time of this recording don't support any camera extension.

Mohit:

Although in a recent pixel feature drop support for night site was added to

Mohit:

Excel six as in camera to extension, which is unfortunately different

Mohit:

from camera ex extensions, very few flagship Samsung devices support all the

Mohit:

camera, X vendor specific extensions.

Mohit:

However.

Mohit:

In one of the events that was recently conducted under Google IO, 2022 Google

Mohit:

said that they'll be launching their new extensions through camera versus camera

Mohit:

two, where they'll be providing their own fully software extensions for when.

Mohit:

Entry level devices that don't have them yet starting from the

Mohit:

bouquet portrait extension mode.

Mohit:

We're quite optimistic about it as this could help us giving a

Mohit:

more consistent experience for our application across all the supported

Mohit:

devices, including our place to users.

Mohit:

And honestly, if this gets implemented as expected, it might shortly be both

Mohit:

the long weight of camera extensions.

Mohit:

Finally arriving for pixel devices.

Mishaal:

Honestly, when I first heard about vendor extensions in camera X,

Mishaal:

I believe back when I was at XDA, our developer author, Zachary wrote an

Mishaal:

article about vendor extensions and there was a lot of hype around them.

Mishaal:

Maybe this will finally solve the feature parody issue between stock

Mishaal:

camera apps, and third party camera apps.

Mishaal:

And as the more I've learned, it seems like there's been major issues

Mishaal:

with the implementation as you brought up, even Google's own pixel.

Mishaal:

Don't support camera X, vendor extensions, which is hugely

Mishaal:

disappointing because they're the ones who are pushing developers

Mishaal:

like use camera X, use camera X.

Mishaal:

Well, they don't even support it properly for their own devices.

Mishaal:

So that's a huge disappointment.

Mishaal:

And then with the pixel feature drop, as you brought up that

Mishaal:

enabled night site in Snapchat.

Mishaal:

But the way they did that is they released a camera two vendor extension rather

Mishaal:

than a vendor extension through camera X.

Mishaal:

And you think they'd be the same thing, right?

Mishaal:

If there's a vendor extension, there's a vendor extension, but it's not only

Mishaal:

apps that implement the camera two vendor extension API, which is something that

Mishaal:

only introduce an Android 12 can use it.

Mishaal:

Whereas camera X is vendor.

Mishaal:

Extension is available across more Android versions and is something that

Mishaal:

more apps are expected to support.

Mishaal:

But then Google goes ahead and does.

Mishaal:

A camera two extension instead of camera X, which they've

Mishaal:

been pushing on app developers.

Mishaal:

So it's kinda like a, it's a, it's a really weird situation where we're in,

Mishaal:

where one side of Google is telling developers use camera X, but then

Mishaal:

the other side is like, uh, yeah, we will continue to support camera two.

Mishaal:

And here is our flagship pixel phone with an extension that's

Mishaal:

only available through this API.

Mishaal:

We're telling you not to.

Mishaal:

But fortunately, at least Google IO, that was a pretty significant announcement.

Mishaal:

As you brought up that Google itself will start providing some software vendor

Mishaal:

extensions for low end devices, starting with like a bouquet portrait mode.

Mishaal:

So all low end devices will have a vendor extension for portrait mode

Mishaal:

that apps using camera X can hook into.

Mishaal:

It remains to be seen if they're able to extend that to the other extensions

Mishaal:

that are possible, including like night.

Mishaal:

But, um, I'm not really sure if Google plans to expose its

Mishaal:

brilliant night site feature to all low end devices that are on its.

Mishaal:

So we we'll see what happens with that.

Mishaal:

Yeah.

David:

And I think that it's a, it's a question of the economics of a feature.

David:

And so Boca and portrait mode, for example, is basically democratized, um,

David:

even very low end media tech chip sets.

David:

I'm sure at this point in the camera stack support for some kind of portrait mode.

David:

So it's no longer one of those features that you would want to.

David:

To a high end device and not giving people access to that, especially in a consistent

David:

way is probably worse for the platform than just coming up with a standard

David:

implementation that works for everybody.

David:

And Google can keep adding value on their pixel phones by introducing things like

David:

IOS's portrait, lighting style mode.

David:

Which is much more advanced and uses a lot more algorithms to get the output.

David:

So, I mean, a lot more algorithms.

David:

I sound real smart when I say that.

David:

Uh, but you, you understand what I mean?

David:

So these initial extensions are probably a good sign of what Google

David:

thinks is most important to largest number of camera users, portrait mode.

David:

Obviously being a big one.

David:

And I think that over time.

David:

Yeah.

David:

Well, you will almost certainly see more of these features.

David:

They used to be gated to the high end devices.

David:

Probably just start to become a standard smartphone feature, like

David:

having tool cameras, all of this stuff sort of democratizes over time.

David:

Right.

Mishaal:

Yeah.

Mishaal:

And Mo said that he was optimistic about the feature of camera.

Mishaal:

And honestly, I can, I can understand why they have been slow

Mishaal:

to implement some basic features.

Mishaal:

Like version 1.1 of the library is what brought video capture support.

Mishaal:

And I think that just came out earlier this year and stable.

Mishaal:

And then one of the features that they've been working on in a O S

Mishaal:

P support for CSL or zero shutter lag, but we'd mention that as well.

Mishaal:

So like, these are features that you've already had access to with camera.

Mishaal:

But they're not in camera X yet.

Mishaal:

And then Android 13, for example, introduces HDR video capture support

Mishaal:

in camera two, but I don't think there's a, there's an equivalent API

Mishaal:

through camera X, but if you're looking to implement a very basic app that

Mishaal:

uses some very basic camera capture functionality, it's easier than ever

Mishaal:

thanks to camera X and other libraries.

Mishaal:

So for example, if you wanted to do QR code scanning or barcode

Mishaal:

scanning, Google play services has a drop in solution for that.

Mishaal:

But there are also, you know, open source solutions that you

Mishaal:

could implement with camera X.

Mishaal:

The ability for app developers to add camera functionality into their

Mishaal:

applications is easier than ever.

Mishaal:

And it's not like the realm of professional camera,

Mishaal:

app developers anymore.

David:

And with that, this is where we do our S per plug, because if you've been

David:

listening this entire episode, you've understandably come to appreciate just

David:

how complex the situation is with cameras on Android and how the OS interacts

David:

with the camera and how applications then interact with the operating

David:

system, which interacts with the camera.

David:

do you want to build for camera X?

David:

Do you want to just keep building for camera two?

David:

Do you even need direct access to the camera to accomplish what you're doing?

David:

It depends.

David:

And if you find yourself in a situation where you're trying to build the

David:

device that needs to do something like Michelle said, capture a QR code or

David:

scan a barcode or recognize a face.

David:

those are obviously very different scenarios.

David:

The camera needs to be able to do different things to

David:

accomplish those tasks.

David:

So if you're wondering, okay, is the camera on this device going to

David:

be suitable for my work purpose, whatever it is, I'm capturing faces or

David:

barcodes or taking pictures of cats.

David:

I don't really know what you're doing with the camera, but if there's something

David:

specific and you're wondering, okay.

David:

Is there an expert out there who can tell me, what can I

David:

actually do with this camera?

David:

And is it extensible?

David:

Is it scalable?

David:

Can I do this with a bunch of different cameras running

David:

Android, come talk to us at Esper.

David:

This is the kind of thing we deal with regularly differences in implementations,

David:

across vendors and hardware and software.

David:

That is our bread and butter and understanding how to make that

David:

experience consistent for you and your devices is what Esther's all about.

David:

So if you want to learn more about ESER check us out at SPER dot, I.

Mishaal:

Thanks David.

Mishaal:

And with that, I wanted to give Mohi a brief chance to tell us about

Mishaal:

where people can follow him as well.

Mishaal:

So Mohi, tell us about where people can follow your work.

Mishaal:

So yeah.

Mohit:

Could follow me at my official getup page.

Mohit:

So it's get.com/.

Mohit:

MH she, which is M H S H E T T Y or could follow me on LinkedIn as very, as a

Mohit:

student at the shin engineering college, you could follow me there as well.

Mohit:

So that's where I'm mainly active.

Mishaal:

And if people want to follow, if people want to try out

Mishaal:

your work, they can go to the Google play store and download the secure

Mishaal:

camera app, or they can try installing graph OS on a compatible device.

Mishaal:

If you have one, go listen to our previous episode with the graph OS developer,

Mishaal:

Gabe, if you wanna learn more about the project and, uh, thank you for listening

Next Episode All Episodes Previous Episode
Show artwork for Android Bytes (powered by Esper)

About the Podcast

Android Bytes (powered by Esper)
A weekly show that dives deep into the Android OS
Android Bytes (powered by Esper) is the podcast that dives deep into the engineering and business decisions behind the world’s most popular OS. https://www.esper.io

Android powers over 3 billion devices worldwide and is the platform of choice for over a thousand companies. You’ll find Android on smartphones, tablets, watches, TV, cars, kiosks, and so much more. How does Google architect Android to run on so many form factors, and how do companies fork AOSP to make it run on even more devices? These are the kinds of questions the Android Bytes podcast considers each week.

Join cohosts Mishaal Rahman and David Ruddock, two journalists with extensive knowledge covering the Android OS platform and ecosystem, as they speak to system architects, kernel engineers, app developers, and other distinguished experts in the Android space.

Get in touch with us at Esper.io if you’re looking to use Android for your product — we have the experience you need.

About your hosts

David Ruddock

Profile picture for David Ruddock
David is the Editor in Chief of Esper, and cohosts Android Bytes. David spent over 10 years as the Editor in Chief of Android Police, where he reviewed more phones than he'd care to admit, broke dozens of exclusive mobile industry stories (and also, phones), and ran one of the web's most vibrant Android communities.

Mishaal Rahman

Profile picture for Mishaal Rahman
Mishaal is the Senior Technical Editor at Esper.io and a cohost of the Android Bytes podcast. At his previous role as Editor-in-Chief at XDA-Developers, Mishaal was at the forefront of Android tech journalism, breaking story after story on new OS features and platform changes.