Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Pixel smartphones delivered with secret but inactive remote maintenance (heise.de)
126 points by qwertox on Aug 16, 2024 | hide | past | favorite | 125 comments


This is largely the same stuff Wired pushed.

The folks over at GrapheneOS have a much better analysis of this whole thing: https://grapheneos.social/@GrapheneOS/112967309987371034


I don't understand why they say you need physical access to the device. The Play Store has access to remotely install apps; surely it can enable them. I'm going to try this today.

You can log in to the Play Store on a computer and click "Install" and it'll ask you what device you want it on.

I'm not saying Graphene has this app, I'm just talking about their claim that physical access is required to enable it on non-Graphene pixels.


Apps don't have root permission and most Apis aren't available until the user accepted a prompt for the permission. And this specific API is likely only possible through adb, though I'm not further informed then this cited thread.

Could they change that in a software update? Sure! But they could also just push an update that crypto locks your device unless you pay them a monthly fee. The technology exists to do so, your trusting the OTA update provider not to do so. Just like with every other device you're running updates on, wherever that's your laptop, TV, fridge, smart mirror, speakers, doorbell, lightning or... whatever else has firm- or software in your household that you choose to update, manually or automatically.


> most Apis aren't available until the user accepted a prompt for the permission

True in general, but not true for preinstalled installed. System apps are already granted permissions on a fresh install (for example, Google Play Services has basically every permission, but you were never prompted).

Also what I'm describing isn't an update. At runtime, no update or reboot required, you can tell Play to install an app on your phone. Google then tells your phone to install it. I bet the mechanism is the same to enable a disabled app. I do know play store can enable disabled apps, I just don't know if it can be done remotely.

Edit: here's proof you can Enable a disabled app: https://storage.googleapis.com/support-forums-api/attachment...

Here's proof you can remotely install apps: https://support.google.com/googleplay/answer/14274288?hl=en

If you put these together, you have this app that can be remotely enabled, contrary to what Graphene is saying.


Neither of these proofs actually prove anything.

The first one has nothing about any disabled apps

The second one explicitly states that you're only able to install on your own device. And even if you doubt that... This still won't help you unless the user also opens the application and accepts the pop-up for scary permissions.


Okay let's say the app can be enabled remotely by someone other than the user of the phone.

What next? Have you looked at the app? What can actually be done with it? Please explain the exact steps an attacker would take next, with evidence.


This thing being there is evidence something somewhere went super wrong and now the entire system cannot be trusted by default.

Ask: was it put there intentionally? If yes, why? If it is there by mistake, and no one at google noticed it there, then how many other (actually properly hidden and actually exploitable) backdoors did they miss in their phone?


The Verizon retail demo mode doesn't become active if the package is enabled and regardless they haven't actually demonstrated enabling any of the Verizon apps on Pixels through the Play Store. Enabling the retail demo app doesn't add any remote attack surface.

Verizon's Android apps are additional attack surface for Verizon Android users on any Android device with proper support for Verizon. The retail demo app has yet to be shown to add any relevant attack surface. Despite that, there's a massive amount of news coverage portraying it as if this was accidentally included (it wasn't) or included for no explicable reason (it used to be used by Verizon for demos in their stores). The other apps in the suite are used as part of providing useful Verizon features because they refuse to do things in a standard way.

GrapheneOS has never included these so it's missing features on Verizon including Wi-Fi calling which work with any normal carrier such as T-Mobile. We're previously analyzed the apps and have repeatedly written about them and our privacy/security concerns. The retail demo app isn't part of what's concerning from our perspective.

iVerify, etc. talk about iOS not including carrier apps but it has included a lot of similar functionality for carriers. They're portraying it as Google not having access to the code and not knowing what the apps do which is at least to us is a strange thing to assume. There are many things wrong with the overall claims. The motivation to promote their product by portraying it as finding this is clear, but they clearly shouldn't get credit for that and we've demonstrated that in our thread. We can provide further examples beyond the thread and commit we linked. This section talks about the carrier apps and is not new or modified recently:

https://grapheneos.org/features#broad-carrier-support

We have a lot of past threads on Twitter about it. A lot is on our pre-2018 Twitter account which got stolen from the GrapheneOS project.


Enabling the package doesn't mean the app is active. You also haven't actually demonstrated enabling any of the non-Verizon apps on a non-Verizon SIM with this. You're presenting it as contradicting what we've said but you haven't and are mistaken about what's needed to enable the Verizon retail demo mode. Simply having a Verizon SIM enables the package but the retail mode app is disabled unless the device is put in demo mode with an under the hood setting change.


A response from GrapheneOS:

Enabling the package doesn't turn it into a remote security vulnerability. It's still not active.

The overall Verizon carrier support apps get enabled when you have an active Verizon SIM and disabled when it's not active. Retail demo mode requires additional setup. The other Verizon apps implement production functionality and add attack surface that's exposed along with giving Verizon a questionable level of control. We've repeatedly talked about it and explained why we've always omitted them, which causes a loss of Verizon on Verizon's network. That includes talking about it with the CEO of Trail of Bits and others involved in this. The focus on the retail demo app in particular is strange. The claim that it was discovered by iVerify is strange. iVerify is an EDR app which runs in the app sandbox and can hardly do anything more than scan the static manifest and code for each APK. It doesn't even have anywhere close to the level of access of analyzing the firmware and software through the published code instead of from an app sandbox. Sure, it can see these, but it can't see huge parts of the OS and firmware from there despite it being publicly available. It's a very contrived way to promote iVerify and take credit for finding something they weren't the first to discover something that the Android security research community has been talking about and analyzing for many years. People who aren't security researchers have talked about these apps in quite a lot of detail years ago. Here's the Showcase app id, for doing a quick web search:

    com.customermobile.preload.vzw
Using ADB doesn't just require physical access but rather physical access combined with the user's password, even if the user's device is already unlocked due to enabling developer options requiring it. Otherwise, you would need exploits. Regardless of the approach, what purpose does using this app serve? ADB gives far more access. Filesystem write vulnerability gives far more access.

Stock Pixel OS doesn't implement the same level of verified boot as GrapheneOS and it's not relevant to the way they do things. It's theoretically relevant to verified boot on GrapheneOS but the app has NEVER been included. If this app was included in GrapheneOS, it would present a very contrived example of a privilege escalation option for the verified boot threat model. In practice, it wouldn't actually matter because there are better known ways to do it. We don't claim verified boot on GrapheneOS avoids trust in persistent state to the point this would be relevant at all. We are gradually removing trust in persistent state but that's a long road.

Here's one of several places we publicly documented the fact that we don't include these apps from the stock Pixel OS (they aren't part of AOSP) with a basic explanation of why we omit them:

https://grapheneos.org/features#broad-carrier-support

We've always omitted the Showcase retail demo app but we never considered that to be a particularly relevant part of omitting these apps since it requires special setup, unlike the others which actually run and expose attack surface when using a SIM for that carrier. The Verizon apps get activated not only on Verizon itself but also the Verizon MVNOs.

We don't have an issue with talking about the security threat of the Verizon apps activated when using a Verizon SIM. It's an issue for Verizon Android users, not Pixels specifically, and not Pixel users on non-Verizon carriers. The story is framed in a highly inaccurate way where iVerify gets credit for finding a Pixel vulnerability which they present as being the inclusion of an app which isn't really part of the real issue. They didn't discover these apps, the privileged permissions they're granted and what they do including the connections they make and lack of HTTPS for the demo app. Other people just didn't try to push Verizon's retail demo app as a security vulnerability specifically on Pixels to promote their product.

Talking about this as if Google accidentally included the app or that there was no reason to include it is very strange. They're well aware of why they included the apps as part of the Verizon partnership and what they do. Google appears to have written significant parts of some of them. They know what they are.

There are vulnerabilities discovered in Android on Pixels on a monthly basis including serious ones. This doesn't even seem to qualify as Low severity. Why is it deserving of all this media coverage pushing it as an incredibly serious issue? Google already publicly removed the app as part of Android 15. Can see that from the Android 15 Beta. That's likely being released in September. The amount of time they'd have taken to fix it if they'd classified it as a valid Low or Moderate severity issue may not have been any quicker. Not really clear what the fuss is about.

It's pretty sad that news sites largely operate as a press release system for well connected companies to promote their products with incredibly dubious claims. Incredibly warped way of getting information about security people, and it's harmful too. Non-Pixel Android devices don't even ship Low/Moderate severity patches until the next major yearly release since they don't ship the monthly/quarterly releases of Android like Pixels. How it is improving things to push people to less secure Android devices? How is it improving things to blame Pixels for Verizon's Android requirements for all Android devices? The goal is promoting a dubious product claiming it can meaningfully defend devices from within the iOS and Android app sandbox. An antivirus app with a basic DNS filtering/monitoring system and APK scanning is getting portrayed as a lot more that it isn't to get lots and lots of money. They successfully pushed a largely inaccurate set of claims across most major news sites covering tech to promote a product and a surveillance company. Why does tech news work this way, and how are they going to fix it?


> ... but only if the perpetrator has physical access to the device, enters the user's password ...

Clickbait title. If you have physical access and password of course you can do whatever. That is the point of the password.


> If you have physical access and password of course you can do whatever.

Well, you should be able to do whatever given that it's your device. Devices unfortunately also try to protect themselves against their owners nowadays...


Yep. Verified boot is designed to erase your ability to use banking apps, streaming apps, etc. if you touch any of the code running on the system. Because it's "insecure" to change anything from the default. (read: "insecure" to reduce the amount of remote control that big tech has over your phone)

When I buy a phone, I don't want to use something owned by someone else. I want to own it myself.

Mobile Linux is absolute garbage though, even worse than desktop Linux, which is why I can't realistically use either.


Where is the clickbait?

The software is there but inactive.

That's exactly the case.


Did you know linux is delivered with a secret but inactive remote shell for hackers to attack your box?

It can be enabled with:

    socat file:`tty`,raw,echo=0 TCP-L:1022
Or:

    awk 'BEGIN {s = "/inet/tcp/0/0.0.0.0/1022"; while(42) { do{ printf "shell>" |& s; s |& getline c; if(c){ while ((c |& getline) > 0) print $0 |& s; close(c); } } while(c != "exit") close(s); }}' /dev/null
Better uninstall awk or else you're "vulnerable to spyware", from the wording of the article.

The article is clearly playing this up far more than it should be.

The title is very clearly angling at "this is malicious, it's bad", when it sounds totally benign if it's just like the socat thing above, where you need to already be on the other side of the airtight door to use it https://devblogs.microsoft.com/oldnewthing/20221004-00/?p=10...


Or even better... While booting, using either grub or systemd-boot, enter a uefi shell and change the kernel parameters to include 'init=/bin/bash rw'. Boot like that and you'll get a rw root shell.

The point is.. a system someone has physics access to is never secure. One thing that helps a little bit is using LUKS full disk encryption.


Apple has made significant progress in terms of developing evil-maid-resistant systems.

"No system is secure when you have physical access" is one of those canards that was true ten or so years ago, but was not an iron law even then and has been falsified by recent developments. Kinda like "there's no such thing as unbreakable DRM" in an era when the Xbox DRM is indeed, for all intents and purposes, unbreakable.


> Apple has made significant progress in terms of developing evil-maid-resistant systems.

Definitely, but a couple of the high end forensic data extraction companies have largely kept up with them. It's no longer something which can be done by someone who doesn't have access to expensive commercial exploit tools or government-developed tools. Most of the forensic companies can't keep up anymore but the demand is there for a couple options which do and they're not unsuccessful.

Cellebrite Premium is widely available and used by law enforcement and governments around the world. https://grapheneos.social/@GrapheneOS/112826067364945164 is their leaked documentation on their capabilities from July 2024.

Recent iPhones and Pixels are successfully preventing brute force attacks via Cellebrite Premium for Before First Unlock devices via their secure elements. They aren't successfully prevented the OS being exploited either Before First Unlock or After First Unlock. Pixels being able to run a more hardened OS is a major advantage in this regard. iOS lockdown mode and USB restricted mode exist, but don't appear to defend against Cellebrite Premium. Lockdown mode mainly reduces browser and Apple service attack surface.


> The point is.. a system someone has physics access to is never secure.

this is an article of faith among certain tech circles but it's not actually true. The entire point of the xbox security model is defending against an attacker who has unlimited physical access to the console, and it was not breached during the lifetime of the xbox one nor does xbox series S/X appear to be any different.

like literally the title of the presentation is "Guarding Against Physical Attacks". And they succeeded, despite an intense amount of effort from the modding community.

https://www.youtube.com/watch?t=1130&v=U7VwtOrwceo&feature=y...


If you do get ADB access, a filesystem write vulnerability or exploit the device to get code execution then this app is irrelevant since you already have more access. A real attack vector has not been presented, which is why Google determined that it wasn't valid security vulnerability. That's their standard operating procedure. That doesn't mean they won't fix a bug or remove attack surface. They removed Showcase from Android 15, which is visible in the Android 15 Beta.

However, you do need more than physical access with a Pixel to enable this and set it up. They have full verified boot with a specific per-device key (which is how key rotation gradually happens) and anti-rollback fuses to prevent downgrade attacks to old vulnerable versions. The OS images are completely verified with anti-rollback via the secure element which has authenticated encryption between it and the main SoC. The data partition for the OS has every block encrypted, although it's not authenticated encryption yet. The firmware is quite locked down and reset attack mitigation for firmware boot modes was added in April based on a vulnerability report from us in January. RAM isn't fully encrypted yet but it's quite difficult to tamper with modern RAM or even dump it without controlling the OS / SoC firmware unless there's debugging functionality left enabled in production. Fully encrypted RAM is the main thing they're missing aside from a more hardened OS.

Cellebrite can successfully exploit up-to-date Android and iOS devices with physical access as part of their Cellebrite Premium product, but it's increasingly not easy for them. They often fall behind with updates, but they consistently catch up again. Leaked July 2024 documentation showing the current capabilities is available here:

https://grapheneos.social/@GrapheneOS/112826067364945164

For GrapheneOS, our aim is defending the device long enough for our auto-reboot timer after locking to activate combined with zero-on-free and firmware reset attack mitigation. They haven't been very successful at exploiting GrapheneOS but did develop exploits for older versions from 2022 and earlier. Physical access is not an entirely lost cause, the goals just need to be well defined. Defending the device for 18 hours since it was locked (our default auto-reboot timer) is our goal. Users can set auto-reboot as low as 10 minutes but then they'd be missing notifications.


Won't work if you're using UKIs. The kernel command line is fixed, signed and measured into TPM2 and your disk won't unlock


Hardly anynone uses that currently & it has a lot of unsolved issues.


What are some of the unsolved issues? Just curious


When the motherboard fails, everything is gone. There is no way around this.

You either accept that your data is irretrievably tied to your motherboard, or accept that your data can be viewed/modified by someone who can replace your motherboard.


No? All you need to do is install to a standard LUKS partition with a good password. Then your data is secure but not irretrievably tied to your motherboard. Granted, that's not UKI, but it's a functional solution.


Yes, TPM is just one way to access your disk. The only reason to use it is in my opinion to not have to enter a super long password on every boot. An extra key if your mobo fails is the way to go. Or if you can't boot after an update, and you don't sign your recovery OS with the same key.


"Users cannot uninstall Showcase.apk themselves."

Linux is just a kernel. There is no requirement to install/keep gawk. One can use a faster awk that has no TCP networking. Most GNU/Linux distributions do not have socat pre-installed.

Linux, being only a kernel, allows the computer owner to choose what programs they want in a userland. By contrast, this "smartphone" from an advertising company comes with software pre-installed. And not easily removed.

This is why "smartphones" suck as general purpose computers.


I thought we talk about the title and not the article.

You could claim the article is fear mongering but the title not clickbait.


Linux is an ecosystem that includes Android. Pixel is an Andoid phone by Google that is supposed to be hardened against local attacks, showing when an unhardened OS is booted, etc.

No one cares that you can run a firewall on an insecure OS configuration, but they care if your shipped appliance does it.


This is beside the point.

The point is that many OSes include tools that you can use to do remote maintenance as long as you have the password and physical access. There's nothing to write home about.

There being inactive software somewhere to do maintenance that can be used if you have physical access and the password is as best interesting (curiosity), at worst not newsworthy, and in any case not concerning.

Now, phones sold pre-filled with junk / invasive software all over the place is gross.


Funny how you people keep making it as if bins included openly in normal Linux distributions are as bad as some weird unaudited internal tool by Verison of all things, additionally hidden from the user. I would have higher expectations of a Pixel tbh


We are not remotely saying this. And there are no "messengers" shouted, and there's no hatred. We are saying that tools provided in common linux distros allow you to setup remote control if you have the password and physical access. This is not being bad, it's just that the tools are powerful and included out of the box.

As for including an opaque binary, I would expect way better from Linux distros. An opaque binary would be scandalous.

But on Android? You already can't trust lineage or aosp because of the proprietary blobs you need on any smartphone for the drivers. Stock Android? Add all the crapware from the manufacturer. Add to this the crap added by the carrier. The phone is already full of inscrutable crap, it's hopeless.

Some deactivated stuff seems like a total non event in comparison to all this, including the crap you can't even disable and that does you don't know what and sends who knows what to who knows who. The whole situation is concerning and scandalous, but not much more with that additional, deactivated opaque stuff.


Most of the stuff you refer to are why a company would restrict phones (in its intranet) to only ones that were 1 vendor and not telco modified to possibly deliver the evil maid/police illegal wire tap post (short) arrest. We just discovered that there are not 2 such vendors, but one.


> tools provided in common linux distros

Again... those tools are open source, audited and have many eyes on them

This tool however is shady as heck. google dropped the ball


We can't seem to understand each others.

I 100% trust my open source audited rm, but it will definitely remove everything from my system if I call it with parameters "-rf" and "/" with sufficient permission. It is powerful enough, and the whole set of trusty tools I have on my linux distro lets me take control of it remotely.

That tool is shady, I agree, but it also deactivated. Do you know what it means on Android for an app to be deactivated? It basically means "not installed". It's here in the file system (on the system partition), but doesn't run. It wouldn't concern me if I had it (though I would prefer it not to be there and for the system partition to be smaller so I can use this space in the user partition), I'm way more concerned by all the craps that actually runs.


> I 100% trust my open source audited rm, but it will definitely remove everything from my system if I call it with parameters "-rf" and "/" with sufficient permission.

You're almost there. Now imagine you could not trust it to do that, and also did not ask it to be there, and also it was an internal tool for verizon written by verizon:)

And it's not like rm, it's more like teamviewer and who knows now many bugs it has. If I install linux and there is a hidden teamviewer there, even if it doesn't run by default I would wipe the system just in case because wtf.

Ask yourself, is it by design? If yes, why? If not, then the responsible person did not notice it there, so ask yourself then what else did they miss?

It just should not be there period, if it is there something somewhere went super wrong.


We are going circles. My comment at https://news.ycombinator.com/item?id=41270161 fully answers this.

> it's more like teamviewer and who knows now many bugs it has

My point is that it's nore like teamviewer's installer since it's deactivated, which is pretty equivalent to "not installed" in Android's world.

I feel like you are assuming I'm wrong: I find your "you are almost there" and "ask yourself" phrasings quite annoying. You are just assuming you are right and I'm wrong. We will not convince each others, our respective views seem fully made up here, this discussion will probably not progress anymore and I feel like I already wrote down every interesting point I could make on the topic, so I will stop there.

> It just should not be there period

It's not like I even disagree here. It should not be there for sure. Like all the more concerning crap that has been there since the beginning which is my core point. If you are pissed off by this Jew discovery, please do complain loudly about all that crap in our phones, we do absolutely need more people doing this. There's definitely not enough awareness around this stuff.

To me, complaining about this new thing is like complaining about some dust particle yoi just noticed in a house where housework was never made. But it's good people are beginning to see the dust, I guess...

I'm quite pissed off by the Sony phone I inherited with its impressive amount of crap you can't even all deactivate, and the lack of working lineage rom for it. Some deactivated shit in it would be the least of my concerns compared to this.


> My point is that it's nore like teamviewer's installer since it's deactivated

Same question: wtf is it doing on this phone? Is it on purpose? Is it a fuckup? Etc.

You saying that phones contain junk we did not ask for and so it is not news is not true. Some phones do. Pixel was assumed to be a good reference Android device. Now it turns out Pixel also does. It is news.


Okay, now I get it: you are disappointed by the Pixel specifically.

I don't trust Google's proprietary stuff neither, so I didn't have that in mind: I assume that any stock Android is going to phone home and should I receive a Pixel, I would replace the stock Android on it to one of the FOSS roms anyway, without Google Play services.

> wtf is it doing on this phone?

Again, nothing, I agree with you on this.


> you are disappointed by the Pixel specifically.

Not just me apparently, the company in the article too:)


> It basically means "not installed"

This seems to be the oddity in the discussion. Not installed has a lot of equivalents in some security models, but not many things have an equivalent to installed as an Android manufacturer package. If a package is re-enabled as a manufacturer one it bypasses play checks and Advanced Protection and maybe can hide itself as a system package.

Letting stalkerware through and avoid detections via the manufacturer store exception is IMO likely to be intentional on Google's part to be able to let Android succeed via manufacturer/telco customizations in countries where laws require malware. An unlocked Pixel was expected to be clean because it wouldn't be setup to be in one of these deals.


It seems to me like a lot of it is hatred for the messengers.. But I think Palantir is a perfect organization to resentfully report a telco conspiracy to create a law enforcement back door as long as they didn't get a piece of it.


True, there's some irony in that...


No it is not besides the point. They are not supposed to ship a setup where physical access escalates to permanent spying with no warnings because they are promising things about devices not about an ecosystem's overall functionality to build any possible configuration.

Many people feel Google and Apple have ulterior motives, but that is an academic argument unless they abandon this motive, at that point they need to always ship rooted insecure boot phones for our ease of use.


"Secret" is the clickbait, as it implies that there's some kind of subterfuge going on. The app isn't intentionally hidden as far as I can tell, so "preinstalled" would be far less sensational.

Any operating system distro is going to come with a bunch of stuff that you're not necessarily going to use; obviously it's less than ideal if distributions are being shipped with old junk that nobody uses, but it's hardly the crime of the century.


It's hidden from the normal user, isn't it?


It's not clickbait. Imagine if your Linux shipped with TeamViewer like app hidden somewhere. You would have to ask some serious questions. especially if you are a security related IT business


Like... SSH?


> Like... SSH?

Except SSH is a protocol, and this is some shady piece of internal enterprise tool... probs written by one unpaid intern and never security audited once in its lifetime... otherwise yeah no difference at all, nailed it ;)


https://www.theverge.com/2024/8/15/24221151/google-pixel-sho...

  “This was very deleterious of trust, to have third-party, unvetted insecure software on it,” Dane Stuckey, Palantir’s chief information security officer, told The Washington Post. “We have no idea how it got there, so we made the decision to effectively ban Androids internally.”

  “It’s really quite troubling. Pixels are meant to be clean,” Stuckey, of Palantir, told the Post. “There is a bunch of defense stuff built on Pixel phones.”
Pixel phones have AVF/pKVM, which can be used to isolate security-sensitive workloads in a separate VM, https://source.android.com/docs/core/virtualization/architec...


interesting, though i don't think that isolating sensitive stuff in a vm is a reasonable security strategy if we are talking about low-level compromise of the entire architecture, or did you want to rationalize the usage for "defense stuff"?


If sensitive data is isolated in an EL1 pKVM VM, it is protected from compromise of host OS that is also at EL1, thanks to hypervisor at EL2.


that's a matter of terminology.

i'd argue that what you describe as "host" is rather a management vm which is allowed to talk directly to the hypervisor. though, through this privilege is most likely able to compromise it and all other guests.

but this doesn't really matter as the attack vector we are talking about already has dma and does not care about any of that.


It's kind of wild that Palantir would ban Android phones when this was software installed by Verizon. If Apple had installed disabled-but-insecure software on iOS, would it even be discoverable?


If the underlying OS has a remote access vulnerability wouldn't that compromise every VM OS running on top?


> wouldn't that compromise every VM OS running on top?

pKVM VMs run "on the side" rather than "on top" of the host OS, so any compromise of the host is isolated from guests, besides DoS.


This vulnerability isn't with the underlying OS though. They just installed a disabled application that has security concerns, but someone has to manually enable it for it to be a problem.


Can confirm it's on my Pixel 8:

    shiba:/ $ pm list packages -a -f
    ...
    package:/product/priv-app/Showcase/Showcase.apk=com.customermobile.preload.vzw
    ...
Gross. I have zero relationship with Verizon and yet this garbage is on my phone.


Verizon is prone to overstepping with Pixels. I couldn't tether my phone with my carrier (from a country that doesn't have Verizon anywhere near it) because they happened to use the same SIM ID (or something, I'm not sure of the details), and when I tried my phone would reach out to Verizon to see if I was allowed to, it didn't know who I was and said no, so I couldn't tether my phone.


Vodafone? They used to own a ~controlling interest in Verizon, and it wouldn’t be at all surprising if they shared that sort of thing, as _ordinarily_ it wouldn’t matter.


Vodafone, but in NL. You might be on to something there.


probably phones outside the US also got polluted with this...


I'm in Italy and my pixel 8 pro bough from Amazon has it, so yes, seems to be in all of them


I never got to try GrapheneOS because a single unresolved doubt: does its Camera app benefit from the same image improvement features that the stock camera has?

Especially now that Pixels started shipping with hardware chipsets specifically designed to accelerate certain kinds of image processing, this is a feature that's quite important to me not to lose when changing systems.


With graphene, you can just run the regular google camera app and disable it's network permissions.

I can't comment on your question about the graphene camera app.


I also use GraphenOS and the regular google camera app, everything except the AI features works , and the camera quality is on par with what I got while using the Stock OS for a day (it was also one of my biggest fears, that the Camera quality would be worse, that's why I compared it).

I've tried the Graphene Camera app, but to be honest the UX is a bit janky, but I think the image quality is basically the same.

Relevant Link: https://grapheneos.org/usage#camera


You can use most of the AI features if you install the relevant apps. Certain things can't work without privileged access we don't provide, but we do allow Google apps to use non-standard TPU acceleration by default with a toggle for people who don't want it.

> I've tried the Graphene Camera app, but to be honest the UX is a bit janky, but I think the image quality is basically the same.

It has HDR+ and Night mode. It's largely the same image quality. Pixel Camera has a lot more overall features and fancier HDR+ features. The UI in our app has gotten significantly better. Make sure you're using Latency mode to match what Pixel Camera does rather than Quality mode which purposely delays capturing until focus lock.


> Make sure you're using Latency mode to match what Pixel Camera does rather than Quality mode which purposely delays capturing until focus lock.

Aah okay that's what this toggle means, thats definitely better.

But my biggest gripes with the GrapheneOS camera app are:

1. The Zoom slider: You can't select the different lenses easily (I'd need to set the zoom slider to exactly 5x to get the zoom lens without digital zoom, which is not that easy) and there is no way to quickly reset it to 1x. I also think the position for this slider would be better at the bottom (where most other camera apps put it)

2. The brightness and zoom slider are rather hard to hit, I maybe have a bit too fat fingers for these? (The brightness slider also does not seem to work in night mode) I also can't see the zoom/brightness values while sliding and the Google Camera App allows for more smoother/finer control.

Same with the little arrow in the top left corner, it always takes like 3-5 hits to open the menu. Never had this issue in the Google Camera App.

3. I need to use the QR Code mode to scan qr codes. I did bind the Google Camera app to double click the power button so I can easily take pictures and scan QR Codes, without the need to tap anything else. But It's definitely a cool feature that the GrapheneOS camera can scan many more Code Types and also that the GrapheneOS Camera removes exif data by default

4. The mode slider at the bottom doesn't let me scroll further than one mode, this means I'd need to swipe thrice to get to the QR Code mode from the Video mode (or tap twice) because when in QR Code mode I can't see the Video mode and vice versa

5. I currently tested it out for writing this comment and my phone definitely got way warmer than when using the Google Camera App. And it used 5% Battery while being active for 7 minutes. EDIT: very weird it did now vanish from the Battery Usage list in the Settings, and in the App Info it also says it did not use any battery since the last full charge (It did before)

6. I think it does not support the Astrophotgraphy (Tripod) mode for the Night Mode.

These are all not huge issues and most people who just want to quickly take some pictures are probably not really bothered by this. But I use the Phone Camera very often (its even one of the reasons I went for the Pixel 8 Pro instead of the normal 8, to get the extra zoom lens) so these issues made me install the Google Camera App.

Don't get me wrong, I still think it is really cool that GrapheneOS can ship an Open Source camera app which achieves the same Image Quality with added features like the HDR+, QR and Night Mode, which is probably enough for most users, and I wouldn't want to trade GrapheneOS for any other OS (see my comments: https://news.ycombinator.com/item?id=41238691 https://news.ycombinator.com/item?id=41241594 and https://news.ycombinator.com/item?id=39145200)


> 1. The Zoom slider: You can't select the different lenses easily (I'd need to set the zoom slider to exactly 5x to get the zoom lens without digital zoom, which is not that easy) and there is no way to quickly reset it to 1x. I also think the position for this slider would be better at the bottom (where most other camera apps put it)

That's not what the zoom buttons do in Pixel Camera and it doesn't switch at exactly the telephoto magnification zoom value but rather adjusts when it switches based on the available light because the telephoto camera can't handle low light as well. We could add 1x for going back to 1x but you can already easily zoom out to the ultrawide and the telephoto is more complex than people realize. You can see it often doesn't actually switch at the minimum.

> 2. The brightness and zoom slider are rather hard to hit, I maybe have a bit too fat fingers for these? (The brightness slider also does not seem to work in night mode) I also can't see the zoom/brightness values while sliding and the Google Camera App allows for more smoother/finer control. > > Same with the little arrow in the top left corner, it always takes like 3-5 hits to open the menu. Never had this issue in the Google Camera App.

You can swipe down for settings and can swipe left/right between modes. The arrow is mostly to imply that you can swipe down. We do plan to change the overall layout and sliders/buttons a bit.

> 3. I need to use the QR Code mode to scan qr codes. I did bind the Google Camera app to double click the power button so I can easily take pictures and scan QR Codes, without the need to tap anything else. But It's definitely a cool feature that the GrapheneOS camera can scan many more Code Types and also that the GrapheneOS Camera removes exif data by default

You can open it via the standard Android QR scan quick setting if you use it a lot.

> 4. The mode slider at the bottom doesn't let me scroll further than one mode, this means I'd need to swipe thrice to get to the QR Code mode from the Video mode (or tap twice) because when in QR Code mode I can't see the Video mode and vice versa

We can consider adjusting changing modes.

> 5. I currently tested it out for writing this comment and my phone definitely got way warmer than when using the Google Camera App. And it used 5% Battery while being active for 7 minutes. EDIT: very weird it did now vanish from the Battery Usage list in the Settings, and in the App Info it also says it did not use any battery since the last full charge (It did before)

It shouldn't consume more power than Pixel Camera. Don't know why that would be the case. It does do things a fair bit differently. CameraX is improving which brings improvements to our app without us having to do much and we implemented some things ourselves like the parallelized image saving in the background.

> 6. I think it does not support the Astrophotgraphy (Tripod) mode for the Night Mode.

We'd need them to add this to the extension API.


> That's not what the zoom buttons do in Pixel Camera and it doesn't switch at exactly the telephoto magnification zoom value but rather adjusts when it switches based on the available light because the telephoto camera can't handle low light as well.

Ooh TIL, thank you, I just tried it out, that's definitely a bit confusing. Would it work to just have 2/3 buttons that switch between the cameras? Or is this not possible (because you need to supply a zoom level)?

> You can swipe down for settings and can swipe left/right between modes.

That works definitely better!

> It shouldn't consume more power than Pixel Camera. Don't know why that would be the case.

Will keep an eye out on this, but after quickly testing the few things you mentioned in your comment my phone already got a bit warmer.

> We'd need them to add this to the extension API.

Ah okay, that's a bummer!


> I never got to try GrapheneOS because a single unresolved doubt: does its Camera app benefit from the same image improvement features that the stock camera has?

Our Camera app has hardware-accelerated HDR+ and Night mode on current Pixels. Pixel Camera can be used on GrapheneOS for the full feature set it provides. It has the same hardware acceleration on GrapheneOS unless you toggle it off via our toggle for giving a specific list of Google apps special access to the TPU, image processing accelerator, etc. as Pixels do. Other apps can use these features too but via more limited standard APIs.

The main thing you lose on GrapheneOS is that certain financial apps choose to ban using a non-Google-certified OS which we think is a violation of anti-competition laws/regulations and intend to pursue it as a legal and/or regulatory issue. It should be possible to run all Android apps from the Play Store on any OS maintaining a comparable security model, and Google shouldn't get to have veto power over this as they do now. The certification rules for OEMs forbid implementing some of the privacy/security features we provide so it's clearly an unacceptable system even if it was open to us to get certified. It'd also be ridiculous to have each release delayed by third party certification if that was required.

> Especially now that Pixels started shipping with hardware chipsets specifically designed to accelerate certain kinds of image processing, this is a feature that's quite important to me not to lose when changing systems.

The hardware accelerated image processing and general purpose neural net acceleration via the TPU fully works on GrapheneOS.

Certain fancy AI features can't be used via our optional sandboxed Google Play feature because it's a set of regular sandboxed apps and we didn't provide toggles for granting it privileged access for niche functionality. As a simple example, the wake via hotword feature in Google Assistant can't be used because that's not a capability available to regular apps with a normal permission. We could implement our own wake system or make a toggle for it, it's just not a priority.


Thanks a lot! A text such as this one would have been great to find somewhere relevant when looking for a bit of technical information about whether the image processing pipeline will be the same or not when taking photos or videos. With this message you've probably driven me towards finally investing the time to try GrapheneOS :)


I had the same doubt, but my understanding is that all the image processing takes place before the camera app gets the image feed.

Even with the OpenCamera app (which allows for manual focus control), I'm still getting the same quality of pictures on my GrapheneOS Pixel 7 Pro as with the stock OS and camera app.


The camera app can choose how this works. GrapheneOS Camera app supports HDR+ and Night mode on Pixels along with having comparable video processing including the HDR+ style merging of data across the low-level frames and EIS. Can use Pixel Camera on GrapheneOS if you want the additional features and the more aggressive HDR+ processing it uses with extra features.


I'm on GrapheneOS and running the official Pixel Camera app from Play Store with network disabled. Everything works great and the photos look exactly like on a stock system.


You can download gcam from XDA forums. I even have it on my Samsung Galaxy


The real news here is that Google's internal organizational controls are so weak that some idiot from "sales", or "carrier relations", or whatever you want to call it, can get arbitrarily chosen stupid software preinstalled at the whim of some idiot from a similar function at Verizon.


Another advertisement for https://grapheneos.org ...


is it safe from it?


https://x.com/grapheneos/status/1824163464052396432

> This is one of multiple carrier apps in the stock Pixel OS which we don't include in GrapheneOS. We were aware of it already since we had to go through them and figure out why they exist... GrapheneOS has gone through each of the carrier apps included on Pixel generation to determine their purpose and consequences of including or excluding them. Here it is being excluded from the new adevtool project for ProtonAOSP and GrapheneOS in 2021.


We don't include these carrier apps in GrapheneOS. However, this particular claimed vulnerability isn't a real security issue. The other Verizon apps do add significant attack surface and give them control they shouldn't really have. They're only active on the stock OS when using a Verizon SIM though.

GrapheneOS does massively improve privacy and security, but these apps aren't a problem if you aren't using Verizon SIM and even then the retail demo app is not remote attack surface.


Ehm... Does someone think ALL smartphones are not spying devices controlled by many more than their formal human owner?


I don't think smartphones are "spying devices". I mean you'd have to have medically worrying levels of paranoia to think that.

Are smartphones a juicy spying target for governments and corporations? Yeah obviously. They aren't "spying devices" though. Nobody said "guys! let's invent smartphones so we can spy on people".


Does it matter if the chicken came before or after the egg?

You have a big tech company, born out of a small piece of Xerox PARC research having found a good way to make it anti-user, you finally be able to sell very cheap computers where users can only touch with fingers, have essentially no control over them, to the point they have no "administrative account" and some apps even refuse to operate if the user have cracked his/her own system to have a root access. Such small devices are essentially just "endpoints" of some remote services, and they are next to useless without those services, they collect gazillion of data and they normally mirror/send them on remote. What you want to do? Offer free 15Gb/user storage without using such data for anything?

You probably have read about car's collected data, sold to insurances, to craft better prices (for the insurance), and you do think smartphones are not used for similar way? You probably read about Google GPS data used by the police to wrongly accused some people "nearby" crime scenes etc and you think all this examples are isolated stuff?


I personally would classify most targeted advertising as "spying"


My next question is does Verizon know the users passwords (possibly via some other exploit we don’t know about) and did some state agency encourage Verizon to request that software so that access to confiscated devices was a subpoena or fisa court order away.



The smart money assumes "yes" at this point. After Snowden, we are fools if we think otherwise.


Agreed! Snowden taught us a lot and sadly most people don’t seem to realize what it means. People say “the government can’t get in to Signal it’s encrypted and the protocol is well tested” and it’s like… yeah unless they can directly access the keys on your phone, and funny we keep seeing long standing vulnerabilities so actually no normal computer has ever been “secure”. Idk people aren’t paranoid enough. But I’m glad you get it!


"spyware contains spyware"

any phone when idle and seemingly "off" is doing all kinds of things and sending all kinds of info that you may not consent to if you were aware.

i dont even trust the wifi off and airplane mode anymore. they are all liars.



Related sources already submitted:

Google Pixel Phones Have Unpatched Flaw in Hidden Android App

https://www.wired.com/story/google-android-pixel-showcase-vu...

(https://news.ycombinator.com/item?id=41256122)

Google sold Android phones with hidden insecure feature, companies find

https://www.washingtonpost.com/technology/2024/08/15/google-...

(https://news.ycombinator.com/item?id=41255631)

iVerify Discovers Android Vulnerability Impacting Millions of Devices

https://iverify.io/press-releases/iverify-discovers-severe-a...

(https://news.ycombinator.com/item?id=41255798)


This specific to the submission is uninstalled normally (can be uninstalled by the user), I understand.

Otherwise one could try:

  pm uninstall --user 0 the.unwanted.package
I have to research about how effective the above is (uninstall only for this user - what you can do with "system" packages) to remove spyware.


IIRC if it's a system package, that will just disable it, not remove it, and it can be re-enabled fairly easily.


> can be re-enabled fairly easily

Right, but remotely?

If the package is disabled, and not re-enabled by third parties, and this operation can disable spyware... As said, I do not quite know yet how effective this can be in practice.


> Right, but remotely?

It doesn't matter; the attack described in the article already requires physical access.

Regardless, I'm sure Google has hooks that allow them to remotely enable/disable apps on the phone via Play Services, so... yes, remotely.


If it were just the Google Play Services, disabling that would disable remote enabling of applications - it would also make some applications unusable.

One solution to the remote enabling of apps could be a "watchdog" monitor that checks relevant statuses - so you will know if a past user decision got overridden.

Anyway, we need a database profiling Android software with relevance to security and privacy. There must be something around.


> we need a database profiling Android software with relevance to security and privacy.

GrapheneOS maintains a list for Pixel phones:

https://github.com/GrapheneOS/adevtool/commit/9c5ac945f

https://news.ycombinator.com/item?id=41264936


you can safely assume google has full remote access to your android devices [0] [0] https://www.bbc.com/news/technology-45546276


And when Apple did it with the iPhone "1" they said in press conference that "we decided that if we did not put the feature in, we would later regret it".

The (BBC) article confirms that some Android implementations allow remote access from Google.

The question remains, how much can `pm uninstall --user 0` limit the unwanted. For example, that remote access to the configurations the article discusses.


Can you provide a source for that quote? I can’t seem to find it.


I am not the person you are replying to but it is blantantly obvious that IF you have google play services on your phone, they have full access to your device.

They have accidentally shown their hand previously by turning on battery saver mode on everyone's phone on accident.

https://tech.hindustantimes.com/tech/news/google-can-remotel....

They can try to say it is limited to whatever, but I think all the smoke means there is fire. I used to exclusively use Android devices until this incident happened.

At the end of the day, I realize that even if I buy a google phone, I am still the product. I do not believe it is the same with Apple. Do I trust apple? No way, but I feel they have more incentive to not treat my data as a product to be sold and my device to be used to spy on me.


So what are you suggesting? Don't disable it?


I didn't say that; not sure how you got that from my comment. Sure, disable it, but don't let that make you believe it's gone and permanently inactive.


So I decided to check if this is really as described.

I downloaded a copy of the app from here, which is the Pixel 8 firmware for June 2024: https://dumps.tadiphone.dev/dumps/google/shiba/-/raw/shiba-u...

The app has two components that are exported:

- broadcast receiver com.customermobile.preload.StartOnBoot, which receives the ACTION_BOOT_COMPLETED broadcast that is sent by the system after the device is booted and first unlocked by the user

- activity com.customermobile.preload.DebugActivity

The onReceive method in StartOnBoot is this:

        public void onReceive(Context context, Intent intent) {
            Log.d(LOG_ID, "on boot called: " + (intent != null ? intent.toString() : "null"));
            boolean demoEnabled = getDemoEnabled(context);
            boolean checkAppPermissions = checkAppPermissions(context);
            Log.d(LOG_ID, "demoEnabled: " + demoEnabled);
            Log.d(LOG_ID, "permissionsCheck: " + checkAppPermissions);
            if (demoEnabled && checkAppPermissions) {
                startDemoStub(context);
            } else {
                if (!demoEnabled || checkAppPermissions) {
                    return;
                }
                startPermissionsCheck(context);
            }
        }
It relies on the getDemoEnabled method returning true to do anything:

        public static boolean getDemoEnabled(Context context) {
            int iStoreDemoMode = 0;
            int iVzwStoreDemoMode = 0;
            String szStoreDemoMode;
            String szVzwStoreDemoMode;
            
            try {
                iStoreDemoMode = Settings.Secure.getInt(context.getContentResolver(), "store_demo_mode");
            } catch (Exception ignored) {}
            try {
                iVzwStoreDemoMode = Settings.Secure.getInt(context.getContentResolver(), "verizonwireless_store_demo_mode");
            } catch (Exception ignored) {}
            try {
                szStoreDemoMode = Settings.Secure.getString(context.getContentResolver(), "store_demo_mode");
            } catch (Exception ignored) {}
            try {
                szVzwStoreDemoMode = Settings.Secure.getString(context.getContentResolver(), "verizonwireless_store_demo_mode");
            } catch (Exception ignored) {}
            
            return
               (iStoreDemoMode > 0) || 
               (iVzwStoreDemoMode > 0) ||
               "true".equalsIgnoreCase(szStoreDemoMode) ||
               "true".equalsIgnoreCase(szVzwStoreDemoMode)
        }
Which will only happen if system settings "store_demo_mode" or "verizonwireless_store_demo_mode" are set to a positive integer or to "true".

As neither of these are set by default, none of the code behind the startDemoStub method will be executed when the device boots.

The DebugActivity layout is as follows, and consists of a single button labelled "Simulate StartOnBoot":

        <?xml version="1.0" encoding="utf-8"?>
        <androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android" xmlns:app="http://schemas.android.com/apk/res-auto"
            android:layout_width="match_parent"
            android:layout_height="match_parent">
            <Button
                android:id="@+id/startOnBoot"
                android:layout_width="wrap_content"
                android:layout_height="wrap_content"
                android:layout_marginLeft="8dp"
                android:layout_marginTop="8dp"
                android:layout_marginRight="8dp"
                android:text="Simulate StartOnBoot"
                app:layout_constraintLeft_toLeftOf="parent"
                app:layout_constraintRight_toRightOf="parent"
                app:layout_constraintTop_toTopOf="@+id/guideline"/>
            <androidx.constraintlayout.widget.Guideline
                android:orientation="horizontal"
                android:id="@+id/guideline"
                android:layout_width="wrap_content"
                android:layout_height="wrap_content"
                app:layout_constraintGuide_percent="0.1"/>
        </androidx.constraintlayout.widget.ConstraintLayout>
All this does if clicked is send an intent which will end up at the onReceive of the StartOnBoot receiver:

        public class DebugActivity extends AppCompatActivity implements View.OnClickListener {
            public void onCreate(Bundle bundle) {
                super.onCreate(bundle);
                setContentView(R.layout.activity_debug);
                findViewById(R.id.startOnBoot).setOnClickListener(this);
            }

            public void onClick(View view) {
                if (view.getId() == R.id.startOnBoot) {
                    sendStartOnBootIntent();
                }
            }

            private void sendStartOnBootIntent() {
                sendBroadcast(new Intent("com.customermobile.preload.StartOnBoot"), "com.customermobile.preload.StartOnBoot.PERMISSION");
            }
        }
We know this because the receiver filters for this action:

        <receiver
            android:name="com.customermobile.preload.StartOnBoot"
            android:enabled="true"
            android:exported="true">
            <intent-filter>
                <category android:name="android.intent.category.DEFAULT"/>
                <action android:name="android.intent.action.BOOT_COMPLETED"/>
            </intent-filter>
            <intent-filter>
                <action android:name="com.customermobile.preload.ManualStart"/>
            </intent-filter>
            <intent-filter>
                <action android:name="com.customermobile.preload.StartOnBoot"/>
            </intent-filter>
        </receiver>
So, nothing to worry about. The app does nothing unless one of the two system settings described above are set.


Why was this installed on all their devices in the first place? I have nothing to do with Verizon, I've never even been to the US. This is something to worry about because all it takes is someone having access to your unlocked device. For example a border guard demanding to have a look at something. When the user doesn't know this vulnerability exists, it's a problem.

It's a company making remote access software. Who honestly wants this pre-installed on their devices? They can install their malware on their own phones, after having purchased them.


It doesn't do anything though. The "store_demo_mode" and "verizonwireless_store_demo_mode" settings are not set by default, and there doesn't appear to be any way to set these through the UI.

Also, according to other comments, the app is disabled by default. From examining the CarrierSettings app, this does indeed appear to be the case. It seems to require a certain carrier configuration to enable it.

Anyone who wants to enable the functionality of this app would have to go to some effort to do so, including enabling the Developer Settings - which requires the user's PIN, password or pattern - and then connecting to the device over ADB, enabling the app (if not already enabled by the carrier), and changing one of the two settings above.

This really is nothing to worry about. If you disagree, do please feel free to do some analysis of the software yourself so we can discuss this based on the facts.


Search for the app id (com.customermobile.preload.vzw) and you'll see people have been looking at it for years including figuring out what conditions were required to enable it even without the package disabled. GrapheneOS has repeatedly publicly talked about these apps including in threads where the CEO of Trail of Bits (which founded iVerify and was closely involved in this) was active. Incredibly strange marketing stunt which has destroyed their credibility for a lot of people, particularly with them doubling down on it and attacking people debunking it.


The Verizon packages are only enabled when using a Verizon / Verizon MVNO SIM. They're equivalent to uninstalled when not using one. This is a requirement from Verizon for all Android devices to fully use their network features. GrapheneOS has always omitted these apps and therefore is missing certain features like Wi-Fi calling on Verizon which work fine on T-Mobile and other carriers internationally.

Aside from the packages not being enabled without a Verizon SIM, this retail demo app iVerify/Palantir claim is a serious vulnerability has the extra layer of being disabled unless it's manually set up. The access needed to set it up or especially to enable the package in the first place is already very invasive.

They're also trying to portray it as if Google unnecessarily or accidentally included this without knowing what it is but that's really not the case. Verizon was using this for actual demos in their stores and required OEMs to install it to have Verizon selling their phones. Pixels made sure the packages were completely disabled along with the other Verizon apps when not using Verizon and then it's also disabled even when using them.

Issues in these apps are an overall Verizon Android problem, not Pixel specific, and hardly something which was overlooked on Pixels. They know it's unfortunate that these apps are needed and have gradually gotten rid of the need for most carrier app. Verizon is the only significant holdout. One carrier in the whole world has this problem. The retail demo app is hardly part of the issue. Turned out Verizon doesn't actually use it anymore so Google removed it in Android 15 which can be seen from the Android 15 Beta releases that are available.

It appears entirely correct that Google deemed this was not even a Low severity vulnerability. Android 15 will probably be released in September where the app is gone.

Incredibly strange that this got portrayed as a serious issue when it's not really relevant to anyone. Meanwhile, tons of serious vulnerabilities get fixed in both Android and iOS on a regular basis. Non-Pixel Android devices don't ship Low/Moderate severity patches for the most part until the next yearly release since they skip monthly/quarterly releases and only apply High/Critical severity backports. They've got a lot of other weaknesses compared to Pixels too.

iPhones and Pixels with the stock OS are both still getting exploited by Cellebrite Premium with no sign of stopping it for more than a couple months at a time after certain updates:

https://grapheneos.social/@GrapheneOS/112826067364945164

As a bonus, there was a whole inaccurate news cycle about that too recently with the same publications falsely claiming Cellebrite couldn't exploit the latest iOS based on leaked docs from April 2024. They simply fell a bit behind after a release as often happens and had already caught up BEFORE the news cycle. We posted the new July 2024 docs as the news cycle was still propagating. Now they're onto their next fake story, and they'll simply move on to the next and the next without ever correcting or retracting all the false claims. Even the New York Times and Washington Post published this fake Showcase story promoting iVerify/Palantir.


Thank you for sharing all this extra information and context, in this comment and the others.

It's been refreshing to read through the GrapheneOS threads on this and see some actual evidence-based discussion.


We had to go through each of these apps years ago to figure out what we would be missing from excluding them. We knew Showcase was disabled at another layer and we're very aware of the CarrierSettings system for distributing APN and other carrier configuration data since we had to make our own implementation:

https://github.com/GrapheneOS/platform_packages_apps_Carrier...

This is where we mark Showcase as omitted for current era GrapheneOS, but it was never included:

https://github.com/GrapheneOS/adevtool/blob/0957926ce747e2d8...

Here's the part of the main non-MVNO Verizon carrier configuration where it enables these packages:

https://github.com/GrapheneOS/adevtool/blob/0957926ce747e2d8...

It's also in the MVNO configuration. We update these carrier configurations via an adevtool command we made for fetching them from the relevant Google Play API similar to their CarrierSettings app and include those in GrapheneOS instead of just using the ones extracted from the latest Pixel stock OS, so this less delay than checking it in the stock OS factory images since it's fully up-to-date as of the last time we ran the tool which we do at least monthly after the new AOSP / stock Pixel OS release.

The packages being disabled is nearly the same as them being uninstalled and installed on demand, meaning the overall set of apps is only actually a real world attack surface for Verizon / Verizon MVNO users.

On GrapheneOS, since we expand verified boot, apps like this having privileges granted them enabled/installed adds some trusted persistent state we don't want to have but that's not relevant to the stock OS which has a narrower goal for verified boot and doesn't get impacted by this. We do various things to reduce trust in persistent state but this is still far from something with real world relevance, it just has a theoretical relevance if it was included on GrapheneOS which it never has been.


Why is it there? Hanlon's razor, incompetence.

What if a border guard has access to your _unlocked phone_ and can re-anable an app?

It would be easier for the border guard to instead "adb install government-spyware-app.apk" than to enable some weird third-party app that they don't even control the management servers to.

If your threat model includes an adversary having access to your unlocked device, the least of your problems are apps like this.


It's there on Pixels with the stock OS and other devices with Verizon support because it's one of the apps that are part of the standard suite for fully supporting Verizon's network. Verizon doesn't do things in standard ways so functionality like Wi-Fi calling is missing without these apps. It's not at all Pixel specific. Verizon included it with these apps because they used it for demos in their stores. They don't use it anymore, so Pixels removed it in Android 15 as their response to this. That's public as part of the Android 15 Beta. No actual security impact was found from this so they didn't consider it a security issue but removed it anyway. The other Verizon apps which actually get used are still there, but only active when you have an active Verizon SIM. They're essentially uninstalled (packages fully disabled) when not using a Verizon SIM. It's extra attack surface for Android Verizon users in general. Retail demo app part of it isn't actual attack surface, so this whole news cycle is bogus. The other apps are extra attack surface, but they're useful and needed for proper Verizon support because they refuse to be normal.

ADB shell gives far more access than this app has so it doesn't matter as a physical attack vector for someone with the password.


The Verizon apps are disabled (equivalent to uninstalled) unless you have a Verizon SIM. If you do have a Verizon SIM, this app's functionality is still disabled unless the demo mode is enabled, implying having largely equivalent or greater access to what it provides.


So what? There's people who need remote support


This what

>According to iVerify, once activated, the application downloads a configuration file via an insecure connection, which can result in system-level code being executed. The configuration file is retrieved from a domain hosted by AWS over unsecured HTTP, which leaves the configuration and the device vulnerable to malicious code, spyware and data wiping.


https://xkcd.com/463/

The "unsecured HTTP" is about as relevant as lactose is for a butterfly.


The app isn't used by Verizon anymore.

How long will they keep the domain they used for that?


Huh? Are you really saying that downloading configuration files over HTTP is fine? (I’m really struggling to find a charitable interpretation)


Of course it is. If you want security, you should secure the files (i.e., signatures, public key, whatever), not the carrier pigeon used to send them.


I think their argument is it shouldn’t download any configuration via any connection.


No it's that HTTP means nothing


Of course it's fine. You sign the files


HTTP means nothing


[flagged]


>Please don't comment on whether someone read an article. "Did you even read the article? It mentions that" can be shortened to "The article mentions that".

https://news.ycombinator.com/newsguidelines.html




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: