Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Cars worked fine without seatbelts too. Just because the world goes on doesn't mean we can't do better.

Taking a step back though, I suspect there are cultural differences in approach here. Growing up in Europe, the idea of a regulation to make everyone safer is perfectly acceptable to me, whereas I get the impression that many folks who grew up in the US would feel differently. That's fine! But we also have to recognise these differences and recognise that the platforms in question here are global platforms with global impact and reach.

 help



OTOH the controlling way modern software behaves is an US artifact, so the differences are not necessarily clear-cut like this.

I grew up and live in Europe. I support the general idea of "regulation to make everyone safer" being an acceptable choice. At the same time, I vehemently oppose third-party interests reaching into my computing device and dictating what I can vs. cannot do with it.

But as you say, "global platforms with global impact and reach" - and so I can't set up my phone to conditionally read out text and voice messages aloud, because somewhere on the other side of the world, someone might get scammed into installing malware, therefore let's lock everything down and add remote attestation on top.

Unfortunately, the problem is political, not technological, and this here is but one facet of it. Ultimately, what SaaS does is give away all leverage: as users, it doesn't matter if we fully own the endpoints, or have a user-friendly vendor: any SaaS can ultimately decide not to serve a client that doesn't give the service a user-proof beachhead.


I really don't think that's a cultural difference. I also grew up and live in the EU. What Google wants just does not solve the problem in any way.

And it's also not actual regulation, just new TOS from a company many are basically forced to interact with.


It might not "solve" the problem, but I'd expect it to significantly address the problem no?

I've heard much criticism of it being too heavy-handed, but I don't think I understand criticism that it won't improve security. Could you expand on that?


No. You seem to be implicitly arguing that that unsigned apps are inherently less trustworthy than PlayStore apps. That's a claim that needs to be proven first. And based on the huge amount of documented data exfiltration performed by Google-approved apps, I'm going to say that claim is false.

I'm arguing that a curation process that includes security review is likely to produce a more secure set of software. Admittedly it might be completely ineffective, but I think that's an unreasonable assumption. So some review is more secure than no review. Now I'm not saying "better", you could argue it's a false sense of security, but it's still more security.

> I'm arguing that a curation process that includes security review is likely to produce a more secure set of software

I actually totally agree! There is no external entity users can rely on to make sure apps they download are legitimate. I read the thread from root to this comment and I don't see it mentioned, so I'm not sure if you know this and are just arguing something else but...

There is actually nothing about testing or verifying apps themselves in the announcement made by Google. It's just about enforcing developer verification in some Google service and "registering the apps".

https://support.google.com/android-developer-console/answer/... https://android-developers.googleblog.com/2025/11/android-de...

EDIT: I checked your profile, and I now see that you actually work at Google, on Android... Is there something I misunderstood about these announcements?

> you could argue it's a false sense of security, but it's still more security

Well here I don't agree, I would much rather be aware of the dangers than think I'm safe when I'm actually not.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: