logoalt Hacker News

functionmousetoday at 2:37 PM6 repliesview on HN

That's because this particular sort of cyber security is merely theatrics with the goal of reducing user agency and increasing paranoia and vendor lock-in. The user facing friction is the goal. There will always be scams and viruses; the only practical outcome will be that you have less control over your computer, and Apple/MS/Google have more. See: Sideloading, Wayland, UWPs, iOS JIT, Windows XP and 7 still being used for accessibility


Replies

nerdjontoday at 2:42 PM

I strongly disagree.

I often have apps on my Mac or iPhone that ask for permission to see my camera, microphone, contacts, etc etc that I don't want it to see. But I do want other apps to be able to access those things.

Being able to stop those apps from accessing before they do instead of trying to fix it after is incredibly valuable.

Sure some users just accept everything, but that is not an argument against them existing in the first place.

show 1 reply
perching_aixtoday at 6:49 PM

> this particular sort of cyber security is merely theatrics with the goal of reducing user agency

Literally all security features carry the hazard of being used for oppression and being ineffective or counter-effective. That's how constraints work.

You need two things for a security feature:

- a segmentation under which a behavior is considered unsafe / unsecure (arbitrary, subjective)

- a technical solution that constrains the behavior of <thing> in <usage context> so that the aforementioned is mitigated

So something being "a tool of oppression" or "a tool of safety" is a matter of your alignment with that segmentation. And it being a theater or not is a matter of functional soundness given a threat model. So is its tendency to become counter-effective.

Constraints are just constraints. Whether they're effective and whether you're disadvantaged by them are both separate, independent matters. Empirical too.

ryandraketoday at 3:27 PM

We are moving away from the old world where you can trust the applications you are running on your computer, to today's world where you can't. The unix permission model is based on apps running as your user having access to every device and file you, the user, have access to. The threat was "other system users trying to access your files and devices" but now the threat is "applications you run trying to access your files and devices." OS vendors have been slow to adapt to this new threat model.

Even today, any rando application I download and run can read and/or write to any file on my system that I own and have permission to read and/or write, unless I go out of my way to run it in a chroot, a container, a jail or whatever. That's just poor security in a world where nearly every commercially developed application is an attacker.

show 1 reply
Zaktoday at 4:09 PM

I think we're on the same side in principle. The ability for people to interact with the wider world using general purpose computers that they fully control should be sacrosanct, and attempts to interfere with that such as remote attestation, app store exclusivity, and developer verification are evil.

Sandboxing apps by default is not that. The principle of least privilege is good security. If I vibecode some quick and dirty hobby app and share it with the world, it's better if the robot's mistake can't `rm -rf ~/` or give some creep access to your webcam.

The user should be able to override that in any way they see fit of course.

lpcvoidtoday at 3:20 PM

>Wayland

I can see the rest, but why did you mix in Wayland, a open source display protocol?

show 1 reply
Kaliboytoday at 2:57 PM

Maybe I don't understand your point, but why is Waylabd in your list?