logoalt Hacker News

ryandrakeyesterday at 2:44 PM6 repliesview on HN

> What the article describes sounds like what many devs would land on given the browser APIs available.

> To reiterate, at no point am I saying this is good or acceptable. I think there’s a massive privacy problem in the tech industry that needs to be addressed.

These two sentences highlight the underlying problem: Developers without an ethical backbone, or who are powerless to push back on unethical projects. What the article describes should not be "what many devs would land on" naturally. What many devs should land on is "scanning the user's browser in order to try to fingerprint him without consent is wrong and we cannot do it."

To put it more extreme: If a developer's boss said "We need to build software for a drone that will autonomously fly around and kill infants," The developer's natural reaction should not be: "OK, interesting problem. First we'll need a source of map data, and vision algorithm that identifies infants...." Yet, our industry is full of this "OK, interesting technology!" attitude.

Unfortunately, for every developer who is willing to draw the line on ethical grounds, there's another developer waiting in the recruiting pipeline more than willing to throw away "doing the right thing" if it lands him a six figure salary.


Replies

haswellyesterday at 2:54 PM

I completely agree.

Fighting against these kinds of directives was a large factor in my own major burnout and ultimately quitting big tech. I was successful for awhile, but it takes a serious toll if you’re an IC constantly fighting against directors and VPs just concerned about solving some perceived business problem regardless of the technical barriers.

Part of the problem is that these projects often address a legitimate issue that has no “good” solution, and that makes pushing back/saying no very difficult if you don’t have enough standing within the company or aren’t willing to put your career on the line.

I’d be willing to bet good money that this LinkedIn thing was framed as an anti-bot/anti-abuse initiative. And those are real issues.

But too many people fail to consider the broader implications of the requested technical implementation.

show 1 reply
turtletontineyesterday at 6:16 PM

> These two sentences highlight the underlying problem: Developers without an ethical backbone, or who are powerless to push back on unethical projects.

One reason your boss is eager to replace everyone with language models, they won’t have any “ethical backbone” :’)

show 2 replies
jt2190yesterday at 3:47 PM

> These two sentences highlight the underlying problem: Developers without an ethical backbone, or who are powerless to push back on unethical projects. What the article describes should not be "what many devs would land on" naturally. What many devs should land on is "scanning the user's browser in order to try to fingerprint him without consent is wrong and we cannot do it."

I think using LinkedIn is pretty much agreeing to participate in “fingerprinting” (essentially identifying yourself) to that system. There might be a blurry line somewhere around “I was just visiting a page hosted on LinkedIn.com and was not myself browsing anyone else’s personal information”, but otherwise LinkedIn exists as a social network/credit bureau-type system. I’m not sure how we navigate this need to have our privacy while simultaneously needing to establish our priors to others, which requires sharing information about ourselves. The ethics here is not black and white.

jcgrilloyesterday at 4:18 PM

You can't actually push back as an IC. Tech companies aren't structured that way. There's no employment protection of any kind, at least in the US. So the most you can do is protest and resign, or protest and be fired. Either way, it'll cost you your job. I've paid that price and it's steep. There's no viable "grassroots" solution to the problem, it needs to come from regulation. Managers need to serve time in prison, and companies need to be served meaningfully damaging fines. That's the only way anything will get done.

show 2 replies
mrguyoramayesterday at 5:01 PM

I integrate these kinds of systems in order to prevent criminals from being able to use our ecommerce platform to utilize stolen credit cards.

That involves integrating with tracking providers to best recognize whether a purchase is being made by a bot or not, whether it matches "Normal" signals for that kind of order, and importantly, whether the credit card is being used by the normal tracking identity that uses it.

Even the GDPR gives us enormous leeway to do literally this, but it requires participating in tracking networks that have what amounts to a total knowledge of purchases and browsing you do on the internet. That's the only way they work at all. And they work very well.

Is it Ethical?

It is a huge portion of the reason why ecommerce is possible, and significantly reduces credit card fraud, and in our specific case, drastically limits the ability of a criminal to profit off of stolen credit cards.

Are people better off from my work? If you do not visit our platforms, you are not tracked by us specifically, but the providers we work with are tracking you all over the web, and definitely not just on ecommerce.

Should this be allowed?

show 3 replies
orochimaaruyesterday at 6:40 PM

One works for money. And money is important. Ethics isn’t going pay mortgage, send kids to university and all that other stuff. I’m not going to do things that are obviously illegal. But if I get a requirement that needs to be met and then the company legal team is responsible for the outcome.

In short, you are not going to solve this problem blaming developer ethics. You need regulation. To get the right regulation we need to get rid of PACs and lobbying.

show 1 reply