Unfortunately there's no money in privacy, and a lot of money in either outright selling data or cutting costs to the bare minimum required to avoid legal liability.
Wife and I are expecting our third child, and despite my not doing much googling or research into it (we already know a lot from the first two) the algorithms across the board found out somehow. Even my instagram "Explore" tab that I accidentally select every now and then started getting weirdly filled with pictures of pregnant women.
It is what it is at this point. Also I finally got my last settlement check from Equifax, which paid for Chipotle. Yay!
As new moms tend to change their consumer purchasing habits they are coveted by advertisers. http://www.nytimes.com/2012/02/19/magazine/shopping-habits.h... Certain cohorts and keywords are very valuable so even searching a medical condition once or clicking on a hiring ad for an in-demand job can shift ads toward that direction for a long time.
> the algorithms across the board found out somehow.
It's worth keeping in mind that this is basically untrue.
In most of these algorithms, there's no "is_expecting: True" field. There are just some strange vectors of mysterious numbers, which can be more or less similar to other vectors of mysterious numbers.
The algorithms have figured out that certain ad vectors are more likely to be clicked if your user vector exhibits some pattern, and that some actions (keywords, purchases, slowing down your scroll speed when you see a particular image) should make your vector go in that direction.
> Unfortunately there's no money in privacy
But there should be and there should be punishments for data breaches, or at least compensations for those affected. Then there would be an incentive for corporations to take their user's privacy more seriously.
Your personal data is basically the currency of the digital world. This is way data about you is collected left, right, and center. It's valuable.
When I trust a bank to safely lock away my grandmother's jewelry, I have to pay for it, but in return, if it just so happened that the bank gets broken into and all my possessions get stolen, at least I'll get (some) compensation.
When I give my valuable data to a company, I have already paid them (with the data themselves), but I have no case whatsoever if they get compromised.
Also on the front page of HN right now is a job posting for Optery (YC W22). Seems like they are growing really fast.
Could be as simple as buying a bunch of scent free soap / lotion and some specific vitamin supplements. Walmart / Target were able to detect pregnancy reliably back in 2012 from just their own shopping data.
Also possible they have your location if you went to the hospital. Maybe from any Meta "partners" or third party brokers.
Interestingly in healthcare there is a correlation between companies that license/sell healthcare data to other ones (usually they try to do this in a revokable way with very stringent legal terms, but sometimes they just sell it if there is enough money involved) and their privacy stance... and it's not what you would think. Often it's these companies that are pushing for more stringent privacy laws and practices. For example, they could claim that they cannot share anonymized data with academic researchers, because of xyz virtuous privacy rules, when they are actually the ones making money off of selling patient data. It's an interesting phenomenon I have observed while working in the industry that seems to refute your claim that "there's no money in privacy". Another way to think about it is that they want to induce a lower overall supply for the commodity they are selling, and they do this by championing privacy rules.