> Free to ingest and make someones crimes a permanent part of AI datasets resulting in forever-convictions?
You're conflating two distinct issues - access to information, and making bad decisions based on that information. Blocking access to the information is the wrong way to deal with this problem.
> a teenager shouldn't have their permanent AI profile become "shoplifter" because they did a crime at 15 yo that would otherwise have been expunged after a few years.
This would be a perfect example of something which should be made open after a delay. If the information is expunged before the delay, there's nothing to make open.
> Blocking access to the information is the wrong way to deal with this problem.
That's an assertion, but what's your reasoning?
> This would be a perfect example of something which should be made open after a delay. If the information is expunged before the delay, there's nothing to make open.
All across the EU, that information would be available immediately to journalists under exemptions for the Processing of Personal Data Solely for Journalistic Purposes, but would be simultaneously unlawful for any AI company to process for any other purposes (unless they had another legal basis like a Government contract).
> Blocking access to the information is the wrong way to deal with this problem.
Blocking (or more accurately: restricting) access works pretty well for many other things that we know will be used in ways that are harmful. Historically, just having to go in person to a court house and request to view records was enough to keep most people from abusing the public information they had. It's perfectly valid to say that we want information accessible, but not accessible over the internet or in AI datasets. What do you think the "right way" to deal with the problem is because we already know that "hope that people choose to be better/smarter/more respectful" isn't going work.