"Now, with LLM-generated content, it’s hard to even build mental models for what might go wrong, because there’s such a long tail of possible errors. An LLM-generated literature review might cite the right people but hallucinate the paper titles. Or the titles and venues might look right, but the authors are wrong."
This is insidious and if humans were doing it they would be fired and/or cancelled on the spot. Yet we continue to rave about how amazing LLMs are!
It's actually a complete reversal on self driving car AI. Humans crash cars and hurt people all the time. AI cars are already much safer drivers than humans. However, we all go nuts when a Waymo runs over a cat, but ignore the fact that humans do that on a daily basis!
Something is really broken in our collective morals and reasoning
> AI cars are already much safer drivers than humans.
I feel this statement should come with a hefty caveat.
"But look at this statistic" you might retort, but I feel the statistics people pose are weighted heavily in the autonomous service's favor.
The frontrunner in autonomous taxis only runs in very specific cities for very specific reasons.
I avoid using them out of a feeble attempt to 'do my part', but I was recently talking to a friend and was surprised that they avoid using these autonomous services because they drive, what would be to a human driver, very strange routes.
I wondered if these unconventional, often longer, routes were also taken in order to stick to well trodden and predictable paths.
"X deaths/injuries per mile" is a useless metric when the autonomous vehicles only drive in specific places and conditions.
To get the true statistic you'd have to filter the human driver statistics to match the autonomous services' data. Things like weather, cities, number of and location of people in the vehicle, and even which streets.
These service providers could do this, they have the data, compute, and engineering to do so, though they are disincentivized to do so as long as everyone keeps parroting their marketing speak for them.