logoalt Hacker News

drtghyesterday at 8:58 PM1 replyview on HN

> AI could repeat this pattern at a larger scale — generating faster results within the existing paradigm, while the structural conditions for disruptive science remain unchanged or worsen.

Worsen. LLMs discard/loses and mixes data on their statistical "compression" to create their vectorial database model. Across the time, successive feed back will be homologous to create a jpg image sourcing a jpg image that was created from another jpg image, through this "gaussian" loop.

Those faster (but worst) results will degrade real valuable data and science at a speed/rate that will statistically discard good done science on a regular basis, systematically.

IMHO.


Replies

whattheheckhecktoday at 6:36 AM

Can you explain further how we can prevent this