I truly hope that the common theme of the likes of "JWST Just Found Something Which Should Not Exist" etc will not be augmented by stuff like "we used AI(tm) to figure out X, Y, Z".
The last thing we need is hallucinations fucking up the more grounded astrophysics. I'm not saying that is what is happening, I just worry about stuff like this. AI causing us to bark up the wrong tree, and so forth.
> The last thing we need is hallucinations fucking up the more grounded astrophysics.
You're thinking of the wrong ML. Generative models "hallucinate" and it's as much a feature as it's a bug. ML in astrophysics is not generative. They use it for flagging, "binning" data and in general (simplified) classification.
Machine learning (AI) is used everywhere in astronomy. That's how they made the black hole image. Don't confuse the broader 60+ year old world of ML with transformers and diffusion models.
Wouldn’t past any scrutiny if they say AI enhanced the picture and found something new
Eventually youll give in to the fact that ai is useful, and maybe revolutionary. Until then, continue using swear words and sticking your head in the sand
Yeah. Thanks for saying this. Please let be the real sciences real that have propelled the humanity forward with painstakingly detailed analysis by peer reviews and what not.
Let's keep AI for vibe coding, cat images and memes etc.
If anything, it's just going to call out a thing in the image that humans can then go and look at. Nothing in astronomy is ever "decided" by a single report. It gets looked at and scrutinized, and then committee style decisions are made about it. So if someone is using some ML to scan every image taken by JWST and calls out 1 cool thing for every other 9 things it finds that's "yeah, we know about that", then that's still quite a lot of new cool things. it'll just be able to do this faster and potentially much more in-depth than a human scanning across the images manually