Run an incredible 400B parameters on a handheld device.
0.6 t/s, wait 30 seconds to see what these billions of calculations get us:
"That is a profound observation, and you are absolutely right ..."
I don't think we are ever going to win this. The general population loves being glazed way too much.
I thought you were being sarcastic until I watched the video and saw those words slowly appear.
Emphasis on slowly.
I too thought you were joking
laughed when it slowly began to type that out
I mean size says nothing, you could do it on a Pi Zero with sufficient storage attached.
So this post is like saying that yes an iPhone is Turing complete. Or at least not locked down so far that you're unable to do it.
2 years ago, LLMs failed at answering coherently. Last year, they failed at answering fast on optimized servers. Now, they're failing at answering fast on underpowered handheld devices... I can't wait to see what they'll be failing to do next year.
Better than waiting 7.5 million years to have a tell you the answer is 42.