I wonder if the era of dynamic programming languages is over. Python/JS/Ruby/etc. were good tradeoffs when developer time mattered. But now that most code is written by LLMs, it's as "hard" for the LLM to write Python as it is to write Rust/Go (assuming enough training data on the language ofc; LLMs still can't write Gleam/Janet/CommonLisp/etc.).
Esp. with Go's quick compile time, I can see myself using it more and more even in my one-off scripts that would have used Python/Bash otherwise. Plus, I get a binary that I can port to other systems w/o problem.
Compiled is back?
Nice work detective Simon! I love these “discovery” posts the most because you can’t find this stuff anywhere.
This is either going to save hours… or create very educational outages.
Giving agents linux has compounding benefits in our experience. They're able to sort through weirdness that normal tooling wouldn't allow. Like they can read and image, get an error back from the API and see it wasn't the expected format. They read the magic bytes to see it was a jpeg despite being named .png, and read it correctly.
Seems like everyone is trying to get ahead of tool calling moving people "off platform" and creating differentiators around what tools are available "locally" to the models etc. This also takes the wind out of the sandboxing folks, as it probably won't be long before the "local" tool calling can effectively do anything you'd need to do on your local machine.
I wonder when they'll start offering virtual, persistent dev environments...
I wonder how long npm/pip etc even makes sense.
Dependancies introduce unnecessary LOC and features which are, more and more, just written by LLMs themselves. It is easier to just write the necessary functionality directly. Whether that is more maintainable or not is a bit YMMV at this stage, but I would wager it is improving.
And so it begins - Skynet 3.0.
Regular default ChatGPT can also now run code in Node.js, Ruby, Perl, PHP, Go, Java, Swift, Kotlin, C and C++.
I'm not sure when these new features landed because they're not listed anywhere in the official ChatGPT release notes, but I checked it with a free account and it's available there as well.
Congratulations. One insecure buggy code generator connected to an insecure packaging "system", PyPI.
We are eagerly awaiting Claude Launch, which will be connected to ICBM bases. The last thing humanity will hear is a 100 page boring LLM written mea culpa by Amodei, where he'll say that he has warned about the dangers but it was inevitable.
Has Gemini lost its ability to run javascript and python? I swear it could when it was launched by now its saying it hasn't the ability. Annoying regression when Claude and ChatGPT are so good at it.
Thank God, this was extremely annoying
Maybe soon we have single use applications. Where ChatGPT can write an App for you on-the-fly in a cloud sandbox you interact with it in the browser and fulfill your goal and afterwards the App is shutdown and thrown away.
Not sure if this is still working. I tried getting it to install cowsay and it ran into authentication issues. Does it work for other people?
How much compute do you get in these containers? Could I have it run whisper on an mp3 it downloads?
Did I miss the boat on chatgpt? Is there something more to it than the web chat interface?
I jumped on the Claude Code bandwagon and I dropped off chatgpt.
I find the chatgpt voice interface to be infuriating; it literally talks in circles and just spews summary garbage whenever I ask it anything remotely specific.
but… will gpt still get confused by the ellippses that its document viewer ui hack adds? probably yes.
As an infosec guy I'm going to go ahead and buy a bigger house