logoalt Hacker News

sdesol01/20/20253 repliesview on HN

I agree and this thread got me thinking about how I can package WASM in my chat app to execute LLM generated code. I think a lot can be achieve today with a well constructed prompt. For example, the prompt can say, if you are asked to perform a task like calculating numbers, write a program in JavaScript that can be compiled to WASM and wait for the response before continuing.


Replies

wat1000001/20/2025

External tool use and general real-world integration seems to be really lacking currently. Maybe current models are still too limited, but it seems like they should be able to do much better if they weren’t effectively running in a little jar.

Philpax01/20/2025

Don't really need WASM for that - have you tried Claude Artifacts?

show 1 reply
diggan01/20/2025

If only we had a function in JavaScript that could execute JavaScript code directly, wouldn't need WASM then (assuming it's just you + assistant locally).

show 1 reply