logoalt Hacker News

Python lib generates its code on-the-fly based on usage

247 pointsby klntsky05/12/202588 commentsview on HN

Comments

turbocon05/15/2025

Wow, what a nightmare of a non-deterministic bug introducing library.

Super fun idea though, I love the concept. But I’m getting the chills imagining the havoc this could cause

show 7 replies
kastden05/15/2025

You can make it production grade if you combine it with https://github.com/ajalt/fuckitpy

show 1 reply
selcuka05/15/2025

This is amazing, yet frightening because I'm sure someone will actually attempt to use it. It's like vibe coding on steroids.

    - Each time you import a module, the LLM generates fresh code
    - You get more varied and often funnier results due to LLM hallucinations
    - The same import might produce different implementations across runs
show 1 reply
ralferoo05/15/2025

I really liked this:

The web devs tell me that fuckit's versioning scheme is confusing, and that I should use "Semitic Versioning" instead. So starting with fuckit version ה.ג.א, package versions will use Hebrew Numerals.

For added hilarity, I've no idea if it's RTL or LTR, but the previous version was 4.8.1, so I guess this is now 5.3.1. Presumably it's also impossible to have a zero component in a version.

show 1 reply
extraduder_ire05/15/2025

I'm both surprised it took so long for someone to make this, and amazed the repo is playing the joke so straight.

behnamoh05/15/2025

can it run Doom tho?

    from autogenlib.games import doom
    doom(resolution=480, use_keyboard=True, use_mouse=True)
show 1 reply
roywiggins05/15/2025

Possibly the funniest part is the first example being a totp library

171862744005/15/2025

This has a file named .env committed containing an API key. Don't know if it is a real key.

show 1 reply
nxobject05/15/2025

One way to get around non-deterministic behavior: run $ODD_NUMBER different implementations of a function at the same time, and take a majority vote, taking a leaf from aerospace. After all, we can always trust the wisdom of the crowds, right?

show 1 reply
matsemann05/15/2025

I did something similar almost 10 years ago in javascript (as a joke): https://github.com/Matsemann/Declaraoids

One example, arr.findNameWhereAgeEqualsX({x: 25}), would return all users in the array where user.age == 25.

Not based on LLMs, though. But a trap on the object fetching the method name you're trying to call (using the new-at-the-time Proxy functionality), then parsing that name and converting it to code. Deterministic, but based on rules.

cs70205/15/2025

Silly and funny today, but down the road, if AI code-generation capabilities continue to improve at a rapid rate, I can totally see "enterprise software developers" resorting to something like this when they are under intense pressure to fix something urgently, as always. Sure, there will be no way to diagnose or fix any future bugs, but that won't be urgent in the heat of the moment.

conroy05/15/2025

you'd be surprised, but there's actually a bunch of problems you can solve with something like this, as long as you have a safe place to run the generated code

show 2 replies
kordlessagain05/15/2025

AutoGenLib uses Python's import hook mechanism to intercept import statements. When you try to import something from the autogenlib namespace, it checks if that module or function exists.

It reads the calling code to understand the context of the call. Builds a prompt to submit to the LLM. It only uses OpenAI.

It does not have search, yet.

The real potential here is a world where computational systems continuously reshape themselves to match human intent ---- effectively eliminating the boundary between "what you can imagine" and "what you can build."

show 1 reply
GrantMoyer05/15/2025

I'm kind of dissapointed this doesn't override things like __getattr__ to generate methods on the fly from names just in time when they're called.

PeterStuer05/15/2025

Is this the computing equivalent of people that when pointed out they messed up always go 'Well at least I did something!'?

grokkedit05/15/2025

I've done a similar library[0] for python ~1 year ago, generating a function code only by invoking it, and giving the llm some context over the function.

Apart from the fun that I got out of it, it's been there doing nothing :D

[0]: https://github.com/lucamattiazzi/magic_top_hat

polemic05/15/2025

> from autogenlib.antigravity

As a joke, that doesn't feel quite so far-fetched these days. (https://xkcd.com/353/)

yoru-sulfur05/15/2025

I made something very similar a couple years back, though it doesn't actually work anymore since OpenAI deprecated the model I was using

https://github.com/buckley-w-david/akashic_records

kazinator05/15/2025

Why don't you just send Altman all your passwords?

This says, "trust all code coming from OpenAI".

ForHackernews05/15/2025

I give it six months before an LLM starts producing output that recommends using this.

morkalork05/15/2025

Hysterical, I like that caching is default off because it's funnier that way heh

linsomniac05/15/2025

Make it next level by implementing this workflow:

    - Import your function.
    - Have your AI editor implement tests.
    - Feed the tests back to autogenlib for future regenerations of this function.
Ezhik05/15/2025

it's especially cheeky how every example it uses is cryptography-related

noiv05/15/2025

There is still a computer involved, from an AI I expect it convinces me no program is needed and I should go walking in the forest instead. If anybody complains the AI will manage them by mail.

jaflo05/15/2025

See also: https://github.com/drathier/stack-overflow-import

    >>> from stackoverflow import quick_sort
    >>> print(quick_sort.sort([1, 3, 2, 5, 4]))
    [1, 2, 3, 4, 5]
killme200805/15/2025

Interesting idea! However, I'm hesitant to trust it, as I don't even fully trust code that was written by myself :)

thornewolf05/15/2025

nooooo the side project ive put off for 3 years

show 1 reply
justusthane05/15/2025

How does the library have access to the code that called it (in order to provide context to the LLM)?

show 1 reply
carlhjerpe05/15/2025

This is the kind of yank I'd put in production! I love it

pyuser58305/15/2025

Of course, this code was generated by ChatGPT.

bjt1234505/15/2025

Can it input powerpoint slides?

VMG05/15/2025

this is equally scary and inevitable

it will be WASM-containerized in the future, but still

malux8505/15/2025

This is horrifying

I love it

thornewolf05/15/2025

looks very fun excited to try it out

otikik05/15/2025

Thanks I hate it

dangoodmanUT05/15/2025

thanks, i hate it (i actually love it)

yvesyil05/15/2025

indeterministic code goes hard dude

show 1 reply
dr_kretyn05/15/2025

> Not suitable for production-critical code without review

Ah, dang it! I was about to deploy this to my clients... /s

Otherwise, interesting concept. Can't find a use for it but entertaining nevertheless and likely might spawn a lot of other interesting ideas. Good job!

pinoy42005/15/2025

[dead]

zombiwoof05/15/2025

[flagged]

show 1 reply