logoalt Hacker News

Show HN: I used AI to recreate a $4000 piece of audio hardware as a plugin

147 pointsby johnwheelerlast Saturday at 1:08 AM100 commentsview on HN

Hi Hacker News,

This is definitely out of my comfort zone. I've never programmed DSP before. But I was able to use Claude code and have it help me build this using CMajor.

I just wanted to show you guys because I'm super proud of it. It's a 100% faithful recreation based off of the schematics, patents, and ROMs that were found online.

So please watch the video and tell me what you think

https://youtu.be/auOlZXI1VxA

The reason why I think this is relevant is because I've been a programmer for 25 years and AI scares the shit out of me.

I'm not a programmer anymore. I'm something else now. I don't know what it is but it's multi-disciplinary, and it doesn't involve writing code myself--for better or worse!

Thanks!


Comments

franky47last Sunday at 1:34 PM

I used to do that exact job 10 years ago (without AI, obviously). I figure that career would be very different now.

There was something exciting about sleuthing out how those old machines worked: we used a black box approach, sending in test samples, recording the output, and comparing against the digital algorithm’s output. Trial and error, slowly building a sense of what sort of filter or harmonics could bend a waveform one way or another.

I feel like some of this is going to be lost to prompting, the same way hand-tool woodworking has been lost to power tools.

show 4 replies
figassisyesterday at 11:53 AM

I don’t think we were ever supposed to be programmers. A lot of us are scared because they assumed knowing every detail of a system or language and being able to conjure a system with code was the point of your profession. But it was always building things, or, engineering, just with different tools. If we get to the point where we can ask AI to 3d print a spaceship and also build JARVIS into it for navigation, then your job will become something else, like figuring out how to build brain computer interfaces as we get on our way to becoming cyborgs or whatever for FTL journeys. Building interfaces will not be a thing we will do anymore, as UIs will just be conjured on the fly, contextually, by the AI.

Our challenge will always be to keep track of all the foundational knowledge so we can rebuild it all if it comes crashing down (AI or some other event tries to end us).

You should feel exited about it, and level up to the next thing where you will be needed, which is to build reliable heterogeneous, self healing systems, often without having a contract between them.

This will mean you can conjure up an entire tax management system, a financial system, a government management system, quickly and have them all talk to each other so people can just go about their lives.

A dam is built, and you immediately have a system that can operate it and all of its equipment.

This may give manufacturers freedom to innovate without worrying about breaking things. Just install it and let the AI learn it, tell you if it needs to calibrate the new equipment, or adjust the existing system to better integrate it, take better advantage of it, etc.

There is so much to do in that and many other directions (I mean healthcare, etc, why not eat bigpharma’s lunch?) that we should be excited and not afraid. Of course current AI is nowhere near this, and maybe what enables this will be in an entirely different shape, but that we’re all putting effort into getting there instead of worrying about Angular vs React is what I love the most.

hebejebeluslast Sunday at 2:01 PM

I was hoping that the video was a walkthrough of your process - do you think you might share that at some point?

> I'm not a programmer anymore. I'm something else now. I don't know what it is but it's multi-disciplinary, and it doesn't involve writing code myself--for better or worse!

Yes, I agree. I think the role of software developer is going to evolve into much more of an administrative, managerial role, dealing more with working with whatever organisation you're in than actually typing code. Honestly I think it probably was always heading in this direction but it's definitely quite a step change. Wrote about it a little incoherently on my blog just this morning: https://redfloatplane.lol/blog/11-2025-the-year-i-didnt-writ...

show 3 replies
Blackthornlast Sunday at 1:48 PM

How can you say it's a 100% faithful recreation if you've never programmed DSP before?

show 7 replies
Xmd5alast Sunday at 3:15 PM

Very nice work. I’m curious: what kinds of projects are you guys currently working on that genuinely push you out of your comfort zone?

I had a small epiphany a couple of weeks ago while thinking about robot skin design: using conductive 3D-printed structures whose electrical properties change under strain, combined with electrical impulses, a handful of electrodes, a machine-learning model to interpret the measurements, and computational design to optimize the printed geometry.

While digging into the literature, I realized that what I was trying to do already has a name: proprioception via electrical impedance tomography. It turns out the field is very active right now.

https://www.cam.ac.uk/stories/robotic-skin

That realization led me to build a Bergström–Boyce nonlinear viscoelastic parallel rheological simulator using Taichi. This is far outside my comfort zone. I’m just a regular programmer with no formal background in physics (apart from some past exposure to Newton-Raphson).

Interestingly, my main contribution hasn’t been the math. It’s been providing basic, common-sense guidance to my LLM. For example, I had to explicitly tell it which parameters were fixed by experimental data and which ones were meant to be inferred. In another case, the agent assumed that all the red curves in the paper I'm working with referred to the same sample, when they actually correspond to different conducting NinjaFlex specimens under strain.

Correcting those kinds of assumptions, rather than fixing equations, was what allowed me to reproduce the results I was seeking. I now have an analytical, physics-grounded model that fits the published data. Mullins effect: modeled. Next up: creep.

We’ll see how far this goes. I’ll probably never produce anything publishable, patentable, or industrial-grade. But I might end up building a very cheap (and hopefully not that inaccurate), printable proprioceptive sensor, with a structure optimized so it can be interpreted by much smaller neural networks than those used in the Cambridge paper.

If that works, the gesture will have been worth it.

show 1 reply
LatencyKillslast Sunday at 1:13 PM

This is fantastic. I’m currently building a combustion engine simulator doing exactly what you did. In fact, I found a number of research papers, had Claude implement the included algorithms, and then incorporated them into the project.

What I have now is similar to https://youtu.be/nXrEX6j-Mws?si=XdPA48jymWcapQ-8 but I haven’t implemented a cohesive UI yet.

show 1 reply
dubeyelast Sunday at 1:41 PM

awesome, in 2025 I made a few apps for my small business that I have spent hours trawling the web looking for, and I have little coding skills.

Sometimes it feels like I'm living in a different world, reading the scepticism on here about AI.

I'm sure there enterprise cases where it doesn't make sense, but for the your everyday business owner it's amazing what can be done.

maybe it's a failure of imagination but I can't imagine a world where this doesn't impact enterprise in short order

show 3 replies
gbraadlast Sunday at 7:46 PM

Isn't that like the Ursa Major Stargate 323 Reverb? Greybox audio released code for this about a year ago: https://github.com/greyboxaudio/SG-323

show 1 reply
utopiahlast Sunday at 2:12 PM

I'm not in the domain, even though I did dabble with DAW and tinker with a PGB-1 and its open source firmware, but how far would you say CMajor helped? I feel like solely picking the right tool, being framework, paradigm, etc can make or break a project.

Consequently here for me to better understand how special this is I'd appreciate how (especially since I don't see a link to code itself) how does one go to e.g. https://cmajor.dev/docs/GettingStarted#creating-your-first-p... to a working DSP.

jvanderbotlast Sunday at 2:39 PM

On your "Scares the shit out of me" comment.

Use AI like a CNC machinist uses a mill. You're still in the loop, but break it into manageable "passes" with testing touchpoints. These touchpoints allow you to understand what's going on. Nothing wrong with letting AI oneshot something, but it's more fun and less ennui to jump in and look around and exercise some control here and there. And, on larger systems, this is basically required. (for now, perhaps).

This is how I do it now: https://jodavaho.io/posts/ai-useage-2025.html

show 1 reply
alexjplantyesterday at 12:23 AM

Nice! Earlier this week I discovered another enterprising engineer working on a digital sim of the Mesa Mark IIC+ preamp using a discrete component modeling approach [1]. Pretty cool stuff coming out in the digital audio production space these days.

[1] https://www.youtube.com/watch?v=GcdyOtO5Id0

show 1 reply
newyankeelast Sunday at 7:37 PM

A similar approach might actually help build cheap, and decent hearing aids too

KellyCriterionlast Sunday at 1:57 PM

Great achievement!

Regarding your own titleing: you are now some type of "platform operator/manager" of this agents :-))

gus_massalast Sunday at 1:41 AM

Did you recreate the UI only or also the internal circuits? Does it produce a similar distorsion?

show 1 reply
drcongolast Sunday at 1:12 PM

Cmajor, for anyone wondering: https://github.com/cmajor-lang/cmajor

show 1 reply
atentatenlast Sunday at 1:12 PM

Nice! Which DAW are you using in the video?

show 2 replies
tomhowlast Sunday at 1:01 PM

[under-the-rug stub]

[see https://news.ycombinator.com/item?id=45988611 for explanation]

show 7 replies
synthdedlast Sunday at 2:34 PM

[dead]

agentifyshlast Sunday at 8:16 PM

We are all glorified QA testers with software architect title now. Sure we set the structure of what we want but everything else the AI does and most of our time is now spent testing and complaining to the AI.

Pretty soon AI will do the QA portion as well. It will generate any piece of software, even games for a cool $200/month from a vendor of choice: Microsoft (OpenAI) or Google

Companies will stop paying for SaaS or complex ERP software, they will just generate their own that only the AI knows how to maintain, run, and add features.

It's ironic that software developers are the most enthusiastic about automating their jobs out of existence. No union, no laws that interfere with free market forces.

show 1 reply