logoalt Hacker News

bigyabaiyesterday at 10:18 PM2 repliesview on HN

I'm just as confused as you are, honestly. It feels like we've seen the "ASIC for everything" campaign so many times over, and yet only FPGAs and CUDA typically find adoption in the industry.

A lot of my questions went away when I got to this line though:

> He’s also fully engaged in the third leg of the “democratizing chip design” stool: education.

This is a valiant effort. Chip design is a hard world to break into, and many applications that could benefit from ASICs aren't iterating or testing on it because it sucks to do. It's a lot of work to bring that skill ceiling down, but as a programmer I could see how an LLVM-style intermediate representation layer could help designers get up-and-running faster.


Replies

charlie-83yesterday at 11:32 PM

Isn't HDL basically the intermediate representation you want? Plus, you can learn it with simulation or FPGA dev board which makes it reasonably accessable

show 1 reply
bsderyesterday at 11:43 PM

> I'm just as confused as you are, honestly. It feels like we've seen the "ASIC for everything" campaign so many times over, and yet only FPGAs and CUDA typically find adoption in the industry.

That's because we don't need more digital. Digital transistors are effectively free (to a first approximation).

The axes that we need more of involve analog and RF. Less power consumption, better RF speed/range, higher speed PCI, etc. all require messy analog and RF design. And those are the expensive tools. Those are also the complex tools require genuine knowledge.

Now, if your AI could deliver analog and RF, you'd make a gazillion dollars. The fact that everybody knows this and still haven't pulled it off should tell you something.

show 1 reply