logoalt Hacker News

omneityyesterday at 11:42 PM2 repliesview on HN

If performant FPGAs were more accessible we’d be able to download models directly into custom silicon, locally, and unlock innovation in inference hardware optimizations. The highest grade FPGAs also have HBM memory and are competitive (on paper) to GPUs. To my understanding this would be a rough hobbyist version of what Cerebras and Groq are doing with their LPUs.

Unlikely this will ever happen but one can always dream.


Replies

rfv6723today at 8:33 AM

FPGA for AI only makes sense when machine learning had diverse model architectures.

After Transformer took over AI, FPGA for AI is totally dead now. Because Transformer is all about math matrix calculation, ASIC is the solution.

Modern Datacenter GPU is nearly AISC now.

show 1 reply
15155today at 2:18 AM

> highest grade FPGAs also have HBM memory

The three SKUs between Xilinx and Altera that had HBM are no longer manufactured because Samsung Aquabolt was discontinued.