logoalt Hacker News

tverbeuretoday at 6:12 AM2 repliesview on HN

I used to be a huge VHDL proponent, talk about the delta cycle stuff, give VHDL classes at work to new college grads and such. And then I moved to the West Coast and was forced to start using Verilog.

And in the 21 years since, I’ve never once ran into an actual simulation determinism issues.

It’s not bad to have a strict simulation model, but if some very basic coding style rules are followed (which everybody does), it’s just not a problem.

I don’t agree at all with the statement that Verilog fails when things become too complex. The world’s most complex chips are built with it. If there were ever a slight chance that chips couldn’t be designed reliably with it, that could never be the case.

Anyway, not really relevant, but this all reminds me of the famous Verilog vs VHDL contest of 1997: https://danluu.com/verilog-vs-vhdl/


Replies

e7h4nztoday at 6:22 AM

On a practical level, you're right, most of my team's work is done in Verilog.

That being said, I still have a preference for the VHDL simulation model. A design that builds correctness directly into the language structure is inherently more elegant than one that relies on coding conventions to constrain behavior.

show 2 replies
formerly_proventoday at 8:23 AM

This actually sounds a bit like a C/C++ argument. Roughly: Yes, you can easily write incorrect code but when some basic coding conventions are followed, UAF/double free/buffer overflows/... are just not a problem. After all, some of the world's most complex software is built with C / C++. If you couldn't write software reliably with C / C++, that could never be the case.

I.e. just because teams manage to do something with a tool does not mean the tool didn't impede (or vice versa, enable) the result. It just says that it's possible. A qualitative comparison with other tools cannot be established on that basis.

show 1 reply