Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is this meant for hardware people or software people? Because I'm a hardware guy and I don't know Haskell. None of my EE friends know Haskell either. I really don't see a serious hardware engineer using this over VHDL or Verilog, even if it is more beautiful or provably better or whatever.


Strangely enough, alternative HDLs embedded in Haskell seem to be a thing. Besides this, there is also Bluespec and Lava.

http://wiki.bluespec.com/

http://raintown.org/lava/


Tom Hawkins is probably around here somewhere.

http://en.wikipedia.org/wiki/Atom_(programming_language)


Immutability looks too much like hardware.


For such a mission critical application (hardware design) my mind is blown by how laughably bad the HDLs are when compared to modern programming languages. Chunks of the languages aren't synthesizeable, and which bits are/aren't are not consistent across vendors. There's also IEEE standards that nobody reeeally seems to care about too much. Additionally, it's not clear from a syntax point or a types point whether or not the code you will be writing is synthesizable. You just need to "know" ahead of time. It's terrible! It makes this whole field really obtuse and hard to enter from a design standpoint.

The field really needs some fresh blood or dare I say "disruption". I'm glad people are trying to make things better.


From the limited amount of VHDL I did in my Digital Design class at my university, I found hardware description similar to functional programming. State is expensive to represent in hardware (you need latches, flip-flops, etc), and so VHDL's concept of signals is similar to constants, and components map well to pure functions. With an intelligent enough compiler, I think this is totally feasible.

From a more short-term practical standpoint, no, I don't expect anyone to use this. Hardware engineers are incredibly stubborn when it comes to software. Their work typically involves large time investments with lots of costs and risks. For better or worse, they typically don't ever want to add more risk by using an "untrusted" tool, creating a chicken-and-egg problem.

In the second paragraph, the author admits that there's not yet a good way to represent a recursive algorithm in his language. I think this is more of a proof-of-concept that could eventually become useful.


If the finance industry (which is similarly large-investment and high-risk) can begin to use functional programming for critical applications, then I can see hardware moving towards it as well. If tools like Clash can be made to emit human-readable VHDL, I can see that as a path forward towards adoption.


> If the finance industry (which is similarly large-investment and high-risk) can begin to use functional programming for critical applications then I can see hardware moving towards it as well

I don't think these things are at all comparable. The finance industry is writing code to do software things. The hardware industry is writing code to build hardware.

It is not just a matter of how much money and risk is involved, it's a matter of whether the language is a good mapping for the things it is describing.


The finance industry is writing code for hardware to do financial things. http://www.pcmag.com/article2/0,2817,2424495,00.asp


About EE conservatism in regards to software tools, this is mainly because we are highly restricted in what tools we can use by the platforms we have to design for. If you want to use a specific FPGA or a specific ASIC process, you have to use the tools that the FPGA vendor or ASIC foundry officially supports.


I totally agree and sympathize here. I've seen a few solutions that compile to VHDL or Verilog, as most platforms support one of those, to work around this.


That is what UC Berkeley's Chisel (https://chisel.eecs.berkeley.edu/) does, and it seems to be what CLaSH does as well. The issue is the same as in the various compile-to-JS languages. The additional level of indirection can make things difficult to debug (and heaven knows HDL code is hard to debug already).

Also, there is the issue of interfacing with third-party IP blocks or builtin FPGA hardware slices. You either have to stub these out in your high-level description or simulate using the generated HDL.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: