Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Show HN: A RISC-V core in Racket (baierouge.fr)
122 points by _mouvantsillage on May 1, 2021 | hide | past | favorite | 27 comments


Author here.

This post is part of an ongoing experiment to use Racket as a platform for hardware description languages. Describing a RISC-V core in Racket is a step in this direction, but the ultimate goal is neither to use Racket itself as an HDL, nor to define an embedded hardware description DSL in Racket. The long-term goal is to create an HDL that would benefit from Racket's "language-oriented programming" facilities, with the ability to simulate digital hardware, but also to generate standard Verilog or VHDL.


Can you do a write up on the lower level algorithms like place and route, circuit constraints solving optimization, the process from the netlist stage to autorouting? The engineering behind VHDL and Verilog systems are way too opaque for most software engineers.


The book "Electronic Design automation" by Wang, Chang, and Cheng is quite approachable.

The reason why they are opqaue is partly because the problem is hard, and secondly because the developers of the code make money directly off the tools (like compilers were in the past)


Physical layout CAD software is really interesting though and heavily scriptable. I’m pretty sure at least one major offering uses some kind of lisp as its extension language.


A lot of the were written fully in lisp back in the day, and might still be underneath it all. ie. it might be more than just being used as an extension language.


My experience in the lower-level aspects of hardware synthesis is limited but I would love to explore this topic.

I use proprietary synthesis tools for FPGAs everyday but I do not develop them. My situation is similar to that of a software engineer who uses compilers and who could only explain how they work in general terms.


Since there was a post about Spectre on the HN front page today, I wonder if HDLs should perhaps contain more formal verification tools; is that on the roadmap?


Spectre isn't really something that you can catch with formal verification. It sort of isn't a bug, that's why it's such a problem.


Only because most models focus on functionality, not on timing.


No it's because they focus on speed.

Eliminating spectre is either very hard or very slow. Of course you could verify that spectre does or doesn't happen formally, but at what cost?


You changed your argument flavor like a neutrino, used a false dilemma and had it swim with red herring. Don't take this as a harsh judgement but as something to work on.

There are ways to solve this now with type systems, dependent type systems, affine type systems, session types, etc. That allow us to track and prove what pieces of data should be visible (for many definitions) to whom.

We also will not know the cost unless we start designing our systems rigorously. Think of it as an accounting problem, and economics problem and engineering is made of up primarily as the combination of economics and failure analysis.

We have the means and methods to solve these problems now, we should use them.


While simulation is a well-established practice among digital hardware developers, I have the impression that formal verification is still largely ignored. I hope it changes as free tools are emerging such as SymbiYosys https://symbiyosys.readthedocs.io

While this is still a long-term goal, I am planning to experiment how formal verification can be used in Racket-based hardware tools.


This is great! I also like your other posts in the series on tiny-hdl. Thank you for writing this up!


Great post! What do you use to draw your state diagrams?


All diagrams were created with Inkscape.


Fascinating work, thanks for creating it!


Very impressive work.

I've only spent about an hour going through this post and its predecessor. I'm a career software engineer, although I've been working very closely with a team of FPGA developers for the last couple of years.

HDL engineers have a very strong preference to see the types of all nets and registers clearly defined. The pain points are mostly in organizing and stitching together larger projects out of smaller modules. You see this reflected in common project structures, where leaf nodes are in straight VHDL or [System]Verilog but larger structures may be stitched together with TCL or some custom in-house tooling.

HDL replacement technologies have a tendency to throw out both the low-level language and the abstraction language. IMO this is in part because they are being built by software engineers who have a hard time with the low-level parts of the standard HDLs.

But its not the low-level parts (eg, process blocks, type system, and arithmetic expressions) that are busted. Its the organization and abstraction parts (eg, the module systems) that are busted. I know this sounds counter-intuitive, because as a Schemer I know you're used to building systems where its more or less the same abstraction "all the way down" using a general model of computation as the base. But the practicing hardware engineers I know don't care. They are fully aware of the message-passing model that backs the HDLs. To them that model is an obstacle to be overcome on the way to building circuits.

Caveat: my HDL colleagues are all from the US aerospace/defense sector. They are firmly on the side of preferring VHDL over [System]Verilog based on the former's strictness and explicit behavior. So my perspective and feedback is inherently biased in that direction.


> They are fully aware of the message-passing model that backs the HDLs. To them that model is an obstacle to be overcome on the way to building circuits.

Can you expand on this? I don't quite understand what you mean by "message passing model" in the context of HDL. Signals are sort of like messages in the no shared memory sense but there is still plenty of spooky action at a distance when synthesizing.

Unless your level of abstraction is at the interconnect level like AXI, timing is always a huge pain and small changes in design can have significant downstream impact as signals are rerouted. Sure you could create a data structure that represents a complex module but it'd be a monstrosity that accepts dozens if not hundreds of closures to enable tweaks like "delay this half of the bus by n and the other half by 2n" that will have to be modified anytime there is a significant refactor that reorganizes the modules.

The core problem of applying software abstractions to FPGAs is that while HDL describes logic, it compiles nondeterministicly (to humans) into a format where physical placement and fit makes or breaks that logic. At that point, it's not really an abstraction as programmers see it but as, say, cabinet makers see it: there is a variety of well characterized off the shelf parts that they can order and they have a suite of power tools like grinders and routers to modify that part to fit their specific use case. The "arguments" or input for the abstraction have to be incrementally discovered by first grinding or cutting, then trying the part out to see if it fits, then mixing some sawdust with glue to fill in some spot, then grin or cut some more, and so on - rinse and repeat until all parts fit together correctly.


The message-passing model might refer to discrete event system implementations of digital logic simulators. That is, you can think of the events as messages between logic gates and wires. See for example the simulator in SICP: https://mitpress.mit.edu/sites/default/files/sicp/full-text/...


> Can you expand on this? I don't quite understand what you mean by "message passing model" in the context of HDL.

See Section 9 of the SystemVerilog language reference manual, ieee1800. Bootleg pdfs of the 2005 revision are readily available.

The formal semantics of the language's execution are defined in terms of message-passing. Any lowering to physical hardware should respect the same semantics even if it isn't actually passing literal messages from one reg to the next.


Now I think I understand what you mean.

For many hardware engineers, I think that the message-passing semantics is a conceptual framework for explaining how simulators work. It is considered as a "necessary evil" that must be taken into account when writing HDL code.

When I write VHDL, I don't think in terms of communicating processes. I think in terms of combinational and sequential circuits that I describe with processes. The description is written in a way that makes the synthesizer generate the hardware that I had in mind.

I see a possible explanation for this mindset: since only a subset of VHDL or Verilog is actually synthesizable, we cannot rely only on the language semantics to write code that will map to functional hardware.


I totally agree, and that's what I'm getting at.

Other formal models like the lambda calculus have an advantage in that lambda functions and the type systems to work with them aren't that far removed from the machines that execute them. So the practicing software engineer doesn't have a hard time using that model directly++.

As you so demonstrated, practicing HDL engineers don't generally find the actor model to be a good mental representation of their designs. Nevertheless, even in the synthesizeable subset, the languages are formally specified using message-passing processes. They just also happen to admit non-synthesizeable designs as well.


Related: one of the de-facto netlist formats (what comes out of the synthesis tool like Yosys or Synplify Pro) is EDIF [1] which represents the design as S-Expressions. If you use Xilinx tools, the *.dcp files it generates are just ZIP archives with one of these files inside.

[1] https://en.wikipedia.org/wiki/EDIF


If using "signal of vector": how does an update is propagated to a process that is sensitive only to some particular bits? So, any update to that signal (whatever the bit) causes all sensitive processes to be evaluted (regardless if that would be a misfire?). How then you got such interesting performance numbers, that "signal of vector" is better?


In the "signal-of-vectors vs vector-of-signals" comparison, "vector" refers to the Racket data type used for arrays. It should not be confused with "bit vectors", which are represented as integers (assuming that a bit can only be 0 or 1). While it is theoretically feasible, I would rather not use an array to store bits.

I think that your question assumes a model of computation similar to VHDL or Verilog. The techniques I use in Racket are similar to those used in Clash ( https://clash-lang.org/ ) In VHDL or Verilog, signals are updated in reaction to events. In Racket, I implement signals as lazy data streams: values are "pulled" from signals, which can trigger the evaluation of other signals. This is possible because I restrict my models to synchronous circuits with a single clock domain.

You can read more about it in this post: http://guillaume.baierouge.fr/2021/03/14/simulating-digital-...


Cool!

Is there a similarly simple core to compare with in Bluespec and Clash?


Don't know if these would match the 'similarly simple' requirement but there are certainly RISC-V cores written in either Clash: https://github.com/adamwalker/clash-riscv https://github.com/losfair/Violet or Bluespec: https://bluespec.com/category/risc-v/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: