Skip to content(if available)orjump to list(if available)

FPGAs Need a New Future

FPGAs Need a New Future

20 comments

·December 19, 2025

Cadwhisker

This article is a rant about how bad tools are without going into specifics. "VHDL and Verilog are relics", well so is "C" but they all get the job done if you've been shown how to use them properly.

"engineers are stuck using outdated languages inside proprietary IDEs that feel like time capsules from another century.". The article misses that Vivado was developed in the 2010's and released around 2013. It's a huge step-up from ISE if you know how to drive it properly and THIS is the main point that the original author misses. You need to have a different mindset when writing hardware and it's not easy to find training that shows how to do it right.

If you venture into the world of digital logic design without a guide or mentor, then you're going to encounter all the pitfalls and get frustrated.

My daily Vivado experience involves typing "make", then waiting for the result and analysing from there (if necessary). It takes experience to set up a hardware project like this, but once you get there it's compatible with standard version control, CI tools, regression tests and the other nice things you expect form a modern development environment.

tverbeure

> My daily Vivado experience involves typing "make", …

Exactly my experience with Quartus as well.

One really can’t help but wonder if those who always whine about the IDE/GUI just don’t know any better?

mgilroy

The issue with the software team using an FPGA is that software developers generally aren't very good at doing things in parallel. They generally do a poor job in implementing hardware. I previously taught undergraduates VHDL, the software students generally struggles with the dealing with things running in parallel.

VHDL and Verilog are used because they are excellent languages to describe hardware. The tools don't really hold anyone back. Lack of training or understanding might.

Consistently the issue with FPGA development for many years was that by the time you could get your hands on the latest devices, general purpose CPUs were good enough. The reality is that if you are going to build a custom piece of hardware then you are going to have to write the driver's and code yourself. It's achievable, however, it requires more skill than pure software programming.

Again, thanks to low power an slow cost arm processors a class of problems previously handled by FPGAs have been picked up by cheap but fast processors.

The reality is that for major markets custom hardware tends to win as you can make it smaller, faster and cheaper. The probability is someone will have built and tested it on an FPGA first.

makestuff

Yeah I agree it is a lack of understanding on how to use the tools. The main issue I ran into in my undergrad FPGA class as a CS student was a lack of understanding on how to use the IDE. We jumped right into trying to get something running on the board instead of taking time to get everything set up. IMO it would have been way easier if my class used an IDE that was as simple as Arduino instead of everyone trying to run a virtual machine on their macbooks to run Quartus Prime.

j-pb

VHDL is ok, Verilog is a sin.

The issue isn't the languages, it's the horrible tooling around them. I'm not going to install a multi GB proprietary IDE that needs a GUI for everything and doesn't operate with any of my existing tools. An IDE that costs money, even though I already bought the hardware. Or requires an NDA. F** that.

I want to be able to do `cargo add risc-v` if I need a small cpu IP, and not sacrifice a goat.

VonTum

Well really, the language _is_ the difficulty of much of hardware design, both Verilog and VHDL are languages that were designed for simulation of hardware, and not synthesis of hardware. VHDL may prevent quite a few typing errors verilog wouldn't, at the cost of a literal 3x larger code size. Likewise lots both languages have of similar-but-not-quite ways of writing things, like blocking/nonblocking assigns causing incorrect behavior that's incredibly difficult to spot on the waveform, not being exhaustive in assigns in always blocks causing latches, maybe-synthesizeable for loops, etc. Most of this comes from their paradigm of an event loop, handling all events and the events that those events trigger, etc, until all are done, and advancing time until the next event. They simulate how the internal state of a chip changes every clock cycle, but not to actually do the designing of said chip itself.

I'm tooting my own horn with this, as I'm building my own language for doing the actual designing. It's called SUS.

Simple things look pretty much like C: module add : int#(FROM:-8, TO: 8) a, int#(FROM: 2, TO: 20) b, int c { c = a+b }

It automatically compensates for pipelining registers you add, and allows you to use this pipelining information in the type system.

It's a very young language, but me, a few of my colleagues, and some researchers in another university are already using it. Check it out => https://sus-lang.org

tverbeure

Or you could do the right thing, ignore the GUI for 99% of what you’re doing, and treat the FPGA tools as command line tools that are invoked by running “make”…

blackguardx

You can pretty much do everything in Vivado from the command line as long as you know Tcl...

Also, modern Verilog (AKA Systemverilog) fixes a bunch of the issues you might have had. There isn't much advantage to VHDL these days unless perhaps you are in Europe or work in certain US defense companies.

Cadwhisker

# Here's the general flow for TCL projects. Read UG835 for details.

create_project -in_memory -part ${PART}

set_property target_language VHDL [ current_project ]

read_vhdl "my_hdl_file.vhd"

synth_design -top my_hdl_top_module_name -part ${PART}

opt_design

place_design

route_design

check_timing -file my_timing.txt

report_utilization -file my_util.txt

write_checkpoint my_routed_design.dcp

write_bitstream my_bitfile.bit

exmadscientist

The main advantage to VHDL is the style of thinking it enforces. If you write your Verilog or SystemVerilog like it's VHDL, everything works great. If you write your VHDL like it's Verilog, you'll get piles of synthesis errors... and many of them will be real problems.

So if you learn VHDL first, you'll be on a solid footing.

null

[deleted]

exmadscientist

FPGAs need their "Arduino moment". There have been so, so, so many projects where I've wanted just a little bit of moderately-complicated glue logic. Something pretty easy to dash off in VHDL or whatever. But the damn things require so much support infrastructure: they're complicated to put down on boards, they're complicated to load bitstreams in to, they're complicated to build those bitstreams for, and they're complicated to manage the software projects for.

As soon as they reach the point where it's as easy to put down an FPGA as it is an old STM32 or whatever, they'll get a lot more interesting.

timthorn

Sounds like a PLD might suit your usecase? Simpler than an FPGA, programmed like an EEPROM, perfect for glue logic.

omneity

If performant FPGAs were more accessible we’d be able to download models directly into custom silicon, locally, and unlock innovation in inference hardware optimizations. The highest grade FPGAs also have HBM memory and are competitive (on paper) to GPUs. To my understanding this would be a rough hobbyist version of what Cerebras and Groq are doing with their LPUs.

Unlikely this will ever happen but one can always dream.

rcxdude

FPGA toolchains certainly could do with being pulled out of the gutter but I don't think that alone will lead to much of a renaissance for them. Almost definitionally they're for niches: applications which need something weird in hardware but aren't big enough to demand dedicated silicon, because the flexibility of FPGAs comes at a big cost in die area (read:price), power, and speed.

nospice

To folks who wax lyrical about FPGAs: why do they need a future?

I agree with another commenter: I think there are parallels to "the bitter lesson" here. There's little reason for specialized solutions when increasingly capable general-purpose platforms are getting faster, cheaper, and more energy efficient with every passing month. Another software engineering analogy is that you almost never need to write in assembly because higher-level languages are pretty amazing. Don't get me wrong, when you need assembly, you need assembly. But I'm not wishing for an assembly programming renaissance, because what would be the point of that?

FPGAs were a niche solution when they first came out, and they're arguably even more niche now. Most people don't need to learn about them and we don't need to make them ubiquitous and cheap.

anondawg55

No. FPGAs already have a future. You just don't know about it yet.

jauntywundrkind

Making FPGA's actually available (without encumbering stacks) would be so great. Companies IMO do better when they stop operating from within their moat & this would be such the amazing use case to lend support for that hypothesis.

Gowin and Efinix, like Lattice, have some very interesting new FPGAs, that they've innovated hard on, but which still are only so-so available.

Particularly with AI about, having open source stacks really should be a major door opening function. There could be such an OpenROAD type moment for FPGAs!

checker659

Cost is also such a big issue.

fleventynine

There are some reasonably affordable models like https://www.lcsc.com/product-detail/C5272996.html that are powerful enough for many tasks.