I'm about to pull a "dropbox" here, but I am aware of many companies that already do this inside their Git infrastructure. It's not that hard to do when you combine verilator, testbenches in software languages, and cloud CI intended for software. This is one of the big advantages of greenfield designs (and FPGA companies): you can set things up for Verilator and/or CocoTB natively, and then you get to use things like Github actions to do continuous verification.
If you can get the commercial simulators and full support for the awful, ass-backwards, industry-standard verification frameworks (eg UVM), there's a great business here, but the trouble is going to be in getting there.
Thanks for the observations. It's true that every company that needs this eventually figures something out.
The difference is (a) our customers don't want to be in the devops business, and for startups especially it's a severe barrier to entry that we can make disappear, and (b) we are going to keep investing in our products (especially collaboration tools and integrations with waveforms, logs, etc) long past the point where a chip company would decide their internal tools are "good enough" (hint: they're generally not).
UVM support is one of the next items on our priority list.
You can see we have 100% test coverage, illustrated by CodeCov, and our CI runs the test suite on each PR. This is very normal in the software world and I guess I don't understand why the hardware world would need a specialized provider just to run Verilator for you.
It's not in gitlab's CI infrastructure, but I have continuous integration set up in a private server for https://gitlab.com/specbranch/r5lite and also for my company's proprietary hardware.
Ya and I've seen similarly basic support in small IP houses that support Verilator alongside whichever proprietary suite the house uses.
Is there a need here? Are there IP design houses that are so bad at CI infrastructure that "we run Verilator for you" is a value add?
I don't mean to denigrate the OP, just wondering what the market is. Undergrads build this stuff and let me tell you my undergrads are not a particularly talented group.
When you are trying to design high performance IP, you are often trying to ensure that your design is mathematically correct, inputs and outputs are matching a complicated 100 page specification. You are also trying to parse out the minimum set of workable requirements for "version 1" all with fitting into utilization constraints that ultimately are undefined.
Your mindset is really split. "Building up a software dashboard" to visualize your test results is really the last thing on your mind. You definitely don't want to be building the dashboard for all your customer's platforms.
Having somebody (a company) help on this front is really useful.
As a non-website designer, I used to think the same of tools like netlify, but they seem to be popular as ever, especially in a collaborative workspace when you need to handoff a project from one team to the next.
The thing is too: Gitlab and GitHub CI are still kind of crap unless you put a bunch of work into them (gitlab in particular really don't what they're doing; they're not dumb but they aren't good enough).
The functionality on offer here is equivalent to about 30 lines of Github Actions YAML to install verilator, run the tests, and upload the coverage information. [1]
Generating waveforms is free, Verilator already does that if you pass it the appropriate argument, either --trace or --trace-fst. We usually control that with a single CMake option.
Complex workflows can get nutty, but what's illustrated here is not a complex workflow.
If you can get the commercial simulators and full support for the awful, ass-backwards, industry-standard verification frameworks (eg UVM), there's a great business here, but the trouble is going to be in getting there.