logoalt Hacker News

kaspar030today at 3:15 PM1 replyview on HN

> In Figure 12, they simply stop optimizing the code once desired rate is reached.

Yes. The goal was to handle the maximum data rate of the used sensor, and stop there. Time was limited on both ends.

> Just at the end of the project the Rust firmware gets over a third performance boost, most likely from their OS developers.

The ST intern found those boosts all by himself. They compared the exact MCU & peripheral initialization of the C and Rust firmwares, tightened I2C timings (where STM Cube has vendor tuned & qualified values), and enabled the MCU's instruction cache, which somehow is not default in Embassy's HAL. We were quite impressed actually, the last days before the deadline were quite productive, optimization wise.


Replies

bArraytoday at 3:29 PM

> Yes. The goal was to handle the maximum data rate of the used sensor, and stop there. Time was limited on both ends.

I understand, and I understand that there were limits to what could be done with the resources there were. What irks me is the strength of the claim made without enough evidence to make it.

> The ST intern found those boosts all by himself. They compared the exact MCU & peripheral initialization of the C and Rust firmwares, tightened I2C timings (where STM Cube has vendor tuned & qualified values), and enabled the MCU's instruction cache, which somehow is not default in Embassy's HAL. We were quite impressed actually, the last days before the deadline were quite productive, optimization wise.

Fair enough, hats off to the intern. This kind of thing is common in MCUs, even on low-end CPUs weird defaults can be selected. But the involvement and influence of the OS developers remains unclear.

Again, there's just not enough data to make such strong claims. I think the paper could easily make recommendations, it could say that at least in some cases (as evidenced) Rust could be a reasonable choice, and it could make an argument for further work.