There's no dynamic memory allocation with (100%) Spark. That's really limiting. You can to write "unsafe" code, but that has the same problems as Ada.
SPARK is not used for the whole system, but for the < 5% parts, which are safety/security-related in a good architecture.
That is true for parsers like libjs, but again crypto module or even networking, can still be written in spark, which is much more safety critical.