With all the changes in C++, it always makes me wonder how developers these days reason about what the code is actually going to compile to and perform like. I feel like I have a good enough understanding of C and of older C++, but there's a constant influx of new syntax and new concepts in C++, and for a language that is supposed to be for systems programming, much of that seems so far away from "the machine".
> it always makes me wonder how developers these days reason about what the code is actually going to compile to and perform like
What will it compile to? Check out Compiler explorer [0].
What will it perform like? Use benchmarks [1] [2]. Remember that benchmarks heavily depend on what system you're using. Benchmarks for generically-compiled software on modern hardware will look very different than benchmarks for the same software but hyper-optimized to run on 10-year-old hardware.
So: if it's not tested, it's not Engineered.
For tests, I strongly prefer GTest [3], for a fairly consistent API across benchmarking, testing, and mocking.
> there's a constant influx of new syntax and new concepts in C++
Yes, but you don't have to use all of it. You can still use older C++ if you really want to. I wouldn't recommend it though. I think the "sweet spot" right now is around C++17. It's got quite improved safety compared to "old" C++ (say... pre C++11) and is fairly easily understandable if you're coming from "old" C++.
[0]: https://gcc.godbolt.org/
[1]: https://quick-bench.com/
[2]: https://github.com/google/benchmark
[3]: https://github.com/google/googletest