What would change your mind? Genuine question.
The adversarial test is public and runnable in 5 minutes:
git clone https://github.com/Lama999901/metagenesis-core-public
python demos/open_data_demo_01/run_demo.py
If output isn't PASS/PASS on your machine, I want to know.
If the protocol design is flawed, I want to know where specifically.Known limitations are machine-readable: reports/known_faults.yaml
First of all, I don't want to run anyone's code without proper explanation, so help me understand this. Let's start with the verifier. The 3rd party verifier receives a bundle, not knowing what the content is, not having access to the tool used to measure, and just run a single command based on the bundle which presumably contains expected results and actual measurements, both of which can easily be tampered. What good does that solve?