This is a smart approach to API testing - capturing real production patterns is way more valuable than synthetic tests.
One question: how do you handle sensitive data in the captured traces? We've been working on API governance at toran.sh and found that policy enforcement during trace capture can be tricky - especially ensuring PII doesn't leak into test fixtures.
Great work on the trace replay mechanism!
Thanks! Great question, we have a Transforms system that lets you define redaction rules (redact, mask, replace, or drop) using matchers with JSONPath support. Transforms are applied at capture time, so sensitive data never leaves your service boundary.
Full docs here: https://docs.usetusk.ai/api-tests/pii-redaction/basic-concep...
Would love to hear what patterns you've found work well at Toran!