$7.2k just to run at best Qwen3.5-35B-A3B doesn't seem worth it at all.
This is certainly not the most effective use of $7k for running local LLMs.
The answer is a 16" M5 Max 128GB for $5k. You can run much bigger models than your setup while being an awesome portable machine for everything else.