I thought the RTX 6000 Ada was 48GB? If you have 96GB available that implies a dual setup, so you must be relying on tensor parallelism to shard the model weights across the pair.
RTX Pro 6000 - 96GB VRAM - Single card
RTX Pro 6000 - 96GB VRAM - Single card