It might be just our region, but for a long time we couldn't access current frontier models at all. Only old GPT4 level models. Meanwhile, Anthropic is rolling out access to every model within 24 hours of announcement to Bedrock.
Not sure how much Azure OAI has changed, but when I last used it 2-3 years ago, it was just a farce to get you using provisioned throughput. The throughput quotas were small, the process to request more was bureaucratic, and the Azure SAs were
It was also very clear the OAI and MS teams held each other in contempt (not relevant, but was interesting and grew in the immediate aftermath of the Altman drama).
So why were we using it? OpenAI don’t really have an enterprise go to market, bedrock still relied on Claude 2, and we weren’t willing to YOLO on clickthroughs.
Once Claude 3 came out, we jumped ship. That sucked too, although I hear it’s gotten better though.
Not sure how much Azure OAI has changed, but when I last used it 2-3 years ago, it was just a farce to get you using provisioned throughput. The throughput quotas were small, the process to request more was bureaucratic, and the Azure SAs were
It was also very clear the OAI and MS teams held each other in contempt (not relevant, but was interesting and grew in the immediate aftermath of the Altman drama).
So why were we using it? OpenAI don’t really have an enterprise go to market, bedrock still relied on Claude 2, and we weren’t willing to YOLO on clickthroughs.
Once Claude 3 came out, we jumped ship. That sucked too, although I hear it’s gotten better though.