logoalt Hacker News

manquertoday at 4:34 AM0 repliesview on HN

A Full migration is not always required these days.

It is possible to write adapters to API interfaces. Many proprietary APIs become de-facto standards when competitors start creating those compatibility layers out of the box to convince you it is a drop-in replacement. S3 APIs are good example Every major (and most minor) providers with the glaring exception of Azure support the S3 APIs out of the box now. psql wire protocol is another similar example, so many databases support it these days.

In the LLM inference world OpenAI API specs are becoming that kind of defacto standard.

There are always caveats of course, and switches go rarely without bumps. It depends on what you are using, only few popular widely/fully supported features or something niche feature in the API that is likely not properly implemented by some provider etc, you will get some bugs.

In most cases bugs in the API interface world is relatively easy to solve as they can be replicated and logged as exceptions.

In the LLM world there are few "right" answers on inference outputs, so it lot harder to catch and replicate bugs which can be fixed without breaking something else. You end up retuning all your workflows for the new model.