There are plenty of brilliant people who use python. However, in every one of these boom cycles with python I dealt with A LOT of developers with horrific software engineering practices, little understanding of how their applications and dependencies work, and just plane bizarre ideas of how services work. Like the one who comes with 1 8k line run.py with like 3 functions asking to “deploy it as a service”, expecting it to literally launch `python3 run.py` for every request. It takes 5 minutes to run. It assumes there is only 1 execution at a time per VM because it always writes to /tmp/data.tmp. Then poses a lot of “You guys don’t know what you’re doing” questions like “yeah, it takes a minute, but can’t you just return a progress bar?” In a REST api? Or “yeah, just run one per machine. Shouldn’t you provide isolation?”. Then there is the guy who zips up their venv from a Mac or Windows machine and expects it to just run on a Linux server. Or the guy who has no idea what system libs their application needs and is so confused we’re not running a full Ubuntu desktop in a server environment. Or the guy who gives you a 12GB docker image because ‘well, I’m using anaconda”
Containers have certainly helped a lot with python deployments these days, even if the Python community was late to adopt it for some reason. throughout the 2010s where containers would have provided a much better story especially for python where most libraries are just C wrappers and you must pip install on the same target environments, python developers I dealt with were all very dismissive of it and just wanted to upload a zip or tarball because “python is cross platform. It shouldn’t matter” then we had to invent all sorts of workarounds to make sure we have hundreds of random system libs installed because who knows what they are using and what pip will need to build their things. prebuilt wheels were a lot less common back then too causing pip installs to be very resource intensive, slow and flaky because som system lib is missing or was updated. Still python application docker images always range in the 10s of GBs
There are plenty of brilliant people who use python. However, in every one of these boom cycles with python I dealt with A LOT of developers with horrific software engineering practices, little understanding of how their applications and dependencies work, and just plane bizarre ideas of how services work. Like the one who comes with 1 8k line run.py with like 3 functions asking to “deploy it as a service”, expecting it to literally launch `python3 run.py` for every request. It takes 5 minutes to run. It assumes there is only 1 execution at a time per VM because it always writes to /tmp/data.tmp. Then poses a lot of “You guys don’t know what you’re doing” questions like “yeah, it takes a minute, but can’t you just return a progress bar?” In a REST api? Or “yeah, just run one per machine. Shouldn’t you provide isolation?”. Then there is the guy who zips up their venv from a Mac or Windows machine and expects it to just run on a Linux server. Or the guy who has no idea what system libs their application needs and is so confused we’re not running a full Ubuntu desktop in a server environment. Or the guy who gives you a 12GB docker image because ‘well, I’m using anaconda”
Containers have certainly helped a lot with python deployments these days, even if the Python community was late to adopt it for some reason. throughout the 2010s where containers would have provided a much better story especially for python where most libraries are just C wrappers and you must pip install on the same target environments, python developers I dealt with were all very dismissive of it and just wanted to upload a zip or tarball because “python is cross platform. It shouldn’t matter” then we had to invent all sorts of workarounds to make sure we have hundreds of random system libs installed because who knows what they are using and what pip will need to build their things. prebuilt wheels were a lot less common back then too causing pip installs to be very resource intensive, slow and flaky because som system lib is missing or was updated. Still python application docker images always range in the 10s of GBs