Don't know how to begin or what to say, but feel like saying something about a few Python development issues.
as hosts evolved from actual/virtual machines to containers, server deployment adapted accordingly. hosts were de facto containers because they had service interfaces, but deployment via systemd/upstart was unusual. i know supervisord was also used.
virtual environments became popular, because of host-server and server-server dependency conflicts. i avoided this style of virtualization by segregating servers and/or letting some servers govern the dependencies of others.
needless to say, containers and machines are quite different. when a container is server-specific, a virtual environment has no value.
thus, i install Python packages globally unless i'm forced to do otherwise.
The need for speed
Astral builds extremely fast
tools to make the Python ecosystem more productive.
(how do we measure ecosystem productivity?)
i like Ruff, because using Flake8 and isort requires slightly more effort. Ruff's speed is a nice feature, but it helps only when i make a stupid mistake, e.g. leave trailing whitespace. normally, i have to wait for mypy and the tests themselves. i often walk away.
in other words, the need for speed is more imaginary than real.
Dependency management
on the other hand, i dislike uv. if i wanted to build a virtual environment in a container, i would use a specialized tool for that purpose. preferring monolithic applications is weird.
most developers needn't create/update their environments that often. ignoring automated deployments, once per day/week might be sufficient. furthermore, even continuous deployment needn't be extremely fast.
like most applications, my server uses third-party packages. i install/upgrade them with pip, because it is the standard tool. (gotta love people who say they like standards while recommending deviation :-)
some packages should be upgraded immediately, e.g. ones that generate documentation. in general, immediate upgrades cause test failures about once a year. precise dependency specifications are essential, but postponing upgrades is more fearful than prudent. who hasn't resolved an issue by upgrading?
as it happens, i have mypy pinned to a version that's four months old because newer versions complain about stdlib code. depending on a package that's going through a rough patch is a classic dependency management scenario. a package manager should let me specify the pinned version and the latest bad version so a new release can trigger an upgrade. pip does not do this; this limitation is a bigger issue than its performance.
dependency conflicts are harder to resolve. we should review release notes to discover an optimal package combination. thus, a good package manager identifies the conflict, recommends solutions, and provides links to release notes.
in short, there are package manager issues, but focusing on performance seems misguided.
Comments
Post a Comment