You don't really need a virtualenv

2023-10-19 09:35:00 浏览数 (1)

When you develop a Python project, you need to install the project's dependencies. For a long time, tutorials and articles have told you to use a virtual environment to isolate the project's dependencies. This way you don't contaminate the working set of other projects, or the global interpreter, to avoid possible version conflicts. We usually have to do these things:

代码语言:javascript复制
$ python3 -m venv venv  # make a virtualenv named `venv`
$ . venv/bin/activate   # activate the virtualenv
(venv) $ pip install -r requirements.txt  # install the dependencies

There are also workflow tools that simplify this process, such as Pipenv and Poetry. They create virtual environments for you without perception and then install dependencies into them. They are used by a wide range of users. A virtual environment contains a Python interpreter and has the same directory structure as a normal installation so that it can be used as if it were a standalone Python installation.

The problems with virtual environments

Virtualenvs help us isolate project dependencies, but things get tricky when it comes to nested venvs: One installs the virtualenv manager(like Pipenv or Poetry) using a venv encapsulated Python, and creates more venvs using the tool which is based on an encapsulated Python. One day a minor release of Python is out and one has to check all those venvs and upgrade them if required before they can safely delete the out-dated Python version.

Another scenario is global tools. There are many tools that are not tied to any specific virtualenv and are supposed to work with each of them. Examples are profiling tools and third-party REPLs. We also wish them to be installed in their own isolated environments. It's impossible to make them work with virtualenv, even if you have activated the virtualenv of the target project you want to work on because the tool is lying in its own virtualenv and it can only see the libraries installed in it. So we have to install the tool for each project.

I've been maintaining the Pipenv project as a collaborator for the past two years and became a member of PyPA in early 2020. I am always thinking if the virtual environment is really a must-to-have for Python projects, just like npm, it doesn't need a cloned node binary, but just a node_modules directory that is unique for each project.

PEP 582 -- Python local packages directory

The solution has been existing for a long time. PEP 582 was originated in 2018 and is still a draft proposal till the time I wrote this article, but I found out it is exactly what node_modules are in Python.

Say you have a project with the following structure:

代码语言:javascript复制
.
├── __pypackages__
│   └── 3.8
│       └── lib
└── my_script.py

As specified in the PEP 582, if you run python3.8 /path/to/my_script.py, __pypackages__/3.8/lib will be added to sys.path, and the libraries inside will become import-able in my_script.py.

Now let's review the two problems I mentioned in the last section and see how they change with the power of PEP 582. For the first problem, the main cause is that the virtual environment is bound to a cloned Python interpreter on which the subsequent library searching based. It takes advantage of Python's existing mechanisms without any other complex changes but makes the entire virtual environment to become unavailable when the Python interpreter is stale. With the local packages directory, you don't have a Python interpreter any more, the library path is directly appended to sys.path, so you can freely move and copy it.

For the second, once again, you just call the tool against the project you want to analyze, and the __pypackages__ sitting inside the project will be loaded automatically. This way you only need to keep one copy of the global tool and make it work with multiple projects.

PDM -- A new Python package manager and workflow tool

Starting from the PEP, I made PDM, a new Python package manager and workflow tool that leverages PEP 582 to get rid of virtualenv entirely. It installs dependencies into the local package directory __package__ and makes Python interpreters aware of it with a very simple setup. It is not only an implementation of PEP 582 but also the only package manager that supports PEP 621, a new metadata format based on pyproject.toml which becomes the standard recently. It is foreseen that pip will also gradually support this format. Besides, PDM uses the same dependency resolver as pip and has a full-featured plugin system, allowing for community-contributed plugins to enhance the functionalities.

In PDM, PEP 582 is not mandatory, you can also stick with virtualenv. PDM can detect existing venvs but not create new ones.

Another thing that is noteworthy is its dependency resolution mechanism -- it tries to lock versions that are compatible with the requires-python value of the project. Say your project requires Python 2.7 or 3.6 upper and you want to add pytest as a development dependency, in Pipenv(ver. 2020.11.15) you have to pin pytest = "<5" manually in Pipfile. And in Poetry(ver. 1.1.4) if you run poetry add -D pytest you will get:

代码语言:javascript复制
The current project's Python requirement (>=2.7,<3.0 || >=3.6,<4.0) is not compatible with some of the required packages Python requirement:
    - pytest requires Python >=3.6, so it will not be satisfied for Python >=2.7,<3.0

Yes, it tells you to upgrade your Python requires version. However, in PDM, you can lock successfully:

代码语言:javascript复制
❯ pdm add -d pytest
Adding packages to dev dependencies: pytest
✔ 


	

0 人点赞