Packaging and environments in python Today, the standard way most python projects work is setting up one or more requirements text files. Then, using $ pip -r path/to/requirements.txt to install the packages. But there's more to the story: Developers have multiple python projects on the same system. So they'll use something called virtualenv to set up independent site-packages/. They'll then "source" their shell to use the virtual environment's site-packages/.

Converting from pip to Pipenv (Technically pipenv still uses pip under the hood.) I have used custom bootstrap scripts that setup virtualenv and their required python packages so anyone wanting to edit on a Unix-like system could get setup automatically. A bit TLDR: I've also dabbled with creating various wrappers to genericize virtual environment's across platforms and systems so people contributing to open source projects can get setup ASAP change things. I've played with ruby projects and was surprised by how straight forward packaging is There's no sourcing environments, just bundle exec and the command automatically runs in the context of the project's packages.

Enter Pipenv That's the kind of simplicity Pipenv (kennethreitz/pipenv) provides. It automatically sets up a virtual environment and packages via Pipfile, which is a successor to requirements text files. Roughly speaking, pip is to requirements.txt what Pipenv is to Pipfile. As an added bonus: Pipenv handles dependency resolution. Update It also handles loads and verifies package checksums. Aside: Checksum support in pip 8.0 pip introduced a hash-checking mode in pip 8.0. It may be worth checking out if you're not ready for Pipfile and want safer requirements.txt files. You can use $ pip install --require-hashes <package> to get started with that. hashin can be used to add hashes to requirements files too. Apparently plain-old pip doesn't do dependency resolution, but pipenv, with the help of Pipfile.lock will. This slows things down, and with certain packages you pull in, could break if the vendor locked the version. You can also generate requirements.txt files via pipenv. It will even add hash checksums to the packages, which offer extra piece of mind. (I find I prefer staying with the new Pipfile format.)

What does a Pipfile look like? Pipfiles are TOML-format. Here's an abbreviated snippet of mine: [[source]] url = "https://pypi.python.org/simple" verify_ssl = true name = "pypi" [dev-packages] isort = "==4.2.15" "autopep8" = "==1.3.3" pytest = "==3.2.3" "fabric3" = "*" "11260ed" = { file = "https://github.com/develtech/fabtools/archive/fabric3.zip" } [packages] django = "==1.11.6" django-debug-toolbar = "==1.8" django-extensions = "==1.9.1" [requires] python_version = "3.6" See how it contains the python version? That comes in handy when Pipenv creates the virtual environment. As for Pipfile.lock, it's JSON. You definitely won't be editing that one manually. But you won't need to touch Pipfile either!

Grab Pipenv First, if you haven't, bootstrap pip on your environment. $ pip install pipenv

Migrating to Pipfile from legacy requirements files When you use $ pipenv install for the first time, a Pipfile and Pipfile.lock will be created automatically for you. With dependency-resolution setup.py packages (in --editable mode): $ pipenv install -e . Normal requirements.txt: $ pipenv install -r requirements.txt Production requirements go into normal group (kennethreitz/pipenv#335): $ pipenv install -r requirements-prod.txt For dev/test dependencys, use --dev: $ pipenv install --dev -r requirements-dev.txt $ pipenv install --dev -r requirements-test.txt After this, you'll have a Pipfile and Pipfile.lock. Without dependency resolution Add --skip-lock. No Pipfile.lock will be created: setup.py packages: $ pipenv install --skip-lock -e . Normal requirements.txt: $ pipenv install --skip-lock r requirements.txt Production requirements go into normal group: $ pipenv install --skip-lock -r requirements-prod.txt For dev/test dependencys, use --dev: $ pipenv install --skip-lock --dev -r requirements-dev.txt $ pipenv install --skip-lock --dev -r requirements-test.txt After this, you'll have a Pipfile.

Install individual packages You can install packages via $ pipenv install , too: $ pipenv install sphinx $ pipenv install sphinx == 1 .6.0 $ pipenv install flake8 --dev See the --dev? You can "group" non-production development packages. Also, your Pipfile updates after you install. Kind of like $ yarn add and $ npm install --save do.

Handling version conflicts If you're facing a version conflict with packages that'd normally work with pip, you can add --skip-lock. After that, type $ pipenv graph to find what dependency has the stuck dependency. Notify the maintainer of the package and hope for the best. $ pipenv install sphinx == 1 .6.0 --skip-lock $ pipenv graph

Speed up slow installs If you have a ipfile.lock, you can still choose to install packages and ignore it entirely. $ pipenv install --skiplock . This gives you performance and behavior similar to plain-old pip. $ pipenv install django --skip-lock You'll have a Pipfile. No Pipfile.lock will be created if it doesn't exist. If it does exist, it won't be updated.

Run via pipenv run You can now run your python files within the context of the virtualenv pipenv creates, without sourcing yourself into a virtualenv: $ pipenv run py.test $ pipenv run ./manage.py runserver $ pipenv run ./myfile.py Be sure to be inside the project with the Pipfile!

Drop into virtualenv easy [~/work/develtech] $ pipenv shell Spawning environment shell (/bin/zsh). Use 'exit' to leave. source /home/user/.local/share/virtualenvs/myproject-FfMtJBIY/bin/activate [~/work/myproject] $ source /home/user/.local/share/virtualenvs/myproject-FfMtJBIY/bin/activate [~/work/myproject myproject-FfMtJBIY] $ You're in a virtualenv, yay [~/work/myproject myproject-FfMtJBIY] $ ./manage.py runserver Same with this: be sure to be inside the project with the Pipfile!

Create requirements.txt from pip If you're still utilizing a setup.py or want to use requirements.txt in production for other reasons, Pipenv can produce an output of your requirements wish hashes: $ pipenv lock -r > requirements.txt This will give you something that looks like this: mccabe==0.6.1 --hash=sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42 --hash=sha256:dd8d182285a0fe56bace7f45b5e7d1a6ebcbf524e8f3bd87eb0f125271b8831f chardet==3.0.4 --hash=sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691 --hash=sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae pycparser==2.18 --hash=sha256:99a8ca03e29851d96616ad0404b4aad7d9ee16f25c9f9708a11faf2810f7b226 You can also pass in both -r and -d for development dependencies: $ pipenv lock -r -d > dev-requirements.txt

In production I currently use Pipenv in production for HSKFlashcards.com and devel.tech. To install packages, it's similar to $ pip install -r <file> , except now you install Pipenv to the server $ pip install pipenv and $ cd /path/to/project to $ pipenv install . This creates a virtualenv and installs the packages. But then you need to find where the packages installed to. This is an issue since pipenv assigns a random hash to the name of the virtualenv. To keep the virtualenv location deterministic, use PIPENV_VENV_IN_PROJECT=TRUE /etc/environment. This makes sure the virtualenv is created .venv directory in the project root (where Pipfile is). You can also programatically find your pipenv's virtualenv, $ cd to your project root and use this: $ pipenv --venv /home/user/.local/share/virtualenvs/myproject-FfMtJBIY Also, there is a $ pipenv install --deploy command, I think it's just stricter on dependency resolution. Personally I'm not using it at the moment, but it's there for you.