Since you mentioned you like to follow best practices, I am guessing you are using virtualenv too, correct? Assuming that is the case, and since you are already pinning your packages, there is a tool called pip-tools that you can run against your virtualenv to check for updates.
There is a down side, and why I mentioned the use of virtualenv though.
[the tool] checks PyPI and reports available updates. It uses the list of
currently installed packages to check for updates, it does not use any
requirements.txt
If you run it in your virtualenv, you can easily see which packages have updates available for your current active environment. If you aren't using virtualenv, though, it's probably not best to run it against the system as your other projects may depend on different versions (or may not work well with updated version even if they all currently work).
From the documentation provided, usage is simple. The pip-review
shows you what updates are available, but does not install them.
$ pip-review
requests==0.13.4 available (you have 0.13.2)
redis==2.4.13 available (you have 2.4.9)
rq==0.3.2 available (you have 0.3.0)
If you want to automatically install as well, the tool can handle that too: $ pip-review --auto
. There is also an --interactive
switch that you can use to selectively update packages.
Once all of this is done, pip-tools
provides a way to update your requirements.txt with the newest versions: pip-dump
. Again, this runs against the currently active environment, so it is recommended for use within a virtualenv.
Installation of the project can be accomplished via pip install pip-tools
.
Author's note: I've used this for small Django projects and been very pleased with it. One note, though, if you install pip-tools
into your virtual environment, when you run pip-dump
you'll find that it gets added to your requirements.txt file. Since my projects are small, I've always just manually removed that line. If you have a build script of some kind, you may want to automatically strip it out before you deploy.