Question

Last Friday, I've built an RPM spec for my Django project. The RPM creates a virtualenv, downloads dependencies via pip and puts everything into the packages. Today, I've found out that BeautifulSoup 3.2 has been released. Luckily, I've had my BeautifulSoup version pinned in the requirements.txt, so I found out because of the build failing.

Now a completely different matter is: how do I do avoid upgrading stuff in the future? BeautifulSoup has deleted all previous versions from PyPI, so I can't download a version I've actually tested against. pip's download cache doesn't help here either, since pip always tries to check PyPI first.

Can you recommend something to avoid this situation?

Was it helpful?

Solution

First, this is an unusual situation. I've never seen another package remove all old releases the way BeautifulSoup does. I consider that rather user-hostile behavior, except perhaps in cases of a serious security fix.

That said, if you want a reliable build process using pip, you really need to mirror all the packages you rely on locally. It's not hard to do; you can use pip's --download option (or your existing pip cache) to get all the package tarballs, then just dump them in an indexed, web-served directory and use --find-links in your requirements file to point pip there (plus --no-index to tell it not to use PyPI).

OTHER TIPS

The files in question can still be found: just provide the direct url instead of the package name:

http://www.crummy.com/software/BeautifulSoup/download/3.x/3.0.8.tar.gz

for example.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top