Question

I have several small Python libraries that I wrote with stuff that I find myself wanting over and over again. I think most programmers have something similar. I want to use these libraries from a variety of different machines so I've started keeping this stuff in my DropBox. However, I'd like to be able to use my code on machines on which I can't install DropBox or other cloud storage applications, even in portable form. I can just download the files every time one of them changes (DropBox can provide me a URL for each file in my Public folder), which is only a moderate nuisance. But--and I admit this is a longshot--is there a solution out there that will let me tell Python to load a library from my DropBox via http?

BTW, I'd like to add the whole remove folder to my sys.path, but getting a URL for a folder is complicated, so I'm going to try to walk before I run by starting with individual files.

Was it helpful?

Solution

Yes, it's possible. I think you want the combination of two previous questions:

So your task basically breaks down into writing a little bit of glue code: download the URL via the first bullet, write it to a local file, and then import that file using the second bullet.

So that's how you'd do that.

BUT - please keep in mind that dynamically downloading and executing code has many potential security downfalls. Will you be doing this over a secure connection? Who else has the ability to manipulate that URL? There are a bunch of security issues inherent in downloading and executing code on the fly. I would ask you to consider going about your solution in a different way, but I'm giving you the answer you're asking for.

As a simple security check, you can establish a known-good hash for your file, and then refuse to import any file other than one that's on the list of known-good hashes. This makes it a pain to update your modules, but gives you a little bit of extra safety.

OTHER TIPS

  1. Don't use DropBox as a Revision control
  2. Pick a real solution like Git
  3. Setup access to the Git repository on one of your servers
  4. Clone the repository to your worker machines and checkout master
  5. Create a develop branch where you put every change you make
  6. Test the changes and when you consider any of them stable, merge it to master
  7. On your worker machines set up a cron job which periodically pulls from master branch of repository (and possibly restarts some Python processes as importing the same module again won't make Python interpreter aware of changes since imported modules are cached)
  8. Enjoy your automatically updated workers :)
  9. Don't feel shame - it happens that even experienced software developers come up with XY problem
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top