Question

Of course source control tools like Git, (Mercurial, SVN, etc...) can do a great job at managing source code. But I wonder, do these tools provide the developer with any advantage when used to store copies of files such as PhotoShop PSDs and Illustrator AI files? Does it make sense to use these tools with these kind of files? Would I be storing less in the repository than the sum of the file sizes of all of these files? Even though the file format of these files is only machine readable, I would expect that in for such applications, especially when dealing with vector rather than raster graphics, a small part of these files would change, and much of the rest would remain the same.

Thank you for your insight.

Was it helpful?

Solution 2

Although this is very opinionated issue, I would say no — at least for git.

  • Git was not created as a storage solution.
  • There's no way to merge image files.
  • Therefore, branches don't make sense — if the only way to merge these branches together is to choose which version is correct, you're better of replacing the file right away.
  • Git GUI tools are inferior to console, and are not simple. Do you want to teach your art team what the difference between commit and push is?
  • When you checkout git repo, you checkout the whole history of all files, starting from initial commit. If you work on binary files long enough, the size will get enormous.
  • Many git hosting sites, such as github, have limits on individual file sizes.

I think that you're much better off with dropbox.

OTHER TIPS

Git itself can manage any kind of data, provided their are not too big or to numerous.
See "git with large files" ("large" as in size or number).

Diff'ing pictures/graphics isn't a feature supported by Git natively, but a Git repo hosting service can extends its web GUI to offer such a support.

GitHub just announced (June 2014) "ePSD Viewing & Diffing", which extends their " image viewing and diffing" (Nov. 2011)

Any PSD assets in your repositories will be treated just like images, meaning you can view them inline and use our three image view modes to see what's changed in a commit.

Update Mach 2022: this is no longer supported.
See "Working with non-code files" for the files for which diff is supported.

https://cloud.githubusercontent.com/assets/2546/3165594/55f2798a-eb56-11e3-92e7-b79ad791a697.gif

Those answering "no" have very good reasons, but it isn't impossible.

I'm successfully using GitHub to manage an open source project consisting of hundreds of Illustrator files and PDFs (and also some code and text, but that's a tiny blip in comparison). The repo comes out at about 8GB. The reason I'm doing something so insane is because the Illustrator files are the core of the product, not merely decorative artwork to go along with it - they are the source of the project - and because I wanted to make sure it would stay open source.

There have been some sticking points, and things to be aware of. I would suggest:

  • Don't try this unless you're pretty familiar with git. Resolving conflicts and branching issues can get really thorny, and you may have to do some pretty arcane stuff to keep the repo happy. Nobody expects you to know every corner of git (I'm not sure a sane person could), but know enough that you can google the rest.

  • Make sure you're comfortable using git on the command line. GUI tools may shield you from complexity, but they also prevent you from fully understanding what's going on under the covers. Once you have that understanding, you're free to use a GUI for 95% of the time.

  • Avoid branching if possible. Binary files don't merge the way code does, so bringing branches together can get messy and laborious.

  • Learn about specific features of git that can help you to manage the size and complexity of the repo: partial checkouts, tags, git gc, etc

  • Take time to plan in advance. It may be that you would benefit from separating the project into two or more git repos, or from combining it with another service.

  • If you're using a hosting service, make sure you know what limits they impose on the repo. For example, GitHub will complain about files over 100MB. Here are their recommended guidelines for binaries.

No, I would not recommend using git, svn, etc. for version tracking. A surprising amount of lines will change between barely altered versions of Adobe files - see for yourself by doing a diff comparison. This is especially true when options like native file compression are turned on in Illustrator.

By judiciously using layers, links, and saving milestone versions of files, you'll have a much more efficient use of storage than SVN's could ever give you for native Adobe files.

The one exception I can think of is for XML based files, like pure-vector SVGs.

If you just need simple version management with a simple UI, Subversion works pretty well for managing these files. It has good GUI support (e.g. SmartSVN or TortoiseSVN) with shell integration. It's also much easier to selectively check out only the files you need.

For all of you pointing that the size of the files is a big problem, Git-LFS comes to solve that problem.

It is easy to install and use, and popular platforms such as GitHub, GitLab or Bitbucket support it without any problem.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top