Question

If you're doing a solo project - would you use CI tools to build from a repository? I've used Hudson and Cruise Control in a team environment, where it's essential to build as soon as anyone checks anything in.

I think the value of version control is still obvious, but do I need to build after every commit, seeing as I would just have built on my local machine, and no-one else is committing?

Was it helpful?

Solution

Well I'm planning to use an continuous integration tool on a project and I'm the unique developer. The need comes because

  1. One goal is to make it cross-platform, and having an integration tool helps you implicitelly (basic) check your app on several platform once you've pushed your changes in the integration repository (or the central one, or whatever is the one that will be used as authoritative).
  2. The project is built for long-term so I will need to work with a team later. I'm not planning to recruit anyone until 2012 so the continuous integration thing can wait for the moment, until 1. becomes the priority.

Other than cross-platform and team-work preparation, I don't see the need. The main thing is to have some kind of source control software and have several repository to keep backups. That will help you setup build tools around it when needed.

About the build timing, I'm using Mercurial and setting up an integration repository that is not the teamwork repository. So, push changes in the teamwork repo until I feel it's time to make the integration system try to build. Then I push from the teamwork repo to the integration repo and that will trigger the build. Then I also add a script that will pull the teamwork repo in the integration repo one time each day.

Here I'm assuming that I work almost all days on my project but it's not always true. You have to set a build timing that is relative to how often you need builds. In a game company I worked at before, we used CruiseControle and it built a full build each hour. We could also force a build whenever we wanted if needed.

For a home project, one time a day might be already "often". The main requirement would be to easily allow the user to force launching a build.

OTHER TIPS

After a brief contemplation I would suggest that it might even be more important for a solo developer than for a team.

At the most basic level a CI server demonstrates that you can build your application from scratch from committed source - combined with a decent set of tests it should demonstrate that you can build and run from scratch.

Since one of the things I try to do is ensure that my build includes a deployable package you also know that you can go get something to deploy (clean and from a known state/version).

In fact now, when you do File|New Project you should probably include creating or adding to your repository and setting up your CI build script and deployment setup (even if that's only to zip a pile of stuff for xcopy deployment)


Addendum (2016) - nowadays my CI system will also be an integral part of my deployment process, so its value has increased and I absolutely won't run any deliverable project without it. Automated push button deployment takes a lot of stress out of the process and in some way, shape or form a build server is integral to that.

When I am the only one committing, I just build and test before actually committing. I usually use a makefile target like :

make sense

That configures, builds, runs all tests (valgrind aware), runs lints, etc. As I know I'll be the only one pushing, I don't really need the power of something like Hudson.

Additionally, in an environment where you have several branches feeding a main repository, if everyone follows the always pull before you commit or push, the CI server might be a little over kill. A well written rule that the author of whatever broke the last build buys pizza on Friday usually keeps things running very smoothly :)

If it gets into a situation where a project is clearly divided into sub systems that have their own leaders, you really need to contemplate using something like Hudson. Someone might test locally, lose a race with another sub system and end up pushing something toxic.

Also, if you are maintaining a fork of a fast moving project (for instance, your own set of patches to the Linux kernel), you should really consider using something like Hudson, even though you are 'solo' on that project. This is especially true if you branch / re-base directly from mainline.

I wouldn't say this is just a nice bonus, I'd say it's vital to high quality software engineering for us solo artists out there. Most of us will let their quality standards slip a bit if in a rush of if they think its easy enough to fix later. If you commit software in that state, you essentially have a worthless codebase stored in your source control.

If adhered to properly (that is, you don't skip the tests and you make sure it builds every time you commit) CI forces you to adhere to a higher quality standard than you would if you'd just commit it anyways.

It is important if you want to reduce your wait time to see if everything is still going well. Although you can get your IDE to compile stuff for you as soon as you save, it does not automatically run unit tests, so I have my CI server run the unit tests and test case coverage reports and other quality analysis of my code as soon as I push it.

The only trigger I have to do is push my current changes to version control and I can go back to coding. And while I am thinking about coding, the CI system is busy churning along doing the long winded quality reports which I will look at once in a while when my brain goes in a lull.

I have a separate VMWare machine on the same laptop that does the builds for code that I push in. To do this I just get a Turnkey Linux VMWare image and install the jenkins using apt-get and do a few minor configuration changes.

Licensed under: CC-BY-SA with attribution
scroll top