Links

pmuellr is Patrick Mueller

other pmuellr thangs: home page, twitter, flickr, github

Friday, December 16, 2011

accessible backed up git repos for transient work-related projects

preface: this is basically a "note to self", but thought others might find it useful

I tend to create a lot of "projects" at work. A "project" is a directory in my "~/Projects" directory. Who knows what might be in it; probably code, or maybe just documentation. Many times, it's a "project" that I think may be useful to other people, and I may eventually "publish" to an either a public SCM repo farm like GitHub or an internal IBM SCM repo farm. I currently have 317 projects in my "~/Projects" directory.

Before "publishing" it though, I'll usually want to crank on it for a bit, see if I can get the crap to work, and figure out if it's actually useful. So there's a window where my project isn't being "backed up" or available to anyone else, because it's sitting on my laptop's spinning disk of rust. That's not great.

In the past, I've used Subversion and/or Mercurial to help manage this "back up" process for personal, non-work-related projects. What I'd do is set up a repo on my shared server, and then just use it like any other remote SCM.

Problems:

  • What about work-related stuff? I don't want to publish something that's work-related on a non-IBM site.

  • I made it sound like "set up a repo on my shared server" was easy. It isn't. Depending on the capabilities of your server, it might be impossible.

  • Yeah, we do have various internal-only SCM repo farms in IBM, but those still require some "setup" and "maintenance", just a different flavor than shared hosting, and even that can be too much sometimes. Especially for a project which may only be useful for a couple of days.

Bummer.

a new hope

What I'm doing now is using Git, and storing my Git repos on IBM's internal, backed up, rsync-/smb-/http-accessible cloud file store. Even if you don't have rsync or smb access, there are likely ways you can do something similar, with different tools, on your own server.

Here's how it works:

  • mount the file store on my local machine, create a directory for the new repo in my "git-repos" directory, which is in a part of my cloud file store that is publically accessible via http, unmount the file store.

  • initialize the project on my local machine with git (eg, "git init")

  • write a "backup" script, which looks like this:

    #!/bin/sh
    
    #--------------------------
    # make the git repo accessible via:
    #   git clone http://{host-name}/~pmuellr/git-repos/{project-name}/.git
    #--------------------------
    
    cd `dirname $0`
    
    git update-server-info
    rsync -av . {host-name}:{path-to}/git-repos/{project-name}
    
  • Instead of using a "git add/commit/push" workflow, I use a "git add/commit; ./backup" workflow.

  • As the comment in the script notes, the git repo is accessible via http, to anyone who has access to that URL. For the general case for me, that's anyone in IBM.

  • The "backup" script, as written, also backs up my working directory. Which is a win as far as I'm concerned. Now I can also point folks to web-accessible/-renderable resources in my working directory.

I've found this to be a very useful, very lightweight process to keep my immature projects backed up. I don't doubt there are ways to do similar things with SVN or Mercurial or whatever as well, I just never tried.

I can, but haven't yet, set up a similar story for personal projects, using my shared server, but that should be pretty easy; it supports rsync, but not smb. Instead of "mount and create project directory", I can "ssh and create project directory". Even that's probably not needed; rsync can likely create that project directory for me, but I'm an rsync n00b and I kind of like having to make that step explicit.