Thursday, November 14, 2013

How we use git as an update mechanism

The little problem...



As many other studios out in the wild, here at Cluster Studio we have a lot of artists, producers and other employees, who use penguins, fruits and holes in the walls as OSes. Our work as the development team is to provide applications that help them to do their work easier (or harder as they sometimes perceive it). So we needed to find a painless way to distribute software across all the users and all the platforms.


Since some time ago we use git to version control the R&D projects and share code between us, but nothing else beyond this. Fortunately we are changing that, now we use a workflow as the one described on this Bitbucket's article (or at least we try). A better development workflow is something that deserves another complete post on this blog and I think we will never find a definitive solution for that. So let's focus on today's topic DISTRIBUTION.

A local solution...


Since all the artists work at desktop workstations we can mount a shared partition for all of them, so they can launch the standalone applications from there (most of the apps are python scripts). We were doing that since long time ago without any correct version control. So we decided to use git repositories to manage that problem and since, in today's world, everything must be "on the cloud" at least twice to prove it exists, we choose Bitbucket over our old github repositories to host our work.


Why bitbucket?... because their business model fits better for a small development team with a lot of private repositories, like us. We only have to pay for the number of team members and not for the number of private repositories. So we use github for Open Source initiatives (like those of today's post) and Bitbucket for our private parts (well, not that parts).


Bitbucket also has a nice API and a Hook system that let us automate the pulls to our local production repositories. This way the (lazy) members of the development team don't need to manually update both repositories every time. We just make a push to bitbucket and a hook sends a request to our homemade web service that finds the right path to the repository and executes the pull command (kind of magic). You can fork it here.

A little explanation next:

Our weapon of choice for the web service is flask, mainly because its good intentions and great flexibility. Following Miguel Grinberg's tutorial it was easy to develop it. Using wsgi we deploy the web service at Apache.

At Cluster Studio we use Shotgun as our database, including tool tracking, so we get the repositories locations from there. Not much to say here, you can  change it to use any way that fits you to get the local path to your repository.

For those who can't stay...


Some users (usually the producers using laptops, sorry MacBooks Pro) can't stay in our local network all the time but they still need to use our applications, always the latest version, so we came up with the idea of giving them a local repository and develop an auto update system. You can fork it here.

Basically it works by reading a config file to look for the path to the local storage, the URL to the repository and the name of the script, it can include other repositories as dependencies just in case needed. Executes a pull or clone for every repository and then run the given script.

Since executing something from terminal usually is too complicated for our mac users we also have an applescript that runs the python command. Save it as an app and your users will never know the difference.


Finally...

sh deserves a mention apart because it is a great subprocess interface that helps to handle the system interaction.

Hope you find it useful, interesting or at least as a not to do reference.
We'd love to see how you deal with this in your studio.

The code
csGitHook
csAppLauncher