We are building a Java application, using BitBucket as private Git repo, and we wanted to have an online Jenkins to have our code continuously tested and built on each push. So we decided to give OpenShift a try.
OpenShift offers a free account with three “small gears“, which are like small VMs, where you can install whatever you want. It also allows you to install some apps with just a few clicks, these ones are called “cartridges” (like the good old console video games!).
These gears have some limitations: 1Gb of disk space and about 512Mb of RAM, but they work out of the box, you can connect to them using ssh, apps have their own git repo to update them (the gear is auto restarted once the repo is pushed), you have your own domain and they can also be created or destroyed to optimize the use of the gears. Crazy!
The Jenkins solution there uses one gear and it is always online. Whenever you ask to execute a job, the job executor created uses another gear, that can be created or reused at that moment. This new gear is destroyed after 15mins of being idle.
If the gear is new, the job executor clones the repo and downloads everything from scratch (source, artifacts, plugins etc), builds everything and the results are collected by the Jenkins gear (don’t know exactly the process… it may be magic).
If the gear already exists and it is being reused, the job executor updates its repo and runs with what it already has, obviously faster.
I didn’t find any place where you can setup the job executor gear to remain active, just to avoid doing all the process over and over again. Anyway, I didn’t investigated too much about it. And it seems that the only problem is the time it consumes to build, because the rest is pretty cool the way it is (careful here, if you plan to run many jobs at once).
Let Jenkins get the code
So once you have your Jenkins up and running, you may need to get access to your repo to clone it.
SSH Credentials to get the BitBucket repo cloned
- Connect to the server using ssh.
- Generate the user ssh credentials with:
ssh-keygen -t rsa
- Save the credentials somewhere in
- Then add it to Jenkins in the “Manage Credentials” configuration page. I saved mine here:
- Finally, go to BitBucket to enable Jenkins to clone the repo, using the Configuration/Deployment keys/Add key”, adding the ssh public key.
Update (2014-04-02): if you find issues with ssh when jenkins tries to clone the git repo, you will probably need to read this:
Where it states that (after having the keys in $OPENSHIFT_DATA_DIR and the public key added to BitBucket), you basically need to create a script in $OPENSHIFT_DATA_DIR, let’s call it wrapper.sh with the following:
#!/bin/bash ssh -i $OPENSHIFT_DATA_DIR/id_rsa $1 $2
And then run the following:
GIT_SSH=$OPENSHIFT_DATA_DIR/wrapper git clone email@example.com:<git_user>/<git_repo>.git
Thanks to Douglas Carvalho for spotting the issue!
Then tell Maven to build it
There is one big issue with this CI approach (Java code + BitBucket repo + Jenkins in OpenShift):
Even when Jenkins is installed really easy and runs out of the box, the Maven integration just SUCKS.
Why? Maven uses
~/.m2 folder by default to search for
settings.xml and to store the artifacts repo. But the gear does not provide write access to
$HOME to the process, only to
$OPENSHIFT_DATA_DIR. So trying to create a Maven2/Maven3 job is not going to work.
- create a free-style job
- add a shell script step after getting the local repo cloned
- use that script to create a settings.xml in the
$OPENSHIFT_DATA_DIRfolder that tells Maven to create the repo somewhere in that folder.
- run Maven goals to build your project using the just-created
Example of workaround
cd $OPENSHIFT_DATA_DIR echo -e "74settings>\n 74localRepository>$OPENSHIFT_DATA_DIR74/localRepository>\n74/settings>\n" > settings.xml cd ../../$WORKSPACE/ok-prototype-backend mvn clean package -s $OPENSHIFT_DATA_DIR/settings.xml
Repo Hook to trigger the build
We now need to tell Jenkins to start the job once the repo is updated, automatically, so there is no need to check every time we push our code.
To achieve this we are using the POST hook in the BitBucket repo, where you basically need to build the Jenkins url needed as follows:
- USER: is the Jenkins user that will trigger the job
- APITOKEN: is a token associated to that user to allow the use of the API, you can get it from the user configuration page in Jenkins
- JENKINS_URL: the url of your jenkins server
- JOBNAME: the name of the job
- TOKEN: the token associated to the job, you must add it in the job configuration page (enable remote triggers)
The cool thing about this is that you can check if it works just using curl from the console.
Also note there is no password in the url, you have the API TOKEN instead, this is to avoid publishing your user and password.
Something else we do to improve security a little bit is having a different user with just access rights to read and build jobs, and use it only for this. So we don’t have to publish our own user and token, which may have admin rights.
This is based on these documents:
- How to clone a private Git repo from a gear
- Make Jenkins work with a private Git repo (the thread that led to the solution in the previous link, also includes some references to stackoverflow questions)
- Jenkins Remote Access API (submitting jobs)
- Jenkins hook management] (check out the comments)
- Hooking BitBucket up with Jenkins] (this is VERY useful if you want to try the Jenkins hook)
If you find other workarounds to the Maven issue, or better ways to achieve the integrations, please let me know ok?