CI/CD pipeline comparison – Jenkins vs. GitHub Actions

I’ve been asked recently whether I would use Jenkins or GitHub Actions to create a CI pipeline for a web app project. This got me thinking a bit, as it’s a comparison I’ve often thought about but never had to make a decision on.

I’m fully aware of the multitudes of caveats and “it depends” of such a vague premise. There are also multiple other hosted services as well as open source projects to choose from, such as GitLab CI, Circle CI, Azure DevOps, to name a few. There are also stack-specific tools to choose from, such as Netlify, Vercel, or the various tools in the Kubernetes space. It’s unrealistic for me to do a comprehensive comparison, or experience them all.

I just want to note down some thoughts after having some experience using these two systems. YMMV.

Jenkins

If you work in an “enterprise”, or a big team with a variety of different tech stacks, it might be worthwhile to invest the time and effort into maintaining your own Jenkins server, or look into a hosted offering.

It’s a tried and true option, with tons of resources and support, and is infinitely extensible. The new style of Declarative Pipeline has made reading and maintaining Jenkins pipeline more enjoyable, while the ability to embed Groovy script into your pipeline makes it very flexible.

Another benefit is a fairly complex permissions and security model, with support for RBAC and secrets management.

GitHub Actions

While Jenkins has been around forever, GA is the shiny new toy. Its biggest advantage is perhaps its beautiful UI integrated directly into where your code lives. It takes away the burden of maintaining a separate system to test and build your code.

This might be a bit controversial, but I am not too fond of the yaml syntax chosen. While it’s very popular in certain circles, I find it limiting when you want to do some highly customized logic, which I’ve found to be pretty common in a build pipeline.

Similar to Jenkins extensible plugin model, GA also allows you to reuse code, which is pretty cool and powerful. Sometimes trying to incorporate these shared actions can feel a bit awkward.

What to choose

As I started to write down some of these thoughts, I realized that there are just too many factors to consider here. I could write several posts discussing each of these topics in details. But that might not be too useful for anyone.

My general opinion is, if you’re already using GitHub, then spend a couple days building out GitHub Actions workflow. If you can get it to work and do everything you need, consider it a win! I think it will require the least amount of maintenance going forward, and thus will be a higher return on investment.

If you find yourself struggling with it too much, then don’t be afraid to spin up a Jenkins server. There’s a pretty good chance you’ll be able to get it to do exactly what you need.

If you have other thoughts and opinions, feel free to reach out and have a conversation. I spend a good amount of my time building pipelines, and am always looking to learn more.

ievms – Easily bring up IE browsers on any environment with Vagrant and VirtualBox

Thinking about incorporating IE browser testing for your site, but unsure how to proceed given that you’re developing on a macOS or Linux environment? Do you want to bring up a Windows virtual machine with something like VirtualBox, but not sure how to grab a valid Windows license? Luckily, Microsoft is aware of how difficult it is to develop and test for Edge and IE browsers, so they have released free virtual machines that already have these browsers built in.

If you go to the Microsoft developer website above, you have the options of downloading these VM images for different virtualization softwares (for e.g. VirtualBox and VMWare). However, these steps still feel pretty manual. I wanted to have a simple set of instructions that I could pass around to folks on my team to do IE testing with. Ideally, it would just be a script that anyone can run from any development environment. Thankfully, Microsoft also made vagrant images available, so I was able to create a simple wrapper around this.

ievms – a simple way to start IE and Edge VMs with Vagrant and Virtual Box

Vagrant is a developer tool that allows you to create automated and reproducible development environment using virtual machines. It works pretty well with VirtualBox, a free and well-supported virtualizer.

ievms relies on vagrant, VirtualBox and the images provided Microsoft to create a simple workflow for managing IE browsers across different Windows versions.

Bringing up a Windows 10 virtual machine with IE 10 installed and ready to use:

:; vagrant up ie10-win7

This will download the image if it’s not installed, add it to vagrant, and then bring up the virtual machine with a graphical UI. Once you’re done testing, you can suspend the machine with vagrant suspend ie10-win7, or remove it completely with vagrant destroy ie10-win7. Even if you removed it, the next time you need to bring it up, the image is already cached locally, so you will not have to wait for the download again.

In addition to IE10 on Windows 7, other browser-platform combinations are supported by default:
– ie6-xp
– ie8-xp
– ie7-vista
– ie8-win7
– ie9-win7
– ie10-win7
– ie11-win7
– ie10-win8
– ie11-win81
– msedge-win10

Bonus: You can also test a local website that’s running on the host development environment. For example, if you have a site running on port 8080 locally, you can reach it from within the virtual machine by going to http://192.168.33.1:8080.

Check out ievms‘s README for more instruction on how to get started.

Authentication for Google Cloud Functions with JWT and Auth0

Surprised that there was no built-in authentication mechanism for Google Cloud Functions, I made an attempt to implement a simple one with JWT and Auth0

With all the hype around serverless, I recently took a stab at creating a cloud function and see how it goes. I went with Google Cloud Functions instead of AWS Lambda, because I had some free signup credits on Google.

I started with this tutorial https://cloud.google.com/functions/docs/tutorials/http, and it seemed pretty straightforward. I created a cloud function with a HTTP trigger in about 30 minutes.

The function I deployed adds an entry to a Cloud Datastore database, and would do so every time I make a curl request to the function’s endpoint. That was pretty thrilling.

curl -X POST -H "Content-Type: application/json" \
-d '{"foo": "bar"}' \
"https://.cloudfunctions.net/post"

However, it soon dawned on me that this is pretty insecure, as anyone who knows of this endpoint could write to the database. Imagine if I wrote a delete function! I thought surely Google must have built in some sort of authentication scheme for Cloud Functions. But after googling around for a while, it didn’t seem so. I did next what any clueless developer would, and post a question on StackOverflow.

After a few days, the answers I got back seemed pretty disappointing. Apparently if I had used AWS Lambda, I could leverage API Gateway, which has support for auth. But I am on my own for Google Cloud Functions.

So I decided to implement an authentication check for my cloud function with a JWT token passed in in the form of an Authorization header access token, with the help of Auth0.

Here’s the implementation in Node, and the explanation is after.

const jwksClient = require('jwks-rsa');
const jwt = require('jsonwebtoken');

const client = jwksClient({
  cache: true,
  rateLimit: true,
  jwksRequestsPerMinute: 5,
  jwksUri: "https://.auth0.com/.well-known/jwks.json"
});

function verifyToken(token, cb) {
  let decodedToken;
  try {
    decodedToken = jwt.decode(token, {complete: true});
  } catch (e) {
    console.error(e);
    cb(e);
    return;
  }
  client.getSigningKey(decodedToken.header.kid, function (err, key) {
    if (err) {
      console.error(err);
      cb(err);
      return;
    }
    const signingKey = key.publicKey || key.rsaPublicKey;
    jwt.verify(token, signingKey, function (err, decoded) {
      if (err) {
        console.error(err);
        cb(err);
        return
      }
      console.log(decoded);
      cb(null, decoded);
    });
  });
}

function checkAuth (fn) {
  return function (req, res) {
    if (!req.headers || !req.headers.authorization) {
      res.status(401).send('No authorization token found.');
      return;
    }
    // expect authorization header to be
    // Bearer xxx-token-xxx
    const parts = req.headers.authorization.split(' ');
    if (parts.length != 2) {
      res.status(401).send('Bad credential format.');
      return;
    }
    const scheme = parts[0];
    const credentials = parts[1];

    if (!/^Bearer$/i.test(scheme)) {
      res.status(401).send('Bad credential format.');
      return;
    }
    verifyToken(credentials, function (err) {
      if (err) {
        res.status(401).send('Invalid token');
        return;
      }
      fn(req, res);
    });
  };
}

I use jwks-rsa to retrieve the public key part of the key that was used to sign the JWT token, and jsonwebtoken to decode and verify the token. I use Auth0, so jwks-rsa reaches out to the list of public keys to retrieve them.

The checkAuth function can then be used to safeguard the cloud function as:

exports.get = checkAuth(function (req, res) {
  // do things safely here
});

You can see the entire Google Cloud Functions repo at https://github.com/tnguyen14/functions-datastore/

The JWT / access token can be generated in a number of way. For Auth0, the API doc can be found at https://auth0.com/docs/api/authentication#authorize-client

Once this is in place, the HTTP trigger cloud function can be invoked with:

curl -X POST -H "Content-Type: application/json" \
-H "Authorization: Bearer access-token" \
-d '{"foo": "bar"}' \
"https://.cloudfunctions.net/get"

Next leg: NYC

This has been one of the hardest/ most terrifying decisions for me to make: leave a growing career at Demandware/ Salesforce Commerce Cloud and come to New York City and work for Bloomberg.

Now that I’ve been in New York for a few months, I’d like to jot down a few thoughts, so that I could look back on at some point in the future.

At this juncture, looking back, it is still a toss-up whether this decision has turned out to be the right one. I am going through a lot of challenges, both personally and professionally, that make me constantly question the move. I hope that in a year or so, the outlook on things will improve. I knew that it was a long-term investment that would require a bit of short-term pain. I should try to stay positive and see the rewards.