Ignoring maintainability introduces technical debt, which will either need to be addressed later or will lead to increasing costs when implementing new features.

We have recently been looking into how to increase our code quality by:

Setting up tools to immediately alert the developers when code quality drops

Documenting code guidelines and thinking about how we could have avoided maintainability problem in past projects

I am going to briefly outline what we have set up to automatically monitor code quality.

Groundwork

We have opted to base our continuous integration setup on Jenkins, running on a Mac Mini that lives in our studio. As much as I dislike how Jenkins looks and works, it is by far the most stable and suitable tool for this job.

We have Jenkins installed through Homebrew and Ruby installed with rbenv, which provides us an up-to-date and stable Ruby Gems environment. With these two package managers we are able to install almost any tool we need, in a way that is much less likely to break with system updates compared with using the OS X provided Ruby.

Unit Testing

We test our iOS projects using Specta and Expecta.

Specta gives us a Behaviour-Driven Development (BDD) style syntax for writing tests which (we think) is more readable than XCTest syntax. It also has a very powerful system of grouping tests together and running blocks of code before or after those tests, which can greatly reduce the amount of duplicated code.

Expecta is a matcher framework which we use to create assertions in our tests. The syntax is very powerful, yet at the same time more readable than the built in XCAssert suite. For example:

expect(@"foo").to.equal(@"foo");

expect(foo).notTo.equal(1);

expect([bar isBar]).to.equal(YES);

expect(baz).to.equal(3.14159);

We run our tests from XCode while developing and using XCTool on Jenkins, installed via Homebrew. XCTool is an alternative to xcodebuild which allows you to very easily run a test suite from the command line and generate JUnit-style reports.

$ xctool -workspace Project.xcworkspace -scheme Project -reporter junit:junit-report.xml test

These reports are then published in Jenkins using the JUnit Plugin which provides graphs of the unit test results over time, giving us an insight into how stable our tests are.

Going well so far!

Pull Request Testing

We want our tests to run as soon as possible so we know right away if we have broken something. At ribot we make changes on feature branches and then submit a pull request on Github so the code can be reviewed by another developer. As soon as one is opened, we run all the tests to ensure that nothing has been broken.

To manage this, we set up the Github Pull Request plugin which sends a message from Github to Jenkins when a new pull request is opened. If any tests fail it will be shown in Github and we will not merge it in until it has been fixed.

Code Coverage

We also generate code coverage reports using the Gcovr tool, also installed with Homebrew. To set up a project you need to change two build settings for the debug configuration of the main target. Set both Generate Test Coverage Files and Instrument Program Flow to Yes.

Make sure you only enable these settings for the Debug configuration

We then need to add OBJROOT=./build to the end of the XCTool command when we run the unit tests and then generate our code coverage reports.

$ gcovr -r . — object-directory build/Project.build/Debug-iphonesimulator/Project.build/Objects-normal/x86_64 — exclude ‘.*Tests.*’ — xml > coverage.xml

The coverage report that Gcovr outputs can then be published by the Cobertura Jenkins plugin which provides a visual way of seeing how code coverage is changing over time.

We now have a way of seeing not only if the tests are passing, but also how thoroughly we are testing our code.

Static Analysis

One of the most powerful set of tools to keep code quality high is static analysis. These tools will scan your code and generate a report of where your code breaks one of the code style rules. Some examples of these rules are:

Unused variables or parameters

Long variable names, method names or lines

Overriding a method and not calling super on a method that requires it

Overly long or complex methods

And so on…

We use OCLint which is a static analysis tool which works with C, C++ and Objective-C. OCLint provides great integration with XCTool using a special json-compilation-database reporter. We first need to add a second reporter to our XCTool command and then pass that report to OCLint to perform the static analysis.

$ xctool -workspace Project.xcworkspace -scheme Project -sdk iphonesimulator -reporter json-compilation-database:compile_commands.json clean build

$ oclint-json-compilation-database -e Pods — -report-type pmd -o oclint-pmd.xml

The report generated is in the PMD format and can then be published in Jenkins by the PMD Plugin. With this plugin you can also set limits to how many of each priority of warning (low, medium and high) there can be before the tests fail. We initially set these limits low so we are alerted as soon as we introduce code that could be improved.

As soon as we set up OCLint, we saw and fixed the worst of the problems we had. Exactly what we wanted to happen!

Automated Deployment

The last piece of the puzzle is less about code quality and more about time saving. Developers will regularly need to send out builds through Crashlytics to designers for design reviews, or to clients at the end-of-sprint demos. Sending out a build of the app usually only takes around ten minutes of a developer’s time, but it requires them to switch tasks and break their flow.

We have recently set up a nightly build system which will automatically send a new version of the app each morning to everyone on the project at ribot.

To do this we are using fastlane, which is an amazing collection of tools to define lanes of actions to perform. We currently have three lanes defined, one for releasing just to ribot developers, one for releasing to everyone at ribot and another for releasing to the client.

before_all do |lane|

cert

sigh

end desc “Deploy a new build to ribot iOS developers over crashlytics”

lane :dev do

ipa

crashlytics({ groups: ‘ribot-developers’ })

end desc “Deploy a new build to people at ribot over crashlytics”

lane :internal do

ensure_git_status_clean

append_build_time

ipa

crashlytics({ groups: ‘ribot’ })

reset_git_repo

end desc “Deploy a new build to everyone over crashlytics”

lane :external do

ensure_git_status_clean

increment_build_number

ipa

crashlytics({ groups: [‘ribot’, ‘client’] })

commit_version_bump

add_git_tag

push_to_git_remote

end after_all do |lane|

clean_build_artifacts

end

A lane is run using the fastlane tool (installed through Ruby Gems).

fastlane internal

At the start of all the lanes we automatically make sure we have a valid signing certificate and an up-to-date provisioning profile. All of our configuration lives in a .env file, which allows us to have default settings but override them when we run the fastlane command if needed.

In the future we will also look into automating the app store submission process using the deliver action.

Going Forward

So far we have seen good results on the projects where we have trialled this process. We expect to see our code quality increase now we have these tools in place. The reports will enable us to quantify how our code quality has changed over time. We are looking forward to seeing what happens on our next project with these tools in place.