I’ve just deployed my first virtual server on Digital Ocean* and they support a one click installer for GitLab!

They provide hosting from 5$ a month and have a pretty great interface for firing up new instances and their applications tab enables you to pick from some pre-configured images to run things from a classic LAMP stack to WordPress, Ghost or in this case GitLab.

Check out the blog post they have about setting up GitLab on digital ocean. For further instructions, as usual, see my book on GitLab 😉

*

GitLab hardware Requirements and Digital Ocean

The hardware specs for the cheapest plan are below the recommended hardware specs for a GitLab installation which start at 1024 MB of memory.

512MB is too little memory, GitLab will be very slow and you will need 250MB of swap

So in order to run a GitLab installation we will at least need the 1024MB server droplet at 10$ a month:

GitLab “Cannot allocate memory”

When your GitLab installation runs out of memory, you will typically get 500 http status codes and find lines like these, when trying to add an ssh key.

Completed 500 Internal Server Error in 14.8ms Errno::ENOMEM (Cannot allocate memory - ssh-keygen): lib/gitlab/popen.rb:9:in `popen' app/models/key.rb:56:in `block in generate_fingerpint' app/models/key.rb:53:in `generate_fingerpint' app/controllers/profiles/keys_controller.rb:19:in `create' app/controllers/application_controller.rb:54:in `set_current_user_for_thread'

Also you might see error messages while trying to run the internal test suite with

sudo -u git -H bundle exec rake gitlab:check RAILS_ENV=production

Reduce GitLab memory consumption

By only running one instead of two unicorn worker processes, you can reduce the memory consumption slightly, but your performance will suffer. You can alter these settings in /home/git/gitlab/config/unicorn.rb:

# Use at least one worker per core if you're on a dedicated server, # more will usually help for _short_ waits on databases/caches. worker_processes 2

* affiliate links