Background

If you are unaware, Grafana is a fantastic tool for creating and sharing dashboards. However, you may quickly find yourself outgrowing its default SQLite database if you have too many people trying to use your Grafana instance at once.

In my case, this issue of scale presented itself by the SQLite database frequently being inable to be lookup a user's session cookie, which would cause users to get logged out despite having a valid session. This is discussed in various Github issues in the Grafana project, with some of the main ones being issue #10727 and issue #15316.

The Grafana project has made some improvements in Grafana 6.2.1 designed to address the issue I experienced of being frequently logged out, but I still wanted to migrate my database to Postgres as an additional precaution and to have a more scalable database.

The Journey

The first thing I tried when migrating my database was to use the pgloader project, which is a nice tool designed for migrating a variety of databases into Postgres. However, there are some column types that are difficult to translate from SQLite to Postgres, particularly the boolean values. Booleans in SQLite are 1 or 0 whereas they are true or false in Postgres. If these aren't translated correctly, Grafana will just implode in on itself and you won't be able to login.

After discovering the boolean issue, I found the grafana-migrator project on Github, which is designed specifically for migrating Grafana databases. This addressed the boolean issue by changing the column types after the database is initialized, importing the SQLite database, and then changing the column types back by translating the values to ones Postgres will like.

-- alter column type ALTER TABLE alert ALTER COLUMN silenced TYPE integer USING silenced::integer; -- translate back after import ALTER TABLE alert ALTER COLUMN silenced TYPE boolean USING CASE WHEN silenced = 0 THEN FALSE WHEN silenced = 1 THEN TRUE ELSE NULL END;

While this project was a good start, it was inconsistently maintained and was essentially just stringing together of multiple scripts. To ensure that everything would work the way I wanted it to, I set out to make my own application that would handle the import of the SQLite database to Postgres.

The Solution

Rather than just copying the grafana-migrator project, I wanted to create a more robust method of migrating the Grafana database than just a collection of scripts. Consequently, I decided to create a Go program since that would allow me to compile to a single binary for easier usage.

In addition to just running the DDL statements after sanitizing them, the primary points I addressed in my program are:

Automatic resetting of database sequences

Automatic decoding of hex-encoded values

Automatic translation of boolean columns and values

The biggest challenge I encountered during the building of the project was the fact that SQLite dumps the JSON definitions of Grafana dashboards as hex-encoded values, but you cannot just import those hex-encoded strings into Postgres as they will not be recognized. The HexDecode function I wrote has some inefficiencies (such as converting decoded to a string, so that it can be concatenated to be between single quotes), however the performance hit is negligible due to the scale of the project.

// HexDecode takes a file path containing a SQLite dump and // decodes any hex-encoded data it finds. func HexDecode(dumpFile string) error { re := regexp.MustCompile(`(?m)X\'([a-fA-F0-9]+)\'`) re2 := regexp.MustCompile(`'`) data, err := ioutil.ReadFile(dumpFile) if err != nil { return err } // Define a function to actually decode hexstring. decodeHex := func(hexEncoded []byte) []byte { // Find the regex submatch in the argument passed to the function // then decode the submatch. decoded, err := hex.DecodeString(string(re.FindSubmatch(hexEncoded)[1])) if err != nil { logrus.Fatalf("Failed to decode hex-string in: %s", hexEncoded) } decoded = re2.ReplaceAll(decoded, []byte(`''`)) // Surround decoded string with single quotes again. return []byte(`'` + string(decoded) + `'`) } // Replace regex matches from the dumpFile using the `decodeHex` function defined above. sanitized := re.ReplaceAllFunc(data, decodeHex) return ioutil.WriteFile(dumpFile, sanitized, 0644) }

The fruit of all this effort can be found on my Github in the grafana-sqlite-to-postgres repository. I have successfully used this app to migrate multiple Grafana databases, including our production Grafana database used by 50+ users with 50+ dashboards. As such, I would consider it safe for production use but you should maintain a copy of your grafana.db SQLite database just in case.

You can find usage instructions in the README, but feel free to open up an issue if you encounter any issues.