Developing software nowadays, more often than not, involves managing multiple integration points and accessing external services, APIs and resources. These integrations require the existence of methods to check the identity and the authorization of the other end, in order to provide the access.

A similar thing happens in our everyday usage of the Internet, with the usage of passwords or OAuth implementations for login methods, tokens for maintaining the session open, etc.

The common pattern here is the existence of a given secret that only you can obtain, that proves your identity. This is a recurring security issue for a given person (people hate passwords) and is aggravated when applied to an organization, where multiple agents must have access to the secret at different times and places.

Using, storing and managing these secrets is a well-discussed topic, however, we continue to see cases of it being done the wrong way (or no way at all) more often than it would be desirable, leading to serious security problems.

This issue, that is very frequent in the development and deployment of software products and servers, is growing and becoming more complex due to the current trend of adopting a microservices architecture. Which in practice means an increase in the overall number of servers and integrations.

It is clear that this is a hard problem and a sensitive topic, that grows with the size of the company (either people or infrastructure). To make things a little bit harder we have to take into account 2 more variables:

The clients (mostly directed to companies that do consulting work) that sometimes are the sender or recipient of these secrets, which some may not feel comfortable using the tools used to protect this information or are not aware of the dangers of transmitting the secrets without any protection.

The culture of BYOD (Bring Your Own Device), that is more widespread nowadays. For instance, this happens here at Whitesmith, turning things more challenging since the company does not have control over the machine.

What aspects should be taken care of

To correctly address this issue we should think of the situations were these secret might be used and where they might be more exposed. Of course that no matter how many tools and policies are in place to armor your data, they are useless if people do not use and follow them. One good example is a protected password that is shared securely but someone copies it to a post-it or to a desktop note to avoid having go to the trouble of decrypting it again for a later usage.

The simplicity of the usage and the integration, with current workflows and tools, is important. To start the main things that must be ensured are:

The secure transmission of those secrets throughout the Internet.

The identification of who exactly will be able to access the contents.

Store this information safely so it is protected in the case of hacking, robbery or the computer being lost, while being easily accessible for the person working with it.

A nice feature to have, but hard to implement in most situations, is the capability to audit the access to those secrets. However, this means having a centralized store for these secrets, which is kind of a single point of failure (an issue with that service and nobody works for that day).

How we currently handle this

As you might have noticed, for us this is an important issue, we work hard to maintain our own’s and our client’s data secure and safe.

Over time, this led to the development of a set of processes to transmit and store information, in a way that we can share the secrets between the team and store them safely while we still need them.

We take advantage of existing tools such as GPG for sending and receiving data over the wire and VeraCrypt volumes to keep all data securely encrypted on disk while it is needed. For those familiar with gpg , all the keys of Whitesmith’s team are available in keyservers and the legit ones are signed by [email protected] (key).

What can be done better

Of course, nothing is perfect and this approach seems to be a little harder to scale and maintain with agile workflows, while assuring that the defined process is followed. Since improvements are always possible, below are some of the main things we are currently looking forward to making them better:

Auditing

Avoiding the spread of copies of this information.

Make it easier for clients, without having any knowledge of how to work with gpg , to send us encrypted information.

Revocation of the access to certain secrets

In an effort to keep improving and staying up to date, we are studying and testing some tools that might represent a step forward. Since this is a common issue, a good starting point is to learn with other companies and how they dealt with this problem.

Some of these tools are services provided by well-known companies, others are open source tools more suited for this use case. Bellow is a small list of examples:

All of them promise to help you get rid of the wild west of spreading the secrets, or avoid sharing the credentials to a common “password manager”, while being able to easily manage all activity. Some of them are cloud services, however at this moment I personally prefer to have more control over the place where our secrets are stored, so I lean to solutions like vault or keywhiz .

In future posts, I will explore the pros and cons that we found on the two open source alternatives described above.

Overall the most important aspect of this post is that handling these pieces of information should be done carefully. While at the beginning it might be easy, and convenient, to use common tools through a more direct and relaxed approach, when the team starts growing things can get trickier and harder to handle.

The inability to control and protect your application/server secrets might lead to serious consequences later on the road, so it is important to raise the awareness on the issue.