Hashing, Encryption and Random in ASP.NET Core

This post look at hashing, encryption and random string generation in ASP.NET Core. We examine a few different approaches and explain why some common techniques should be avoided in modern applications.

Generating a random string

It is a very common requirement to generate random strings. Not only do they make excellent primary keys (in many NoSQL data stores at least) but in addition, they are commonly used for email validation and password reset procedures. Developers often use a (modified) GUID for this:

Guid.NewGuid().ToString("N")

This returns a string similar to: 84bc1c2db56140b39e35b040e6856457

This is often acceptable but for a more random, more readable and potentially shorter string we can come up with a better alternative:

public class RandomGenerator { private const string AllowableCharacters = "abcdefghijklmnopqrstuvwxyz0123456789"; public static string GenerateString(int length) { var bytes = new byte[length]; using (var random = RandomNumberGenerator.Create()) { random.GetBytes(bytes); } return new string(bytes.Select(x => AllowableCharacters[x % AllowableCharacters.Length]).ToArray()); } }

The key to this method is the use of System.Security.Cryptography.RandomNumberGenerator.Create which returns a cryptographically strong random number generator. This is in direct contrast to the System.Random class which returns the same pseudo-random numbers in the same order given the same seed. The 'known' nature of System.Random can be very useful in some situations but for anything security related, the use of RandomNumberGenerator.Create is preferred.

This example uses a base36 alphabet but you can change it to your requirements. For example, if your end users are expected to type in the string then you might want to remove characters which can be easily confused such as 0/o and 1/i.

Hashing a string

Hashing is a common requirement for storing passwords but you need to be very careful when implementing security logic yourself. Unless you have a very good reason, you are far better off using a tried and tested library to handle this for you. If you do have a genuine requirement to hash a string yourself then you need to pay close attention to:

The algorithm you use The correct use of a salt

You often come across hashing examples similar to the following code:

public string CalculateHash(string input) { using (var algorithm = SHA512.Create()) //or MD5 SHA256 etc. { var hashedBytes = algorithm.ComputeHash(Encoding.UTF8.GetBytes(input)); return BitConverter.ToString(hashedBytes).Replace("-", "").ToLower(); } }

The above code may be perfectly acceptable for checksums (i.e. comparing the calculated hash with a known value to ensure that a payload is complete) but it should not be used for security purposes, even if it is modified to use a salt. Modern GPUs can calculate billions of hashes a second so it is important to choose a hashing algorithm which is specifically designed to be less susceptible to brute force attacks.

For a more secure alternative we must look to another package. Adding a NuGet reference to Microsoft.AspNetCore.Cryptography.KeyDerivation allows us to use PBKDF2 which is far harder to brute force.

This example uses the common (and perfectly secure) technique of storing the salt together with the hashed password:

public string CalculateHash(string input) { var salt = GenerateSalt(16); var bytes = KeyDerivation.Pbkdf2(input, salt, KeyDerivationPrf.HMACSHA512, 10000, 16); return $"{ Convert.ToBase64String(salt) }:{ Convert.ToBase64String(bytes) }"; }

Of course the salt should be unique to each entry and not something that is guessable so we need a means to generate a random value. Again RandomNumberGenerator should be used rather than System.Random.

Here we are using a salt of 16 bytes or 128 bits but you can adjust as necessary:

private static byte[] GenerateSalt(int length) { var salt = new byte[length]; using (var random = RandomNumberGenerator.Create()) { random.GetBytes(salt); } return salt; }

When using PBKDF2, the recommendation is to change the number of iterations (the 4th argument in the KeyDerivation.Pbkdf2 call) to as large a figure as you can get away with. More iterations means more security but at the expense of slower generation.

To verify a plain-text string against a stored hash, we can simply hash the plain-text using the same salt and compare the values:

public bool CheckMatch(string hash, string input) { try { var parts = hash.Split(':'); var salt = Convert.FromBase64String(parts[0]); var bytes = KeyDerivation.Pbkdf2(input, salt, KeyDerivationPrf.HMACSHA512, 10000, 16); return parts[1].Equals(Convert.ToBase64String(bytes)); } catch { return false; } }

PBKDF2 is a big improvement on SHA* algorithms and is generally regarded as an acceptable approach in 2017. Having said that, it is important to be aware that there are several algorithms that are better (i.e. take longer to crack) such as Argon2, bcrypt and scrypt. Unfortunately these are not currently available in Microsoft-authored libraries so if you are looking for the ultimate in protection then you will need to rely on (and trust) a third party package.

Encrypting strings and objects

Encryption can be complicated. It goes without saying that developers should almost always avoid writing their own encryption implementations. .NET Core has much of the encryption capabilities of the full .NET framework and for more advanced situations, you can can advantage of the implementations in system.security.cryptography. Much of the time however this is unnecessary and we can take advantage of a much simpler API.

In previous version of .NET, a quick and easy way to encrypt data was to use:

MachineKey.Protect(unprotectedBytes, purpose);

This is not available in .NET Core but the new data protection APIs are just as easy to use:

public class Encryptor { private readonly IDataProtector _protector; public Encryptor(IDataProtectionProvider provider) { _protector = provider.CreateProtector(GetType().FullName); } public string Encrypt(string plaintext) { return _protector.Protect(plaintext); } public string Decrypt(string encryptedText) { return _protector.Unprotect(encryptedText); } }

Note that IDataProtector is automatically registered so it will be resolved without any configuration necessary.

If you are decrypting data which could have been manipulated, it is a good idea to change our Decrypt method to a TryDecrypt which handles any cryptographic exceptions. We can also extend our example to allow objects as well as strings to be encrypted and decrypted:

public class Encryptor { private readonly IDataProtector _protector; public Encryptor(IDataProtectionProvider provider) { _protector = provider.CreateProtector(GetType().FullName); } public string Encrypt<T>(T obj) { var json = JsonConvert.SerializeObject(obj); return Encrypt(json); } public string Encrypt(string plaintext) { return _protector.Protect(plaintext); } public bool TryDecrypt<T>(string encryptedText, out T obj) { if (TryDecrypt(encryptedText, out var json)) { obj = JsonConvert.DeserializeObject<T>(json); return true; } obj = default(T); return false; } public bool TryDecrypt(string encryptedText, out string decryptedText) { try { decryptedText = _protector.Unprotect(encryptedText); return true; } catch (CryptographicException) { decryptedText = null; return false; } } }

With C# 7 inline out variables, this approach simplifies the calling code:

if (_encryptor.TryDecrypt(out var password)) { // use password } else { // handle error }

Out of the box, the data protection APIs use AES256 (CBC Mode) for encryption and SHA256 for validation but you can easily change this by configuring it in Startup.cs:

services.AddDataProtection() .UseCryptographicAlgorithms(new AuthenticatedEncryptionSettings { EncryptionAlgorithm = EncryptionAlgorithm.AES_256_GCM, ValidationAlgorithm = ValidationAlgorithm.HMACSHA512 });

Here we change the encryption to GCM mode and the validation to SHA512. The default settings will suit most people so no configuration will be necessary but it is good to know that the library is flexible enough to adapt to your needs. You can even specify custom algorithms. See the docs for more information.

About key management

It probably hasn't escaped your attention that we have not specified any encryption keys anywhere. This is generally a good thing. Much of the time, we do not care about specifics and only want some data to be encrypted. If we do need a higher level of control then we can use the standard system.security.cryptography mentioned above.

When using the data protection library, encryption keys are automatically generated and stored and by default have a 90-day lifetime. Again, you can override the defaults in Startup.cs:

services.ConfigureDataProtection(dp => { dp.PersistKeysToFileSystem(new DirectoryInfo(@"c:\whatever")); dp.SetDefaultKeyLifetime(TimeSpan.FromDays(30)); });

Web Farms and Azure

If you have multiple servers then it is essential to use the same encryption keys for every box.

The standard cookie authentication middleware and CSRF protection both use the data protection APIs internally so if you are having problems such as users being logged out after a deployment then this is a likely cause.

If you host on Azure's PaaS infrastructure (App Services) then you will be pleased to know that synchronising keys between instances is already taken care of. Keys are stored in "%HOME%\ASP.NET\DataProtection-Keys" which is part of the storage-backed drive which is automatically synced. One potential gotcha is if you use deployment slots then switching slots will result in problems but this is likely to be addressed soon.

If you manage your web hosts yourself then you will need to handle the syncronisation. A shared drive may be an option but many users will need to find an alternative solution. The ASP.NET team have written some extensions for Redis and Azure Blob Storage but you can also write your own.

Tugberk Ugurlu has a nice post showing how to use Redis for centralised key storage.

For blob storage, you need to reference Microsoft.AspNetCore.DataProtection.AzureStorage and add the following to your ConfigureServices method in Startup.cs:

var storageAccount = CloudStorageAccount.DevelopmentStorageAccount; var client = storageAccount.CreateCloudBlobClient(); var container = client.GetContainerReference("key-container"); container.CreateIfNotExistsAsync().GetAwaiter().GetResult(); services.AddDataProtection().PersistKeysToAzureBlobStorage(container, "keys.xml");

As an alternative to the above, if you are familiar with shared access signatures, you can use:

services.AddDataProtection() .PersistKeysToAzureBlobStorage(new Uri("<blob URI including SAS token>"));

If you have another data provider such as SQL Azure, you will need to implement a custom IXmlRepository. You can see this post for an example.

Summary

This post looked at three distinct but related tasks: encryption, hashing and random string generation.

We showed a simple technique to improve upon the use of GUIDs for random strings.

We then examined hashing and explained why using SHA512.Create(), SHA256.Create() and MD5.Create() is not recommended and explained how to use PBKDF2 instead.

Finally, we looked at hassle-free encryption using the data protection APIs and discussed how web farms require additional configuration to avoid encryption key problems - not just with the data protection APIs directly but also with built-in functionality such as cookie authentication and CSRF protection which uses DPAPI internally.

Useful or Interesting?

If you liked the article, I would really appreciate it if you could share it with your Twitter followers.

Comments