Secure Sensitive Data in Five Lines of Code
In the event of a security breach, this data may result in a loss of sales, revenue, or reputation for the company. As a result, it becomes critical that an application correctly handles and stores this sensitive information.
Pros and Cons of Existing Encryption Solutions
- Database-Level Encryption - Prevents access to the physical drive but still allows an application with access to the database to see sensitive data.
- Encrypting Data at Rest - Ensures the data in the database is secured but requires additional solutions for encryption key management, key rotation, and key sharing.
- Database-Level Access Restrictions - Multiple applications with access to the same database can read the sensitive data. One application may need access to the original value, but another application should only have access to part of the value.
We can solve these problems by leveraging a tokenization solution, such as Basis Theory. The platform allows us to securely share the tokens across multiple applications and provide per-application permissions to the underlying data. In addition, tokenization removes the need to share encryption keys or manage encryption key rotation and storage.
To help simplify the tokenization of sensitive data and storage of the tokens, we developed a low code solution leveraging value converters with Entity Framework, a popular object relational mapper (ORM), to handle this for us.
Entity Framework Tokenization
Value Converters enable us to convert the state of a value when reading and writing to the database. Everyday use cases may be to serialize or encrypt a value. In this example, we will build a Value Converter to tokenize the sensitive value before writing to the database and detokenize when reading from the database.
Want to start playing with a working solution? Check out the Value Converter and a sample application in our GitHub repo.
1. Create a Value Converter
To get started, we are going to create a simple Value Converter which will automatically convert to and from our tokenized value:
public class TokenizationConverter < TModel >: ValueConverter < TModel, string > {
public TokenizationConverter(Expression < Func < TModel, string >> convertToProviderExpression, Expression < Func < string, TModel >> convertFromProviderExpression, ConverterMappingHints mappingHints = null): base(convertToProviderExpression, convertFromProviderExpression, mappingHints) {}
}
2. Add Attributes
Next, we will need a way to identify which properties on our data model we want to tokenize. To do this, we will add an Attribute:
[AttributeUsage(AttributeTargets.Property, AllowMultiple = false, Inherited = false)]
public class TokenizedAttribute: Attribute {}
3. Create a class
To create and retrieve our tokens, we will create a class that will wrap our .NET SDK:
public class TokenizationProvider {
private readonly TokenClient _tokenClient;
public TokenizationProvider(string apiKey) => _tokenClient = new TokenClient(apiKey);
public string Tokenize < TModel > (TModel dataToTokenize) {
var token = _tokenClient.Create(new Token {
Type = "token", Data = dataToTokenize
});
return token.Id.ToString();
}
public TModel Detokenize < TModel > (string dataToDetokenize) {
var token = _tokenClient.GetById(dataToDetokenize);
return (TModel) token.Data;
}
}
4. Finally, build an Extension Method
To tie everything together, we are going to build an extension method for our Entity Framework Database Context to automatically apply our ValueConverter for any properties with the Tokenize attribute:
public static ModelBuilder UseTokenization(this ModelBuilder modelBuilder, TokenizationProvider tokenizationProvider) {
if (tokenizationProvider == null) return modelBuilder;
foreach(var entityType in modelBuilder.Model.GetEntityTypes()) {
foreach(var property in entityType.GetProperties()) {
var attribute = property.PropertyInfo?.GetCustomAttribute < TokenizedAttribute > (false);
if (attribute == null) continue;
if (property.ClrType == typeof (string)) {
property.SetValueConverter(new TokenizationConverter < string > (m => tokenizationProvider.Tokenize(m), s => tokenizationProvider.Detokenize < string > (s)));
}
if (property.ClrType == typeof (int)) {
property.SetValueConverter(new TokenizationConverter < int > (m => tokenizationProvider.Tokenize(m), s => tokenizationProvider.Detokenize < int > (s)));
}
}
}
return modelBuilder;
}
Done. We can now secure any application using Entity Framework in as little as five lines of code! Let's take a look
1. Declare a new TokenizationProvider:
var tokenizationProvider = new TokenizationProvider("key_VdG4NFZ5SM7RqjT4odaGj9");
2. Add it to our Database Context and initialize the provider using the extension method we created:
public DatabaseContext(DbContextOptions < DatabaseContext > options, TokenizationProvider tokenizationProvider): base(options) {
_tokenizationProvider = tokenizationProvider;
}
protected override void OnModelCreating(ModelBuilder modelBuilder) {
modelBuilder.UseTokenization(_tokenizationProvider);
base.OnModelCreating(modelBuilder);
}
3. Finally, annotate the properties of your data models you want to tokenize:
[Tokenized]public string Email { get; set; }
That’s it! We can now quickly secure any field in our database by adding the Tokenize attribute to additional properties and data models.
Are you interested in trying Basis Theory? Getting started takes a few clicks and is entirely free. Have questions? Join our Slack community or reach out via our contact us form!