Tokenization
Multiple methods exist for generating tokens and protecting the overall system; but in contrast to encryption, no formal data tokenization standards exist. One common approach is to deploy a centralized data tokenization service that generates tokens, performs the substitution, and stores the token and corresponding original data, allowing it to de-tokenize (substitute the original value for the token) when an application needs to use the original data. Alternative approaches avoid the need for a central data tokenization service and repository by utilizing secret, pre-generated look-up tables that are shared with applications.