Tokenization is a privacy-enhancing technology (PETs), which replaces sensitive data with substitute values called tokens. Tokens are stored in a separate, encrypted token vault that maintains the relationship with the original data outside the production environment. When an application calls for the data, the token is mapped to the actual value in the vault outside the production environment. Tokens can represent substitute values in various ways.
For example, they can retain the format of the original data while revealing only the last few digits. The same token can also represent each instance of the original data.