Tokenization (Security)
Definition
In data security, tokenization is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as a "token," that has no extrinsic or exploitable meaning or value. The token is a reference that maps back to the sensitive data through a tokenization system.
Why It Matters
Tokenization is a powerful way to protect sensitive data, especially in payment processing. By replacing a credit card number with a token, a merchant can process payments without ever storing or transmitting the actual card number, significantly reducing their security risk and PCI compliance scope.
Contextual Example
When you use Apple Pay, your actual credit card number is not sent to the merchant. Instead, a one-time-use token is generated for that specific transaction. This token is useless to an attacker if intercepted.
Common Misunderstandings
- This is different from the tokenization in AI (splitting text into tokens) or tokenization in blockchain (creating a digital asset).
- The actual sensitive data is stored securely in a central "token vault".