PCI Vault supports 3 tokenization algorithms. You can specify the algorithm to use when you create a capture endpoint or when you send a PAN to the vault directly.
All of these tokens are based on a number, which we obtain from your data. The number that is used is determined in the following way:
number
, card_number
, n
, account_number
.
If a field with one of these names are found, its value is used to generate the token.400
.Please note that all algorithms are sensitive to spaces. This means that a PAN with spaces will generate a different token than a PAN without spaces.
The default PCI Vault tokens are the most collision resistant of the 3 algorithms. Full tokenization uses a proprietary cryptographically secure hashing algorithm to generate a hash, which then serves as a token.
These tokens are 64 characters long, highly collision resistant, and the recommended algorithm unless you have specific needs.
To use this tokenization algorithm, specify algorithm=full
in your request,
or send the request without specifying an algorithm. The full algorithm is the default.
Example tokens:
Number | Token |
---|---|
4111111111111111 | 9bfe9241f7ad0d84230e9837aa99c64bf3c3936478317146d1f7c58a66f40af6 |
5425233430109903 | bf2fa482286633fab50369c8ba855bf7bc5b1d29bb5ffc183b371e20b9d6253d |
4263982640269299 | 744d174e68756188bb48dd69ff69d1a4f4ed3a2ca1d3d0747b3b8d768f010e0b |
374245455400126 | a8366e50f6aafd1cc71360f43060808f2ef9f6f8a850319aff629a0b80ddd7de |
Partial tokenization are useful when you want to restrict the token length to be the same length as the input number. However, the trade-off to achieve this is a significantly lower collision resistance. It is therefore recommended to use a unique reference for every token in order to guarantee uniqueness. Partial tokenization works by generating a full token, and trimming the resulting token to the desired length. The specified digits of the number is then replaced by the generated token. The resulting token is part original number, and part token.
These tokens are the same length as the input number, but not highly collision resistant.
To use this tokenization algorithm, specify algorithm=partial_n_pos
in your request.
Where n is a number between 8 and 16 (inclusive) and pos is either start
or end
Example tokens for partial_8_end
:
Number | Token |
---|---|
4111111111111111 | 411111119bfe9241 |
5425233430109903 | 54252334bf2fa482 |
4263982640269299 | 42639826744d174e |
374245455400126 | 3742454a8366e50 |
Luhn Compliant tokenization generates a token that is in itself a 16 digit credit card number that also passes Luhn's algorithm.
These tokens are always 16 characters long and always pass Luhn's algorithm. They are less collision resistant than full tokens, but they are collision resistant enough for most practical purposes. It is recommended to limit the amount of tokens to a few thousand per unique reference to be safe.
To use this tokenization algorithm, specify algorithm=luhn
in your request.
Example tokens:
Number | Token |
---|---|
4111111111111111 | 5560386204722122 |
5425233430109903 | 3113674056387905 |
4263982640269299 | 3543337643660602 |
374245455400126 | 3935235944928114 |