bn:03450085n
Noun Concept
Categories: Cryptography
EN
tokenization  tokenisation
EN
Tokenization, when applied to data security, is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as a token, that has no intrinsic or exploitable meaning or value. Wikipedia
Definitions
Relations
Sources
EN
Tokenization, when applied to data security, is the process of substituting a sensitive data element with a non-sensitive equivalent, referred to as a token, that has no intrinsic or exploitable meaning or value. Wikipedia
The process of substituting a sensitive data element Wikipedia Disambiguation
Concept in data security Wikidata
The act or process of tokenizing. Wiktionary
Wikidata
Wiktionary
Wikidata Alias