Jump to content

Talk:Tokenization (data security)

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Steve-o~enwiki (talk | contribs) at 03:47, 15 April 2004. The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.
(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

Tokenizing is the operation of replacing one set of symbols with another, typically to make the resulting set of symbols smaller.

This is not the common usage of the term. In computer science it normally means to split a string up into tokens (e.g. key words, separators, etc.), not to replace a list of tokens with smaller tokens.

I am not familiar with the usage tokenizing given in the previous version of the article, but I will leave it as an alternative meaning of tokenizing until I can verify whether it is incorrect or not.

Steve-o 03:47, 15 Apr 2004 (UTC)