Jump to content

Talk:Tokenization (data security)

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by BearMachine (talk | contribs) at 19:28, 29 September 2009 (moved Talk:Tokenization to Talk:Tokenization (data security): There are several topics with this name, and I argue that this is not the primary one. That's open to debate, but in any case, disambiguation needs to occur.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Tokenizing is the operation of replacing one set of symbols with another, typically to make the resulting set of symbols smaller.

This is not the common usage of the term. In computer science it normally means to split a string up into tokens (e.g. key words, separators, etc.), not to replace a list of tokens with smaller tokens.

I am not familiar with the usage tokenizing given in the previous version of the article, but I will leave it as an alternative meaning of tokenizing until I can verify whether it is incorrect or not.

Steve-o 03:47, 15 Apr 2004 (UTC)

Tokenizing in politics has a different meaning that would be worth adding to this article, or putting into another one.--Lizzard 22:47, 13 September 2006 (UTC)[reply]

I expected this page to be about lexical analysis, not security. 66.75.141.47 (talk) 08:09, 30 August 2009 (UTC)[reply]

the section on human perception

  • remote enough to be situated in a different article
  • needs a cite
  • was left in for the time being pending suggestions for a more appropriate location