What is tokenization?
Prashanth Reddy, Robert Byrne,
McKinsey & Company,
Mar 06, 2024
This article looks at "what tokenization is, how it works, and why it's become a critical part of emerging blockchain technology" and therefore a core part of web3. In a nutshell, "tokenization is the process of creating a digital representation of a real thing." In other words, its's something we already do all the time - your driver's license, passport, credit card or other sorts of ID are tokenized versions of yourself. What makes tokenization interesting is (to my mind) persistence - how can you make it so that the bond between the token and the real thing is unbreakable. That's what distributed ledger technology (aka blockchain) is supposed to provide, but there are still issues. This article also looks at tokenization in large language models, which suggests all sorts of possibilities (imagine, for example, an AI that performs using 'large persistent object models' instead of language models).
Today: 0 Total: 86 [Share]
] [