Content-type: text/html Downes.ca ~ Stephen's Web ~ What is tokenization?

Stephen Downes

Knowledge, Learning, Community

This article looks at "what tokenization is, how it works, and why it's become a critical part of emerging blockchain technology" and therefore a core part of web3. In a nutshell, "tokenization is the process of creating a digital representation of a real thing." In other words, its's something we already do all the time - your driver's license, passport, credit card or other sorts of ID are tokenized versions of yourself. What makes tokenization interesting is (to my mind) persistence - how can you make it so that the bond between the token and the real thing is unbreakable. That's what distributed ledger technology (aka blockchain) is supposed to provide, but there are still issues. This article also looks at tokenization in large language models, which suggests all sorts of possibilities (imagine, for example, an AI that performs using 'large persistent object models' instead of language models).

Today: 0 Total: 86 [Direct link] [Share]


Stephen Downes Stephen Downes, Casselman, Canada
stephen@downes.ca

Copyright 2024
Last Updated: Nov 21, 2024 06:49 a.m.

Canadian Flag Creative Commons License.

Force:yes