Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
What exactly is a Token? A beginner's essential guide to understanding AI
1. Large AI models simply cannot directly process the raw text we input; the first step in all content processing is to convert the text into Tokens.
2. In simple terms, a Token is the smallest processing unit into which text is broken down before feeding it into the model.
3. A Token can be an entire word, part of a word, punctuation, or even just a space.
4. Common words are usually split into just one Token, while longer or less common words are often broken into smaller segments, for example, the English encoding might be split into "encod" + "ing."
5. Here's a general conversion reference: one Token roughly corresponds to 4 English characters or 3/4 of an English word; however, this value is not fixed and varies depending on the language and the tokenizer used.
6. The complete processing flow is as follows: first, split the text into Tokens; then, map each Token to its corresponding numerical ID; next, convert the ID into a vector that the model can recognize; after these three steps, the model will officially start processing your content.
7. Also, the commonly heard "context window" is measured in Tokens — the token limit of the window directly determines how much content the model can "remember" in a single interaction.
8. Lastly, a point everyone is probably very interested in: Tokens are also the core pricing unit for generative AI. The money we spend on AI is usually calculated based on the number of Tokens used.
What I just described is only the tip of the iceberg; the underlying logic behind Tokens is far more interesting than you might think.