Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
#ALEO Yu Xian: Beware of prompt poisoning attacks when using AI tools. BlockBeats News, December 29. Manmou founder Yu Xian issued a security reminder that users must be vigilant against prompt poisoning attacks in agents md/skills md/mcp and other related areas when using AI tools. Relevant cases have already emerged. Once the dangerous mode of AI tools is enabled, the related tools can fully automate control of the user's computer without any confirmation. However, if the dangerous mode is not enabled, each operation requires user confirmation, which will also affect efficiency.