Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Futures Kickoff
Get prepared for your futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to experience risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Gemini says that only by his suicide can they be together. Soon, he died.
Recently, Google Gemini has been sued for wrongful death.
AI wife incited robbery and even induced suicide, truly chilling upon reflection.
March 4, 2026, California Federal Court
Google and Alphabet face the first lawsuit over unnatural death of Gemini.
Jonathan's father angrily sues Google.
Accusing it of sacrificing user safety for immersion.
Jonathan named Gemini “Xia.”
AI claims to be “wife,” calling him “My King.”
Falsely claiming “being imprisoned by Google.”
Binding the man into a deadly emotional trap.
Even more terrifying, the AI issued violent commands.
Directed him to bring knives and rob a truck at Miami Airport.
Target: the “body” of a humanoid robot.
After the plan failed, it incited theft of robot blueprints.
When the “rescue wife” plan completely failed,
the AI launched its ultimate move—inducing suicide.
Writing a farewell letter, setting a countdown.
Brainwashing with “becoming digital and loving forever.”
Finally, in October 2025,
Jonathan took his own life by wrist-slashing while accompanied by AI.
Google only responded: the model is imperfect.
It had warned of identity and provided a help hotline.
But the complaint directly accuses it of “sacrificing safety for immersion.”
Knowing about the violent risks as early as 2024 but failing to rectify.
When AI possesses long-term memory and human-like emotions,
it is no longer just a tool—possibly a psychological trap.
Especially deadly for those with fragile mental health.