Futures
Hundreds of contracts settled in USDT or BTC
TradFi
Gold
Trade global traditional assets with USDT in one place
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Futures Kickoff
Get prepared for your futures trading
Futures Events
Participate in events to win generous rewards
Demo Trading
Use virtual funds to experience risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and enjoy airdrop rewards!
Futures Points
Earn futures points and claim airdrop rewards
Investment
Simple Earn
Earn interests with idle tokens
Auto-Invest
Auto-invest on a regular basis
Dual Investment
Buy low and sell high to take profits from price fluctuations
Soft Staking
Earn rewards with flexible staking
Crypto Loan
0 Fees
Pledge one crypto to borrow another
Lending Center
One-stop lending hub
VIP Wealth Hub
Customized wealth management empowers your assets growth
Private Wealth Management
Customized asset management to grow your digital assets
Quant Fund
Top asset management team helps you profit without hassle
Staking
Stake cryptos to earn in PoS products
Smart Leverage
New
No forced liquidation before maturity, worry-free leveraged gains
GUSD Minting
Use USDT/USDC to mint GUSD for treasury-level yields
Put on Vision Pro and become an "astronaut": Apple reveals the behind-the-scenes development of the "Ganymede" immersive environment
IT House, February 25 — Apple recently launched the immersive environment Amalthea on visionOS 26. On February 21, during an interview with Cool Hunting, the visionOS team first revealed the complex design process from concept to reality for these virtual spaces.
Yuri Imoto from the visionOS product marketing team and Apple human-machine interface designer Matt Dessero stated in the interview that the team always centered “naturalness” as the core foundation during the early design stages.
Dessero explained that designers repeatedly consider what emotions these environments might evoke in users, such as whether they can bring a sense of calm, enhance focus, or stimulate curiosity about exploring the unknown. Based on these emotional considerations, the team ultimately chose Amalthea, one of Jupiter’s many moons, as the visual focus.
When creating realistic scenes of Earth locations like Mount Hood and Yosemite, the design team conducted all-weather on-site image collection. They used LiDAR scanning and high-precision material capture technology to build a unified 3D geometric mesh. To make the environment “come alive,” the team also customized acoustic grids to realistically simulate sound reflections on granite or forest ground surfaces.
For outer space scenes lacking real-world data, the team turned to rigorous scientific modeling. IT House cites a blog post stating that Apple carefully selected Amalthea, approximately 144 kilometers in diameter, as the focal point when developing the new “Jupiter” environment, ensuring that Jupiter appears both awe-inspiring and not overwhelming in the field of view.
Due to the absence of high-definition images, Apple enlisted NASA’s Jet Propulsion Laboratory (JPL) as a technical partner. Guided by JPL’s theories, the team accurately recreated Amalthea’s unusual terrain composed of rubble and ice, and added physical details such as Jupiter’s faint rings and light scattering beneath ice layers, which are often overlooked.
To achieve the most authentic scale and spatial sense, the team broke traditional development norms. Instead of building 3D models on conventional flat screens, designers directly used the Vision Pro headset to create the environment internally.
Dessero emphasized that he guided 3D artists throughout the process inside the headset for scene composition. This immersive creation method was crucial for spatial layout, requiring even the placement of individual rocks in close-up views to be repeatedly verified from the actual wearing perspective.
Additionally, this scene features real-time depth interaction. Users can now manipulate a timeline to witness how sunlight shifts across icy craters and how storms on Jupiter’s surface swirl, all in real-time.