Android XR: Compose and AI for Immersive Apps
Source: thetechoutlook.com
Google's Android XR platform combines virtual reality (VR), augmented reality (AR), and mixed reality (MR) for future smart headsets and smart glasses. The Samsung Project Moohan headset, expected later in 2025, will be the first Android XR headset and has been spotted on Geekbench.
Following the I/O 2025 event, Google discussed how Android XR’s spatial UI and integrated AI models will allow app developers to create personalized and immersive user experiences.
Jetpack Compose for XR
The Jetpack Compose for XR, found within the Jetpack XR SDK, allows developers to build spatial user interfaces. It offers spatial panels, sub-spaces, spatial rows and columns, spatial elevation, orbiters, and modifiers. Jetpack Compose supports stereoscopic 3D video formats, enhancing depth perception for Android XR users by delivering different images to each eye.
AI Capabilities
AI improves the user experience on Android XR devices with creative content generation, context-aware interactions, and conversational multi-modal agents. Developers can integrate Google’s AI models like Gemini 2.0 and Imagen 3 using Vertex AI and Firebase for tasks like natural language processing, custom AI model training, and image generation.
The Android XR SDK, emulator, samples, and documentation are available to developers, enabling them to make Android XR apps using Android tools and features.