Google’s Android Auto is getting its biggest update in ten years, and it’s all about fitting in — literally. At the company’s I/O 2026 developer conference, the team behind the phone-projection system announced a suite of enhancements that promise to make the in-car experience more flexible, intelligent, and visually cohesive. The centerpiece is a new “full bleed” design that allows Android Auto to fill any screen shape, from curved panoramas to circular displays, ending the era of unused black margins.
“You have the new BMW Neue Klasse that has an irregular trapezoid. I don’t even know the shape. It’s kind of parallelogram-ish,” said Patrick Brady, Google’s Vice President of Android Automotive, during an interview. “I was like, man, I need to go back to geometry classes.” Brady explained that the old approach — placing a standard rectangular interface inside larger displays — no longer works in a world where automakers are experimenting with unconventional screen geometries. The Lucid Air’s curved display and the MINI Cooper’s circular screen are just two examples of how car dashboards have evolved beyond the traditional rectangle.
This “full bleed” approach isn’t just about aesthetics; it’s about functionality. Applications like Google Maps will now stretch to the edges of the screen, ensuring that drivers can see more map detail without scrolling or zooming. Brady emphasized that the new system dynamically adapts the interface to the screen’s aspect ratio and shape, using a combination of scalable layouts and clever cropping. For maps, this means that landmarks, traffic conditions, and turn-by-turn directions will be displayed more prominently, even on oddly shaped panels.
Another headline feature is the ability to stream YouTube videos through Android Auto — but only while the vehicle is parked. This addresses a long-standing request from EV owners who often wait at charging stations for 20 to 40 minutes. The feature supports 4K resolution at 60 frames per second and Dolby Atmos spatial audio, making it a viable entertainment option. The video content is streamed from the user’s phone, not from the car’s embedded system. The car informs the phone when it’s in park, which unlocks video functionality. According to Brady, users have been badgering the company for the ability to watch movies or videos while charging their EV, waiting in parking lots, or sitting outside schools.
Google is also bringing its Material You design language — now called Material Three Expressive — to Android Auto. First introduced on Pixel phones in 2021, Material You adapts the interface colors and theme based on the user’s wallpaper. This personalization will now extend to the car, so Android Auto’s palette will match the phone’s. The update includes springy animations, bold fonts, and vibrant colors that make the interface feel more youthful and integrated. Allison Johnson, a colleague who reviewed Material You last year, described it as “full of springy animations, bold fonts, and vibrant color absolutely everywhere.”
Widget support is another major addition. Android users can project their existing phone widgets onto the car’s screen, giving them quick access to sports scores, smart home controls, one-tap contacts, garage door shortcuts, and lighting controls. And these widgets are now voice-controllable through Google’s Gemini AI assistant. For example, a driver can ask, “What’s the score of the Lakers game?” or “Turn on the porch lights,” and Gemini will fetch the relevant widget and display the response. This reduces the need to fumble with a phone while driving.
The new “Magic Cue” feature takes this a step further. It analyzes incoming text messages and proactively surfaces useful information. If someone texts you asking for an address or a phone number, Gemini can retrieve that data from your phone and suggest a one-tap reply. Brady argues that this will help reduce unsafe phone interactions. “We do driver distraction studies in a simulator,” he said. “We test the heck out of everything.”
Gemini is also being upgraded as an agentic assistant that can operate third-party apps in the background. For instance, drivers can now ask Gemini to place a pickup order through the Starbucks app. The AI doesn’t rely on special API integrations; instead, it can navigate the Starbucks app on the user’s phone, select the desired items, and confirm the order — all while the driver keeps their hands on the wheel. This functionality is also available for DoorDash and other delivery apps, making it easier to order food while on the road.
Immersive Navigation in Google Maps, which was announced earlier this year, is now coming to Android Auto. When a user starts navigation, the map will display refreshed colors, detailed 3D buildings, elevated roadways, realistic terrain, and greenery. This provides a richer, more contextual view of the route, especially useful in dense urban areas or unfamiliar neighborhoods.
Brady believes these updates help narrow the gap between phone projection (Android Auto) and embedded native systems (Android Automotive). Even vehicles that don’t support phone projection — like Rivian, Chevy, and Cadillac EVs — run Google’s built-in software. And many features that debut in Android Auto eventually find their way into embedded systems, and vice versa. This cross-pollination ensures that Google’s ecosystem remains consistent across different vehicle architectures.
However, there are still limitations. Android Auto users cannot control the car’s HVAC system through phone projection. Other vehicle systems like drive modes, driver assist features, and radio settings also remain inaccessible. But Brady noted that even these distinctions are disappearing. Advanced capabilities like Google Maps communicating directly with an EV’s powertrain to add charging stops or precondition the battery were once only possible through deeply embedded systems. Now, Google has been working with automakers to allow the same features to work through Android Auto.
“I think these worlds are blending as the phones get more capable, the cars get more capable from a software perspective, and the integration between them [improves],” Brady said.
The updates announced at I/O 2026 reflect a broader trend in automotive technology: the convergence of phone-based and built-in systems. As screens become more diverse and intelligent assistants become more capable, the line between “projected” and “native” experiences will continue to blur. For users, this means fewer compromises when choosing between a head unit upgrade and a new car with built-in software. And for Google, it means a larger share of the automotive infotainment market, regardless of how the car is equipped.
Android Auto originally launched in 2015 as a simple mirroring system for music, maps, and calls. Over the years, it has evolved to support messaging, podcasts, and a variety of third-party apps. The 2026 update is the most significant overhaul to date, touching every aspect of the interface from screen adaptation to AI integration. With the addition of video streaming, widget support, and Gemini-powered automation, Android Auto is positioning itself as a full-fledged in-car operating system — even when the car itself doesn’t run Android.
Source: The Verge News