Google Maps has announced a slew of new features coming soon, ranging from on-map weather updates to AR-powered indoor navigation. There’s a lot to cover, and the company says it is on track to deliver “over 100 AI-powered improvements to Google Maps” this year.
First, there’s a new UI for directions. Today, the direction UI uses tabs for each transportation type: one for driving, then mass transit, walking, ride shares, and biking. In this redesign, everything appears in a flat list, but now you can hit the “options” button and set preferred modes of transportation. You can prioritize routing options for driving, walking, trains, buses, motorcycles, bikes, ride shares, “bike and scooter shares,” and ferries. You can even pick multiple items, so all your top choices will be first in the list.
Some of the route options will have a little green leaf next to them, which is part of Google Maps’ new focus on promoting cleaner methods of transportation. For driving, Google Maps’ routing screen will soon take into account fuel efficiency, and you’ll start to see a green leaf next to the fuel-efficient routes, too. For many places, the shortest route is the most fuel-efficient, so not much will change. But Google Maps will calculate things like traffic, starts and stops, and road elevation (a major concern in Google’s backyard of California) to come up with a CO2 rating for each trip. If it finds a route that is more fuel-efficient but longer, it will tell you about it, and if both routes take the same amount of time, fuel efficiency will be used as the tiebreaker for the default route.
Google says both features will release sometime this year.
Google Maps takes on the weather
There are two new layers coming to Google Maps that put the service in a bit of competition with your favorite weather app: air quality and weather layers. Weather is almost always something you should look into before traveling, and soon you’ll be able to get that information right in Google Maps. For allergy sufferers or people in places where air quality is regularly an issue, seeing that info easily will be handy, too.
Google says, “Data from partners like The Weather Company, AirNow.gov and the Central Pollution Board power these layers that start rolling out on Android and iOS in the coming months. The weather layer will be available globally, and the air quality layer will launch in Australia, India, and the US, with more countries to come.”
For now, the presentation of this information is very limited. The most obvious way to display weather and air quality data is with an overlay showing the rain and air in a radar view, with varying colors denoting the intensity. Google Maps only displays this data as tiny, random dots on the map, similar to how points of interest are displayed. This makes it difficult to determine where rain starts and stops, how long it will hang around for, whether it will get better or worse in the next few hours, or how bad the weather will be while you’re driving there. Adding this information directly into Google Maps would probably cut a big chunk out of the weather app industry, since overlaid data on Google Maps is a core feature, but maybe Google is more receptive to a first-party weather solution now that Apple is investing in the area with its acquisition of Dark Sky.
Indoor AR navigation
Google Maps AR Navigation is moving indoors. The feature rolled out to iPhone and Android devices in select cities in 2019 and uses AR Core-based 3D sensing and Google’s trove of street-view imagery to determine your direction via the camera and what it is pointed at. Outdoors, the feature ended up being the world’s most complicated replacement for a compass, but compasses in phones (especially Android phones) just aren’t that accurate and are prone to interference, so getting the initial walking direction right using only a compass can be a challenge. AR Navigation, in addition to the cool 3D visuals overlaid on a camera, really is a big help.
It sounds like AR Navigation is going to really shine when it moves to indoor navigation. Google demoed the feature in an airport terminal, where it can do things like figure out your location (GPS doesn’t work indoors) and identify what floor you’re on. In the demo, it tells someone where the escalator is and to go down a level to reach their terminal. Google says it wants to roll this technology out to “airports, transit stations, and malls,” where it will be able to “help you find the nearest elevator and escalators, your gate, platform, baggage claim, check-in counters, ticket office, restrooms, ATMs and more.”
Indoor navigation has been something Google has continually tried to get businesses to adopt, pitching solutions like Wi-Fi RTT—Wi-Fi-based positioning—which was built into Android 9. I think the company has realized that any method that requires independent businesses to install and maintain some kind of technical infrastructure isn’t going to work. AR Navigation feels like a more scalable alternative because Google can do all the work itself. It’s powered by nothing but your camera and a bunch of pictures stored on Google Maps—Google calls this VPS, or Visual Positioning System—and it’s basically AI-powered landmark navigation.
VPS data is the same as Street View data, so it can scale the same way Street View scales, by sending out hordes of contractors around the world to photograph everything with special equipment. You could claim that photographing every major indoor public space is too much work, but Google has already proved it can do this with Street View. The company professionally manufactures its own Street View backpacks now, so sending a contractor on a quick march through your local airport, train station, or mall should be enough for VPS data. “Just go photograph the entire world” is entirely within Google’s capabilities.
Google calls the feature “Indoor Live View” and says it “is live now on Android and iOS in a number of malls in Chicago, Long Island, Los Angeles, Newark, San Francisco, San José, and Seattle. It starts rolling out in the coming months in select airports, malls, and transit stations in Tokyo and Zurich, with more cities on the way.”
Now, if we could just get this in an AR-version of Google Glass, that would be great.
https://arstechnica.com/?p=1753119