My Google Map Blog

Archive for May 19th, 2021

Search, explore and shop the world’s information, powered by AI

by Prabhakar Raghavan on May.19, 2021, under 3D Models, Argentina, Australia, Brazil, California, Denmark, England, Germany, Google Earth News, Google Earth Tips, Google Sky, Google maps, Hawaii, Indonesia, Ireland, Italy, Japan, Kenya, Mexico, Natural Landmarks, Netherlands, Sightseeing, Street Views, USA

AI advancements push the boundaries of what Google products can do. Nowhere is this clearer than at the core of our mission to make information more accessible and useful for everyone.

We've spent more than two decades developing not just a better understanding of information on the web, but a better understanding of the world. Because when we understand information, we can make it more helpful — whether you’re a remote student learning a complex new subject, a caregiver looking for trusted information on COVID vaccines or a parent searching for the best route home.

Deeper understanding with MUM

One of the hardest problems for search engines today is helping you with complex tasks — like planning what to do on a family outing. These often require multiple searches to get the information you need. In fact, we find that it takes people eight searches on average to complete complex tasks.

With a new technology called Multitask Unified Model, or MUM, we're able to better understand much more complex questions and needs, so in the future, it will require fewer searches to get things done. Like BERT, MUM is built on a Transformer architecture, but it’s 1,000 times more powerful and can multitask in order to unlock information in new ways. MUM not only understands language, but also generates it. It’s trained across 75 different languages and many different tasks at once, allowing it to develop a more comprehensive understanding of information and world knowledge than previous models. And MUM is multimodal, so it understands information across text and images and in the future, can expand to more modalities like video and audio.

Imagine a question like: “I’ve hiked Mt. Adams and now want to hike Mt. Fuji next fall, what should I do differently to prepare?” This would stump search engines today, but in the future, MUM could understand this complex task and generate a response, pointing to highly relevant results to dive deeper. We’ve already started internal pilots with MUM and are excited about its potential for improving Google products.

Information comes to life with Lens and AR

People come to Google to learn new things, and visuals can make all the difference. Google Lens lets you search what you see — from your camera, your photos or even your search bar. Today we’re seeing more than 3 billion searches with Lens every month, and an increasingly popular use case is learning. For example, many students might have schoolwork in a language they aren't very familiar with. That’s why we’re updating the Translate filter in Lens so it’s easy to copy, listen to or search translated text, helping students access education content from the web in over 100 languages.

Animated GIF showing Google Lens’s Translate filter applied to homework.

AR is also a powerful tool for visual learning. With the new AR athletes in Search, you can see signature moves from some of your favorite athletes in AR — like Simone Biles’s famous balance beam routine.

Animated GIF showing Simone Biles’s balance beam routine surfaced by the AR athletes in Search feature.

Evaluate information with About This Result 

Helpful information should be credible and reliable, and especially during moments like the pandemic or elections, people turn to Google for trustworthy information. 

Our ranking systems are designed to prioritize high-quality information, but we also help you evaluate the credibility of sources, right in Google Search. Our About This Result feature provides details about a website before you visit it, including its description, when it was first indexed and whether your connection to the site is secure. 

Animated GIF showing the About This Result features applied to the query "How to invest in ETFs."

This month, we’ll start rolling out About This Result to all English results worldwide, with more languages to come. Later this year, we’ll add even more detail, like how a site describes itself, what other sources are saying about it and related articles to check out. 

Exploring the real world with Maps

Google Maps transformed how people navigate, explore and get things done in the world — and we continue to push the boundaries of what a map can be with industry-first features like AR navigation in Live View at scale. We recently announced we’re on track to launch over 100 AI-powered improvements to Google Maps by the end of year, and today, we’re introducing a few of the newest ones. Our new routing updates are designed to reduce the likelihood of hard-braking on your drive using machine learning and historical navigation information — which we believe could eliminate over 100 million hard-braking events in routes driven with Google Maps each year.

If you’re looking for things to do, our more tailored map will spotlight relevant places based on time of day and whether or not you’re traveling. Enhancements to Live View and detailed street maps will help you explore and get a deep understanding of an area as quickly as possible. And if you want to see how busy neighborhoods and parts of town are, you’ll be able to do this at a glance as soon as you open Maps.

More ways to shop with Google 

People are shopping across Google more than a billion times per day, and our AI-enhanced Shopping Graph — our deep understanding of products, sellers, brands, reviews, product information and inventory data — powers many features that help you find exactly what you’re looking for.

Because shopping isn’t always a linear experience, we’re introducing new ways to explore and keep track of products. Now, when you take a screenshot, Google Photos will prompt you to search the photo with Lens, so you can immediately shop for that item if you want. And on Chrome, we’ll help you keep track of shopping carts you’ve begun to fill, so you can easily resume your virtual shopping trip. We're also working with retailers to surface loyalty benefits for customers earlier, to help inform their decisions.

Last year we made it free for merchants to sell their products on Google. Now, we’re introducing a new, simplified process that helps Shopify’s 1.7 million merchants make their products discoverable across Google in just a few clicks.  

Whether we’re understanding the world’s information, or helping you understand it too, we’re dedicated to making our products more useful every day. And with the power of AI, no matter how complex your task, we’ll be able to bring you the highest quality, most relevant results. 

Comments Off :, , , more...

A smoother ride and a more detailed Map thanks to AI

by Russell Dicker on May.19, 2021, under 3D Models, Argentina, Australia, Brazil, California, Denmark, England, Germany, Google Earth News, Google Earth Tips, Google Sky, Google maps, Hawaii, Indonesia, Ireland, Italy, Japan, Kenya, Mexico, Natural Landmarks, Netherlands, Sightseeing, Street Views, USA

AI is a critical part of what makes Google Maps so helpful. With it, we’re able to map roads over 10 times faster than we could five years ago, and we can bring maps filled with useful information to virtually every corner of the world. Today, we’re giving you a behind-the-scenes look at how AI makes two of the features we announced at I/O possible.

Teaching Maps to identify and forecast when people are hitting the brakes

Let’s start with our routing update that helps you avoid situations that cause you to slam on the brakes, such as confusing lane changes or freeway exits. We use AI and navigation information to identify hard-braking events — moments that cause drivers to decelerate sharply and are known indicators of car crash likelihood — and then suggest alternate routes when available. We believe these updates have the potential to eliminate over 100 million hard-braking events in routes driven with Google Maps each year. But how exactly do we find when and where these moments are likely to occur?


That’s where AI comes in. To do this, we train our machine learning models on two sets of data. The first set of information comes from phones using Google Maps. Mobile phone sensors can determine deceleration along a route, but this data is highly prone to false alarms because your phone can move independently of your car. This is what makes it hard for our systems to decipher you tossing your phone into the cupholder or accidentally dropping it on the floor from an actual hard-braking moment. To combat this, we also use information from routes driven with Google Maps when it's projected on a car’s display, like Android Auto. This represents a relatively small subset of data, but it’s highly accurate because Maps is now tethered to a stable spot — your car display. Training our models on both sets of data makes it possible to spot actual deceleration moments from fake ones, making detection across all trips more accurate. 


Understanding spots along a route that are likely to cause hard-braking is just one part of the equation. We’re also working to identify other contextual factors that lead to hard-braking events, like construction or visibility conditions. For example, if there’s a sudden increase in hard-braking events along a route during a certain time of day when people are likely to be driving toward the glare of the sun, our system could detect those events and offer alternate routes. These details inform future routing so we can suggest safer, smoother routes.

Using AI to go beyond driving

When you’re walking or biking or taking public transit, AI is also there helping you move along safely and easily. Last August we launched detailed street maps which show accurate road widths, along with details about where the sidewalks, crosswalks and pedestrian islands are in an area so people can better understand its layout and how to navigate it. Today, we announced that detailed street maps will expand to 50 more cities by the end of 2021. While this sounds straightforward, a lot is going on under the hood — especially with AI — to make this possible! 

A GIF that shows a before and after comparison of detailed streets maps built from satellite imagery

A before and after comparison of detailed streets maps built from satellite imagery

Imagine that you’re taking a stroll down a typical San Francisco street. As you approach the intersection, you’ll notice that the crosswalk uses a “zebra” pattern — vertical stripes that show you where to walk. But if you were in another city, say London, then parallel dotted lines would define the crosswalks. To account for these differences and accurately display them on the map, our systems need to know what crosswalks look like — not just in one city but across the entire world. It gets even trickier since urban design can change at the country, state, and even city level.

To expand globally and account for local differences, we needed to completely revamp our mapmaking process. Traditionally, we’ve approached mapmaking like baking a cake — one layer at a time. We trained machine learning models to identify and classify features one by one across our index of millions of Street View, satellite and aerial images — starting first with roads, then addresses, buildings and so on. 

But detailed street maps require significantly more granularity and precision than a normal map. To map these dense urban features correctly, we’ve updated our models to identify all objects in a scene at once. This requires a ton of AI smarts. The model has to understand not only what the objects are, but the relationships between them — like where exactly a street ends and a sidewalk begins. With these new full-scene models, we're able to detect and classify broad sets of features at a time without sacrificing accuracy, allowing us to map a single city faster than ever before. 


An image of Google Maps’ single-feature AI models

Single-feature AI model that classifies buildings.

An image of Google Maps’ full-scene AI models

Full-scene AI models that capture multiple categories of objects at once.


Once we have a model trained on a particular city, we can then expand it to other cities with similar urban designs. For example, the sidewalks, curbs, and traffic lights look similar in Atlanta and Ho Chi Minh City — despite being over 9,000 miles away. And the same model works in Madrid as it does in Dallas, something that may be hard to believe at first glance. With our new advanced machine learning techniques combined with our collection of high-definition imagery, we’re on track to bring a level of detail to the map at scale like never before.

AI will continue to play an important role as we build the most helpful map for people around the globe. For more behind-the-scenes looks at the technology that powers Google Maps, check out the rest of our Maps 101 blog series.

More from this Series

Maps 101

Google Maps helps you navigate, explore, and get things done every single day. In this series, we’ll take a look under the hood at how Google Maps uses technology to build helpful products—from using flocks of sheep and laser beams to gather high-definition imagery to predicting traffic jams that haven’t even happened yet.

View more from Maps 101
Comments Off :, , more...

Get around and explore with 5 new Google Maps updates

by Oren Naim on May.19, 2021, under 3D Models, Argentina, Australia, Brazil, California, Denmark, England, Germany, Google Earth News, Google Earth Tips, Google Sky, Google maps, Hawaii, Indonesia, Ireland, Italy, Japan, Kenya, Mexico, Natural Landmarks, Netherlands, Sightseeing, Street Views, USA

From the very beginning, we built Google Maps to help you connect with the real world. In 2007, we introduced Street View, the first imagery platform to show you panoramic views of streets all over the world — from Tokyo to Tonga. A year later, we let you throw away your printed directions and get real-time navigation directly from your phone. And three years ago, we were the first to launch Live View and bring AR to navigation at scale. Thanks to our deep knowledge about the world and powerful AI advancements, we’ve spent the last 16 years bringing helpful information and experiences just like these to the map. Today at Google I/O, we’re announcing five new updates so you can more easily navigate, explore and get things done.  

Reduce hard-braking with routing updates

Imagine you’re driving to meet a friend. As you approach a busy intersection, the traffic slows suddenly and you have to slam on your brakes. According to research from experts at the Virginia Tech Transportation Institute, these hard-braking moments — incidents along a route that cause a driver to sharply decelerate — can be a leading indicator of car crash likelihood. Soon, Google Maps will reduce your chances of having hard-braking moments along your drive thanks to help from machine learning and navigation information.


Here’s how it works: Every time you get directions in Maps, we calculate multiple route options to your destination based on several factors, like how many lanes a road has and how direct a route is. With this update, we’ll take the fastest routes and identify which one is likely to reduce your chances of encountering a hard-braking moment. We’ll automatically recommend that route if the ETA is the same or the difference is minimal. We believe that these changes have the potential to eliminate 100 million hard-braking events in routes driven with Google Maps each year, so you can rely on Maps to get you from A to B quickly — but also more safely.

Walk this way with enhancements to Live View and detailed street maps

If you’re getting around on foot, we’ve got you covered with augmented reality in Live View. If you’re exploring a new neighborhood, you’ll be able to access Live View instantly — right from the map — and see helpful details about the shops and restaurants around you, like how busy they are, recent reviews and photos. We’ll also display helpful new street signs for complex intersections so you know exactly what road you’re on and which way to go. And if you’re traveling, Live View will tell you where you are in relation to places like your hotel — so you can always find your way back to home base.


Our detailed street maps feature, which launched last August, will soon be available in 50 more cities by the end of this year — including Berlin, São Paulo, Seattle, and Singapore. With the help of AI and our understanding of cityscapes around the globe, you can see where sidewalks, crosswalks and pedestrian islands are, along with the shape and width of a road to scale. This information can help pedestrians plan the most accommodating route, especially if they’re using a  wheelchair or stroller.

A GIF that shows what Google Maps looks like before and after detailed street maps

Detailed street maps are expanding to 50 more cities globally. 

Spot busy areas at a glance

Each day, more than 80 million people turn to live busyness information on Google for specific places to save time waiting in line and stay socially distanced during the pandemic. Now, this is expanding to show the relative busyness of an entire area, like whether a neighborhood or part of town is busier than usual. If it’s Saturday morning and you want to explore your city without crowds bogging you down, open up Maps to instantly see busy hotspots to avoid — like the streets near the local farmers’ market. On the flip side, if you want to check out popular parts of town, use area busyness to scope out lively neighborhoods at a glance to discover interesting things to do.
A GIF of Google Maps that shows that the area near the Spanish Steps in Rome is busier than usual

Use area busyness to quickly identify where crowded areas are in a city. 

A map tailored to you

Having information about the world is useful, but it can quickly become overwhelming if it’s not delivered at just the right time. To help you make sense of it all, we’re tailoring our map to highlight the most relevant places based on time of day and whether or not you’re traveling. If you live in New York and open up Maps at 8 a.m. on a weekday, we’ll prominently feature nearby coffee shops — instead of dinner spots — so you can start your day with a caffeine fix. And if you’re on a weekend getaway, it’ll be easier to spot local landmarks and tourist attractions right on the map. Want more options? Tap on any place to see similar places nearby.

A GIF that shows coffee spots in NYC in the morning, and dinner spots in the evening

See relevant places based on time of day and whether or not you’re traveling. 

No matter where you’re headed or what your plans are, Google Maps has the information you need along the way and the AI smarts to get you there. All of these features start rolling out globally on Android and iOS in the coming months, with detailed street maps coming to 50 new cities by the end of the year.

Comments Off : more...

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

Visit our friends!

A few highly recommended friends...