How to make a Google Maps Lead Extractor

In this post I will give a brief overview over how to build a scraper for google maps. I will share little parts of code but it really depends on how you plan to scrape them and what language you will be using. I will mostly discuss theory and you will need to find the specifics your self once you pick a programing language. With that said let’s get starting talking about making a google maps lead extractor.

First off, we need to pick a language for our scraper. What you pick is entirely up to you and how you plan to use your scraper. Local Scraper is build using C# because that was the easiest way for me to make a desktop application. Python is super common for scrapers and I use it for many projects! But when it comes to making a UI and packaging for desktop use it gets complex. Using C# allowed me to make my UI in Visual Studio and saved me a ton of headaches. With python you will most likely be running your scraper from a console/terminal, if you chose JavaScript it would be similar. I don’t think any one language is better than the other so just pick what works for you.

The main thing you will need is a browser remote control system such as Selenium or Puppeteer. These are systems that will allow us to take control of a copy of Google Chrome and allows us to send commands to the browser. Since google maps only loads in a browser you have to use a browser to view them. Personally, I chose Selenium since I am most familiar with them since they have been around the longest. They also support the most languages because of this. It was super easy to open NuGet in Visual Studio and grab Selenium for C#. I have also used Selenium with python as well and its just as easy to install.  Puppeteer is the newer system and is based on NodeJS but some have ported the package over to other languages as well Puppeteer Sharp is the .Net port for C#.

Now that we have that done, we want our google maps extractor to open a copy of our browser and browse directly to Google Maps. This step is very straight forward and is covered in all documentation. After that we want to be able to click on each listing and also go though all pages of results until they don’t give you any more pages.  

To figure out the next page button we will want to inspect the element in the chrome dev tools and find a unique identifier that we can test for. In this case I found that the button’s “jsaction” will contain “pane.paginationSection.nextPage” so we can test for that using xpath. That looks something like this “.//button[contains(@jsaction,\”pane.paginationSection.nextPage\”)]” So to make our google maps extractor click on the next page we can test for this xpath. If it exists then we know there is a next page. We can also tell our control system to click on this xpath. Pretty simple, until we get to the final page of results. This where they trick the scraper. On the last page this next page button will still exist in the source code! It will always test true and our scraper will get stuck. The difference is that on the last page the button will add “disabled=’true’” to its code. We can use that to make a new test. Which looks something like this “.//button[contains(@jsaction,\”pane.paginationSection.nextPage\”) and @disabled=\”true\”]” Our new xpath is now testing for the button AND testing if it is disabled or not. If it is disabled then we know we are on the last page or there simply is no next page.

Our google maps extractor can now go though all the pages but how can it click on each listing? Again, we need to find a unique identifier by going over the source code in the inspector. Try to see what each listing has in common that you can click on, check on the child elements like the business title but also check the div that each listing is in. In our case what I found was “class=’a4gq8e-aVTXAb-haAclf-jRmmHf-hSRGPd’” this gibber’ish used to be normal text but google for some reason has obfuscated it to make it harder to read. Each listing has this class so we can use that to click on each one. The main issue is that we have this class but there is nothing else unique between the listings. This means we need to make a way to click on the 2nd and 3rd item ourselves.

What I ended up doing is doing a count of how many times we found that class in the search results source code. I saved this to a variable so I knew exactly how many listings were on the page, because this changes with each search. Its sometimes 2 listings, its sometimes 5 listings, maybe it has 5 pages of full results but the last page only has 2 listings. Each time you load a new page of results you need to rerun this counting system. Clicked next page? Wait a few seconds and then run the listing counter!

Now that we now how many results are on the page we can make a loop to click on each listing. To do so we are going to make a loop that uses are counter variable, say its 10. We need to loop ten times. To click on the items we need to click on the class we found but at the loop position. We can use loop position 1 to click on the second listing, position 2 to click on the third one, and so on. To do this I used Selenium’s ability to run javascript in the browser to activate a click. This code looks something like this “document.getElementsByClassName(\”a4gq8e-aVTXAb-haAclf-jRmmHf-hSRGPd\”)[“+counter+”].click()” you can see that I have added [counter] onto the class so we can tell the lead extractor which one to click.

After we click on the listing we then want to grab the source code of the listing page, we can then parse this over. To get back to the listing page we click the back button or we have the browser run the back command. Then we can do the same for the rest of the listings. In the end we have a google maps extractor that will open the browser, go through the pages, and click on each listing one by one. Of course the final product will need to be a bit more complex but this is the basic framework of what you want the extractor do it.

If your business is in need of data extracted from google maps and you don’t want to create your own program from scratch Local Scraper can help! We have already done the work and already built the scraper, used by thousands of companies for over 8 years you know you can trust us to not disappear overnight. Our google maps extractor gathers as much data as we can and is always kept up to date. If anything changes at google maps, and it will! Know that we will be on the case to fix the issue and keep the program up and running. Head on over to the sales page if you want to know more and if not good luck in your scraper building and I hope this overview has helped your project.