Getting SEMrush API Data Into Google Sheets For Domain Acquisition Insights
Some links on this page are affiliate links that will earn us a commission if you decide to purchase their product. If you purchase via one of these links be sure to comment below. We love being able to share the tools we love and hearing about their impact.
Enriching existing data sets with publicly available API data is a great way to build more detailed prospecting lists for potential domain acquisitions. There are many online resources for buying domains and they may not have the SEO insights that you desire for accurately analyzing a domain’s value.
Whatever your preferred source of SEO data, this guide will show you how to make an API call within Google Sheets to analyze domain targets to purchase.
Why would I want to buy a domain at auction?
SEO. If you are starting a new web project then there’s no quicker way to accelerate your authority/backlink profile than by finding a relevant domain and analyzing its backlinks for your new project. By carefully mapping any legacy authoritative backlinks to your new domain (after purchasing a relevant domain) you can quickly add authority to your site and begin to challenge non-branded organic rankings.
Where are some good places to buy domains?
I’ve spent a lot of time looking at different domain marketplaces and comparing domain prices across them. NameCheap is the one place I end up recommending again and again. It has the lowest prices for any new domain and has a great marketplace too. We’ll use GoDaddy data below because it’s popular and very easy to quickly pull but the same process would apply for other registrars.
Do a Google search if you want more options. You’ll find that often these services provide some of the SEO data that you need but perhaps not the level of insights you might like. For example, exporting GoDaddy auctions data will give you Majestic’s TF/CF alongside your domain data as well as the number of referring domains, but it won’t give you details of follow/no-follow backlinks or the number of C-Class IPs.
We can get this data quickly with some SEMrush API credits, a Google Sheet, and the IMPORTDATA function.
How do I get buy API credits for backlink data?
SEMrush API credits (affiliate link) are available to Business subscribers and can be purchased in blocks (about $54 for a million credits at the time of writing). There are other options available on the market too. DataforSEO offers API credits for their backlinks endpoints that start at $100 per month.
Using Google Sheets, we can take a list of domain sale data and enrich this list with SEMrush’s Backlinks API to get more insights about a domain’s potential SEO value. For this process, we will use an export of GoDaddy’s domains selling soon.
Step 1 – Grab a list of available domains from GoDaddy
Visit GoDaddy and export their latest domains expiring in the next 24 hours. Export the CSV as shown below:
Step 2 – Import that list to a Google Sheet
Go to File > Import, and import your GoDaddy CSV to Google Sheets:
You’ll see that while we get some useful data from GoDaddy we might want to understand more about the backlink profiles of these domains. Especially if we’re going to be spending thousands of dollars on one. This is where the SEMrush API data will help give us access to richer backlink data for making a more astute decision on which domain is the best choice.
Step 3 – Build API query
SEMrush’s backlinks API provides the following example code for running the call we need:
https://api.semrush.com/analytics/v1/?key=YOUR_API_KEY&type=backlinks_overview&target=YOUR_TARGET_DOMAIN&target_type=root_domain&export_columns=ascore,total,domains_num,urls_num,ips_num,ipclassc_num,follows_num,nofollows_num,sponsored_num,ugc_num,texts_num,images_num,forms_num,frames_num
There are two variable values within this API call we need to amend in conjunction with our GoDaddy data:
- YOUR_API_KEY (you can get this from your SEMrush dashboard)
- YOUR_TARGET_DOMAIN (we will use a dynamic reference in Sheets with the domain name list)
In order to run the call we will use the IMPORTDATA function in Google Sheets which takes 3 arguments:
- URL
- Delimeter
- Locale
Replace the YOUR_API_KEY with your API key value (be sure to keep it secure). Type into the next available column on the first row where a domain exists, =IMPORTDATA(. Then paste your amended URL (with parenthesis) into the first argument:
Now we want to amend the rest of the query to include some dynamic references to our domain names. To do that we’re going to use CONCATENATE to append the strings to our fixed values where the domains reside (in this example in column B):
=IMPORTDATA(CONCATENATE(“https://api.semrush.com/analytics/v1/?key=123456789&type=backlinks_overview&target=”,B2,“&target_type=root_domain&export_columns=ascore,total,domains_num,urls_num,ips_num,ipclassc_num,follows_num,nofollows_num,sponsored_num,ugc_num,texts_num,images_num,forms_num,frames_num”))
And there you have it. A fully formed API request in Google Sheets. However, if you run the request now it will look something like this:
This is because we’ve not included a delimiter in the function (which should be the semicolon):
=IMPORTDATA(CONCATENATE(“https://api.semrush.com/analytics/v1/?key=123456789&type=backlinks_overview&target=”,B2,“&target_type=root_domain&export_columns=ascore,total,domains_num,urls_num,ips_num,ipclassc_num,follows_num,nofollows_num,sponsored_num,ugc_num,texts_num,images_num,forms_num,frames_num”),“;”)
By adding the semicolon (in quotations) to the function we will split out the API output:
That looks better. We don’t need to add the 3rd argument (locale) unless you wanted to use a different language and region locale code when parsing the data.
Step 4 – Refining the query and output
Now, all we need to do is copy down our formula to automate the calls for the following domains on our list. However, there’s a bit of a problem:
First, start by copying the headers in your Sheet and pasting the values (CMD + SHIFT + V on a Mac) into the next free column on row 1 so they stick there. As of the time of writing, this was the response header:
ascore, total, domains_num, urls_num, ips_num, ipclassc_num, follows_num, nofollows_num, sponsored_num, ugc_num, texts_num, images_num, forms_num, frames_num
Second, because we’re pulling the response headers each time the data isn’t fitting neatly into the columns adjacent to our domain data. To solve for this we can use a couple of SQL commands in the function:
=QUERY(IMPORTDATA(CONCATENATE(“https://api.semrush.com/analytics/v1/?key=123456789&type=backlinks_overview&target=”,B2,“&target_type=root_domain&export_columns=ascore,total,domains_num,urls_num,ips_num,ipclassc_num,follows_num,nofollows_num,sponsored_num,ugc_num,texts_num,images_num,forms_num,frames_num”),“;”),“SELECT * OFFSET 1”, 0)
Bear in mind when copying the above code that WordPress/Google Sheets have different quotation marks so they may need to be amended.
You can see the changes in bold. This SQL query will skip the header row from our response:
A nice clean response. Now you can run this for as many rows as you so wish and we get all this data from SEMrush to help us make a decision about which domain could be the one to bid on.
Step 5 – Choosing a domain
Now you have the data you can add a few more calculations and make a decision about which domain you want to purchase:
- Add a new column to calculate Follows vs No-follows as a percentage ratio =1–(follows_num/nofollows_num)—a higher percentage will mean a greater volume of follows vs nofollows (you may wish to then filter for domains that have a higher number of follow backlinks than nofollow)
- Add in some conditional formatting to highlight or filter the highest number of “ipclassc_num” domains—this is a better count than referring domains as it only counts links from the same IP range once
- Sort by “ascore” or “Majestic TF” as you will likely want to see the top authority domains toward the top
- Always be sure to use the Web Archive to see if the domain you are reviewing was ever littered with spam (lots of ads, a link directory, not in the desired language, offensive material, etc.)
Additional Considerations
You may choose to filter the domains in your original list so you don’t have to make as many API calls. Credits cost money so you’ll probably not want to be profligate with them.
Summary
Finding a domain that has a relevant and clean backlink profile can save a lot of time and money in building those backlinks in other ways. Thousands of domains are dropped each day so with some research and perseverance it’s very possible to find a gem relevant to your website.
If you have any questions then feel free to comment below.