First things first, I’m extremely happy to announce that I’ll be joining the SEOgadget team within the next couple of weeks. This really is massively exciting and I can’t wait to get started on working with a great bunch of people on some fantastic projects!
For my first post here on the SEOgadget blog I thought I’d write about something I’ve been trying out and getting a little bit addicted to; combining already existing data and reports to provide more insights from the information you already have. I recently wrote about something similar over on State of Search, but this time I wanted to talk specifically about bringing together exports from Advanced Web Ranking and Screaming Frog. You could do the same thing with exports from similar tools; but these two in particular are absolutely fantastic and have definitely become part of my daily work, so for the purpose of this example I’ll be talking focusing on them.
It’s also worth pointing out that there are SEO tools that provide similar information, and some of them really are incredible. However, if you’ve already got ranking exports and crawl exports, why not make more use of them? The other benefit is that you can start to play around with things until you get the perfect set of data to suit your specific needs.
What’s the Story?
When you’re taking a look at a site for the first time it can sometimes be hard knowing where to start. By looking at where certain sections of the site are ranking, and comparing that to technical data on the page itself you can start to see whether or not there might be technical issues that explain poorly performing pages; or perhaps give an indication of what could be improved. This is by no means an exhaustive site auditing process, but it can definitely help improve your ‘birds eye’ view of a new project or help when revisiting a site with a fresh focus.
To help explain what can be done and how to go about it I’ve set up an example report based on a fake URL with fake ranking data. For the keyword selection, I’ve used a set of keywords to represent the head, mid-level and also the long tail. If you were working on a new site and wanted a larger set of data I’d suggested exporting non-branded keywords from Google Analytics.
For this example report, I’ve set it up to display the following information:
- Keyword Category (self assigned)
- Ranking URL
- Level of the Page Within the Site
- Status Code
- Number of Similar Pages
- Instances of Duplicate Source Code
- Number of Links Directed at the Page
- Number of Links on the Page
- Canonical Version of the Page
If you’re familiar with Screaming Frog, you’ll know that the capabilities are huge. Using this export plus other tools and API’s, you could easily extend your report out to include information such as:
- Missing Meta Titles
- Missing Meta Descriptions
- Number of Missing alt tags on the Page
- Page Authority
- Social Activity of the Page
- Search Volume
- Bounce Rate
- Conversion Metrics
That’s all pretty useful stuff that you can start to comparing against rankings and trying to ascertain why a website might be performing the way it does. The true beauty of using Excel to work this way is that you can include or cut whatever you want. If you wanted to extend it out and use other data then simply carry on the process as explained below with the example. Taking this one step further, you’d probably want to start playing around with pulling in data via the SEOmoz API and also the Adwords API. For now though we’ll keep it simple and go through an example with a relatively small set of data. Perhaps a ‘Mega Report’ might be a potential post for the future?!
Building the Report
I’ve made a handy little screen demo to show you how I’ve started to put all this together, but because I’m no Steven Spielberg you might want to keep reading for a full list of instructions!
AWR Export – Unfortunately VLOOKUP won’t work with duplicate data. Because we’re going to be using this function quite a bit you’re going to have to de-duplicate both the keywords and URLS. The end result is one keyword per URL; but that’s cool because this report is giving you a bird’s eye view of a sites position and helping to compare rankings with on-page issues. Once you’re happy with the data you’ve got, move the keywords into the leftmost column and convert the data to a table (ctrl + enter)
Screaming Frog – The first thing I do with this export is to remove any columns that I won’t need for the report. Once you’ve got everything you need, again convert the data into table as this will make working with the data much easier. Once you’ve done that it’s really quick and easy to start filtering out the URL’s you don’t need. If you don’t need to check anything other than the main URL’s, simply filter everything else out by selecting text/html:
The URL’s– You’ll find that AWR and Screaming Frog will export URL’s slightly differently (e.g with or without trailing slashes). In order for VLOOKUP to work when finding data from a URL match, you’ll need to make sure that both sets are the same. To solve this, you can do a quick bit of work with find ‘and replace’ to resolve any discrepancies.
The Main Report
Once your exports are all sorted out you can start to build up your main report with the columns of data you require. The first step is to dump in your keywords, ranking URLS and the positions they are ranking in. To make filtering a bit easier I’ve assigned each keyword with a tag of ‘main’, ‘category’ or ‘product’. That allows me to quickly look through and check each level of the site to see if any patterns start to emerge.
To start combining the exported data together, I’m using a combination of the VLOOKUP and COUNTIF functions. To give you an example of each I’ll explain how I’ve populated the ‘Level’ column and the ‘Similar Pages’ Column. It might take a little bit of playing around, but once you’ve got to grips with these two functions you can simply repeat the process to build out your report to include the information you need.
To fill the ‘Level’ column I’m using the VLOOKUP function. This finds the match for the URL in my ranking report with the appropriate data for the same URL in the Screaming Frog Export. The formula I’m using is:
However, it’s highly likely that your worksheet will be set up slightly differently. The easiest way to make sure the formula is correct is to enter =VLOOKUP( into the first cell of the ‘Level’ Column and then hit ctrl + A. Doing that provides a handy little pop up to help you through:
Lookup_value: highlight the first URL in your main data table.
Table_array: highlight the Screaming Frog export.
Col_index_num: enter the column number within the table that you want to pull through the numbers from. In this case, ‘Level’ is the 8th column in the Screaming Frog export, so simply enter ‘8’.
Range_lookup: enter ‘FALSE’ to get an exact match.
Once completed, hit enter and if you’re working in tables mode you’ll automatically populate the entire column. Understanding how many steps a URL is from the homepage can be pretty useful for finding out potential architecture issues, and whether poorly ranking pages are buried deep within the site.
To fill the ‘Similar Pages’ column we’re looking for the number of pages that have the ranking keyword or phrase within the meta title. This should give us an idea as to whether duplication or keyword cannibalization might be the cause of a page not ranking particularly well. For this we’ll be using the COUNTIF function rather than VLOOKUP:
Simply paste this formula into the first cell in the ‘Similar Pages’ column and edit to make sure it’s selecting the correct values for your worksheet. In the example above, ‘Screaming Frog’ is my array table, ‘D:D’ is the meta title column within that table, and ‘A2’ is the keyword I’m looking to find a number of matches for.
You’ll notice that my keyword cell ‘A2’ has been wrapped with “*”&A2”*”. If you don’t wrap the cell in this way, the COUNTIF function will return the number of meta titles that match the keyword exactly. However, we’re looking for the amount of titles that include the exact phrase. By wrapping the cell with “*”A2”*” you’re telling the function to return the number of titles that include the phrase; perfect!
Getting Quick Answers
The whole point here is to find some quick answers to ranking problems. When you’re taking a look at a new site one of the best questions you can ask yourself is: ‘why is this website ranking the way it does’? Maybe certain sections of the site are ranking nicely whilst others are struggling; if that’s the case then comparing all of this information directly against rankings should help you to identify some answers.
Depending on what information you want, you can start to filter through all the data in any way you want. I’ve personally found it particularly useful to use the keyword categorisation to look at different sections and levels of a site, or to start filtering down via ranking positions to investigate the pages that are ranking lower than others. Find the problematic rankings, see if any onsite issues match up, and start building a quick list of fixes!
If you want some further reading on using Excel to manage your exported data, then you should definitely check out this great post about using SEO Spider Data in Excel.
I’d also recommend checking out some of these really good posts on using Excel for SEO:
–How to Build an Advanced Keyword Analysis Report in Excel
–Keyword Research – Using Categories to Make Your Process More Actionable
-Categorising Your Links with Excel and Open Site Explorer
-Using VLOOKUP to Match Keyword Volume & Rankings Data