SEO tools are widely available today and many of them are even free. While these tools may offer some helpful insights, very few of them can actually deliver good data points.
Here at Locomotive, we believe in making data-driven decisions. That is why we have developed a variety of tools that provide a deeper look into your data that cannot be found elsewhere.
Let’s face it, due to stretched resources and budgets, making the case for new work can be an arduous process. Our tools offer actionable data points and can greatly reduce what would be manual work at other agencies. This enables our clients to stretch their budgets further, substantially lowering billable hours without compromising quality. With new data available, you will also have the necessary business cases for any recommended website optimizations.
To evaluate the impact of algorithm updates, redesigns and navigational changes on rankings, we’ve developed the ability to heatmap site clicks and/or position ranking data, giving us a clear view of any large-scale shifts in site performance.
When optimizing, prioritizing and monitoring a site, it’s critical to understand the performance of key service areas and products. Categorizing and organizing all ranking queries for a site requires extensive resources, so we developed a custom clustering algorithm to build semantic keyword categories for a site’s ranking queries. This gives us a view of keyword growth by relevant keyword category. For example, if a site ranks for 1,500 unique queries related to “cloud computing,” our algorithm offers better understanding of these keywords by clustering them under the same semantic category. Performance is then gauged on a category level, instead of having to look at individual keyword rankings. This can be much more informative when building strategies.
To help clients plan and budget correctly for the upcoming year, we developed a tool using Prophet (open source software from Facebook) that can predict future traffic. This gives clients insight into yearly and weekly seasonality, while providing us with predicted monthly session numbers and the overarching, high-level trends.
Using Google Analytics, we pull down all the pages your site is ranking for and the clicks they have generated over a long period of time, usually 3 to 4 years. We can then identify the pages that have lost the largest number of clicks over time. This helps us see pages that were once doing well but have now significantly dropped off and should be prioritized for optimization.
This tool shows us the pages on your site that are ranking highly for relevant terms, but users are not clicking on them in search results. This data is then coupled with page-level query data, which provides clues on how to best optimize the title tag and meta description of the page.
Using this process, we can align on-page elements with the user’s intent, therefore improving click-through rates. Tools like this will help us to identify low-hanging fruit opportunities for your team to focus on.
Internal linking is a very time intensive process that involves multiple steps in order to be done effectively. At Locomotive, we have developed a proprietary algorithm that streamlines this process, reducing manual time by 75%. This of course, helps stretch our clients’ budgets further.
The tool pulls Google Search Console data from the past 30-60 days, applying a score to each query pair. We are then able to estimate which pages would benefit from moving the query up a couple of position in the SERPs. The tool also identifies relevant site pages that are good candidates to link back to target pages and reviews ngrams within the site content. We can then more efficiently manage content editing through a manual process.
Most marketers or business owners could tell you what keywords they want to be ranking for in search. However, in order to truly see results, you also need to understand which keywords will attract the right audience and lead to conversions.
We developed a process that combines data from Google Search Console with data from Google Analytics. This allows us to see what keywords are converting when used in page titles. Once the converting keywords have been identified, the analysis uses natural language processing and content generation algorithm to write new page titles.
Duplicate content is an issue that many websites face, particularly larger sites. Unfortunately, this can be a difficult problem to tackle and many traditional formulas and algorithms do not effectively tackle large numbers of pages.
Locomotive uses a unique approach to crawling pages of content and then dividing the words within that content into shingles, hashing the strings into integers and utilizing Jaccard Similarity to determine the page with the closest matching content. In doing so, we can yield a Match Score, which allows us to prioritize the pages that have the biggest issues with duplicate content.
We believe in the power of quality internal linking and are always looking for ways to improve internal linking for our clients. This is why we developed a process that determines the difference in link equity before and after our optimizations to internal linking. By incorporating page level data pulled in via the Moz API with data analyzed in Python, we are able to provide valuable PageRank and CheiRank metrics.
Our PageRank Comparison tool lets us see how the PageRank algorithm has shifted. Based on the results of the comparison, we are able to make decisions on where to move and adjust internal links, which can generate increases in PageRank.
Connect with us.
Get in touch with us today to talk with our experts and learn how we can best benefit you and your business.