Any good marketing plan is incomplete without an SEO audit. And as part of our Ladder 2017 Marketing Series, we did just that.
There are tons of amazing SEO guides out there, and we’re not trying to build a comprehensive guide here. If you are interested in learning more about SEO, you can’t go wrong with Moz’s SEO Beginner’s Guide or anything on the Backlinko blog. If you want a really in-depth (paid) guide, we highly recommend Annie Cushing’s Site Audit Guide. As a free alternative, or check out Annie’s Site Audit Checklist.
For this guide, however, we’re just tackling the basics. These are the simple things we would want to check for a new client to make sure the foundation is there before we started getting serious about organic search as a channel.
As always with our content, we’re not aiming to be comprehensive and hold your hand through exactly what to do. That approach wouldn’t work anyway, as every site is different. Our hope is that by showing you our thought process and how we do things, you’ll be able to adapt it to your specific situation.
Note: there may be important sections here that we don’t cover because we didn’t have anything particularly wrong with them (e.g. mobile usability, manual actions). We omitted anything uninteresting for brevity, as this guide was already pretty large and you wouldn’t have learned anything additional from inclusion. We only audited the main Ladder.io website, as blog.ladder.io was already pretty well optimized from an SEO point of view.
When doing an SEO audit, there are three major areas to cover.
It’s important for Google to be able to read your website and know what’s going on. Often however, websites are built in a way that gives Google conflicting messages or actively stops it from being able to look for the information it needs to decide how to rank you. Accessibility issues affect more than just Google; fixing them often improves user experience too.
This is the name of the game in Search. Google is the best search engine because it serves the most relevant results. This is core to what keeps users coming back and making sure your site content is as relevant as possible to what people are searching for, is key to ranking and generating traffic from Google.
You can be openly accessible, and perfectly relevant, but still not rank. The reason? Lack of trust. Just like a banker assessing your credit for a loan, or an employer assessing your fit for a role, Google is carefully assessing your website’s link profile. If nobody links to you, or nobody links to the people who link to you, you’re dead in the water with no organic search traffic.
Let’s cover each one in order.
Note: to conduct an audit like this yourself, you’ll first need to set up Google Search Console and connect it to your Google Analytics account. A lot of SEO audits also include keyword research, but we did this as a separate activity in our marketing plan.
If we want to rank organically on Google, we need to make sure they can crawl our site without running into errors. The good news is that Google does a great job of telling you what’s wrong, and how to fix it, in their (free) tool, the Google Search Console (formerly Webmaster Tools).
First, let’s take a look at the dashboard you get when you log in.
First, on the left, we can see that we don’t have a DNS error, we have good Server connectivity and Google has found our Robots.txt.
This is a file that Google (and other search engines/bots) look for instructions as to what pages to crawl and which to ignore. You can block certain pages and tell Google you don’t want them to crawl, which can be advantageous if you don’t want those pages to show up in search engines, or if you’re running into issues with those pages that might cause a penalty.
The simplest version, which allows all bots to visit all pages, looks like this:
You need to get your developer to save it at yourwebsite.com/robots.txt.
If you wanted to disallow every page on your domain, for example if we don’t want Google to crawl or rank our client portal, we’d do something like this:
…and save it in the right place on that subdomain.
If we wanted to disallow just a single page, like our terms and conditions, it’d look like this:
You can also disallow entire folders:
So what do we get when we visit Ladder’s robots.txt?
Oh… a blank page. Looks like something went wrong. Although Google said it had a robots.txt for us, when we go to the robots.txt tester, it’s showing an error.
Maybe we forgot to put the code on there, or perhaps it was taken off by mistake, but we should be adding a robots.txt as a key requirement in our plan.
In the Console you can see when the error occurred; in our case, it looks like this was just never added in the first place.
A robots.txt isn’t necessarily an urgent issue, but it’s good practice to have one. When we’ve created a robots.txt we’ll make sure to come back to the console and test it to make sure we aren’t accidentally blocking URLs we want to be indexed.
Note: we saw that ladder.io/robots.txt actually resolved to a blank page, even though that page didn’t really exist. Actually, any page that doesn’t exist still resolves to a blank page.
This is a big problem from an SEO point of view, not to mention the poor user experience. We should make sure that a request for any page that doesn’t exist, redirects to the homepage.
Looking back at the dashboard, we can see Google is warning us we have some crawl errors; 44 ‘Server Errors’ and 88 ‘Not Found’. Let’s click into Crawl Errors to take a look.
This is a list of pages that Google found, and tried to crawl, but was unsuccessful. This can happen if you had a lot of pages previously indexed from an old version of your site.
It can also occur due to users entering in URLs that mistakenly resolve instead of hitting a 404 error (the issue we just spotted when looking for /robots.txt)
We can actually just mark all of these as fixed; the chances are most of them won’t come back up again, and when we fix the previous issue (resolve all URLs that don’t exist to the homepage) these errors will stop occurring entirely.
Under ‘Not found’ we’re seeing a different issue. In our public tactic database, we seem to have some that no longer exist, that have changed the name, or that have appeared with a ‘-1’ at the end of the URL.
If we wanted to go for an elegant solution, we could write code that 301 redirects the outdated URL to the new page, but that might not be feasible within our resource constraints.
Alternatively, the previous fix we talked about (redirecting all non-existent URLs to the homepage) will be good enough in this case.
While we’re in the crawl section, let’s take a look at crawl stats.
This tells us how often Google is crawling our site, and how many pages it hits. You’ll see this can be really low when you’ve just launched and don’t have many links to your site. However, for a site of our size (~500 pages), this looks really healthy.
These stats become much more important as the size of your site grows to thousands of pages. At some point you might want to exclude certain less important categories to ensure your more important pages get a bigger share of your ‘crawl budget.’
How does Google know which pages to crawl? We help it along, by submitting a Sitemap. A Sitemap is just a special list of all the pages on your site.
There are lots of plugins for creating one if you use a popular blogging platform like WordPress, but because our site is custom, we made our own. You can usually find the sitemap in the same way you find a robots.txt. file: for example, ours is at https://ladder.io/sitemap.xml
It’s important we make sure that there are no pages within this sitemap that don’t exist anymore, for example, our ‘success stories’ and ‘case studies’ pages are a relic of the old website and should be removed.
Within Google Search Console, we can see when the sitemap was last submitted/updated.
We can also see how many of the submitted pages were actually indexed, and if there were any errors found at all.
In the index status report, we can see how many pages total we’ve had indexed. Remember Google will often find pages that aren’t in your sitemap (more on that to come shortly).
If we want to check what pages are indexed, we can just Google it using the convention “site:ladder.io”. This returns all the pages Google finds on your site.
You’ll notice it says 1,590 results here, which doesn’t jive with the 887 number we saw in Search Console. That’s because this number is just an estimate, which includes a lot of duplicate values and can be pretty inaccurate for smaller websites. To find the real number, keep clicking through pages until you get to the final page of results.
270 results actually showing from 887 indexed pages is in about the right ballpark. Looking through the results you might find pages you don’t actually want to be indexed.
For example in our playbook we’re getting some of our search results pages in the playbook indexed (/playbook?query=tag:zigzag), probably because we link to them via the blog. Typically search results pages don’t perform well in Google’s index (it doesn’t want the competition!) so it might make sense to de-index these in our robots.txt and save the crawl budget for other pages.
But what if we don’t have a sitemap? How does Google navigate your site then? Just like a user would; by clicking on links in your navigation and body copy.
It’s important to check this list and ensure that these really are the top pages on your site. Otherwise, you’re sending the wrong signal to Google, and potentially aren’t helping your users find your top pages either.
So how does Google know your site exists in the first place? Through the grace of the other websites that link to you.
We can dig into these links and see where they are linking from. For example here are all the links from Quora questions I’ve answered that go to our homepage.
This report doesn’t tell us much about the value of those links; for example, links from Quora are ‘no-follow’ links by default, and therefore don’t pass much SEO juice. We’ll touch more on this in the ‘Authority’ section.
This is definitely a problem; all of these pages are completely different in purpose, the keywords we’re targeting and the content we’re showcasing. Yet we’ve given them all the exact same description. This will seriously hurt the clickthrough rate of our pages showing in the SERP; we’ll cover this more in the Relevancy section.
If we want to do a more in-depth HTML check, we can run pages through the W3C Markup Validation Service.
This combs through the HTML on your site and checks it complies with best practices and that there aren’t any errors.
You can see for our HTML there isn’t anything really major. We should fix the href lang as that’s what tells Google what language our site is in, and a duplicate ID can cause problems, particularly if we want to run a conversion test or track clicks on that ID.
You should just go down this list and fix what you can if possible. Most of this stuff won’t be urgent from an SEO perspective, but most of it is easy to fix. If you aren’t sure what something means, you can usually Google the error and find out more information.
While we’re on HTML, it usually makes sense just to look at the source code and see how the page is put together. The first thing to check for is the H1 tag; there should only be one on the page, and it should be descriptive of our business and keyword rich.
In our case, the main hero copy fulfills that role and talks about growing businesses. This could probably be improved from a keyword point of view, but we have to be careful to only cater to Google if it isn’t at the expense of what works from a user perspective.
The supporting copy underneath the H1 is actually an H5 tag when it should be a P tag.
If we scroll further down, we can see that each section has its own H1 tag; these should be H2s.
…but the sections at least follow the right P tag convention.
Another useful thing to check is the images. If they don’t have img tags with alt text, they won’t show up in Google Image Search and Google won’t know what the images are relevant to so they won’t boost the page’s’ relevancy.
Unfortunately, it looks like all the images on our homepage are drawn on programmatically, rather than loaded as an image; this means they confer no SEO value. Now we’d potentially gain some SEO benefit from converting these, but it would be at the expense of the visual design, so it probably isn’t worth it in this case.
If we look at the tactic page though, we can clearly see room for improvement.
These are proprietary, branded images that should stand out on the Google Image Search page, but they’ll never rank for these highly-targeted, long tail niche keywords because the alt text is always ‘Tactic Image’.
Additionally, we can see that the tactic name is in an H2 tag (should be H1).
We can see a title with the tactic name in it, which helps.
And although there’s no explicit meta description tag, it seems to be pulling the right description on the SERP (p.s. – first organic position!).
In this case, Google could be pulling it directly from the page, but we’d do well to make this explicit via the meta description tag.
One other thing we should check for is OG (Open Graph) tags. This help show the right content whenever a user shares on social.
In our case, these are really messed up and doing more harm than good. Instead of the tactic description, we’re getting a generic Ladder (out of date) description, and instead of that nicely designed proprietary tactic image, we’re getting a generic Ladder background image.
The last but most important thing we’ll cover is page speed. This has been shown to be a major ranking factor, and it also improves user experience and increases conversion rates. Helpfully, Google provides a free page speed test tool, let’s see how we stack up.
There are a few things to work on here, but nothing too dreadful. We mostly need to get smarter about how we store and deliver assets like images and CSS. You can click ‘Show how to fix’ to get more information;
You can also click a link straight through to Google’s developer docs; a useful resource to send your product team so they know what to do to fix it.
If you want a second opinion, head over to Pingdom, who offer a slightly different speed test.
You can see they found a lot of the same stuff, but the total load time and ‘faster than’ metrics are really useful context, as is the ability to test from different areas in the World.
You can obviously go a lot further than this. There are amazing technical SEO tools out there like Screaming Frog, which can help you take an accessibility audit to the next level. However, if you’re doing your own SEO audit, chances are you don’t need anything that advanced, and can stop once you’ve gone through the basics in Google’s Search Console.
To do well in organic search, we need to make sure we’re relevant to what users are searching. There are two ways to approach this; 1) target a set of keywords and try and move the needle on those by creating relevant content, and 2) see what we’re already ranking for and make changes to our content to improve our ranking.
As we’re saving keyword research for later in the plan, we’ll tackle #2 in this section, but all the same methods still apply once you have your target keyword set. The benefit of starting with approach #2 is that we’re not fighting an uphill battle; we already rank for these keywords, in some cases near the first page of results; any improvement will have a disproportionate effect.
Google Search Console, as ever, is uniquely helpful in seeing what keywords lead to clicks to your website. If you head over to the Search Analytics report you can see the overall trend in impressions, clicks, CTR, and avg. position.
Scrolling down, we can see impressions and clicks for the actual queries we showed up for.
As is often the case, the majority of our organic search traffic comes from brand keywords. Because nobody really types in a full URL anymore, they just type your businesses name.
It’s important you treat this traffic separately; it’s not really ‘organic search’ traffic; it’s more related to the quality of your product and brand awareness from other marketing campaigns.
Filtering out these queries gives us a much more representative sample of what’s truly driving traffic from our content marketing / SEO initiatives.
So around 60% of our ‘search’ traffic is down to non-brand terms. We can see here we have a couple of generic terms related to what we offer, with ‘growth marketing agency’ leading the pack. In order to see these more clearly, we can often just filter for the homepage.
Looks like we’re doing ok for the keyword ‘growth marketing agency’ which is really the most relevant (though less popular) keyword in the agency set of keywords. We’re already bidding on this for PPC, but this report is a great place to find new terms.
Ultimately the generic terms that drive traffic to your homepage will be relatively limited. A page that’s relevant to every keyword, is really relevant for no keyword; to rank on competitive terms you typically need a page per term. We have that with our tactic playbook:
It looks like we are actually getting some early traction on pretty niche long tail terms from our tactic playbook. Our average position is still pretty low, and these terms don’t get much traffic individually, but we’re getting 100 clicks a month from this method and it’ll only grow as we build domain authority and publish more tactics.
As well as by query, we can filter by the device, to see how we’re doing on mobile vs desktop.
CTR is always much higher on mobile because there are fewer results and a smaller screen. Some keywords might rank better on mobile and therefore skew results, so consider filtering out mobile (or only looking at mobile) when making comparisons between keywords.
We can also look at the same stats by country.
In our case, the US and UK drive most of our organic traffic, which makes sense as that’s where our biggest markets are. We can also see a view of the top pages that get organic search traffic.
Even with brand keywords filtered out, our homepage is still our biggest source of traffic. While the tactic pages are a good long-term play, a way to use an existing asset from our product as free content to attract traffic, it’ll be a while before it has a major impact.
So first, let’s focus on optimizing our homepage for the main keyword we saw we were ranking for earlier: ‘Growth Marketing Agency’.
The ‘Growth Marketing Agency’ keyword has limited volume; only 145 searches in the past month. You might be wondering why we’d bother trying to rank higher for it. Apart from it describing the service we offer pretty precisely, there are a number of reasons;
All Non-Brand Homepage Organic Traffic, Last 28 days
“Growth Marketing Agency” Homepage Organic Traffic, Last 28 days
Average organic search CTR by position from a 2014 Moz study.
From Google’s Keyword Planner; queries related to ‘Growth Marketing Agency’
Interest in ‘Growth Marketing’ is trending upwards over time
For inspiration and context, let’s first see what the competition looks like:
Note: it’s best to use Google’s Ad Preview tool for this, as your results change based on location.
So what patterns do we see? It’s useful to know what the competition is doing because, armed with that knowledge, we can either go with the industry default format or try something novel to stand out from the crowd.
Here’s what our search result looks like:
…not particularly inspiring. Also a little confusing, as we jump from talking about our technology to mentioning ‘services’, without explaining much about what we do.
You should treat the meta title and meta description of your page as an Ad; you need to get across your unique value proposition, include some social proof so they can trust you and handle any objections they have that might stop them clicking through.
Let’s fix this by adapting it to the format we saw that was common to our competitors:
Ladder.io – Growth without the guesswork.
Growth marketing agency working with 36 clients in New York and London. Using our proprietary software with 1,000+ proven tactics, we grow your business quickly and efficiently.
This is a world away from what we had previously. Here’s what’s good about it:
If we already had a good version and we were concerned about the effect of changing it, we could run an A/B test via our Google AdWords campaigns to make sure it works first. Alternatively we could have just taken our copy from what works on AdWords in the first place.
Our AdWords ads in the “Growth Marketing” ad group
Optimizing the meta title and description should improve our CTR. However, it won’t (directly) affect our ranking. For that, we need to optimize the page itself.
I’d be hesitant to change our main headline here, as it’s working well in terms of conversion rate. We wouldn’t want to affect that just for the sake of SEO. The copy underneath is pretty much fair game, however — it won’t have a huge effect on conversion but it’s the second most important text on the page because Google reads from the top down to determine relevance.
For the sake of consistency, we should actually use the same text we had in the meta description. Because the description is keyword rich, it will help convince Google this page is relevant to what the user searched. Consistency also creates trust and reassures the user that they landed in the right place; this should help decrease bounce rate.
Growth marketing agency working with 36 clients in New York and London. Using our proprietary software with 1,000+ proven tactics, we grow your business quickly and efficiently.
Why is bounce rate important? Because Google favors websites that exhibit ‘long clicks’ i.e. the users they send you don’t come back, bounce rate can be a real SEO traffic killer. In Google Analytics, we can see this by first filtering for just New Visitors from Organic Search.
Then we navigate over to the landing page report and see how new SEO visitors behave when they first land on the homepage.
Wow; 25% bounce rate is really low; this is probably a big reason as to why we rank ok already. This is particularly low for what is essentially a one-page website, which is very commercial in nature. The premium design elements we discussed in our UX teardown are certainly a factor acting in our favor here.
If we did want to push this even further, improving page speed, which we covered in the previous section, is one big one. However specific to the structure of this page, is the requirement to click ‘learn more’ to scroll down. This is an issue because many people will try and scroll, not be able to, assume it’s broken and bounce.
Speaking of the content further down the page; we sure do have a lot of it.
This is really excellent; we have a lot of industry buzzwords here, many variations of the main keyword we’re trying to rank for and plenty of actual text – remember Google can’t see images – text is all Google has to go on. Additionally, because users can scroll down so far, it’s likely their time on site is much higher, thus decreasing our risk of short clicks.
It’s always great for this text to be natural language; that’s what Google wants after all, but we might want to consider spicing it up a little bit by adding relevant keywords tactically.
To figure out what keywords to use, just scroll down to the bottom of the search results page for your top keyword to the related searches section.
It looks like location is an important modifier (London, Brisbane, Los Angeles). There are also a couple of keywords like ‘saas’ and ‘digital’ we could consider including that Google deems relevant. For more related keywords, we can look back at the results we got from keyword planner; e.g. growth strategies, marketing software, and lead generation.
To apply this to the page, we should switch up some of the times we mention ‘growth marketing’ and substitute it for one of these related words like ‘digital’ or ‘strategies’. This helps us avoid ‘keyword stuffing’ penalties by mentioning the same keyword too many times.
We should also include a section that talks about our locations in New York and London, and list some of the types of services we’re offering, like SaaS and Lead generation. We do have a section with a number of buzzwords like this in it, but it doesn’t count towards SEO benefit because it’s a video.
It’d be ideal if these words were actually in HTML tags and the graphic was drawn on with SVG, but that might be a little over the top for the relatively small benefit. We might instead consider loading a static image first (then replace it with this video when loaded), which has some of the keywords as alt text tags.
Including more static images on the site will help us in other ways. For example, we show in some 10,000 image searches, but very rarely get clicks. This is a sign that the images that are showing aren’t relevant, of high enough quality, or that we’re showing the wrong searches because we don’t have keyword rich alt text.
To take just one example, we’ve appeared in 21 searches for ‘Growth Ladder’, even in position 97.8; that’s a long way to keep going just to find a picture of a ‘growth ladder’.
Google Keyword Planner tells us only 10 people search this a month.
But we know that can’t be true if 21 searchers got more than 10 pages deep to see whatever image of ours ranked in the past 28 days. From the Moz study we looked at earlier, we know that less than 1% of searches go past page 3. This could be an overlooked gold mine of traffic.
…and check out the competition; it’s just awful word art style illustrations.
How many more people could we get in front of if we made a premium, custom image of a growth ladder? How many attributions links back to the site would we get from people using it?
Now that we know the process for optimizing a page for SEO, we can do it for other keywords that we want to rank for. Once we’ve done our keyword research we could make pages specifically for each keyword we want to target, with great, relevant content.
One company that did this really effectively is Qualaroo for the keyword Conversion Rate Optimization. This is an incredibly expensive keyword that they rank first and second for.
They rank so highly for such a competitive term because they really took the content to another level. Rather than a simple blog post, this is a fully fledged beginner’s guide course on the topic.
We could do something similar for one or two of our most valuable keywords, then go through this same optimization process we did for our homepage to make sure it ranks.
Finally, let’s look at navigation; we have a relatively thin site with only 4 main pages (one of which, the blog, is a subdomain so doesn’t count).
This will mean it’ll be hard for us to show site links, which can massively increase clicks if we get into position one. Additionally, it gives Google a better idea of what is important on the page, so it doesn’t accidentally emphasize the wrong keywords on the page.
On our main brand term, we do get some site links, but they’re pretty small (no supporting text) and not the ideal set to show off. Two of these links don’t even exist on the site anymore!
For comparison, take a look at Hubspot’s SEO result; this is much more professional looking, takes up a lot more real estate and helps the user dive right into the right section.
If we made a number of SEO pages (like we discussed in the last section) we could link to these in the navigation, which would hopefully convince Google to use them as site links.
Last but not least, the footer. We have a really simple footer that doesn’t really do much but replicate the menu links and our main call to action.
This is a missed opportunity. We could link to our top blog posts, to SEO pages or to specific tactics in our playbook. We could have our New York and London addresses here for clients trying to find our offices (or prospects doing due diligence). We could answer frequently asked questions or link to our careers page.
For example, check out the footer on VWO’s website:
Almost anything you need to do on the site is mapped out for you and the content they want you to find is front and center. This is both a user benefit and beneficial for SEO; Google often users your footer links to crawl your website. Anything you link to here will be deemed important and teach Google about what your site is relevant for.
The biggest mistake I see marketers making in SEO is the ‘build it and they will come’ mentality. They spend hours writing quality content, designing premium images, doing in-depth keyword research and optimizing on-page factors. They hit publish and move on to the next post, and the next one. Then they’re surprised when their blog doesn’t get any traffic.
Yes, good content is essential, but it’s not sufficient; you need to get that content in front of the right people. Until Google sees websites it already trusts linking to you, it won’t trust you, and therefore won’t rank you highly in search results. Mediocre content with links beats quality content with no links all the time. Pages ranking #1 in search results have an average of 168% more backlinks than those ranking in position 5, and remember that can mean 6x more traffic.
Backlink building has a bad rap; a whole lot of cowboys in the industry have employed (and continue to employ) pretty spammy techniques to get as many links as possible. Instead, our approach is going to be to find what types of content attract backlinks naturally in our industry. This is a more sustainable approach that isn’t going to get us penalized by Google.
To do this research we’re going to use Moz’s free Open Site Explorer tool. There are more advanced ones out there (including Moz’s paid version) but the principles are largely the same.
Domain authority is a score, on a 100-point scale, developed by Moz to predict how well a website will rank. This score is logarithmic, meaning it’s easier to grow from 10-20 than it would be to get to 70 from 60. Page authority is the same thing, but specific to the page you’re ranking. First thing’s first; let’s plug our homepage URL into OSE and see what authority we have.
It can be difficult to benchmark domain authority, but you typically need around a 30 DA before things start getting interesting. The best way to benchmark is versus our competitors. We did a much bigger competitor research piece earlier in the plan, and the marketing universe is pretty enormous…
But for now, let’s just plug in the other websites that rank for the term “Growth Marketing Agency” we optimized for earlier. Once you understand the process, you can do this for any number of competitors/markets you want to understand and compete in.
With our domain authority of 27, we’re doing well against the competitor average of 23. We’re doing much better on homepage authority; 38 vs an average of 30. This is really promising news particularly since we only launched this site at the start of 2016 and started actively content marketing in Q3 2016.
It’s not always the higher authority sites that rank highest. Catch NYC has a number of high authority links from AdWeek, but due to the content on the page is less relevant than the others to this specific keyword, they’re losing out in this instance.
Additionally, the rank isn’t the same for you as it is for me; that last search was in New York, but if I search from San Francisco I get this:
Catch NYC doesn’t show up, and is instead replaced by the Growth Marketing Conference. Searching from London I get a different set of competitors entirely:
The point is that you shouldn’t obsess too much over your ranking; just concentrate on doing the right things with regards to accessibility, relevance and authority and you’ll get more traffic.
To put the authority score in more tangible terms, let’s run each through Moz again and take a look at the number of links each website has collected.
You can see pretty easily the correlation between a number of links and Domain Authority. All things being equal, more links is a good thing. You can see that with Growth Agency vs The Growth; just 3 more links from the same number of sites was enough to push them from 14 to 19 and into 2nd place from 4th.
The number of domains linking is almost as important as the number of links total. If you look at Catch NYC vs Growth Rocks, you can see they have fewer links, but a higher DA. On average sites that link to Growth Rocks do so over 20 times vs only 5 each for Catch. Multiple links from the same source can look spammy and carry less weight compared to multiple websites linking.
Link velocity is also important; though not directly affecting rankings, the number of links ‘Just Discovered’ in the past 60 days gives us an indication that GrowthRocks getting stronger and will soon pass Catch in domain authority. We’ve got strong momentum with 3 links but we need to keep this up if we want to jump to higher positions.
What about domains like Six and Flow, who only have one linking domain; how are they getting a higher DA and outranking The Growth? The answer is link quality.
The quality of your links is extremely important. If someone who has a high Domain Authority links to you, it can be worth hundreds or thousands of links from less trusted sites. P.S. If you’re seeing parallels to a certain Black Mirror episode, you’re not alone.
Let’s take a look at that single domain linking to Six and Flow and find out more about how it had such an outsized effect.
The Prolific North website has a really hefty Domain Authority of 47 and the Page Authority is good too. How is their score so good?
Over 15,000 links from 60 domains; that’s a pretty high ratio, but the volume certainly helps. The big factor in their ranking though is to be found at the bottom of that screenshot; a juicy link from the well trusted 100 Domain Authority BBC website.
Just as a single link from the BBC boosted the profile of Prolific North, a single link from them gave this small, 15-month old agency based in Manchester, UK enough juice to push past more established businesses – that’s the value of link building! Let’s look at the content.
The article is featured in a local business magazine covering the north of England where this agency is based. The founder of the agency is giving his view on the business climate post-Brexit vote. Partially thanks to this one interview, a whopping 45% of their traffic now comes from search.
Seeing how well this worked, we should be trying for similar opportunities to get coverage in local business press and see if it works for us.
It we take a look at the link profile of Catch NYC, we can see a higher level version of this effect; they have numerous links from AdWeek; a massively respected publisher with a Domain Authority of 92! It’s the page authority that is actually passed over, but at 45, this is also very high and likely the reason for Catch’s strong DA.
Now let’s take a look at the links for Growth Rocks; remember they had hundreds of links but still didn’t have as high a Domain Authority as Catch NYC.
At first glance, you might think ‘wow – look at those links from Slideshare with a PA of 49’. However, these actually have very little direct impact on ranking, because they’re ‘no follow’ links. If we go to Slideshare and take a look at the link, we can see it in the code.
Any link with a rel=’nofollow’ attribute is basically that website telling Google “I’m not vouching for this website”. All the major social networks and many sites that allow comments do this because they don’t want to get penalized by Google for the spammy links their user’s post.
These links do help indirectly, however; it’s natural for you to have a tonne of these just through normal social sharing / setting up a social presence. If you don’t have any, or your ratio of nofollow to follow is low, Google might get suspicious that you’re trying to game the system. It’s also important to remember that people click on links; maybe you don’t get any SEO benefit, but you may be getting referral traffic.
One great link they do have is an infographic showcase site called Visual.ly. This PA 41 link is a real coup and all it took was repurposing a popular startup growth book into a nice looking infographic.
Taking a look at the rest of Growth Rock’s links, we can see that very few of them are higher DA than their own site. In fact, a lot of the links are coming from a Greek health and wellness blog; being.gr.
Given that it looks like a lot of Growth Rocks’ core team is Greek, this could potentially be a personal blog of a friend.
While this might seem like a good way to get links, in the beginning, it’s rarely a good idea. If a blog that links to you isn’t related to your core topics and keywords, it’s seen as a negative sign. If Google sees too much of this going on, you could be running the risk of a penalty.
Finally, let’s look at our link profile at Ladder.
The top link here is from a comment one of our staff made on the drift blog; as you’d expect it’s a nofollow link, so no real benefit there. The next few are from Startup Institute following a piece that we wrote for them on how we hire the majority of staff from their program. This is a perfect example of the power of guest blogging; 3 links with a PA of 35 from 2 hours writing an article. Not only that, but the post got a lot of love on social media and benefitted both us and the Startup Institute. Let’s scroll down further.
It looks like there are a few links from Brunchwork here; one benefit from giving talks at events that nobody thinks about is that you can often pick up backlinks this way. We also have a number of links from XYLO; a design agency that did our logo and branding a while back. This is something we could push for from every partner we work with to get a number of new links.
The ‘anchor text’ that a website uses to link to you is also important. For example, startup institute called us a ‘data-driven marketing agency’. This is much more beneficial than just linking to us as ‘Ladder.io’. This is because Google will now associate us more strongly with the keywords ‘marketing’, ‘agency’ and ‘data-driven’.
Whilst beneficial, it’s important you don’t try to push this too much and game the system. If a lot of your links have the same anchor text, or too many of them have anchor text, you might be penalized by Google. If you are prompting partners to link to you with anchor text, make sure they use a variety of related keywords to be safe.
The homepage isn’t the only page on our site that attracts links. We can take a look at the top pages report and see what others collect links.
In our case, our playbook got almost as many links than our homepage, with 16 vs 23. Of those links, 9 of them had a ‘ref=producthunt’ parameter on the end of the URL. This means that the influencers who linked to us must have done so by copying the link after clicking through from Product Hunt. Publicly releasing a useful tool like this via Product Hunt is a particularly good way to gain links, and we should consider planning another free product release in the future.
Despite all the hard work and effort, we’re putting into our blog (some of our posts are 10,000 words or more), they hardly ever attract links yet.
For example, our top blog post, with over 50,000 visitors only got 1 link, from Buffer, which doesn’t really count as it’s just a link shortening service. Our content is driving a lot of traffic on social, and some on organic search, but it’s our homepage and the playbook that get links.
This isn’t the case for Growth Rocks, it seems. Their blog picks up a large number of their links.
Just this one blog post about ‘Why your e-commerce store will fail’ has garnered over 196 inbound links. Let’s look into this a little further.
Huh, they’re almost all from ‘Marketing Pilgrim’, from various archive pages. These links will be next to worthless, and maybe even damaging, as that website looks pretty spammy.
A quick check of another of their blog posts, this one about measuring social ROI, has most of its links coming from Bit.ly, a link shortening service.
If we take a look at that URL, there’s a ref parameter on the end (just like the product hunt one we saw) for a service called Quuu.
This is something that we’ve actually tested at Ladder. It fills up your social media queue with relevant blogs and articles to post, for free, so it looks like you have an active social media feed with very relevant content without lifting a finger. They make money by getting paid to promote said content; maybe this is what Growth Rocks are doing here?
So it looks like Growth Rocks really isn’t getting as many links to their content as we first thought. We can’t know for sure if they’re promoting their content through these distribution channels, or if this is just happening as a coincidence, but either way the links can’t be helping them much.
All in all our findings are pretty promising. In the space of half a year we’ve launched and grown a blog to a pretty decent authority level in our niche. We still have a fair bit to optimize from a technical SEO standpoint, but we’re not committing any major accessibility sins.
Our competitors don’t seem to be too daunting either; with a little work, we could get featured in top publications like AdAge. We can easily lean on our partners and suppliers to link to us; for example, BBH (the ad agency that incubated us) has a domain authority of 59 but isn’t yet linking to us. It’ll be no work at all to test distribution of our posts through a service like Quuu.
Aside from all that, we have one big advantage; our product team. Like the playbook, we can push out a number of free tools to attract links. We can build more engaging pages and design more impressive infographics. Over time, coupled with the quality content we’re pushing out and the optimizations we can make, this should give us the edge.
With these changes, we’re talking about doubling or tripling our current SEO traffic in the next few months. That’s a couple hundred extra visits a month, and maybe 5-10 new leads, making the whole exercise worthwhile for us. But ultimately we’ll still be small fry. Maybe in 10 years when we take on HubSpot, I’ll tell you how we did it!
Normally a bigger website might want to do a social media audit as a separate piece. Many companies have entire separate divisions just focusing on social. For us, however, in the Marketing B2B space, social is a relatively small part of what we do. Don’t get me wrong; it’s important. In fact, it’s actually one of our top sources of traffic for our blog.
But like our main source of traffic, Direct, it isn’t something we can do much to influence directly. Why is that? Well, social (and direct) is mostly just the online version of ‘word of mouth’. When people like our product, our services or our content, they share it with their friends/colleagues.
Because we add ‘utm_campaign’ to every social campaign we run, we can see that only 1% of our ‘social’ traffic comes from activity we intentionally actioned.
Yes, that’s right; your ‘social media manager’ has nothing to do with the vast majority of social traffic you got. Don’t believe me? Check out your Facebook page insights.
Half of these posts don’t get a single click or reach more than 50 people! It used to be that posting on social could drive a lot of engagement, but changes in the platform’s algorithms have changed all that. We still post because it doesn’t take long, but don’t expect a miracle here.
Remember ‘intentionally actioned’ also includes any traffic that comes from people sharing the post using our share buttons.
Sure, a good deal of this traffic comes from aggregation sites like Reddit and Hacker News.
…but we really can’t influence that too much. We post our own articles and participate in discussions on a weekly basis, but doing too much of that can get you banned from the site, so we don’t overdo it.
Where does the rest come from? People sharing the posts themselves, without our intervention.
We get an inordinate amount of shares on our content for such a young blog. It’s hard to tease out why that is based solely on our data. This is because every post is largely following the same few principles we learned from past experience, which are:
Now that we know what’s the same about these posts, let’s look at what’s different as we might be able to find out what type of content drives the most shares. From BuzzSumo you can export to CSV 200 times in a month using the free version.
From here, we can pivot on some of the fields BuzzSumo gives us to find out some useful information. For example, what platforms we get shared on the most.
Over 50% of our shares are on LinkedIn, with another 30% on Twitter. We can look into Google Analytics and see how that correlates to traffic.
Despite being 50% of the shares, LinkedIn only drives 15% of the traffic! Facebook is much more important than it looks from a number of shares; 1,500 visits from 960 shares. This shows you should never just assume a linear relationship between vanity metrics like social shares, and actual traffic (or conversions) on your site.
Because BuzzSumo automatically scrapes the author name, we can also see things like which authors get the most shares on average per article they write.
One thing I can see straight away is that guest posts get much less social engagement. We don’t know for sure why yet, and we don’t have much data, but we should keep an eye on it.
The surprising thing here was that Stefan, our Head of Content, gets the same number of shares on his posts as I do. As a co-founder of many years more marketing experience, I thought there would be no contest, but I was wrong. Any founders/managers struggling to delegate key tasks should take this as a lesson; delegation doesn’t have to mean lower quality.
Let’s sense check a few easy to tease out metrics like length of URL and Title.
Looks like there are no serious differences in URL and Title length, so it’s not worth analyzing. This is a common occurrence; when the team sticks to a strict editorial style, there often isn’t enough variation to produce interesting analysis. That’s why to get any real insight, you can’t just divine insight from your existing data, you need to A/B test to move the needle.
We also have a ‘number of words’ column, let’s map that out and see if there’s a trend.
Note particularly useful in its current form. Let’s break it down into quartiles.
Then we split out the first quartile; 25% of the values fall below this number.
Then we use a nested IF statement to get the second quartile values. We know that if they don’t pass the first IF statement, they’re above the first quartile, so we only need to check if they’re below the next quartile.
We add in the final IF statement (now it’s either Q3 or Q4), and make the cell references to the quartile values absolute, then drag down the formula.
And finally after all of that data wrangling, we can get the pivot table we wanted.
It seems like the number of words has a big impact on the number of shares; the 10 posts that were over 2,600 words (7,000 words on average) got a whopping 2.3x more shares than the shorter posts. It seems like the sweet spot for us is to either write a 2,000-word post for most topics (half the shares for less than half the words) or go for a massive 7,000-word post whenever possible.
Let’s see if time is a factor. Do we pick up many more shares over time, or do most of the shares come right in the first few days of posting?
First, a simple formula to get the days since posted.
Then we just chart that out and see if there are any patterns.
It looks like, excluding a few outliers, there is a slight lift over time, but after the first month, the post has gotten most of the shares it’ll ever get.
Finally, let’s add some information we know into the data to make it more meaningful. As part of our content strategy, we try and publish a mix of ‘process’ oriented posts and ‘tactic’ posts. The process posts talk about how to do some marketing task in a step-by-step fashion, whereas the tactic posts list a number of tactics from our database related to a specific theme or recipe. Let’s go through manually and label these posts.
Now we can easily pivot and see what the performance difference is between article types.
Another surprising result for me. We’re getting the same number of shares on average for the process posts vs the tactics posts when I thought the process posts would be killing it.
Ok by now you get the idea. We could keep going with this process but we won’t get that much useful information out at our stage. For a business that has a lot more data, it’d be largely the same process;
decide on a question => wrangle the data => pivot and/or chart the data => interpret the results
You probably see similarities with the PPC audit we did; it’s always about finding interesting segments and seeing if there is opportunity to improve. Ultimately though, as discussed, the only way to confirm your hypothesis is to A/B Test. Don’t expect to do an audit or analysis and have all the answers; usually, all you’ll have is more questions!
Sign up to our newsletter to stay up to date with all the latest movements in the field.
Get Facebook ad support & answers via our ultimate Facebook FAQ guide – covering ad sizes, approvals, spending, strategy, settings, culture, polit...Read More →
See the marketing plan template that drives our success. This marketing plan example covers strategy, testing, analytics, performance, budgeting, and ...Read More →
Promote a social post to get cheap visits to a website and capture emails. High quality social posts can be targeted to reach specific audiences based on interests, shopping habits, browsing habits, and more. Publish a post on a social network and use their native advertising too...
Place your call to action form or button on the right side of your landing page to increase activation. Focusing on the right side of your landing page rather than the center, especially when the form remains visible while scrolling down the page, keeps registration forms and CTA...