The strongest sites are those that have a tightly knit architecture of interlinked pages, all linking to one another and efficiently working towards boosting their individual rankings, thus pulling up the domain’s authority as a whole. But how is it that these pages come to be so sturdily constructed?
If you are worried that you need to hire some Silicon Valley software engineer to handle your design and architecture problems, you can relax and put your wallet away. There is plenty that you can do on your own to improve the connectivity of your site and improve the ranking of your pages along with it.
Improving the architecture of your site does require some basic HTML know-how, but with the right tools and metrics you can turn your pages into a sticky web that not only lures traffic, but traps more conversions as well. Keeping these tips in mind will clean up your site for the search engines. Ideally, this will also improve your user experience which will promote repeat visitors and conversion.
Impressing the Search Engine
Whether you’re looking to improve your page authority on an individual basis or your domain authority as a whole, you need to be looking at the obstacle through the search engine’s eyes. Basically, all of us are the desperate high school freshman pining for the prettiest girl in school, and Google is the bell of the ball. So improve your site to impress the spiders that are crawling and indexing your pages for ranking.
Many people – even content marketers – don’t know this, but the user’s view of your page is actually different from that of the search engine. You may have a beautiful landing page filled with brilliant content and copy and still wind up with a low ranking because none of it is set up correctly for the spiders to access it. If the spiders can’t get to your content, they aren’t going to be able to record its value, and your ranking is going to suffer.
Here’s an example of the difference between what your page looks like to the viewer, and what it looks like to the search engines.
There are plenty of DIY softwares and sites out there to help you analyze the actual search-engine-friendly format of your pages. Moz’s Open Site Explorer provides you with metrics on the influence and linking profile of your domain and specific pages. It focuses on the domain authority and page link metrics, with plenty of filters for you to fine tune your analytics. Screaming Frog is a site that will help you handle the actual architecture of your pages to improve their crawlability, response codes, and site depth (to name a few).
A strong site architecture is made by creating content with keyword intent. A well interlinked site will use subfolders, linking to other keywords that ideally build off of the long tail keywords associated with the original keyword the sub-folder is targeting in the anchor text. Streamlining your keyword research to be more pervasive through your sub-pages and to develop a long tail pattern through those pages’ anchor text is a great way to strengthen the organic linkage between your pages.
Anchor Text and Descriptors
Anchor text needs to be driven but controlled and organic. Remember that if you are writing your anchor text for the sole purpose of driving up your page’s authority, Google may notice it and you can end up with a penalty instead of any viable improvement to your site. Instead of having an excessive amount of exact match anchor text that draws the negative attention of analytic software, try nuancing the anchor text of your pages to target keyword variation.
Check out the URLs for these different subpages of this well structured site (above).
Varying the anchor text of your pages to target the long tail keyword modifications to your original domain keyword boosts the rankings of each of your pages while also bringing up domain authority all together.
For example, a site that is focused on battling cancer would have its original domain keywords targeted towards short tail keywords related to “cancer treatment.” But the subfolder of its other pages that link to and from that source page would be based on the long tail keyword modification off of “cancer treatment,” such as “diagnoses,” “treatment prices,” “chemotherapy clinics,” etc. This makes navigating the site easy for both spiders and user – impressing both will ensure higher rankings.
Spiders and Crawl Space
I’ve already mentioned spiders a few times thus far. But just in case it’s still unclear, here’s a graphic to help illustrate my point:
Just kidding. Spiders are actually web crawlers that search engines use to navigate and analyze your site’s infrastructure. These are the little buggers you need to impress. They are one of the reasons why link building and internal linking strategy are both so very important. Link building helps generate awareness and build authority of your best content. It can also then spread the “link authority” to other important pages of your site that are interlinked from that initial page.
Now, the spiders navigate your site, “crawling” and “indexing” all of the different links, content, images, and anchor text that make up the guts of your pages. The crawl space is the linkages between your different pages, on your pages, to and from your pages. Cleaning the crawl space up – as in streamlining your internal linking strategy – is how you can maximize the ranking of the content you already have developed. The easier it is for the spider to crawl, the easier it is for you to rank.
Bottom-Up Link Building
Even though link building is a recognized must of any content marketing campaign, most content marketers prioritize the domain authority too much, and can often abandon the PA of their individual pages while trying to bolster their DA. If you are looking to strengthen the weave of a web, you wouldn’t tighten the central knot over and over again expecting the rest of your web to solidify on its own, would you? Hopefully knot (pun intended).
Building your link profile with targeted anchor text to your individual pages will raise the PA for each of your pages. Surrounding your content pages and subpages deeper in your pagination with authoritative links and keyword targeted anchor text produces a more pervasive upward trend. It builds domain authority by amassing a legion of strong page authorities. The central knot may be what holds your web together, but strengthening each individual strand with specific and deliberate attention is what turns your silk into silver.
Common Problems with Interlinking
There are some common issues that sites run into when optimizing their interlinking. Most of it can be chalked up to coding misappropriation. Some of the credit goes to a failure to trim the fat of your page architecture. A web filled with loose strings leading to nowhere and random clumps of knotted links isn’t going to stand the test of time, let alone be ranked highest among the other webs. My metaphor didn’t fall apart there did it? Nah – nailed it.
There are three major monkey wrenches that can be tossed into any interlinking project that will really cause you some grief. Mainly because they are the more difficult mistaken inefficiencies to identify.
1. Gated Content
For the most part, your content should be available ready-at-hand to users, at least within the navigation of your site. However, every now and then you’ll want to use some piece of content to offer users in exchange for information or registration. In most cases, if they have to fill out a form or survey to access the content you are linking to, the spiders are going to pass over it. One way to get around this issue and allow your content to be indexed is by using your gated content for inspiration on blog posts and other pieces of content within your site.
2. Link Flooding
Another big problem that sites often run into is overlooking unnecessary linkage that is either slowing your page down, confusing the spiders, or simply not being found by your users. This is where you need to break out the sweatband and coffee pot, sit yourself down, and start cranking out some due diligence.
External Link flooding – which is exactly what it sounds like: when a page is overloaded with far too many external links to handle – can waste the space in your link profile that could be better spent linking to your own individual pages. For sites with a large link profile, eliminate all of your external links to free up space for your own space. Make sure that if Google is being selective with how many links it follows, that it at least sees yours.
3. “No-follow” tags
You should also be checking your links for “nofollow” tags. This just means that the spider won’t follow that link. If the spider can’t crawl that link it won’t find your content and it won’t be indexed, which means Google isn’t seeing the value of your piece.
But you’d be surprised at how often software engineers accidentally write plugins that automatically assign a “nofollow” tag.
A prime example of this is “related posts” plugins in WordPress. Make sure to look at how these links are being sent to your blog content and that they are not a “no-follow.” This is a simple mistake that can cost you if you don’t catch it quick. If you’re spending all that time and energy developing the content, make sure it gets seen!
Trapping Authority and Audience
The best internal linking strategies are going to be built around targeted keyword search results that build off one another, both connecting back to your original domain keyword and spreading out into the more nuanced long tail modifications.
Remember, that everything you link to is linking back to you as well. “When you link with something, you’re linking with everyone they’ve ever linked to as well” – right? So be careful to only surround your links with strong authoritative pages with high rankings for the keywords you are targeting as well.
A quality web is going to be made out of many individually strong strands, all tightly knit together with no loose ends or wasted space. If those spiders need a web to crawl on, give them a silk stage to dance on instead. Your web will be trapping everything that touches it, from ranking, to authority, to audience, and conversions.