Don’t forget the basics
Nik says: “My number one tip is: don’t forget the basics. I can’t believe that I’m going back here, but I think it’s so important. When we think about the future we think about Google, we think about personalization, and we think about emerging media types coming in and becoming their own reckoning forces.
As SEOs, we always think: ‘I’m going to create all this wonderful content and I’m going to enrich everything!’ but the thing that I’m seeing is cardinal sins that really, really need to be addressed. I’m talking about crawlability, rendering, indexability, ranking, and conversions: the five fundamentals of SEO.
These things are so important to think about. You need to think: ‘How is the search engine crawling through my site and understanding the structure of my site? How can I present the information in the best way, so that it will render that content, index those pages, rank for things that I want it to rank for, and therefore have conversions that ultimately affect the bottom line?’
That should be your new mentality. It’s so important to go back to these basics. I’ve just started a new job, so I don’t have all of the clients that I’ve been working with for years. I’m entering a new situation and I’m going back to thinking about what is most important - and it’s checking the main fundamentals.”
When you conduct an initial crawl, what are some of the things that you should focus on?
“One of the first things to do is open Search Console, have a look at the Pages folder, and see what is indexed and what is not indexed. Google is very good at telling you, from your pages that are not in the index, what that is attributed to. Has it been crawled and not indexed? Has it been discovered and not indexed? Are there 404s in there?
Even just going through and checking there first will give you a sense of what else might be going on with your site. A lot of the time I’ll see sitemaps either with errors or that are totally incomplete - or they’re not in an index file so it’s trying to access many different points and it will try and read it from a top-down approach. Get your Sitemaps actually working, and think about the internal link structure.
Internal links are so important because they will show the authority from different pages pointing to the things that you want people to find. Search engines understand this, and it’s how the crawl works: it will follow those internal links, and it will find those pages. Don’t cut off the crawl path. If you’ve got paginated pages - or a tonne of different pages that you’re cutting off, noindexing, or redirecting the subfolder page - that can really hurt the crawl path. It can create pages that are now orphans because it’s very difficult for search engines to find them through the internal links.
Do you have breadcrumb schema? Breadcrumbs are really useful to bracket your whole structure and give some understanding of what page comes after the next page. Since Google did away with rel=“next” and rel=“prev”, I’ve seen some really interesting Wild West things happening, like integrations with things like robots.txt.
Go back and make sure that this is tackled - as your first protocol. There is nothing more annoying than focusing on an elaborate content strategy and then finding out that most of those pages are not added to the index.”
What are some of the common issues that you’re seeing with rendering?
“When it comes to rendering, you want to be aware of what is rendered and what is not rendered - so what is potentially served client-side and what has been served server-side. Is there JavaScript that is influencing the style, content, and maybe even internal links? These are the places that could go wrong. Consider things from Googlebot’s perspective, did any of these resources show an error when requested?
To check this on a page, I’ll right-click, go down to my Inspect, open up DevTools, go to Sources, and then open up the bottom tabs there with Network Conditions, or even in the Coverage tab. Essentially, I want to see what is in there that could be render-blocking. That is a good place to start. Render-blocking resources are those static files like your HTML, CSS, JavaScript, fonts, etc - the things that are vital to the actual render of the page. Usually, when you are checking this, you will find all the resources that Google has found to be critical, and have to be rendered in order to display the page. These are the files that Google views as critical for them to process, so it puts everything else on hold.
One of the easy things to find and prioritise are these render-blocking resources. On the other hand, non-render-blocking resources don’t necessarily postpone the rendering of the page (the browser can safely download them in the background after the initial page render). Not all resources that the browser deems to be render-blocking are essential, at least for the first paint. It really depends on the original characteristics of the page.
Shortening the Critical Rendering Path and reducing the page load time are good things for us to really focus on when we think about how a page loads. A lot of people will give little screenshots from Lighthouse, and give generic advice, but it is really important that you are actually looking at these resources and understanding: are these critical or should they be non-critical? Then, making them into non-render-blocking resources, perhaps by deferring how they download.
You may want to decrease the total number of those resources using bundling, fewer HTTP requests, reducing the size of these resources, minification, etc., so that the page has fewer bytes to load up these things.”
What are some common reasons why Google may choose not to index something?
“Search Console is, again, a really great way to test this. There will be a page that says: ‘This page is not being indexed.’, and one of the common denominator issues that I see is that is not added to a sitemap. A sitemap is a wonderful directory to say: ‘These are all the pages that I want to have indexed. This is the directory of how we want to display and show information to the end user.’
A lot of the time, when something isn’t indexed, there aren’t a tonne of internal links on there; it hasn’t been added to the sitemap, there aren’t enough internal links, or there isn’t a breadcrumb schema. Often, there’s nothing to say that this page is super important to you and you want it to be indexed - to compete for branded and non-branded terms. Again, start from those basics.
In cases where you’ve done all the right things (you’ve got it in the sitemap, you’ve got internal links, you’ve got breadcrumbs, you’ve got really good rich content on there, and you’ve optimised it the best that you can) but it’s still not added to the index, Google pays attention to the Link Graph, the Social Graph, and the Knowledge Graph.
If you’re putting out content but you’re not really seeing it getting indexed, it could be because there are greater issues. Go back to Search Console - to the Pages tab, to the index on the index list - and get a sense of what’s happening with the way that your site has been perceived. Maybe go into the settings and take a look at the crawl rate, look at how search engines are even seeing you. Do you have a tonne of 404s in there?
I had an old client where I was looking at the crawl and 84% of everything that Google crawled was actually a 404, which was crazy. This was a massive, multi-million-dollar site, but it was only getting a fraction of the attention it should have been. I went in there and saw that it was actually from an old migration that went horribly wrong, but Google was still picking up a tonne of different things. Lo and behold, they still had a staging site, with all of their old and new ideas still indexed. That was something that we only discovered through going back to the basics in Search Console.
Look through these things; you will find some gems. It takes time, but follow the thread and get a flavour of how your sites are actually being perceived.”
Another solution (if you’ve done all the right things and you’re still not getting your pages into the index) is to consider the Social Graph. There’s not a lot of information about it, but if you put out new content and use the company accounts to tweet it out - use that social proof to create conversations around it – then you will create some interest and organic flow to these pages.
We’ve had examples where we’ve not just written up great pieces, but also had full social campaigns around something and those pages get added to the index really fast. If the Link Graph is something that Google has used from the very start, and the Social Graph and Knowledge Graph are starting to carve out their place (becoming more optimised and refined for the future and providing even more), then we should pay attention to these things.”
What kind of impact does conversion have on SEO?
“SEO is Search Engine Optimisation, and it is the stream of marketing that influences search visibility, but for an end goal. When we leave conversion out of the conversation, we’re not heroing that goal.
Some of the things that I like to do now are focusing on having survey data, having heatmaps, looking at the way that people are using the site and clicking through the site, and having an understanding of these pages versus their competitor pages. What has been expected, when looking at the UX of a page? Is the call to action (the thing that people are wanting to do) above-the-fold? Is that a useful, meaningful thing for people to see?
When you’re testing the key terms that you want to rank for, do they have the right intents? What is being displayed about that? Google is a wonderful teacher to show us what it expects to see from a result. For a keyword, I’ll look at the Web tab, I’ll look at the Image tab, and I’ll look at the Shopping tab. If you’re looking at ‘custom sports jerseys’, for example, Google might want to show women’s varieties, men’s varieties, sleeveless, long sleeve, different types of colours, and all these different variations. That tells you that Google is wanting to represent all of those different variations of things on a page for that query.
Therefore, you might need to emulate that in your UX. Give people the ability to be able to click through and find things, or coalesce all that content onto one page in a meaningful way. It’s really interesting to do these tests of what Google has found to be meaningful, versus what has been displayed for the clients or their competitors. It creates a kind of benchmark.
A/B testing is such a useful tool for confirming these things. Has a change influenced the outcome in a positive way? Is it generating a negative result? Each of these things is an important step towards understanding the inspiration behind a click, and garnering that attention.
We did this recently with a client. After we fixed a lot of their technical issues, they were actually getting some visibility in countries where they’d never previously had issues finding traction. The way that they had built out the site was a mirror image of all the other countries that they had been building out. We started to work through them one-by-one, to get an understanding of the way that people were searching. We ran surveys in these countries and found that there were colloquialisms for how people were specifically searching for things.
These insights were so valuable because they went beyond keyword research. Once we tested this within the title content, the heading content, and sometimes in the body content, there was a lot more of an understanding from the local population about what that page was representing. We saw an uptick of clicks in Search Console, specifically going through and finding these key terms. The page had good traction, it was getting indexed, and now Google was testing it for different key terms, and we wanted to see whether it would find them or not. We were lucky because it did.
The next hurdle was where a page might have received a nice increase in impressions and a nice increase in clicks, but nothing was being generated from it. Using a little bit of our own thinking, and through testing with heatmap data: is this page actually getting the appropriate amount of attention in the places that we want it to?
We realised that no one was understanding that there was a two-type menu bar, no one was clicking through and selecting things, and the filter tab was completely unused because it was too basic. When testing the scroll depth, we found that people would only go to about 50% and never 100%, but all their main product lines were about 75% down underneath all kinds of information (boilerplate template content, images, icons, and all this other noise). No one was actually converting on the products that they were wanting to show to people. Based on all of that information, we knew that we had to redesign the UX.”
What shouldn’t SEOs be doing in 2023? What’s seductive in terms of time, but ultimately counterproductive?
“Stop focussing on keywords. Search is changing, and this goes into the idea of Semantic SEO. It’s about topics now. It’s about positioning yourself as either the expert or the qualified, well-researched commentator of something. There is going to be more focus and more rewards based on being able to really qualify things for people, particularly with the helpful content update coming out.
We’ve had so much collateral and thinking going on since 2017/2018, with Google EAT, but this is only going to progress further. Google is trying to personalise search even more. It’s wanting to create whole new ways to understand media, and it wants to be able to focus on these attributes, these entity types, organise them through social media and different types of interactive media, and structure it in a way that is completely personalised for the user.
As marketers and business marketing teams, we need to think of ourselves as providers of some form of expertise, with a responsibility to create, curate, and provide more rich information to people. Start from this mentality, and this way of thinking.”
Nik Ranger is a Senior Technical SEO at Dejan Marketing, and you can find her over at dejanmarketing.com.