Jonathan says: “My additional insight is, when planning strategy, to leverage data analysis and what-if the scenarios to effectively navigate ambiguity and the unknown.”
What data analysis are you referring to when planning a strategy?
“I think it's important when starting a strategy – or if we look at what constitutes good strategy – is having this alignment with a bigger picture and a clear sense of direction.
What I like to do with any client or strategy that I'm pulling together is to essentially gain a good situational awareness of what's happening. Primary data sources, as I'm sure you're aware, are analytics data such as Google Analytics and Search Console. But you've got other data sources that supplement that, such as Majestic.
Really, it's having the time and the skill to be able to interrogate that data to get a good understanding of the existing performance in relation to the bigger picture. I find that, as SEOs, we do have the ability to work in silos.
I think it's important to be able to zoom in and zoom out to get that understanding, using the data, of how SEO fits in amongst that bigger picture at a broader channel level, and as a business level as well.”
You mentioned a few sources of data there: GA4, Search Console, and Majestic. Are there any specific reports that you tend to use on a regular basis as a better source of reliable data?
“When it comes to creating reports, I tend to think in segments. I try to start with, say, total traffic at the top and then, if we drill into that, you've got your channel data. Then you drill in further and you start looking at landing page data. I try to build segments, and that's a very basic 3-tiered segment there.
I try to put that into, how does everything fit as a bigger picture? If you think of something like a tree map, how does organic fit within that? How can you then group landing pages into page templates or content groupings? I want to understand and get a feel for how everything fits together.
For an e-commerce concept, you've got your homepage, you've got your list pages, and you've got your product detail pages. I want to know the relationships at the beginning. I want to know what percentage of traffic is coming into the homepage. I want to know what's going into PLP pages and I want to know what's going into PDP pages. How does that fit as part of the composition?
Then, once I've got an idea of the composition, I want to start looking at the time series data. How has that trended over time? You've got the classic organic trends. Maybe there's been an update and a bit of a drop or there's been a release and there's been an uptick, but understanding that composition and the time series data is normally my main starting point.
Then I do a similar process with Search Console, but Search Console, of course, has the query data. Then it's more about, how can I segment the queries to get that understanding of what the composition is? You can get page query reports, so I can get the granularity there in terms of: ‘Okay, these are the queries that are landing on my PDPs versus the queries that are landing on my PLPs.’ It's building layers there on top of each other.”
I liked your tree example there and, obviously, you talked about e-commerce as well.
Is there an ideal percentage of traffic that you're looking to land on your product pages versus other pages such as home pages and other category-type pages on your site?
“The next thing I'd be looking at there is actually the depth of stock you have, right? You could have a smaller DTC brand that has maybe a handful of products, say less than 20 products, or you can have a merchant with 100,000 products or 10,000 products. The strategy or approach might be dependent on the products you have, and the product variants.
Then working back, knowing that number, and analysing what products you have, would give you a good indication of what is a sensible traffic percentage for PDPs versus PLPs. Generally speaking, if I've got a smaller range or inventory, traffic to PDPs is going to be higher versus if I've got a large inventory, where it’s probably going to be a more PLP-centric approach.
It's an understanding of, not just where the traffic's coming from, but what the inventory looks like as well, in that e-com context.”
You said that this data also helps you prepare for what-if scenarios. What are some typical what-if scenarios that you're trying to prepare for?
“The what-if normally comes in when you're looking at time series data, right? Search Console has got 18 months, GA4 typically around 14 months in some of the reports now.
Really, what we want to understand is what levers have been pulled. That goes back to being able to zoom in on organic performance and then what other external levers have been pulled. A good example would be an algorithm update or a migration. There might be levers that have been pulled or levers that you know might be pulled or anticipate being pulled there.
An example could be that you look at the links and you feel that the link profile might not be where you want it to be. It might be vulnerable to an algorithm update, or you feel that your content's not strong enough to perform well and it might be vulnerable to an upcoming core algorithm update because you've seen volatility in the past around these types of updates.
Using what's happened in the past can help you prepare for the future. I thought we'd use that what-if scenario, so it helps you prepare or anticipate. What would happen if, in the core HCU, we had to rewrite all our blogs? You know, from your crawl data, you have 500 blogs, for example. What would happen if we had to rewrite all 500? Can we get away with rewriting the top X%?
It's understanding those what-if scenarios because planning is generally done at a yearly level. In the context of an agency, or even working in-house, you're probably going to get a budget for a year. There's a list of activities that you want to do over that time period.
Being able to anticipate what might change or what you want to do, or by creating your scenario planning, that helps you figure out what pieces of the jigsaw you need to get in place.”
You mentioned that your link profile may not be as you hoped it would be. What does a link profile look like that you're not particularly content with?
“In most cases, when you're doing that initial situation analysis, it's just getting a feel for what the type of links/type of citations you have, and the type of places you're getting them, versus primarily other peers or sites in your sector.
It's worth having a handle on what low-quality links you do have. I wouldn't say that you should go in and disavow straight away, but it's having that awareness of what historical activity has been done in a link-building context.”
‘Disavow’ is not a word that I hear very often nowadays. Is that something that still works as far as you're concerned?
“I haven't done disavow for a long time.”
You brought it up.
“Yeah, I know, so now I've got to get myself out of this.
As SEOs, we're not good at dealing with paradoxes. The disavow is a great example of a paradox. Google is communicating that you don't need a disavow, and SEOs are saying you don't need a disavow, and there have been tests done on that. Some evidence suggests not to do that.
Then there are other schools of thinking, and something that Mike King has talked about is putting sites into tiers. If you're in a certain tier, Google might treat your links slightly differently.
It's going back to critical thinking and dealing with the paradox that what works in context A might not always work in context B, so I wouldn't necessarily rule it out. As you can tell, I haven't done too many for a long time, but I think it's having that situation awareness to anticipate.
As a minimum, you want to get a feel for whether your links are good enough and how you are going to build better links.”
One of the worst-case scenarios that you mentioned is the possibility of having to rewrite 500 blog articles. How do you actually determine that it's the quality of the content that has been the issue?
“I think there's some good old-fashioned content inventory to start with, and actually eyeball the blogs. There's been a lot of good stuff put out in the community in terms of how to assess that for the newest set of guidelines, particularly on the stuff that's been primarily written for search engines. I think core principles still adhere.
It's going through everything and, again, having a feel for what's sailing close to a certain threshold that you think might be susceptible to a future change. SEOs are kind of operating on a binary, and there's probably more of a continuum.
I’ve got a client that I have taken on fairly recently that has a load of articles that have been directly written to target PAA searches, and the quality is not great on them, so we're having conversations like, ‘We don't think these are great, what would it take to rewrite them or make them better?’
It's those sorts of questions. It's, again, knowing what lever we could pull if we had to do something, or whether we want to pull that lever proactively.”
Do you mean that it's not great based on readability and delivering specific value to users as opposed to some metric of whether or not that article is bringing in traffic or ranking highly?
“Yes, absolutely. It's more that it's been primarily written for a search engine, and it follows a very particular format that is not necessarily designed to help the user answer their question.
A good example there is ‘What time do Everton kick off on Saturday?’ I'll go on a newspaper site and I have to scroll and scroll through a load of stuff just to get to the answer.
It's using the guidelines as a basis to assess what you consider to be good or bad, or has room for improvement.”
Is it even worthwhile attempting to compete for these types of keyword phrases in the future if Google is perhaps just going to incorporate them as an AI result on the SERP?
“That's another great what-if question, right? What if Google rolls out AI overviews and we've got a handful of queries that we feel that it’s more than likely going to be rolled out for? How do we mitigate that potential traffic loss or is that traffic that's always been at risk because, essentially, the quality wasn't necessarily there in the first place?
There's a potential school of thought that, if Google is choosing to show an AI overview, perhaps the corpus isn't strong enough anyway, and it's something that they can essentially plagiarise and paraphrase.”
You talk about using this data analysis to help you navigate ambiguity. What ambiguity are you attempting to navigate and why is that important?
“For me, ambiguity is just making decisions without knowing the full picture, right? As marketers, you're never going to get that full picture. Or, in an agency or freelancer context, you're never told everything. You find out there's been a release, or so and so did this, or we briefed someone else to create a load of stuff.
It's being able to make sensible decisions based on the information that you have available to you at that time and knowing what to prioritise and what not to prioritise.”
You've shared what SEO should be doing in 2024. Now let's talk about what SEO shouldn't be doing. So, what's something that's seductive in terms of time, but ultimately counterproductive? What's something that SEO shouldn't be doing in 2024?
“I think, as a generalisation, there's a real skill in knowing what not to do. Try to avoid going into the minutiae. It's the ability to zoom in and zoom out of the detail.
SEOs should be avoiding cookie-cutter strategies. Giving someone a 12-month plan that is masquerading as a strategy – but it's not a strategy, it's cookie cutter. I would avoid trying to do that.
I've gone around the houses there a bit, but I think it's putting a lot of thought into what not to do. Specifically in the context of a 12-month plan that's just a copy-paste job, that's probably not a good thing.”
Jonathan Moore is Director at Jonathan Moore Consultancy Limited, and you can find him over at JonathanMoore.digital.