Make sure the signals you are sending are consistent
Kaspar says: “Signal consistency. It might not be as popular as some other strategies, but it will serve you well if you embrace it – and there are very good reasons for that.”
What do you mean by signal consistency?
“SEO is about input. Google, and other large search engines, don’t like or dislike websites. They are quite indifferent. The rankings for relevant queries are the output, but what really counts is the input. You have to make sure that the signals emanating from the website are consistent, especially in very competitive environments such as travel, retail, and affiliates.
Assume that search engines, and Google in particular, can’t trust your canonical signals. They may pick and choose the canonicals that they believe are accurate, algorithmically, but those are not necessarily what you would prefer to be crawled, indexed, and ranked.
You have to check those signals and make sure that your technical signals (including canonicals) are consistent, and they go hand in hand with your content signals and off-page signals, including backlinks.
That input ultimately determines how visible a website ends up being for competitive queries. Signal input is everything and there are ways to go about it to ensure that signal consistency.”
Are all search engines essentially looking for the same signals?
“To a large extent, yes. I’m primarily referring to Google and Bing. Bing’s market share is relatively small, but that 5% of potential traffic can still be millions of impressions and clicks.
When you optimize for Google, at the same time, you are very well positioned to optimize for Bing as well, because those two search engines tend to embrace and appreciate the same kind of technical signals.
You don’t necessarily need to do anything differently for Bing. A website that’s well optimized for Google tends to be very well optimized for Bing as well.”
Are LLMs part of this as well?
“It had to come up, didn’t it? LLMs are really just that: large language models. In my experience, they can be a great tool. They can be very useful for expediting repetitive processes, particularly for software development.
They can also be a two-edged sword. If LLMs are being applied as a silver bullet for any and all SEO challenges, the ultimate result frequently ends up being a rather undesirable decline. LLMs are only as good as the user, and that user’s deeper understanding of SEO.
I do not believe that LLMs should be a large part of the conversation when we talk about SEO strategy – and certainly not when we talk about priorities for 2025. However, there will of course be people out there who think differently.”
How do you ensure that your SEO signals are consistent?
“The first, and most important, step for anyone who wants their website to be consistently visible is auditing on a regular basis. I keep hearing (from prospective clients and at conferences as well) that people are terrified about the next Google update because their website seems to be right on the edge.
It’s going up and down all the time. Whenever there is a new update, it’s a terrifying experience and they can’t focus on their actual activities and core business goals because their organic search visibility is either in freefall or it’s going up unexpectedly.
That’s a perfect example of where you are not providing signal consistency: every time there is a Google update and the algorithms are being fine-tuned, the website falls over the edge. It’s terrifying and it’s something to be avoided.
You can avoid it through defensive audits. I like to think about aspirational SEO audits but, in a situation where you already are in distress, it is a defensive audit. For any publisher in a more comfortable and stable situation, it would be an aspirational audit.
It’s comparable to bringing your vehicle for a checkup before embarking on a longer journey. You want to make sure that the fluids are right, your brakes are okay, the lights are okay, etc. You have to think about your commercially viable website, or your online business, as a vehicle. You do not want to get stuck while you’re trying to overtake your competitors, so you have to check those signals.
One very good reason for doing that is, even though that website was built with a lot of love, commitment, and resources, any solution that applied at the time it was built is going to age. The content itself may age. Those legacy signals tend to create inconsistencies, particularly with large websites. For that reason, you need to audit regularly.”
How often should you be auditing?
“There are 2 different, diametrically opposite, approaches – where people either do it very rarely or large operations think they need to do it every 3 months. I would not support either of those approaches.
I think you should audit once per year; every 12 months. It should be an annual set-in-stone cycle where a holistic SEO audit is being conducted. I personally think that a third party does it better, and there are very good reasons for that. A third party doesn’t have the insider knowledge. They look with a very cold perspective, specifically targeting weak points in the website’s armour.
The audit should be holistic, including server log analysis covering the past 12 months (or, preferably, an even longer period of time). It should include on and off-page signals. It should obviously include all the infrastructure architecture, but also the technical signals.
If that’s done once every 12 months, or every 16 months, that’s perfectly fine. If it’s done less frequently than that, you’re running the risk that you will fall behind. Simply put, legacy signals will become too prominent, and rankings will suffer. If it’s done much more frequently, it doesn’t leave enough time for improvement implementation. Auditing too frequently isn’t a good use of an SEO budget either. Hence, once every 12 months is a very good rule.”
What are the key on-page and off-page elements that you’re looking for?
“I’ll try to keep it brief, but one area that is particularly close to my heart is server logs – and simply having those. Of course, they need to be recorded in the first place and preserved, rather than being frequently overwritten.
Having server logs allows you to accurately address Googlebot and Bingbot’s priorities in terms of crawling. This becomes particularly important in fast-moving environments with large websites that have a lot of landing pages, where having up-to-date pages is critically important for ranking in search engine results.
If the relevant landing pages with the relevant products or services are not being crawled frequently enough, users can be confronted with landing pages that do not live up to their expectations.
If a user clicks on a result for ‘travel to Paris’, and the landing page does not live up to their expectation because the travel option described on that page no longer exists, what is that user going to do? They’re going to revert back to Google’s search results and look for an alternative or refine their query. They will indicate with their behaviour that their expectation wasn’t matched, and they were disappointed.
That is a particularly poor user signal. In an individual instance, it’s of no real consequence, but on a large scale, it’s really bad. Poor user signals tend to generate poor rankings over time. Hence, you have to make sure that your crawl budget is allocated towards landing pages that live up to users’ expectations.
Managing your crawl budget is hugely important, particularly for large sites. We could go on and on and discuss the matter in much more detail, but we wouldn’t have the time to cover that topic adequately.
I do want to mention, however, that it is never about just one signal. You can manage the crawl budget adequately, but there might be legacy backlinks that come to haunt you, aggregated over a long period of time by teams long gone, which linger around just to be uncovered at some point leading to an undesirable outcome, like a Google manual spam action/penalty.
For that reason, a holistic approach is what allows you to really understand what your signals are, whether they are consistent, and whether you can improve them in order to put your best foot forward – to make the website understood by Googlebot and Bingbot so that the rankings are as good as they possibly can be.
Log file analysis isn’t mandatory – because the vast majority of websites don’t record them – but it provides a huge advantage. It is a game changer if you can tap into the server log files. 9 out of 10 operators out there do not save and preserve their server log files as of today.”
Do you have any preferred software for server log file analysis?
“No, and every operation is different. It’s hard to give a general solution that would be applicable across the board.
It is important to make sure that the right fields are being saved and preserved. You want to have the IP addresses, the server responses, and much, much more.
The article I penned about this topic on Search Engine Land, Why Server Logs Matter for SEO, provides a little bit more guidance in terms of what needs to be saved and preserved. However, every solution needs to be individual in the context of the technical acumen and technical setup of the website.”
If an SEO is struggling for time, what should they stop doing right now so they can spend more time doing what you suggest in 2025?
“Stop waiting for and anticipating the next Google update. It will come. It’s coming. In the course of the day-to-day, there will be at least one update. Waiting for those to happen, and fretting about what they may cause in terms of fluctuations, is a time-consuming exercise and it’s not very productive. At the end of the day, you can’t undo it, and you can’t stop it from happening.
Instead, do something that you can control, which provides an opportunity to improve. If that is the objective, the two things that need to be abandoned are looking for the next update and looking at what your competitors are doing. Both of these are not productive.
Instead, focus your resources and time on improving your own signals. Focus on what can be done – and that depends on the budget, the size of the website, the team that is available, and the third-party providers you have access to.
If possible, focus on making sure that your website’s signals are consistent. Crawl your website repeatedly with a number of different crawls, compare those results, dive into the code, and put your best foot forward. Do an SEO audit in anticipation of improvement, rather than fretting over potential visibility losses.”
Kaspar Szymanski is Senior Director at SearchBrothers, and you can find him over at SearchBrothers.com.