Rubbish in, Rubbish out
Fili says: "You want to make sure you control what goes into the algorithms of search engines. As webmasters, we are responsible for what goes into the algorithms. We need to make sure that our website is crawlable, indexable, and that search engines understand our content. Now, a lot of the algorithms are like a black box, and we don't have control over how those algorithms are programmed or executed, and what comes out of the search results.
However, as webmasters, we do have control over what goes into those algorithms, so we want to make sure we send the right things into them. If we want to change the output, we need to change the input, and if we're sending in rubbish, we're going to get rubbish out. To improve our rankings, we need to focus on what we send into those algorithms."
What are examples of rubbish that SEOs typically send into the algorithms?
"Common examples include problems with trust signals, like canonicalization that is done wrong, or sitemaps that are basically sending the wrong signal by having non-indexable patterns in them. Also, content pages with barely any content that is indexable. You're asking Google to index a particular page, but it doesn't have a unique sales proposition or value, so it shouldn't actually rank in the first place.
If you have too many of these, you're basically sending a lot of rubbish into the algorithms. You need to be very careful about what signals you're sending. Can Google actually trust your server codes? Can Google trust the content that you're sending?"
Will this give Google a degree of confidence in what you do, and impact its willingness to rank you for contextual terms?
"Yes. You've heard all the terms being tossed around, such as 'content quality', 'expertise, authoritativeness and trustworthiness', and 'domain authority'. It's true that a lot of these signals matter. It's a way of evaluating the quality of your content and seeing if you stand out enough. Do you have enough authority so the user would indicate to Google that this is a very good website? The overall trend of the signals you're sending into algorithms determines your level of trust and authority, and whether you should rank for certain terms or not."
What are specific aspects of sitemaps that SEOs are getting wrong at the moment?
"It's not just SEOs - it's primarily webmasters. What webmasters often get wrong is they only generate one sitemap. They have a dynamic website, but they don't generate a new sitemap when they add new content. You end up with content that is indexable and discoverable in your website, but you don't have it in your sitemap.
Another common mistake is having non-indexable patterns in the sitemap, such as redirects, 404s, or deleted content. Another example is people trying to rank in news because there are different rules for news sitemaps."
How important are Sitemaps in the grand scheme of things? If you have a brilliant website for users and content, but you've got a terrible sitemap, is this likely to impact your ranking significantly?
"I'm going to narrow the field to XML sitemaps because HTML sitemaps is a completely different topic. When it comes to XML sitemaps, it doesn't impact your ranking significantly if everything else is brilliant. The idea of a sitemap is to provide a list of all the indexable URLs that you would like to have indexed and be considered as an SEO landing page - something that can be used in the search results to provide value for actual users. However, it's not just a list alone that gets you crawled. Google also wants to see if there is any internal linking or canonicalization to it. There are multiple different signals that Google uses to prioritise which URLs it crawls, and the sitemap is just one of them."
What would you describe as poor content nowadays?
"You need to provide content that users are actually interested in and solves their problems. If you're not solving that problem, it's content mostly for SEO purposes. We see this a lot in eCommerce - there's hidden text that's not there for users, it's just there for search engines. This is not a good approach for users. The content isn't attractive to users and it's not carrying much weight either. Poor quality content doesn't necessarily mean lack of content, although that could be one of the reasons as well."
What constitutes important user signals that search engines pick up on?
"Keep in mind that when a user starts to search on an engine like Google, they are Google's user - they're not your user. They're not anywhere near being your user at this point. Google wants to satisfy the need of their users - not your users. The user will type in a query, see a bunch of search results and click on the first one. If the page loads very slowly, the content is very poor, or they run into errors, the user is likely to go back.
Also, you need to manage users' expectations. You need to make sure your meta tags and page titles are optimised and demonstrate the benefit of clicking on this search result versus the other nine. This search result is going to give them the answer, and what they expect when they click on it. What is the unique sales proposition? If they click on it, and you don't deliver on these things, the user will go back.
Any situation where the user goes back to Google indicates that this search result did not satisfy them - and there can be plenty of reasons why. Basically, the user has just communicated that your search result isn't great. The key is that it's all about trends. I know there's a lot of people who are focused on specific numbers - but they change all the time and are based on a snapshot of what Google has crawled at that given moment. The web is continuously changing by removing, editing, and adding new content on a daily basis, on a whole bunch of URLs.
Stop focusing on specific numbers and focus on the overall trends. What are you sending as a trend into the algorithms? This is also very important from an off-page perspective. Not necessarily for PageRank purposes, but for discovery purposes - to prioritise which URL Google should crawl next."
How do you explain algorithms to non-technical marketers?
"It's pretty simple - we're talking about formulas. If you change the number in the sum, then you're going to get a different output. A very basic way to explain algorithms is using the analogy of a navigation system in your car. If you're going to a certain destination and the navigation system is telling you where to go, but you're not paying attention, you will miss your turning. This means it takes longer to get there.
This is what happens with algorithms. They are basically calculating in the backend to see where you should be placed. They do continuous testing, and they're trying to figure out the best result for their users. Once a user converts on our website, then they become our user."
What's one thing an SEO needs to stop doing to spend more time focusing on the quality of the signals they're providing to search engines?
"I'm going to give you two things. Firstly, you need to make sure that you check what you're sending in. You need to do some audits of your website. You can do it yourself, or you can go to third parties - but you do need to check what goes in. There are many tools out there in the industry for you to discover. Don't just check what you are sending in from an on-page perspective - also, check from an off-page perspective,
Secondly, you need to stop buying links for the sake of PageRank. That doesn't work, and it's not beneficial in most cases. If you're spending a certain amount of money every month to get 20 links pointing to your website - you're throwing your money in the garbage bin. Instead, you should be spending that budget on getting converting traffic. If you build links, make sure you build links that convert traffic on your website."
You can find Fili over at SEO.services.