Don't forget the basics - but also make sure you monitor Javascript
Natalie says: "Don't forget the basics. A lot of people focus on what's trendy and what everyone's talking about, but SEOs should make sure that best practice, and everything that's quick and easy, is also remembered."
What would you prioritise as the key basics for 2022?
"Site structure, information architecture, and hierarchy. They're seen as more advanced in the content side of things and are often only thought about early on when creating and structuring a website. SEOs need to consider these things throughout the process, rather than just at the start of something.
You also have to think about things like meta content, which can affect click-through rate. We've seen the changes that Google has made in the last year, such as displaying page titles in SERPs. Obviously, you need to make sure that you are trying to get users to the site as much as possible. At the end of the day, we're here to get traffic, that converts and qualifies, to a website."
What are you looking at in terms of site structure?
"For site structure you're looking at the coding of the site, but particularly where it's linked to information architecture. You need to be doing things like creating content hubs, but also making sure that the content is properly and adequately internally linked to as well. For an eCommerce site, make sure that your navigation is well-structured, it appeals to the user, and they are using it properly. Don't just create for creation's sake."
How is information architecture different to site structure?
"Site structure is more about the technical side and the structure of the site. It's things like internal links. For information architecture, you're not just looking at the physical structure of a site, you're also looking at how the information is organised. That may be content hubs, blogs, domains, subdomains, etc. There are many different areas involved in that."
Does hierarchy refer to the number of links from the homepage?
"It does, but there are lots of different types. Alongside information architecture and site structure, looking at the hierarchy of the content on your site is very important. Often, you'll find that you have really informative, helpful pieces of content sitting thousands of layers down on a random blog. Things like that need to be analysed and prioritised, because you don't want useful content sitting further down in a site structure. You want users to be able to find that informative and informational content. By prioritising your hierarchy, that content will rank better."
Is there a maximum number of levels deep that pages can be in a hierarchy?
"I don't think so. It all depends on the site itself; how it's already structured and how you want it to be laid out. If you are migrating, or making changes to your site, make sure that your user journey is supported as much as possible. You need to consider that. You will obviously have different types of users visiting sites at different times, so it's not going to be perfect. Make sure that the journey to that piece of information, or content, is as accessible as possible."
What JavaScript is it important to monitor?
"Tracking scripts, and sites created completely in JavaScript are both relevant. The use of JavaScript is increasing every single year. People are moving to different types of platforms that either use a lot of it or are fully built on JavaScript. People often aren't aware of how that might impact SEO. A lot of SEOs are still on a big learning curve with JavaScript. We know that the market is changing, and you need to make sure that you're keeping on top of that, and understanding what issues may arise. You also need to know who to go and speak to with regards to getting those issues fixed.
You might have a client that wants to re-platform to a site that's fully JavaScript. In that case, you will need to know a bit about JavaScript SEO and how that's going to affect you, and any issues that you need to be aware of initially. It might be that you're getting a new client and pieces of JavaScript in their site are affecting how their content is being crawled, indexed, and ranked. There are lots of reasons why you should look at and monitor JavaScript. Using tools is one step, but there is also education to be done around it. You want to understand why it's used by developers and why clients want to use it, as well as the impact and restrictions it has on you as an SEO."
Could search engines have difficulty crawling JavaScript sites in the same way that they had trouble with flash-based sites in the past?
"It depends. Search engines should know and understand JavaScript as it does take up a large proportion of the web itself. They should be able to crawl that content. They're slowly getting there, but I don't think it's fast enough. As SEOs, and as developers, we need to be aware of the restrictions, and what we need to take into consideration around JavaScript. It does impact many different things - from server-side rendering to whether search engines can crawl internal links or see links in navigation. There are lots of different issues. If you don't have clients or developers on your side with those difficulties, then it presents you with a whole other layer of problems."
Should crawlers like Lumar (formerly Deepcrawl) and Screaming Frog be able to give you the same information about a JavaScript site as a regular HTML site?
"I think so, yes. Certain crawlers, including Sitebulb and Deepcrawl, provide response vs rendered co-coverage, which is really helpful. Before a lot of crawlers were able to do that, you had to rely on Chrome plugins - and a lot of SEOs still do. Having crawling tools give us things like response vs render also enables you to present that information to an internal team, a developer, or whoever you're trying to get buy-in from. You can actually show them the issues that you're having."
What tools do you recommend for monitoring JavaScript?
"Personally, I use a number of different tools. Sitebulb is a great tool and Screaming Frog is very useful - it doesn't give you response vs render yet but that should be implemented soon. ContentKing is great at highlighting any issues and changes. If a developer has implemented a fix regarding JavaScript, it will tell you whether that's been completed without having to constantly run crawls."
How often should a site be crawled for JavaScript monitoring?
"It will depend on the team, what developers are doing, and what's being implemented at that time. We all have different workloads and devs will have different views. A lot of SEOs run crawls on a weekly or monthly basis, but it shouldn't be up to you to crawl a site every day. Having something that will alert you to those issues will really help."
What are the most common, and most important, issues to be aware of?
"Canonicals is a big issue. Years ago, I had an issue where a whole site was canonicalized to one folder. If we had a monitoring tool that alerted us to that at the time, we would have been able to jump on it a lot quicker. Issues like that can arise.
It does depend on what's going on internally, what's going on in the developer's queue, and what's going on in the sphere. The type of platform will affect the issues as well because they can change things themselves, especially if they're not custom-built or use a lot of plugins that need to be updated, like Shopify and WordPress."
What's one thing that SEOs should stop doing to focus more time on securing the basics and monitoring JavaScript?
"Stop creating content without strategy. If you're creating content for content's sake, you should spend that time doing something else. A piece of news that is not optimised or SEO-lead is not going to get traffic to your site. You could be spending that time elsewhere. The main priority for SEOs is traffic acquisition and user acquisition. You're not going to acquire customers through a piece of news that's not going to be distributed, and is not going to be ranking."
You can find Natalie Arney over at NatalieArney.com.