“You wouldn’t build a house without a solid foundation,” said Niki Mosier, SEO and Content Manager at AgentSync. is solid and that there are no cracks.
Optimizing your site architecture can help search engine spiders find and index your content, allowing them to show those pages to users in search results. It can also help send link authority around your site and make it easy for visitors to find what they are looking for.
In his session at Create SMX, Niki Mosier shared the strategies she uses to ensure her site’s foundations are strong and identify opportunities for greater search visibility.
Crawl budget analysis
Exploration budget refers to the number of URLs per site that Googlebot (or any other search engine) can and wants to crawl.
“Each website receives a crawl budget, which can vary depending on the size of the site and how often new content is posted to the site, so get a feel for what the crawl budget is. a website can be really beneficial in making informed decisions. on what to optimize, ”Mosier said.
Performing a crawl budget analysis allows you to have a more complete view of:
- How your website is crawled. “If you identify Googlebot as the customer, you can use log file analysis to find out how Googlebot is handling your site’s URLs. [and] if it crawls pages with parameters, ”she said.
- How fast is your site. While there are many tools that can tell you how fast your server is responding, a log file analysis shows you how long it takes for a bot to download a resource from your server.
- Indexing issues. “Accessing the log files can really show us if bots are having trouble downloading a page entirely,” Mosier said.
- How often a URL is crawled. The crawl frequency can be used to determine if there are any URLs that a search engine crawler should crawl but not, or vice versa.
- Exploration problems. This tactic can also reveal when a crawler encounters 404 errors or redirect strings, for example.
“When it comes to performing a crawl budget analysis, there are a few useful tools,” said Mosier, recommending the Log File Analyzer from ScaremingFrog, Microsoft Excel and Splunk.
Mosier outlined his steps for performing a crawl budget analysis:
- Get your log files; Mosier recommended working with at least a month of data.
- Look at URLs with errors.
- Evaluate which bots crawl which areas of your site.
- Evaluate by day, week and month to establish patterns that can be useful for analysis.
- Check to see if a crawler is crawling URLs with parameters, which may indicate wasted crawling budget.
- Cross crawl data with sitemap data to assess missed content.
“Once you’ve delved into the server logs and got a good idea of what your crawl budget looks like, you can use that data to prioritize your SEO tasks,” she said, adding that SEOs should “prioritize based on the impact fixing different areas of your site will have, the development resources needed to resolve issues, and the time to resolve those issues.”
RELATED: How to optimize the crawl budget of your website
Generate traffic through technical SEO
Finding out how well your site works can help you put in place the right strategies to drive more traffic to it.
“Performing regular site audits is a great way to keep track of what’s going on with our websites,” Mosier recommended. Additionally, Google Search Console should be used to check for Core Web Vitals or schema issues, for example. “Using monitoring tools, [such as] Rank Ranger, Semrush, and Ahrefs, these are great ways to stay alerted to any issues that may arise with your website, ”she said.
Evaluating Search Results Pages (SERPs) can give you an idea of the keyword landscape you are targeting. In addition to seeing what search features may be available, the SERP also shows you which sites rank higher than you – “See what these sites are doing; looking at their source code can tell you what schema they’re using, ”Mosier said, adding that you should also look at their pages to determine what their headers and user experience look like.
Updating your old content can also lead to an improvement in rankings. Mosier recommends paying close attention to your headlines and content above the waterline. Adding schema markup can also allow your content to appear as a rich result, which can also increase your visibility on the SERP.
“Using tools like Frase or Content Harmony can help you see which other sites that rank for the keywords you want to rank for are using for headlines, what type of FAQ content they are using and what content. they are above the waterline. , “she added.
“Paying attention to page speed is definitely an important metric to think about, [but] I think it’s also important to pay attention to the industry average, ”Mosier said. “So go see where your competitor’s sites rank or rank in page speed and set that as a benchmark. “
It’s also important to gauge the speed of each page versus the overall site speed: “You want to see what each page on your site is used for and make page-by-page improvements and not just look at the site’s speed in its own right. together, because the pages are what gets ranked, not necessarily the whole site, ”she said.
Additionally, the way your pages display can affect your user experience as well as what search engine spiders “see”. “Is there a pop-up or a really big header on a particular page that takes up a lot of the space above the waterline?” This can be a problem, ”Mosier said, noting that page speed can also have an impact on how search engines display a page.
More from SMX
New to Search Engine Land