The not-so-SEO checklist for 2022

30-second summary:

  • With several Google algorithm updates in 2021 its easy to fall into a dangerous trap of misconceptions
  • One factor that still remains constant is the value Google places on great content
  • Core Web Vitals aren’t the end-all of ranking factors but a tiebreaker
  • Read this before you create your SEO strategy for 2022!

The year 2021 was a relatively busy one for Google and SEOs across the world. The search engine behemoth is improving itself all the time, but in this past year, we saw a number of pretty significant updates that gave digital marketers cause for paying attention. From rewarding more detailed product reviews to nullifying link spam, Google keeps thinking of ways to improve the user experience on its platform.

Speaking of user experience: the biggest talking point of the year was June’s Page Experience update, which took place over a few months and notably included the Core Web Vitals.

After that happened, tens of thousands of words were published around the web instructing people on how to modify their websites to meet the new standards.

Mobile-friendliness became even more important than before. Some more inexperienced SEOs out there might have started looking to the Core Web Vitals as the new be-all ranking factor for web pages.

With all this new information on our hands since last year, it’s possible that some misconceptions have sprung up around what is good and bad for SEO in 2022.

In this post, I want to bring up and then dispel some of the myths surrounding Google’s bigger and more mainstream 2021 updates.

So, here it is – the not-so-SEO checklist for your 2022. Here are three of the things you shouldn’t do.

1. Don’t prioritize Core Web Vitals (CWV) above quality content

It’s no secret that Google’s Core Web Vitals are among the elements you’ll want to optimize your website for in 2022 if you haven’t done so already.

As a quick reminder, the Core Web Vitals are at the crossroads between SEO and web dev, and they are the measurements of your website’s largest contentful paint, first input display, and cumulative layout shift.

Those are the parts of your website that load first and allow users to start interacting with the site in the first few milliseconds. Logic tells us that the slower your load times are, the worse your site’s user experience will be.

The not so SEO checklist and Core Web Vitals - debunking myths

First of all, this isn’t exactly new information. We all know about page speed and how it affects SEO. We also know how vital it is that your Core Web Vitals perform well on mobile, which is where around 60 percent of Google searches come from.

Google takes its Core Web Vitals so seriously as ranking factors that you can now find a CWV report in Google Search Console and get CWV metrics in PageSpeed Insights results (mobile-only until February of 2022, when the metrics roll out for desktop).

Given that, why am I calling it a misconception that Core Web Vitals should be at the top of your SEO-optimization checklist for 2022?

It’s because Google itself has explicitly stated that having a top-shelf page experience does not trump publishing killer content. Content is still king in SEO. Being useful and answering user questions is one of the most crucial ranking factors.

So, it’s a misconception that Google will not rank you well unless your Core Web Vitals are all in solid, healthy places.

However, having it all is the ideal situation. If you have great web content and optimized Core Web Vitals, you’ll probably perform better in organic search than would a page without strong Core Web Vitals.

In 2022, therefore, work on your Core Web Vitals for sure, but develop a detailed content marketing plan first.

2. Don’t assume your affiliate product-review site is in trouble

Another misconception that might have followed from a 2021 Google update is that affiliate sites, specifically product-review sites, were in some hot water after the Product Reviews update from April.

Google meant for the update to prioritize in-depth and useful product reviews over reviews that are spammy and light on details. In other words, just as in organic search, higher-quality content is going to win here.

If there was ever a point when someone actually made money by running a shady, low-quality affiliate site that featured nonsense product reviews that were then essentially spammed out to thousands of people, Google’s April 2021 product reviews update started to kill that.

The search engine now prioritizes long-form, detailed reviews, the kind that generates trust from users. Those are the types of affiliate content that stand to benefit from Google’s update, while the spammy sites will continue to vanish from top rankings.

Therefore, we can forget about the misconception that good, honest, hard-working affiliate product reviewers would somehow be hurt by the update.

As long as you are presenting something relevant and legitimately useful to users, you may have even seen your rankings rise since the April of 2021.

3. Don’t assume Google will rewrite all your titles

The last misconception I want to address here is the idea that you don’t need to put effort into your pages’ title tags because Google is going to rewrite them all anyway following its August of 2021 title tag-rewrite initiative.

First, some explanation. Back in August, many of you know that SEOs across the industry started noticing their page titles being rewritten, as in, not as they had originally created them.

Google soon owned up to rewriting page titles, but only those it believed were truly sub-par for user experience. In Google’s view, those junky title tags included ones that were stuffed with keywords, overly long, boilerplate across a given website, or just plain missing.

But SEOs still noticed that seemingly SEO-optimized title tags were still being rewritten, and the new titles didn’t always come directly from the original title. Sometimes, as Google has been doing since 2012, the search engine would use semantics to rewrite a title to be more descriptive or just simply better.

In other cases, Google’s new titles came from H1 text, body text, or backlink anchor text.

Google saw these efforts and still does, as one great way to improve user experience during the search.

Many SEOs, however, did not see it that way, especially given that Google’s rewrites were sometimes responsible for drops in traffic.

To put it mildly, there was uproar in the SEO community over the change, so much so that Google explained itself a second time just a month later, in September 2021.

In that blog post, Google said that it uses marketers’ own title tags 87 percent of the time (up from just 80 percent in August). The other 13 percent would be rewrites done to improve:

  • too-short titles,
  • outdated titles,
  • boilerplate titles,
  • and inaccurate titles.

And now to bring things back to the crux of this: it is a misconception that you’re wasting your time writing title tags after August of 2021.

Google does not actually want to rewrite your title tags. It clearly stated this in its September blog post.

What Google wants is for you to write high-quality page titles on your own, ones that are descriptive, truthful, and useful. Give users what they need, and Google will leave your titles alone.

However, throw a bunch of keywords in there, or use boilerplate titles all over your site, and you can expect Google to do some cleaning up on your behalf. The trouble is, you may not personally like the results.

Title tags matter in SEO, big time. Don’t think that your efforts are futile just because of the 2021 change. Focus on creating title tags that matter for users, and you should be just fine.

Going forward

The three misconceptions I have covered here can be dangerous to fall into in 2022.

Now, are Core Web Vitals, quality affiliate links, and title tags important to Google? You can bet they are. But SEOs also just have to be smart when approaching these matters. Everything Google Search Central does has the user in mind.

Optimize for Core Web Vitals, but still, put quality content creation first.

Run your affiliate marketing site, but ensure the reviews are useful.

And write amazing SEO title tags so that Google won’t want to rewrite them.

Following these guidelines can only help you in the year to come.


Kris Jones is the founder and former CEO of digital marketing and affiliate network Pepperjam, which he sold to eBay Enterprises in 2009. Most recently Kris founded SEO services and software company LSEO.com and has previously invested in numerous successful technology companies. Kris is an experienced public speaker and is the author of one of the best-selling SEO books of all time called, ‘Search-Engine Optimization – Your Visual Blueprint to Effective Internet Marketing’, which has sold nearly 100,000 copies.

Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.

Join the conversation with us on LinkedIn and Twitter.

https://www.searchenginewatch.com/2022/01/04/the-not-so-seo-checklist-for-2022/




Four ways to improve page speed and master Core Web Vitals

30-second summary:

  • Websites that rank well on Google tend to have a higher Core Web Vitals score
  • There are three core web vitals that make up the majority of the site’s overall page speed score
  • Prioritizing user experience in web design and marketing campaigns could give you a competitive edge
  • This comprehensive guide prepares you for the rollout of the new Google Search algorithm update

Google’s latest major update to its search algorithm focuses greatly on the user experience through a new set of ranking factor metrics, called Core Web Vitals. Early results from Core Web Vital audits reveal that the average website performs below these new standards. Searchmetrics’ research revealed that, on average, sites could reduce page load time by nearly one second by removing unused JavaScript.

This provides an amazing opportunity to outperform other websites by boosting your own page rankings.

Here is everything you need to know about Core Web Vitals plus four simple steps to improve your metrics.

Content created in partnership with Searchmetrics.

What are the Core Web Vitals metrics?

Core Web Vitals are an extension of Google’s page experience signals that include mobile-friendliness and HTTPs. The three Core Web Vitals metrics measure loading performance, interactivity, and visual stability, which Google views as providing an accurate depiction of a real-world, user experience.

  1. Largest Contentful Paint (LCP) measures the loading time of the largest image or text block visible within the user’s viewpoint.
  2. First Input Delay (FID) measures the interactivity on the page by calculating the time from when a user first interacts with the site to the time when the browser responds to that interaction.
  3. Cumulative Layout Shift (CLS) refers to how much the content shifts during page rendering.

How to check your page speed insights

There are many online tools that check your page ranking score, including PageSpeed Insights, Chrome User Experience Report, Lighthouse Audit, and Search Console. These sites measure page speed in various elements and display the results using a traffic light system. PageSpeed Insights provides a breakdown of the results and highlights areas of improvement.

What does “good” performance mean in numbers?

To provide a good user experience, LCP should occur within 2.5 seconds of when the page first starts loading. Pages should have an FID of less than 100 milliseconds and maintain a CLS of less than 0.1.

Websites, like Wikipedia, have the highest page speed score due to a lightweight approach to web design, using mainly text and optimized images. Websites that rely heavily on video content and images are slower to load and make for a poor user experience. Therefore, there is a balance to strike between design and user experience.

See where your site ranks. Visit PageSpeed Insights and enter your URL. Note: The top number is your Lighthouse score, also referred to as PageSpeed score, measuring from zero to 100. While it’s a good general benchmark for the performance of your site. It’s not entirely related to the three Core Web Vitals metrics, which should be viewed as an analysis of LCP, FID, and CLS.

How to improve your page speed

Passing is considered getting a “good” score in all three areas. Making small changes can improve the page speed score by as little as one second, which can shift the site from a “poor” or “needs improvement” score in LCP to a “good” one. Reducing load time will make users happier and increase traffic to the site.

Tom Wells, creative marketing expert at Searchmetrics, says,

“Anything that’s not needed on a website shouldn’t be there.”

Putting it simply, identifying and removing elements that are not used or have a substantive purpose could improve the site’s page speed score.

1. Oversized images

Poorly optimized images are one of the main causes affecting a site’s LCP score as this is usually the largest element to load. Ecommerce businesses and those who rely heavily on images may have poorer LCP scores due to the page rendering of multiple high-resolution images.

Optimizing these assets by using responsive design or next-gen image formatting such as WebP, JPEG 2000 and JPEG XR can improve the score by cutting down rendering time. Often, images can be condensed to a much smaller size without affecting the quality of the image. Free resources like Squoosh can do this for you.

2. Dynamic content and ads

Loading ads on a web page is one of the main causes of a bad CLS score. This can be down to elements on the page shifting to accommodate dynamic ads, which makes for a poor user experience.

Using a smart implementation method such as allocating size attributes or CSS aspect ratio boxes for all ads, videos, and image elements is one way to reduce content shifting. Some companies may use a plugin or coding at the top of the website to place the ads. However, this could lead to a slower website, impacting the user experience negatively and indirectly affecting rankings.

Also, never insert content on top of existing content, except in response to specific user interactions as this ensures any layout shift that occurs. For example, when you click a CTA button and a form appears is an exception.

3. Plugin-centric web economy

Plugins can act like “plaster over the cracks” to solve website problems, says Wells. Despite creating a temporary fix, it can slow down and hinder web performance as all the code needs to load before the user is able to fully interact with the webpage.

Using plugins can increase server request counts and increase javascript execution time. All these factors can lower the site’s FID score.

“Often we look for advanced fixes and solutions but sometimes it’s as simple as deleting what’s not needed,” says Wells.

Therefore, removing some plugins, especially unused ones, can improve the reactivity and speed of the website.

4. Too much code

Google advises focusing on the overall website performance.

“It’s critical for responsive and well-scored websites to be as lightweight as possible,” says Wells.

“The more things that a server has to load, the slower that load time is going to be overall.”

While unused CSS and JavaScript may not directly impact the page speed score, it can still impact the site’s load times, create code bloat, and negatively impact user experience.

When should I start?

Google’s rollout of the new algorithm began in mid-June, so it’s worth getting a start on reviewing how well your site scores on pages speed tests. Websites that rank well tend to have higher Core Web Vitals scores and this trend is set to continue as Google places greater emphasis on user experience.

Want more Core Web Vital insights? Read Searchmetrics’ Google Core Web Vitals Study April 2021.

https://www.searchenginewatch.com/2021/08/18/four-ways-to-improve-page-speed-and-master-core-web-vitals/




On-page SEO: a handy checklist to tick off in 2021 and beyond

30-second summary:

  • On-page SEO is the process of optimizing your web pages and content for search engine crawlers
  • It involves a lot of moving parts, so it’s easy to forget some elements or tweak them incorrectly
  • This quick checklist will help you keep all the various on-page SEO elements on track

On-page SEO is basically a set of techniques and best practices for your web pages to make them more search-engine friendly and thus, boost your rankings.

Now, as you know, keywords are at the heart of nearly everything on-page SEO. But on-page optimization involves a lot of elements — not just keywords — so it’s easy to overlook some of them.

To make it easy for you to ensure all your pages are correctly optimized for the best possible rankings, here’s a handy checklist to tick off.

URLs

Review the URLs of all pages on your site to ensure they’re concise rather than long and complex. Shorter URLs tend to have better click-through rates and are more easily understood by search engine crawlers.

Include your page’s primary keyword in the URL, remove filler (aka stop) words like “the”, “for”, and “to”, and keep it under 60 characters.

Images

Your website is likely brimming with images, and that’s a good thing as images contribute significantly to improving both user experience and rankings. They make your content more easy-to-consume, engaging, and memorable, and when optimized correctly, help you drive more traffic to your website.

To optimize your images for on-page SEO, here are a couple of things to ensure:

Image filename and alt text

Google bots can’t “see” images like humans. They need accompanying text to understand what the image is about. So, write a descriptive filename (“blue-running-shoes.jpg” instead of “82596173.jpg”) and alt text (which helps in case the image fails to load for some reason) for every image on your site, including keywords in both.

Alt text also helps make your website more accessible, as screen readers use the alt text to describe images to visually-challenged users. In fact, it’s prudent to test your website’s accessibility to ensure you don’t ever have to cough up big bucks for ADA lawsuit settlements.

Image file size

Page speed is a major ranking signal for both desktop and mobile searches, and bulky images slow down your site’s load speed. So make sure to compress all images to reduce their size — ideally under 70 kb.

Titles and meta descriptions

See to it that you’ve included your main keywords in the front of the title tags of all pages. Ensure the length of your title tags is under 60-65 characters and no longer than 70 characters, otherwise, it may get truncated in the SERPs.

Also, the title should be the only element wrapped in the H1 heading tag. In other words, only one H1 tag per page that’s reserved for the title.

For meta descriptions, just ensure you have written a keyword-rich and inviting meta description that is relevant to your user’s search intent. Keep it under 160 characters for all your pages. If you don’t, Google will pick some relevant text from the page and display it as the meta description in the SERP, which isn’t ideal for SEO.

Page load speed

Speed is a major ranking factor you just can’t afford to overlook. If your pages take anything over two to three seconds to load, your visitors will bounce to a competitor, and achieving first page rankings will remain a dream.

Thus, verify that:

  • Code is optimized with minified CSS and JS
  • There are no unnecessary redirects
  • You have compressed all images
  • You’ve enabled file compression and browser caching
  • Server response time is optimal

Regularly review your site speed using PageSpeed Insights to find out the exact areas that can be improved.

Links – internal and external

Ensure you have a proper linking strategy that you always follow. Both internal and external links play a role in your on-page SEO.

External links

Citing external sources and having outbound links is crucial for building credibility in the eyes of both Google crawlers and human visitors. However, make sure that you’re only linking back to high-quality websites and reliable sources.

Plus, ensure there are no broken (“404 not found”) links, as they hurt SEO and user experience. In case you may have a lot of site pages, it is best advised to create an engaging and easy-to-navigate 404 error page. This will help you retain site visitors and help them find relevant content/actions.

Internal links

Make sure to strategically interlink pages and content on your website. This helps crawlers to better understand and rank your content for the right keywords.

Internal linking also helps to guide visitors to relevant pages and keep them engaged.

Content

All your blog posts and website copy play a pivotal role in on-page optimization. Besides ensuring your target keywords are sprinkled judiciously and naturally throughout your content title, URL, subheadings, and paras. Here are a couple of things to get right.

Structure and readability

Verify the structure of content on all pages. Make sure you’ve used keyword-optimized headings and subheadings – H1, H2, H3, and so on, to build a logical hierarchy, which improves the readability and crawlability of your content.

Comprehensiveness

Studies suggest that longer, in-depth posts perform better than shorter ones when it comes to Google rankings. So, strive to have a word count of 2,000+ words in every piece of content.

Comprehensive, long-form content will also serve your audience better as it likely answers all their questions about the topic so they don’t have to look for more reading resources.

Over to you

With each new update to its core algorithm, Google is fast shifting its focus on rewarding websites with the best user experience.

But nailing your on-page optimization which ties closely with UX will continue to help you achieve top rankings and stay there. And so, keep this checklist handy as you work on your SEO in 2021 and beyond.

Gaurav Belani is a senior SEO and content marketing analyst at Growfusely, a content marketing agency specializing in content and data-driven SEO. He can be found on Twitter @belanigaurav.

https://www.searchenginewatch.com/2021/03/11/on-page-seo-a-handy-checklist-to-tick-off-in-2021-and-beyond/




Page speed is a big deal – Why? It impacts SEO rankings

30-second summary:

  • Consumers expect ≈ and apps to respond quickly. Each second of delay can cost $100K in lost revenue for brands.
  • There are tools to help understand a page’s overall speed. Looking at the relationship between page speed and SEO ranking shows that having a faster performing site/page does impact ranking in a positive way, albeit slightly.
  • As marketers look for additional data points about why they should prioritize page speed, Jason Tabeling highlights the data and how it fits into an overall SEO strategy.

The saying is, “Speed Kills.” However, when it comes to digital speed, it’s really, “Slow speed kills.” According to a study by Google and Deloitte for every 0.1 seconds that your page speed improves can increase your conversion rates by 8%. That data point creates a pretty powerful business case to improve the performance of your site. Think about this data point from your own experience browsing the web, mobile, and desktop. Have you ever found that a page was just loading too slow that you moved on to something else, or that brand lost the sale? I know I have. 

So, we know that speed improves page performance, but what about how it impacts SEO rankings? I took a look at a small data set to try to get a sense of the impact that page speed has on rankings. I took 10 retail and conversion based keywords and ran the specific webpages through the Google Page Speed Test to look at the correlation between speed and rank. This data set gives a good directional study on the impact. Here is what I found;

Rank and page speed are only slightly correlated

Creating a scatter plot of each website’s rank with the page speed shows the distribution. The data does show a relationship between speed and rank. The correlation of the data suggests there is just a 0.08 correlation. A correlation of 1 would be perfectly correlated. So, this score isn’t very high, but there is a correlation and when you are looking for an edge and you know the impact speed has on conversion rates becomes very meaningful. The way to forecast the impact is to use the trendline and its slope. If you remember your algebra class you can use this data to create the equation of a line and estimate what your page speed needs to be a achieve a certain rank. Using that equation you can see that every 10 points of additional speed roughly equals a tenth of a point in rank improvement. So moving from a speed score of 10 equals a rank of 5.3, but a page speed of 100 would equal a rank of 4.4. This might not seem like a big deal, but remember speed dramatically improves conversion rates and has a positive impact on rank as well.

rank vs page speed data graph

What action should you take?

Here’s an outline of a few steps you can take to optimize your speed.

1. Improve your page speed

There are a lot of opportunities for brands to increase their page speed. Probably the most surprising thing was how many sites had very slow speed rankings. Look at the data in the histogram below. 56% of page speed scores came back <20. That is pretty remarkable considering there are some very well known brands in this study. There are so many opportunities to improve page speed and Google’s tool, and many others provide very step by step actions to take to improve. If you haven’t run your site through this tool or another one you should do that to start – https://www.thinkwithgoogle.com/feature/testmysite/

histogram of page speed

2. Don’t forget that speed is important but it’s just one of many factors that impact SEO rankings

While this article is about speed, it should be clear that SEO has many more key items that impact rank and overall results than speed. SEO is a holistic strategy that includes content and technical resources. You can’t forget about how well your content resonates with your audience, how fresh it is, and how well the search engine spiders can crawl and understand your site. So many businesses get caught up in their re-platforming their ecommerce platform and never put a minute of thought into the SEO strategy

Page speed is a big deal – Why? It impacts SEO rankings




How to fix the top most painful website UX mistakes [examples]

In today’s market of evolving website functionality, UX design has become more important than ever to businesses, globally. Here’s a roundup of top UX mistakes and how to fix them.

We want users to navigate our site freely, without obstacles and without friction. If your site makes finding information a challenge, you can bet your bottom dollar your competition already knows that, and is capitalizing on it.

Both UX and SEO are user-centric, which makes them a formidable pair when you get them right.

But get it wrong and you could see high bounce rates, low conversions and even a slight hit on your rankings as Google’s John Mu considers UX a ‘soft ranking factor’.

If you’re finding that users aren’t spending time on your site (or converting) there could be issues. We’re going to help you find out what they could be.

Cluttered navigation

Your site’s navigation is the gateway to your content and service pages. It’s where you (really) focus on building a funnel that is both frictionless and easy to understand.

Your navigation should be an architecture of well-structured, groups of pages; either commercial or informational (depending on your business).

example of cluttered navigation, bad for UX and SEO

Some of the most common mistakes we see are:

  • Links without value in the main menu
  • Excessive anchor text
  • Non-responsive (yep, it still happens in 2019)
  • Too many sub-menus
  • Relies on Javascript (with no fallback)

How to fix your site’s navigation

The golden rule here is to make your navigation accessible, responsive and clutter-free.

Think about your categories, your most valuable pages, where users spend most of their time and, more importantly, who your users are.

Forever 21 does a great job of linking through to the main areas of its site in a structured and visually pleasing way.

example of forever 21 good site navigation to main pages

Let’s say you run an ecommerce store for pets (because everybody loves animals).

You sell products for dogs, cats and rabbits. Here’s how you can structure your navigation…

example of good site navigation

Aesthetics are everything

People are visual creatures. We like websites to be aesthetically pleasing.

One of the biggest cardinal sins of UX is poor imagery, color choices and font selection. In fact, I’ll extend that into most areas of any online marketing; social media and display ads.

Here’s an example of all three on a single site:

example of bad site nav, has bad colors, fonts, and images

I don’t think I need to go much further into why this could be improved. But, I will go into the how.

Even the big brands get it wrong…

boohoo example of bad site navigation, how one poorly sized image affects UX of whole page

Notice how a single, poorly sized image has affected the feel of the whole page?

Fixing poor page structure

Things to consider when it comes to site layout are: margins, padding and alignment.

As you can see in the example above, alignment is a big issue and even a small amount of misaligned content/imagery can look unprofessional. Web pages work in columns which provide a structure for designers to create landing pages which can influence users’ focus and attention.

Combining text and images together is normally where layouts can become difficult to manage as we’ve already seen.

Luckily for you, we’ve put together a number of page layout examples below which make structuring your content and images simple.

  • Image aligned to the right of text:

example of image aligned to right of text

  • Text on top of image:

example of text on top of image

  • Text box overlaid on carousel image:

example of text overlaid on carousel image

Working with fonts

When you consider fonts, you need to think about how your content will appear at all sizes. From your header hierarchy to your bullet points.

We recommend limiting the amount of different fonts you use on your site. It’s easy to get carried away to give different sections of your site its own ‘feel’.

Once you’ve chosen a single font family or families, consider how you can create contrast between header and body content.

Single font selections with varying font weights can create a very visually pleasing contrast between sections of your page.

A poor font selection can even make a retail giant, such as Amazon, look untrustworthy and unprofessional.

example of poor font selection on Amazon product page

If you’re struggling for ideas on font pairing, you can find use Font Pair to put together different font types. It’s important to remember that font readability will play a huge part in how users consume content.

Consider:

  • Size
  • Width
  • Paragraph spacing
  • Weight

If you’re content currently looks like this:

example of how typography can go very wrong

I’m sorry, but you’re doing it wrong.

Even a minor change to fonts can cause the greatest of upsets to users. Just ask Amazon how it went for it.

Most CMS platforms will come with pre-installed fonts, but if you need more of a selection you can always use Google Fonts.

Finally, consider how your font appears on desktop and mobile.

If your font size is too big, it would take up too much of a mobile screen. If it’s too small, users will struggle to read it. It’s worth testing different sizes to cater for overall legibility.

This is an example of how Zazzle’s homepage appears on an iPhone X – using a font size of 18px.

example of site nav on mobile

Page load times

This shouldn’t be news to you by now.

There are multiple case studies available about how load times can impact conversions. So, I won’t go into that here.

The most common reason for page load times being high, is images. Images are something most site owners can change with little dev input. It ultimately comes down to saving the right file type, using the right dimensions and compressing high-resolution to preserve quality, whilst reducing size.

Savings in KB can often make a huge difference.

Take a look at what happens when image optimization is ignored.

example of slow page load times

For the sake of anonymity, we’ve hidden the brand’s identity.

However, this is the page we were greeted with for over five seconds. It’s a medium-sized ecommerce website that caters to children’s clothing.

By optimizing the images, we found that there were savings of up to 900kb – a significant weight lifted off of a browser.

Consider how this feels for users? First impressions are everything. What’s to stop traffic bouncing due to content/images not loading?

Think about that for a second!

How to fix image bloat

Firstly, you need to find if this is an issue.

You can run speed tests using Google’s Lighthouse or GTMetrix to get an understanding of which files are too heavy. It’s simple to find poorly optimized images for individual pages.

For batch analysis, we recommend using a tool like Sitebulb which has an incredibly in-depth section attributed to page speeds.

If your images are already on your site and you don’t really fancy opening Photoshop and resizing them all, you need to run batch compressions to reduce the file size.

There are a range of image compression tools available online, we recommend Compressor.io or TinyPNG.

It’s often thought that compressing images means poor quality. However, take a look at the image below and assess the quality difference  for yourselves.

example of image size reduced, image compressor
Image size reduced by 37%, over a 1.3MB saving

For those of you using WordPress, you can use the Smush.it plugin to compress and resize the images on your site.

New image formats

Google Developers introduced a new file format which is considered to be superior to its PNG and JPG equivalent.

It offers fantastic lossless and lossy compression for images.

Shaving milliseconds off your load time, especially on poor mobile connections, can stop a user from leaving your site. Google has said it actively rewards sites that are seen to make incremental improvements to site speed.

The great news?

More than 70% of browsers support this media format!

You can read more about WebP with Google Developers.

Pop-ups

No UX mistake roundup would be complete without mentioning pop-ups.

They seem to get more aggressive and more disruptive each year. You’ll find it hard to come across a website without them.

Sorry Sumo, but this is one of the worst.

example of a bad scroll-triggered popup on a website

This is considered a scroll-triggered pop-up. Whereby, the page waits for me to interact before the pop-up is shown.

There is one major rule that you must abide by: do not disrupt a user’s experience with pop-ups. We know this is a bold statement but… who likes pushy sales?

If you want to help users, do it natively.

Remember, we’re creating a frictionless journey.

How to use pop-ups — the right way

First things, don’t ever use interstitial pop-ups. It will annoy users and could defer the rendering of your page in search engines.

Both are bad for business.

We recommend using pop-ups in a more subtle manner.

Top banners

example of a brand CTA at the top of a screen

A perfect example of a branded CTA at the top of the screen. It’s non-invasive on both desktop and mobile.

Chatbots

good example of a website chatbot

Chatbots are a great way to help users find what they’re looking for, without disrupting their experience.

You can incorporate lead generation, discount codes or just offer general customer advice. It can help improve operational efficiency (reducing calls into the business) and improving conversion rates.

If a customer is finding it hard to find a particular area of the site, chatbots can remove this friction quickly to help retain users.

Native CTA banners

good example of native CTA banners

Another smart way to offer discounts to users is to integrate CTAs within product selections or even at category level as a header banner.

We find this to be a great way to preserve UX and still help drive incentivized clicks to sale or discount pages.

It’s always important to remember to design banners to match size and resolutions of your products.

Summary

UX is as important to your website as your content. Data shows that UX is still a bit of a mystery to many marketers, but it should be the most important factor on any site design.

Website innovation is encouraged but not at the cost of your users. When you’re considering how to improve your user’s experience, you need to remember how you feel navigating a poorly put together site.

Consider the easy fixes; fonts, images, colors and navigation first, before you think about CRO (conversion rate optimization).

Remember, we’re in a market driven by user behavior so, try your best to cater to that as much as you can and you’ll win!

Ryan Roberts is an SEO Lead at Zazzle Media.

Whitepapers

Related reading

Using topic clusters to increase SEO rankings in practical

Moz Local Search Analytics and industry trends: Q&A with Moz’s Sarah Bird and Rob Bucci

Why we need to think of entities and the future of SEO

How changing domains challenge SEO

How to fix the top most painful website UX mistakes [examples]




Five steps to deliver better technical SEO services to your clients

steps to deliver better technical seo to your clients

Taking control of a site’s technical SEO is both an art and a science. Take it from me — a content strategist guy at heart — technical SEO requires a balance of knowledge, diligence, and grit to be proficient. And for many, it can feel both daunting and complicated undertaking such technical matters.

But as code-heavy and cumbersome as technical SEO may seem, grasping its core concepts are closely within reach for most search marketers. Yes, it helps to have HTML chops or a developer on hand to help implement scripts and such. However the idea of delivering top-tier technical SEO services shouldn’t feel as intimidating as it is for most agencies and consultants.

To help dial-in the technical side of your SEO services, I’ve shared five places to start. These steps reflect the 80/20 of technical SEO, and much of what I have adopted from my code-savvy colleagues over the past decade.

1. Verify Google Analytics, Tag Manager, and Search Console; define conversions

If you maintain any ongoing SEO engagements, it’s critical to set up Google Analytics or an equally sufficient web analytics platform. Additionally, establishing Google Tag Manager and Google Search Console will provide you with further technical SEO capabilities and information about a site’s health.

Beyond just verifying a site on these platforms, you’ll want to define some KPI’s and points of conversion. This may be as simple as tracking organic traffic and form submissions, or as advanced as setting-up five different conversion goals, such as form submissions, store purchases, PDF downloads, Facebook follows, email sign-ups, etc. In short, without any form of conversion tracking in place, you’re essentially going in blind.

setting up goals in google analytics

Determining how you measure a site’s success is essential to delivering quality SEO services. From a technical standpoint, both Google Search Console and Analytics can provide critical insights to help you make ongoing improvements. These include crawl errors, duplicate meta data, toxic links, bounce pages, and drop-offs, to name a few.

2. Implement structured data markup

Implementing structured data markup has become an integral element to technical SEO. Having been a topic of focus for Google in recent years, more and more search marketers are embracing ways to employ structured data markup, or Schema, for their clients. In turn, many CMS platforms are now equipped with simple plugins and developer capabilities to implement Schema.

In essence, Schema is a unique form of markup that was developed to help webmasters better communicate a site’s content to search engines. By tagging certain elements of page’s content with Schema markup (i.e. Reviews, Aggregate Rating, Business Location, Person, etc.,) you help Google and other search engines better interpret and display such content to users.

google's structured data testing tool

With this markup in place, your site’s search visibility can improve with features like rich snippets, expanded meta descriptions, and other enhanced listings that may offer a competitive advantage. Within Google Search Console, not only can use a handy validation tool to help assess a site’s markup, but this platform will also log any errors it finds regarding structured data.

3. Regularly assess link toxicity

It should be no secret by now that poor quality links pointing to a site can hinder its ability to rank. Even more so, a site that has blatantly built links manually using keyword-stuffed anchor text is at high risk of being deindexed, or removed from Google entirely.

If you just flashed back 10 years to a time when you built a few (hundred?) sketchy links to your site, then consider assessing the site’s link toxicity. Toxic links coming from spammy sources can really ruin your credibility as a trusted site. As such, it’s important to identify and to disavow any links that may be hindering your rankings.

technical seo tool for checking for toxic links

[Not only does the Backlink Audit Tool in SEMRush make it easy to pinpoint potentially toxic links, but also to take the necessary measures to have certain links removed or disavowed.]

If there’s one SEO variable that’s sometimes out of your control, it’s backlinks. New, spammy links can arise out of nowhere, making you ponder existential questions about the Internet. Regularly checking-in with a site’s backlinks is a critical diligence in maintaining a healthy site for your SEO clients.

4. Consistently monitor site health, speed, and performance

An industry standard tool to efficiently pinpoint technical bottlenecks for a site is GTmetrix. With this tool, you can discover key insights about a site’s speed, health, and overall performance, along with actionable recommendations on how to improve such issues.

No doubt, site speed has become a noteworthy ranking factor. It reflects Google’s mission to serve search users with the best experience possible. As such, fast-loading sites are rewarded, and slow-loading sites will likely fail to realize their full SEO potential.

pagespeed insights score as a technical seo tool

In addition to GTmetrix, a couple additional tools that help improve a site’s speed and performance are Google PageSpeed Insights and Web.Dev. Similar to the recommendations offered by GTmetrix and SEMRush, these tools deliver easy-to-digest guidance backed by in-depth analysis across a number of variables.

The pagespeed improvements provided by these tools can range from compressing images to minimizing redirects and server requests. In other words, some developer experience can be helpful here.

A last core aspect of maintaining optimal site health is keeping crawl errors at a bare minimum. While actually quite simple to monitor, regularly fixing 404 errors and correcting crawl optimization issues can help level-up your technical SEO services. These capabilities are available in the Site Audit Tool from SEMRush.

technical SEO site audit tool

[The intuitive breakdown of the Site Audit Tool’s crawl report makes fixing errors a seamless process. Users can easily find broken links, error pages, inadequate titles and meta data, and other specifics to improve site health and performance.]

5. Canonicalize pages and audit robots.txt

If there’s one issue that’s virtually unavoidable, it’s discovering multiple versions of the same page, or duplicate content. As a rather hysterical example, I once came across a site with five iterations of the same “about us” page:

  • https://site.com/about-us/
  • https://www.site.com/about-us/
  • https://www.site.com/about-us
  • https://site.com/about-us
  • http://www.site.com/about-us

To a search engine, the above looks like five separate pages, all with the exact same content. This then causes confusion, or even worse, makes the site appear spammy or shallow with so much duplicate content. The fix for this is canonicalization.

Because canonical tags and duplicate content have been major topics of discussion, most plugins and CMS integrations are equipped with canonicalization capabilities to help keep your SEO dialed-in.

yoast plugin for wordpress to create canonical URLs, technical SEO tool

[In this figure, the highly-popular Yoast SEO plugin for WordPress has Canonical URL feature found under the gear icon tab. This simple functionality makes it easy to define the preferred, canonical URL for a given page.]

Similarly, the robots.txt file is a communication tool designed to specify which areas of a website should not be processed or crawled. Here, certain URLs can be disallowed, preventing search engines from crawling and indexing them. Because the Robots.txt file is often updated over time, certain directories or content on a site can be disallowed for crawl and indexation. In turn, it’s wise to audit a site’s Robots.txt file to ensure it aligns with your SEO objectives and to prevent any future conflicts from arising.

technical seo performance tool for a robots.txt file

Lastly, keep in mind that not all search engine crawlers are created equal. There’s a good chance these pages would still be crawled, but it is unlikely they would be indexed. If you have URLs listed as ‘do not index’ in the robots.txt file, you can rest easy knowing anything in those URLs will not be counted as shallow or duplicate content when the search engine takes measure of your site.

Tyler Tafelsky is a Senior SEO Specialist at Captivate Search Marketing based in Atlanta, Georgia. Having been in the industry since 2009, Tyler offers vast experience in the search marketing profession, including technical SEO, content strategy, and PPC advertising. 

Whitepapers

Related reading

amazon google market share for ecommerce, data

The power of page speed Practical tips and tools to speed up your site

Five ways to target ads on Google that don’t involve keywords

Two simple behavioral levers to improve your link building efforts

Five steps to deliver better technical SEO services to your clients




How to master technical SEO: Six areas to attack now

How to master technical SEO Six areas to attack now

Technical optimization is the core element of SEO. Technically optimized sites appeal both to search engines for being much easier to crawl and index, and to users for providing a great user experience. 

It’s quite challenging to cover all the technical aspects of your site because hundreds of issues may need fixing. However, there are some areas that are extremely beneficial if got right. In this article, I will cover those you need to focus on first (plus actionable tips on how to succeed in them SEO-wise).

1. Indexing and crawlability

The first thing you want to ensure is that search engines can properly index and crawl your website. You can check the number of your site’s pages that are indexed by search engines in Google Search Console, by googling for site:domain.com or with the help of an SEO crawler like WebSite Auditor.

Screenshot example of how to check your site's indexing by various search engines

Source: SEO PowerSuite

In the example above, there’s an outrageous indexing gap, the number of pages indexed in Google is lagging behind the total number of pages. In order to avoid indexing gaps and improve the crawlability of your site, pay closer attention to the following issues:

Resources restricted from indexing

Remember, Google can now render all kinds of resources (HTML, CSS, and JavaScript). So if some of them are blocked from indexing, Google won’t see your pages the way they should look and won’t render them properly.

Orphan pages

These are the pages that exist on your site but are not linked to by any other page. It means they are invisible to search engines. Make sure your important pages haven’t become orphans.

Paginated content

Google has recently admitted they haven’t supported rel=next, rel-prev for quite some time and recommend going for a single-page content. Though you do not need to change anything in case you already have paginated content and it makes sense for your site, it’s advisable to make sure pagination pages can kind of stand on their own.

What to do

  • Check your robots.txt file. It should not block important pages on your site.
  • Double-check by crawling your site with a tool that can crawl and render all kinds of resources and find all pages.

2. Crawl budget

Crawl budget can be defined as the number of visits from a search engine bot to a site during a particular period of time. For example, if Googlebot visits your site 2.5K times per month, then 2.5K is your monthly crawl budget for Google. Though it’s not quite clear how Google assigns crawl budget to each site, there are two major theories stating that the key factors are:

  • Number of internal links to a page
  • Number of backlinks

Back in 2016, my team ran an experiment to check the correlation between both internal and external links and crawl stats. We created projects for 11 sites in WebSite Auditor to check the number of internal links. Next, we created projects for the same 11 sites in SEO SpyGlass to check the number of external links pointing to every page.

Then we checked the crawl statistics in the server logs to understand how often Googlebot visits each page. Using this data, we found the correlation between internal links and crawl budget to be very weak (0.154), and the correlation between external links and crawl budget to be very strong (0.978).

However, these results seem to be no longer relevant. We re-ran the same experiment last week to prove there’s no correlation between both backlinks and internal links and the crawl budget. In other words, backlinks used to play a role in increasing your crawl budget, but it doesn’t seem to be the case anymore. It means that to amplify your crawl budget, you need to use good old techniques that will make search engine spiders crawl as many pages of your site as possible and find your new content quicker.

What to do

  • Make sure important pages are crawlable. Check your robots.txt, it shouldn’t block any important resources (including CSS and JavaScript).
  • Avoid long redirect chains. The best practice here, no more than two redirects in a row.
  • Fix broken pages. If a search bot stumbles upon a page with a 4XX/5XX status code ( 404 “not found” error, 500 “internal server” error, or any other similar error), one unit of your crawl budget goes to waste.
  • Clean up your sitemap. To make your content easier to find for crawlers and users, remove 4xx pages, unnecessary redirects, non-canonical, and blocked pages.
  • Disallow pages with no SEO value. Create a disallow rule for the privacy policy, old promotions, terms, and conditions, in the robots.txt file.
  • Maintain internal linking efficiency. Make your site structure tree-like and shallow so that crawlers could easily access all important pages on your site.
  • Cater to your URL parameters. If you have dynamic URLs leading to the same page, specify their parameters in Google Search Console > Crawl > Search Parameters.

3. Site structure

Intuitive sites feel like a piece of art. However, beyond this feeling, there is a well-thought site structure and navigation that helps users effortlessly find what they want. What’s more, creating an efficient site architecture helps bots access all the important pages on your site. To make your site structure work, focus on two crucial factors:

1. Sitemap

With the help of a sitemap, search engines find your site, read its structure, and discover fresh content.

What to do

If for some reason you don’t have a sitemap, it’s really necessary to create it and upload to Google Search Console. You can check whether it’s coded properly with the help of the W3C validator.

Keep your sitemap:

  • Updated – Make changes to it when you add or remove something from the site;
  • Concise – Which is under 50,000 URLs;
  • Clean – Free from errors, redirects, and blocked resources.

2. Internal linking structure

Everyone knows about the benefits of external links, but most don’t pay much attention to internal links. However, savvy internal linking helps spread link juice among all pages efficiently and give a traffic boost to pages with less authority. What’s more, you can create topic clusters by interlinking related content within your site to show search engines your site’s content has high authority in a particular field.

What to do

The strategies may vary depending on your goals, but these elements are critical for any goal:

Shallow click-depth: John Mueller confirmed that the fewer clicks it takes to get to a page from your homepage, the better. Following this advice, try to keep each page up to 3 clicks away from the homepage. If you have a large site, use breadcrumbs or the internal site search.

Use of contextual links: When you create content for your site, remember including links to your pages with related content (articles, product pages, etc.) Such links usually have more SEO weight than navigational ones (those in headers or footers).

Informational anchor texts: Include keywords to the anchor texts of internal links so that they inform readers what to expect from linked content. Don’t forget to do the same for alt attributes for image links.

4. Page speed

Speed is a critical factor for the Internet of today. A one-second delay can lead to a grave traffic drop for most businesses. No surprise then that Google is also into speed. Desktop page speed has been a Google ranking factor for quite a while. In July 2018, mobile page speed became a ranking factor as well. Prior to the update, Google launched a new version of its PageSpeed Insights tool where we saw that speed was measured differently. Besides technical optimization and lab data (basically, the way a site loads in ideal conditions), Google started to use field data like loading speed of real users taken from the Chrome User Experience report.

Screenshot of PageSpeed Insights dashboard

Source: PageSpeed Insights dashboard

What’s the catch? When field data is taken into account, your lightning-fast site may be considered slow if most of your users have a slow Internet connection or old devices. At that point in time, I got curious how page speed actually influenced mobile search positions of pages. As a result, my team and I ran an experiment (before and immediately after the update) to see whether there was any correlation between page speed and pages’ positions in mobile search results.

The experiment showed the following:

  • No correlation between a mobile site’s position and the site’s speed (First Contentful Paint and DOM Content Loaded);
  • High correlation between site’s position and its average Page Speed Optimization Score.

It means that for the time being, it’s the level of your site’s technical optimization that matters most for your rankings. Good news is, this metric is totally under your control. Google actually provides a list of optimization tips for speeding up your site. The list is as long as 22 factors, but you do not have to fix all of them. There are usually five to six that you need to pay attention to.

What to do

While you can read how to optimize for all 22 factors here, let’s view how to deal with those that can gravely slow down pages’ rendering:

Landing page redirects: Create a responsive site; choose a redirect type suitable for your needs (permanent 301, temporary 302, JavaScript, or HTTP redirects).

Uncompressed resources: Remove unneeded resources before compression, gziping all compressible resources, using different compression techniques for different resources, etc.

Long server response time: Analyze site performance data to detect what slows it down (use tools like WebPage Test, Pingdom, GTmetrix, Chrome Dev Tools).

Absence of caching policy: Introduce a caching policy according to Google recommendations.

Unminified resources: Use minification together with compression.

Heavy images: Serve responsive images and leverage optimization techniques, such as using vector formats, web fonts instead of encoding text in an image, removing metadata, etc.

5. Mobile-friendliness

As the number of mobile searchers was growing exponentially, Google wanted to address the majority of its users and rolled out mobile-first indexing at the beginning of 2018. By the end of 2018, Google was using mobile-first indexing for over half of the pages shown in search results. A mobile-first index means that now Google crawls the web from a mobile point of view: a site’s mobile version is used for indexing and ranking even for search results shown to desktop users. In case there’s no mobile version, Googlebot will simply crawl a desktop one. It’s true that neither mobile-friendliness nor a responsive design is a prerequisite for a site to be moved to the mobile-first index. Whatever version you have, your site will be moved to the index anyway. The trick here is that this version, as it is viewed by a mobile user agent, will determine how your site ranks in both mobile and desktop search results.

What to do

If you’ve been thinking to go responsive, now is the best moment to do it, according to Google’s Trends Analyst John Mueller.

Don’t be afraid of using expandable content on mobile, such as hamburger and accordion menus, tabs, expandable boxes, and more. However, say no to intrusive interstitials.

Test your pages for mobile-friendliness with a Google Mobile-Friendly Test tool. It evaluates the site according to various usability criteria, like viewport configuration, size of text and buttons, and use of plugins

– Run an audit for your mobile site by using a custom user agent in your SEO crawler to make sure all your important pages can be reached by search engine crawlers and are free from grave errors. Pay attention to titles, H1s, structured data, and others.

– Track mobile performance of your site in Google Search Console.

6. Structured data

As Google SERPs are being enhanced visually, users are becoming rather less prone to clicking. They try to get all they need right from the page of search results without even clicking on any page. And if they click, they go for a result that caught their attention. Rich results are those that usually have the benefit of being visible with image and video carousels, rating stars, and review snippets.

Example of rich snippets in Google SERP

Rich results require structured data implementation. Structured data is coded within the page’s markup and provides information about its content. There are about 600 types of structured data available. While not all of them can make your results rich, it improves chances to get a rich snippet in Google. What’s more, it helps crawlers understand your content better in terms of categories and subcategories (for instance, book, answer, recipe, map) Still, there are about 30 different types of rich results that are powered by schema markup. Let’s see how to get them.

What to do

  • Go to schema.org and choose those schemas suitable for content on your site. Assign those schemas to URLs.
  • Create structured data markup. Don’t worry, you do not need developer skills to do that. Use Google’s Structured Data Markup Helper that will guide you through the process. Then test your markup in Structured Data Testing Tool or in its updated version, Rich Results Testing Tool. Keep in mind that Google supports structured data in 3 formats: JSON-LD, Microdata, and RDFa, with JSON-LD being the recommended one.

Don’t expect Google to display your enhanced results right away. It can take a few weeks, so use Fetch as Google in the search console to make your pages to be recrawled faster. Bear in mind that Google can decide not to show them at all if they do not meet the guidelines.

Don’t be surprised if you get a rich snippet without structured data implementation. John Mueller confirmed that sometimes your content is enough to produce rich results.

Summary

Technical SEO is something you cannot do without if you’d like to see your site rank higher in search results. While there are numerous technical elements that need your attention, the major areas to focus on for max ranking pay-off are loading speed, crawlability issues, site structure, and mobile-friendliness.

Aleh is the Founder and CMO at SEO PowerSuite and Awario. He can be found on Twitter at .

Related reading

Nine types of meta descriptions that win more clicks

Seven reasons why your rankings dropped and how to fix them

A summary of Google Data Studio Updates from April 2019

local SEO for enterprises

How to master technical SEO: Six areas to attack now




Seven reasons why your rankings dropped and how to fix them

Seven reasons why your rankings dropped and how to fix them

Do you know the triumph when your content finally hits the first page of Google and attracts significant traffic? Unfortunately, nobody is safe from a sudden drop in rankings. The thing is that the reasons for it may be different and not obvious at all.

In this post, you’ll discover what could cause a sudden drop in traffic and how to fix the issue.

The tip of an iceberg

Unfortunately, there’s no one size fits all decision, when it comes to SEO. When you face the drop in your rankings or traffic, it’s just the tip of an iceberg. So, get ready to check lots of issues, before you identify the problem.

Graph on issues that cause ranking drops

Note: Percentages assigned in the above graph are derived from personal observation.

I’ve illustrated the most common reasons for a plummet. Start from checking these parameters to find out how you can recover your rankings and drive traffic to your website.

Algorithms test

First of all, check the SERP. What if it’s not only your website that changed its positions in search results? These sharp shifts may happen when Google tests its algorithms. In this case, you don’t even have to take any further steps, as the rankings will be restored soon.

If you track your rankings with Serpstat, you can analyze your competitors’ positions as well. It’ll help you understand whether the SERP was changing a lot lately. From the moment you create a new project, the tool starts tracking the history of top-100 search rankings’ changes for the selected keywords. The “Storm” graph illustrates the effect of the changes that have occurred in the search results.

The "Storm" graph that illustrates the factors causing the ranking drop

On this chart, you see that for the “cakes for dads” keyword the storm score was pretty high on 21st March. Now, let’s look at how the top-10 positions that were changing on this date.

Graph showing a phrase-wise rise and drop in the SERP

The graph shows a sharp drop and rise that occurred in most of the positions. In a few days, all the rankings were back to normal again.

This example tells us that whenever you witness a significant drop in your search rankings, you should start with analyzing the whole SERP. If there’s a high storm score, all you need to do is to wait a bit.

In case you checked your competitors’ positions and didn’t see any movements, here’s the next step for you.

Technical issues

Technical SEO affects how search robots crawl and index your site’s content. Even though you have optimized your website technically, every time you add or remove some files or pages, the troubles may occur. So, make sure you’re aware of technical SEO issues on your site. With Google’s URL Inspection tool, you can check the way search engines see your website.

These are the main factors crucial for your rankings:

1. Server overload

If your server isn’t prepared for traffic surges, it can take your site down any minute. To fix this problem, you can add a CDN on your website or cache your content, set up a load balancer, or set up a cloud hosting,

2. Page speed

The more the images, files, and pop-ups you add to your content, the more time it takes for your pages to get loaded. Mind that page speed isn’t only a ranking factor, but it also influences user experience. To quickly check the issue, you can go with Google’s PageSpeed Insights. And to speed up your website, you can:

  • Minimize HTTP requests or minify and combine files
  • Use asynchronous loading for CSS and JavaScript files
  • Defer JavaScript loading
  • Minimize time to first byte
  • Reduce server response time
  • Enable browser caching
  • Reduce image sizes
  • Use CDN again
  • Optimize CSS delivery
  • Prioritize above-the-fold content (lazy loading)
  • Reduce the number of plugins you use on your site
  • Reduce redirects and external scripts
  • Monitor mobile page speed

3. Redirections

It’s the most common cause of lost rankings. When you migrate to a new server or change the structure of your site, never forget to set up 301 redirects. Otherwise, search engines will either fail to index your new pages or even penalize your site for duplicate content.

Detecting site errors can be quite difficult especially if it’s located solely on one page. Inspecting every page would be time-consuming. Also, it’d be very costly if you’re running a business. To speed up the process of identifying such errors you can use different SEO tools and site audit tools, like Serpstat, OnCrawl, and other such ones.

Wrong keywords

Are you using the right keywords? If you hadn’t considered user intent when collecting the keywords, it might have caused some problems. Even if your site was ranking high for these queries for some time, Google could have changed the way it understands your site’s intent.

I’ll provide two examples to illustrate the issue.

Case one

There’s a website of an Oxford Summer School named “oxford-royale.co.uk”. The site didn’t contain any long-form descriptions but services pages. Once Google began to rank the website for queries with informational intent, SEO experts noticed the traffic dropped. After they added more texts to the service pages, they succeeded in fixing the problem.

Case two

This case occurred to a flower delivery agency. While the website was ranking for transactional queries, everything was alright. Then Google decided the site better suits informational intent. To restore the site’s rankings, SEOs had to add keywords with high transactional intent, such as “order”, “buy”, and many such keywords.

To collect the keywords that are right for your business goals, you can use KWFinder. With the tool, you can identify relevant keywords that you can easily rank for.

Screenshot of a suitable keywords' list in KWFinder

Outdated content

This paragraph doesn’t require long introductions. If your content isn’t fresh and up-to-date anymore, people won’t stay long on your site. Moreover, outdated content doesn’t attract shares and links. All these aspects may become good reasons for search engines to reduce your positions.

There’s an easy way to fix it. Update your content regularly and promote it not to lose traffic. The trends keep changing, and if you provided a comprehensive guide on the specific topic, you don’t want it to become outdated. Instead of creating a new guide every time, update the old one with new data.

Lost links

Everybody knows your link profile is a crucial part of your site’s SEO. Website owners take efforts to build quality links to the new pieces of content. However, when you managed to earn a large number of backlinks, you shouldn’t stop monitoring your link profile.

To discover whether your link profile has undergone any changes for the last weeks, go with Moz or Majestic. The tools will provide you with data on your lost and discovered links for the selected period.

Screenshot of discovered and lost linking domains in Moz

If you find out you’ve lost the links from trustworthy sources, try to identify the reasons why these links were removed. In case they’re broken, you can always fix them. If website owners removed your links by chance (for example, when updating their websites), then ask them to restore links. If they did it intentionally, no one can stop you from building new ones.

Poor user experience

User experience is one more thing crucial for your site’s rankings. If it had started ranking your page high on search results and then noticed it didn’t meet users’ expectations, your rankings could have suffered a lot.

Search engines usually rely on metrics such as the click-through rate, time spent on your page, bounce rate, the number of visits, and more. That’s why you should remember the following rules when optimizing your site:

1. Provide relevant metadata

As metadata is used to form snippets, it should contain relevant descriptions of your content. First of all, if they aren’t engaging enough, users won’t click-through them and land on your site. On the other hand, if your snippets provide false promises, the bounce rate will increase.

2. Create an effective content structure

It should be easy for users to extract the necessary information. Most of your visitors pay attention to your content structure when deciding whether they’ll read the post.

Break the texts into paragraphs and denote the main ideas in the subheadings. This step will help you engage visitors looking for the answer to their very specific questions.

3. Avoid complicated design and pop-ups

The content isn’t the only thing your audience looks at. People may also decide to leave your website because of irritating colors, fonts, or pop-up ads. Provide simple design and minimize the number of annoying windows.

Competition from other websites

What if none of the steps worked? It might mean that your rankings dropped because your competitors were performing better. Monitor changes in their positions and identify the SERP leaders.

You can analyze your competitors’ strategies with Serpstat or Moz. With these tools, you can discover their backlink sources, keywords they rank for, top content, and more. This step will help you come up with ideas of how you could improve your own strategy.

Never stop tracking

You can’t predict whether your rankings will drop one day. It’s much better to notice the problem before you’ve already lost traffic and conversions. So, always keep tracking your positions and be ready to react to any changes quickly.

Inna Yatsyna is a Brand and Community Development Specialist at Serpstat. She can be found on Twitter .

Related reading

Google China

What defines high-quality links in 2019 and how to get them

The evolution of SEO and the shift from point solutions to platform

On-site analytics tactics to adopt now Heatmaps, intent analysis, and more

Seven reasons why your rankings dropped and how to fix them




A survival kit for SEO-friendly JavaScript websites

how to make SEO-friendly Javascript websites

JavaScript-powered websites are here to stay. As JavaScript in its many frameworks becomes an ever more popular resource for modern websites, SEOs must be able to guarantee their technical implementation is search engine-friendly.

In this article, we will focus on how to optimize JS-websites for Google (although Bing also recommends the same solution, dynamic rendering).

The content of this article includes:

1.    JavaScript challenges for SEO

2.    Client-side and server-side rendering

3.    How Google crawls websites

4.    How to detect client-side rendered content

5.    The solutions: Hybrid rendering and dynamic rendering

1. JavaScript challenges for SEO

React, Vue, Angular, Node, and Polymer. If at least one of these fancy names rings a bell, then most likely you are already dealing with a JavaScript-powered website.

All these JavaScript frameworks provide great flexibility and power to modern websites.

They open a large range of possibilities in terms of client-side rendering (like allowing the page to be rendered by the browser instead of the server), page load capabilities, dynamic-content, user-interaction, and extended functionalities.

If we only look at what has an impact on SEO, JavaScript frameworks can do the following for a website:

  • Load content dynamically based on users’ interactions
  • Externalize the loading of visible content (see client-side rendering below)
  • Externalize the loading of meta-content or code (for example, structured data)

Unfortunately, if implemented without using a pair of SEO lenses, JavaScript frameworks can pose serious challenges to the page performance, ranging from speed deficiencies to render-blocking issues, or even hindering crawlability of content and links.

There are many aspects that SEOs must look after when auditing a JavaScript-powered web page, which can be summarized as follows:

  1. Is the content visible to Googlebot? Remember the bot doesn’t interact with the page (images, tabs, and more).
  2. Are links crawlable, hence followed? Always use the anchor (<a>) and the reference (href=), even in conjunction with the “onclick” events.
  3. Is the rendering fast enough?
  4. How does it affect crawl efficiency and crawl budget?

A lot of questions to answer. So where should an SEO start?

Below are key guidelines to the optimization of JS-websites, to enable the usage of these frameworks while keeping the search engine bots happy.

2. Client-side and server-side rendering: The best “frenemies”

Probably the most important pieces of knowledge all SEOs need when they have to cope with JS-powered websites is the concepts of client-side and server-side rendering.

Understanding the differences, benefits, and disadvantages of both are critical to deploying the right SEO strategy and not getting lost when speaking with software engineers (who eventually are the ones in charge of implementing that strategy).

Let’s look at how Googlebot crawls and indexes pages, putting it as a very basic sequential process:

how googlebot crawls and indexes pages

1. The client (web browser) places several requests to the server, in order to download all the necessary information that will eventually display the page. Usually, the very first request concerns the static HTML document.

2. The CSS and JS files, referred to by the HTML document, are then downloaded: these are the styles, scripts and services that contribute to generating the page.

3. The Website Rendering Service (WRS) parses and executes the JavaScript (which can manage all or part of the content or just a simple functionality).
This JavaScript can be served to the bot in two different ways:

  • Client-side: all the job is basically “outsourced” to the WRS, which is now in charge of loading all the script and necessary libraries to render that content. The advantage for the server is that when a real user requests the page, it saves a lot of resources, as the execution of the scripts happens on the browser side.
  • Server-side: everything is pre-cooked (aka rendered) by the server, and the final result is sent to the bot, ready for crawling and indexing. The disadvantage here is that all the job is carried out internally by the server, and not externalized to the client, which can lead to additional delays in the rendering of further requests.

4. Caffeine (Google’s indexer) indexes the content found

New links are discovered within the content for further crawling

This is the theory, but in the real world, Google doesn’t have infinite resources and has to do some prioritization in the crawling.

3. How Google actually crawls websites

Google is a very smart search engine with very smart crawlers.

However, it usually adopts a reactive approach when it comes to new technologies applied to web development. This means that it is Google and its bots that need to adapt to the new frameworks as they become more and more popular (which is the case with JavaScript).

For this reason, the way Google crawls JS-powered websites is still far from perfect, with blind spots that SEOs and software engineers need to mitigate somehow.

This is in a nutshell how Google actually crawls these sites:

how googlebot crawls a JS rendered site

The above graph was shared by Tom Greenaway in Google IO 2018 conference, and what it basically says is – If you have a site that relies heavily on JavaScript, you’d better load the JS-content very quickly, otherwise we will not be able to render it (hence index it) during the first wave, and it will be postponed to a second wave, which no one knows when may occur.

Therefore, your client-side rendered content based on JavaScript will probably be rendered by the bots in the second wave, because during the first wave they will load your server-side content, which should be fast enough. But they don’t want to spend too many resources and take on too many tasks.

In Tom Greenaway’s words:

“The rendering of JavaScript powered websites in Google Search is deferred until Googlebot has resources available to process that content.”

Implications for SEO are huge, your content may not be discovered until one, two or even five weeks later, and in the meantime, only your content-less page would be assessed and ranked by the algorithm.

What an SEO should be most worried about at this point is this simple equation:

No content is found = Content is (probably) hardly indexable

And how would a content-less page rank? Easy to guess for any SEO.

So far so good. The next step is learning if the content is rendered client-side or server-side (without asking software engineers).

4. How to detect client-side rendered content

Option one: The Document Object Model (DOM)

There are several ways to know it, and for this, we need to introduce the concept of DOM.

The Document Object Model defines the structure of an HTML (or an XML) document, and how such documents can be accessed and manipulated.

In SEO and software engineering we usually refer to the DOM as the final HTML document rendered by the browser, as opposed to the original static HTML document that lives in the server.

You can think of the HTML as the trunk of a tree. You can add branches, leaves, flowers, and fruits to it (that is the DOM).

What JavaScript does is manipulate the HTML and create an enriched DOM that adds up functionalities and content.

In practice, you can check the static HTML by pressing “Ctrl+U” on any page you are looking at, and the DOM by “Inspecting” the page once it’s fully loaded.

Most of the times, for modern websites, you will see that the two documents are quite different.

Option two: JS-free Chrome profile 

Create a new profile in Chrome and disallow JavaScript through the content settings (access them directly here –  Chrome://settings/content).

Any URL you browse with this profile will not load any JS content. Therefore, any blank spot in your page identifies a piece of content that is served client-side.

Option three: Fetch as Google in Google Search Console

Provided that your website is registered in Google Search Console (I can’t think of any good reason why it wouldn’t be), use the “Fetch as Google” tool in the old version of the console. This will return a rendering of how Googlebot sees the page and a rendering of how a normal user sees it. Many differences there?

Option four: Run Chrome version 41 in headless mode (Chromium) 

Google officially stated in early 2018 that they use an older version of Chrome (specifically version 41, which anyone can download from here) in headless mode to render websites. The main implication is that a page that doesn’t render well in that version of Chrome can be subject to some crawling-oriented problems.

Option five: Crawl the page on Screaming Frog using Googlebot

And with the JavaScript rendering option disabled. Check if the content and meta-content are rendered correctly by the bot.

After all these checks, still, ask your software engineers because you don’t want to leave any loose ends.

5. The solutions: Hybrid rendering and dynamic rendering

Asking a software engineer to roll back a piece of great development work because it hurts SEO can be a difficult task.

It happens frequently that SEOs are not involved in the development process, and they are called in only when the whole infrastructure is in place.

We SEOs should all work on improving our relationship with software engineers and make them aware of the huge implications that any innovation can have on SEO.

This is how a problem like content-less pages can be avoided from the get-go. The solution resides on two approaches.

Hybrid rendering

Also known as Isomorphic JavaScript, this approach aims to minimize the need for client-side rendering, and it doesn’t differentiate between bots and real users.

Hybrid rendering suggests the following:

  1. On one hand, all the non-interactive code (including all the JavaScript) is executed server-side in order to render static pages. All the content is visible to both crawlers and users when they access the page.
  2. On the other hand, only user-interactive resources are then run by the client (the browser). This benefits the page load speed as less client-side rendering is needed.

Dynamic rendering

This approach aims to detect requests placed by a bot vs the ones placed by a user and serves the page accordingly.

  1. If the request comes from a user: The server delivers the static HTML and makes use of client-side rendering to build the DOM and render the page.
  2. If the request comes from a bot: The server pre-renders the JavaScript through an internal renderer (such as Puppeteer), and delivers the new static HTML (the DOM, manipulated by the JavaScript) to the bot.

how google does dynamic rendering javascript sites

The best of both worlds

Combining the two solutions can also provide great benefit to both users and bots.

  1. Use hybrid rendering if the request comes from a user
  2. Use server-side rendering if the request comes from a bot

Conclusion

As the use of JavaScript in modern websites is growing every day, through many light and easy frameworks, it requires software engineers to solely rely on HTML to please search engine bots which are not realistic nor feasible.

However, the SEO issues raised by client-side rendering solutions can be successfully tackled in different ways using hybrid rendering and dynamic rendering.

Knowing the technology available, your website infrastructure, your engineers, and the solutions can guarantee the success of your SEO strategy even in complicated environments such as JavaScript-powered websites.

Giorgio Franco is a Senior Technical SEO Specialist at Vistaprint.

Related reading

Google Dataset Search How you can use it for SEO

Google featured snippet - A short guide for 2019

Five ways to improve your website's bounce rate (and why you should)

Debunked Nine link building myths you should ignore in 2019

https://searchenginewatch.com/2019/04/09/a-survival-kit-for-seo-friendly-javascript-websites/




Voice search optimization guide: Six steps for 2019

voice search optimization guide 2019

Have you ever tried to search for some data online when you were multitasking and couldn’t type the text? It would be quite challenging without the opportunity to conduct voice search.

According to PWC report, 71% of respondents would rather use their voice assistant to search for something than physically typing their queries.

And what’s most important is that the differences between spoken and typed queries may cause different SERP results. Which in turn means that your competitors’ voice search optimized websites have better chances of engaging your potential customers or subscribers.

Want your website rank as high for voice queries as for the typed ones? This article will give you the six key steps to undertake for 2019.

guide to voice search optimization

Voice search evolution

Remember when voice search required calling a phone number from your mobile device and saying your search query?

That was what voice search looked like in its infancy (to be more precise, in 2010). And to little surprise, few people actually used it.

Since then, voice search has improved significantly.

On June 2011, Google announced they were starting to roll out voice search on Google.com.

At first, the feature could be accessed only in English. Today, we have a choice of about 60 languages supported by Google Voice Search.

With the Hummingbird update in 2013, the concepts of typed and spoken search changed a lot.

The updated algorithm emphasized natural language processing and was aimed at considering the users’ intent and the context of the query. From then on, search questions structured in sentences have returned more relevant answers.

With voice search, people typically ask a long question as they would normally speak. Hence, the Hummingbird update gave a huge push to voice search optimization.

Why do we need to optimize for voice search?

Experts differ on their opinions about voice search optimization. However, most all of them agree that it’s an important part of the SEO process.

We spoke with Jenny Halasz, speaker and consultant on SEO, and Shane Barker, digital marketing strategist, to get their insights on voice search optimization for 2019.

“While voice search is certainly the future of how we will do most searches, there’s not really too much you can do to optimize for it that is different than regular SEO optimization,” Jenny says. “Because Google’s goal will always be to return the best result based on the person, location, and history, it’s hard to guess exactly what the right answer for a query will be.”

Similarly, Shane notes:

“Facilitated by the launch of voice-based digital assistants like Siri and Alexa, voice search now constitutes a significant part of all online searches. And its share is only going to rise to a level that SEO experts can’t deny its importance.

The question is, who will be best prepared when voice search takes up a majority share of all searches? And the answer to that is SEO experts who are devoting their time to it now.

However, there is another side to it. Though voice searches are likely to be a really important part of SEO in the future, it is not the case now. While my advice would still be to start preparing for it, I would advise against allocating a substantial part of your budget to it.”

So, start learning about the nuances of SEO for voice search. That said, don’t go overboard and hire a team for it — at least not yet.

The best place to start? Learn about Google’s Hummingbird and other algorithm updates and how they’ve changed the dynamics of SEO.

For past articles about Hummingbird, read:

Now, let’s turn to how to actually go about voice search optimization

Typed and spoken searches will output different results. Which means that optimizing your site for a traditional search doesn’t always look the same as optimizing your site for voice search.

The most significant concern about voice search? People using voice search on mobile will get only one top result. This one result is affectionately dubbed “position zero,” and everyone wants it.

By 2020, half of searches will be conducted by voice. 

So by 2020, half of your potential customers won’t see your website even if you’re fourth in the SERP.

Ranking number one — or securing position zero — will be the main goal for every business owner.

So, what are the essential factors for you to consider optimizing for voice search?

Six essential factors to consider in voice search optimization

1. Featured snippets

If you’re not familiar with the name “featured snippets,” you almost certainly recognize what they look like. Featured snippets appear at the top of the SERPs. Google pulls the most relevant content and places it in a box like this: 

featured snippet

Featured snippets matter because up to 30% of 1.4 million tested Google queries contain them.

You can be sure that if the results include a featured snippet, your voice assistant will pull its answer from there.

So if you want to rank in voice search results, you should focus on providing such quality data that Google displays it in the featured snippet.

For more on featured snippets, read:

2. User intent

When people search for your website, do they want to buy something or are they looking for information? 

User intent tells us the the reason a person typed their query into a search engine in the first place.

Sometimes intent is obvious and clearly expressed in the query with words such as “buy,” “price,” “how to,” “what is,” etc. But other times, intent hides only in a user’s mind.

Intent may or may not be expressed.

Regardless — thanks to the Hummingbird update — Google digs into the context of the search query. They investigate sites’ content and provide you with whatever answer is deemed most relevant.

For example, I may search for “oscar winners.”

In the most likely scenario, I’m interested in the most recent awards ceremony — not in results from 20 years ago.

Search engines understand this, and take it into account.

Therefore, you should consider user intent when creating content, in order to enhance the relevance of your pages to specific search queries.

So if you want to optimize your page for a featured snippet, your main aim should be understanding user intent and giving your audience an immediate answer.

On this topic, Jenny Halasz says:

“Try to match your customers’ intent with your content, seek to answer questions, and provide details wherever possible. The same steps you take now to optimize for answer boxes are going to help you in voice search too.”

For more on user intent, read:

3. Long tail keywords & questions

When searching for information via voice assistant, people behave as if they’re talking to a human.

Most of us won’t use short, choppy keywords. We’ll ask questions and use long phrases.

As Shane Barker puts it:

“Use more conversational keywords and phrases that people use while speaking, not while typing. Essentially, these will be long-tail keywords but phrased in the way people speak.”

By the way, using long tail keywords is good practice not only for voice search optimization but also for traditional SEO.

In fact, key phrases containing more than two words face less difficulty (or competition) and provide higher chances to rank at the top.

As I’ve already mentioned, people tend to use questions for voice search along with long phrases.

For instance, when typing a query, a person will likely use the most relevant keywords and write something like “the best coffee in NYC.”

Voice query, on the other hand, sounds much more natural.

First of all, talking to your voice assistant, you would start with “Hey Siri…,” “OK, Google,” etc. These phrases make you think you’re communicating with your device, not just conducting a keyword-based query. So when looking for the best coffee, you’re most likely to ask a question: “Hey Siri, where can I drink the best coffee?”

To find out what questions your target audience may ask (and not spend too much time doing it), you can use tools such as Answer the Public or Serpstat Search Questions. Disclaimer, I work at Serpstat and thus am obviously more familiar with the tool. 

Using our tool, for example, you would simply type the word or a phrase best describing the subject of your content. From there, you would be shown how people usually search for that topic. 

long tail keywords and questions

Make sure you answer searcher questions

Shane Barker notes:

“Answer your customers’ common questions on your website or blog. Use a conversational tone for phrasing these questions, to rank well for voice queries.”

When you choose the questions you’re writing about in your post, add them to the pages around your site.

You can also create H2 headers using these queries and provide answers in the body text. Answer the questions concisely and make sure the main idea is stated briefly.

After you answered the question directly, you can also cover other related search questions. This helps you rank for as many variations of queries as possible.

And to not lose your position in featured snippets, keep your content fresh and update it regularly.

For more on keywords, read:

4. Page speed

Page speed means the time needed for your page to load. This also influences whether or not your page will appear in voice search results.

Picture a person using voice search — they’re likely on the go and/or in a hurry.

So to reach those searchers, page speed optimization really is high priority. 

As a first step, analyze your current website speed with PageSpeed Insights.

This tool tells you whether your site’s current loading time is fast enough, and suggestions for how to make it even faster.

Note as well that mobile speed is more important than desktop for voice search optimization. This goes for overall design as well — make sure your website is mobile-friendly since a majority of voice searches happen via mobile devices.

page speed

For more on Page Speed, read:

5. Structured data

Structured data is code added to HTML markup and used by search engines to better understand your site’s content.

Using structured data, you can help search engines crawl and read your content efficiently.

With schema markup, you can better control the way you provide information about your brand, and the way machines interpret it.

Implementing structured data results in rich snippets which are known to increase click-through rate, drive traffic, and bring you competitive advantages.

In the image below, the parts highlighted in red show where these rich snippets differ from normal ones:

structured data

Having this data can also help your pages appear in featured snippets and, consequently, in voice search results.

Specifically, you can use structured data markup to provide better information to mobile devices about your website and its content.

So, if you do everything correctly and produce content interpreted by search engines as highly relevant (and if you get a bit of luck), your snippet will become featured:

structured data

Here’s what that looks like in the code:

structured data code example

To find out how you can implement structured data to your site, use Schema.org vocabulary. They have a set of schemas which enables SEO experts to markup their websites.

When elaborating structured data, you should remember it’s easy to become spammy. Use the data which is relevant to the content you provide.

And finally, keep updating your markup, as everything tends to be constantly changing. As we all know far too well, websites are never finished. 

For more on structured data, read:

6. Local SEO

BrightLocal report found that 58% of consumers use voice search to find local businesses.

This comes as little surprise, since most people use voice search when they’re walking or driving somewhere. Mostly though, people use voice to discover where they should go.

People who search for “best donuts in LA,” are looking to discover donuts near them.

If you own a donut shop in LA, then, you want to include your city when optimizing keywords.

This also stays true for states, neighborhoods, and even countries where your business works.

Most importantly, people conducting voice search will likely use the phrase “near me” 

Say I decide to eat some donuts. I’m more likely to say “OK Google, donuts cafe near me,” than “donuts cafe in Los Angeles.”

In this case, the search engine will use my location to understand which cafes are closest to where I am at the moment.

To appear in the relevant results for such queries, though, you don’t add the actual phrase “near me” to your content, of course.

As Jenny Halasz explains:

“Keep in mind that “near me” queries are simply adding a location intent to a search. It’s not necessary to actually use the words “near me” on your site anywhere. If you want to rank for “pizza near me”, then, by all means, track that keyword’s performance on your ranking tools, but don’t worry about putting “near me” in your actual site code.”

Rather, most search robots use Business Listings information.

To optimize for Business Listings information, you would need to go to your Google My Business page.

On there, make sure you’ve added all the necessary information, such as brand name, address, opening hours, etc.

Shane Barker notes:

“Optimize your Google My Business listing and provide accurate and updated contact information. A lot of voice searches are for local queries and listing your business there will help you rank better for such queries.”

For more on local SEO, read:

To wrap up

People already use voice search widely — and its popularity will only grow dramatically in the coming years.

Those who already take voice search optimization into account in their SEO will improve their content visibility significantly, as voice search results increasingly narrow results to only top pages.

While it may seem ominous now, the future with voice search is clear. These six steps to voice search optimization will help you prepare to stay on top.

Related reading

rankedy-topic-orientation

search industry news trends 2018

SEO tips tools guides 2018

Voice search optimization guide: Six steps for 2019