Connect with us

SEO

How To Boost Your On-Page SEO

Published

on

How To Boost Your On-Page SEO


Never in the history of the internet have users so strongly craved reliability in the information they consume – whether that information comes from journalists, authors, politicians, or businesses.

In an era where misinformation runs rampant and once-venerable news institutions are now rigorously questioned, people anxiously seek reassurance – proof, even – that the information they are presented is true and from a reputable source.

And search engines like Google know it.

It’s no longer enough to reverse-engineer Google’s algorithms to ensure top ranking in search results. Google now seeks a complex array of indicators that signal the expertise, authoritativeness, and trustworthiness (E-A-T) of your website and the people who created it.

E-A-T is not a ranking factor but features heavily in Google’s Search Quality Evaluator Guidelines. These are the people who evaluate page quality to inform algorithm updates, and so their guidelines give us insight into what Google considers a top-quality user experience.

This is particularly important in Your Money or Your Life (YMYL) content, so-called for the serious implications it can have on a searcher’s livelihood. Medical advice, stock recommendations, and mortgages are just a few examples of YMYL topics.

Advertisement

Continue Reading Below

How can you demonstrate E-A-T in your content to satisfy readers and achieve your on-page SEO goals?

The Closest Your Website Will Get To Showing Virtue And Integrity

The trickiest aspect of mastering E-A-T lies in its utter simplicity.

E-A-T is about the value, reliability, and integrity of the content. By integrity, we mean providing users with trustworthy information they need in a way they can use. You can’t fake your website’s E-A-T, and you can’t trick Google into thinking you have it when you don’t.

Maybe it’s this elusiveness – the unshakeable “realness” of content that is trustworthy and produced by experts who are authorities in their fields – that has prompted Google to prioritize E-A-T in the way it has.

Rather, Google engineers have identified indicators of trustworthiness, and the presence or absence of these indicators influence a website’s ranking.

You Can Fly Under The Radar Until You Really Get Competitive

In a business world obsessed with (and a little spoiled by) numbers, formulas, automation, and programmable solutions to any conceivable challenge, the idea of E-A-T can drive any marketer to the depths of frustration.

Advertisement

Continue Reading Below

If you find yourself in this position, clear your mind, then walk yourself through this simple truth:

Your customer needs to trust you, your expertise, and your reputation.

This applies to you no matter what line of business you are in. It also applies just as much to a customer looking for you on the internet as it does to a customer standing in front of you at your place of business.

Now, your next question should be: How do I capture that trust on my website?

We’re talking about mastering E-A-T, and this is what we will be discussing here.

What Is Google’s E-A-T, Really?

Google wants to reward sites that produce high-quality content, as these are the best answers to relevant queries. The search engine also wants to make sure that sites that publish low-quality content get less visibility.

Now, let’s look at each of these factors that make up the E-A-T principle.

Expertise

Demonstrating your expertise is especially important in certain niches (e.g., legal, financial, medical).

The folks at Google want the content on these sites to be written by subject matter experts (SMEs) – people who possess the necessary knowledge and understanding of the field to talk deeply about a specific topic.

This knowledge can be general or highly specialized.

Google also accepts something it calls “everyday expertise.”

Here’s how the company explains this concept in the Search Quality Evaluator Guidelines:

“Some topics require less formal expertise. Many people write extremely detailed, helpful reviews of products or restaurants. Many people share tips and life experiences on forums, blogs, etc.

These ordinary people may be considered experts in topics where they have life experience.

If it seems as if the person creating the content has the type and amount of life experience to make him or her an ‘expert’ on the topic, we will value this ‘everyday expertise’ and not penalize the person/webpage/website for not having ‘formal’ education or training in the field.”

Putting The Plan In Action

Google wants to provide links to websites that have published helpful content that is useful, comprehensive, relevant, and accurate – and this makes perfect sense.

Advertisement

Continue Reading Below

People are coming to your website to find answers to important questions. So it naturally follows that providing inaccurate, unhelpful, or outdated content would be a recipe for SEO disaster.

Google doesn’t want to send its users to incorrect content or websites that deliberately mislead users.

So, make sure the people who create your content possess subject matter expertise and have sources on board to do the necessary research and fact-checking.

Authoritativeness

When Google talks about authority, it’s talking about reputation. The stronger your reputation as a knowledge source is within your circle of industry experts, the greater your authority.

When Google sets its raters on your website, they will scour the internet for signals of your authority in your given subject area. They will look in news articles, reviews, references, and even Wikipedia articles.

The raters want to check your (or your website’s) level of authority on the subject your website covers. The stronger that level of authority, the better your site will rank.

Advertisement

Continue Reading Below

Make it Likable And Linkable

Another element that signals authoritativeness on your webpage is linkability. Links – especially the quality of those links – continue to be a top-ranking factor.

This is no secret.

For years, we’ve heard links compared to votes, where the more votes you get, the more authoritative (or popular) you are.

It’s hard to get people to vote for you if they don’t know your name, right? The same applies to “votes” for your website content.

Where expertise is having specific knowledge or skills, authoritativeness is what happens when others (inside and outside of your industry) recognize that expertise.

That recognition can come in the form of links, mentions, shares, reviews, or any other type of citation.

It kind of sounds like authoritativeness is like your online reputation, right?

That’s because, in a way, it is. The best way to build that authoritativeness online is to create that useful content discussed in the last section.

Advertisement

Continue Reading Below

Trustworthiness

You want people to trust in your brand or business and be willing to endorse or buy from you.

As in the “real world,” you have to put in a ton of hard work to earn the trust of internet users and search engines.

One way to increase your trustworthiness is by highlighting the credentials of your content creators and the website. Think awards, testimonials, endorsements, and other trust factors.

People have to feel they can trust all the information they find on your website.

Likewise, Google wants to rank websites and content that it can trust. In 2018, Google made an update referred to by some in the industry as the Medic Update that prioritized reputable, well-researched content. This update signaled to marketers just how much emphasis Google was placing on E-A-T.

Trust also ties into Google’s YMYL concept.

What Is YMYL?

Websites that sell products or provide services or information that can impact users’ happiness, health, financial stability, or safety are categorized by Google as YMYL – which stands for “Your Money or Your Life.”

Advertisement

Continue Reading Below

Google’s John Mueller shared some insight into the importance of E-A-T for YMYL websites in a March 2021, Google Search Central SEO hangout.

“I don’t think there is one simple approach to that. And I think especially when it comes to medical content, I think that’s super important that our algorithms are very picky there with regards to what we show. So I would look at the quality rater guidelines and really think about how your site might be perceived by the quality raters.

The quality raters don’t make the algorithms, but they do give us a lot of insight into what we might do in our algorithms. So I would strongly recommend going through that. And I think it’s especially, when it comes to these kind of sites, it’s less about the tactics and really more about making sure that it really is a legitimate business and that it’s backed up by appropriate trustworthy sources.

So not just high quality content, and doing all of this syndication, all of these things. But really making sure that it’s written by doctor, it’s created by medical professionals who are legitimate in their field.”

Advertisement

Continue Reading Below

The bottom line when it comes to YMYL is simple: Make sure that any content on your website will help, not hurt, the people who consume it.

Make your users feel safe.

Take great care of your users, and Google should take great care of you.

Why Is E-A-T Important For Your SEO?

For as long as I can remember, Google has been telling us to create great content. And great content is what appears at the top of Google’s search results.

So, in some form, Google considers E-A-T when returning search results, so you should, too.

Be aware that E-A-T applies to all types of sites, even those related to gossip, fashion, humor, forums, and Q&As.

This means E-A-T applies to your site. 

Accordingly, your top priority should be creating content that your target audience wants or needs – content that offers true value. Creating these pages for your website should help it perform better in Google’s search results.

Advertisement

Continue Reading Below

And, yes, this is much easier said than done. First, you must have a clear understanding of what Google means by “high-quality content.”

What Is High-Quality Content?

Whatever content you create must have a purpose. Your content must benefit your clients, customers, users, or readers.

Common Traits Of High-Quality Pages

According to the Search Quality Evaluator Guidelines, high-quality pages are those that have:

  •       High levels of Expertise, Authoritativeness, and Trustworthiness (E-A-T).
  •       A satisfying amount of high-quality main content, including a descriptive or helpful title.
  •       Satisfying website information and information about who is responsible for the website (for shopping pages or those that enable financial transactions, this includes satisfying customer service information).
  •       A positive reputation as a website that is responsible for the main content on the page.
  •       A positive reputation for the creator of the main content, if different from that of the website.

The highest quality pages (including YMYL pages) will have an extremely high level of E-A-T, according to Google’s guidelines.

What Is Low-Quality Content?

Low-quality content is, as you’d expect, the exact opposite of high-quality content.

As Google puts it:

“Websites or pages without some sort of beneficial purpose, including pages that are created with no attempt to help users, or pages that potentially spread hate, cause harm, or misinform or deceive users, should receive the Lowest rating.”

Advertisement

Continue Reading Below

If high-quality content helps your site rank higher, it logically follows that low-quality pages could hurt your Google rankings.

If your content is inaccurate, has no purpose, or includes elements that hurt the user experience, it’s unlikely that Google will feature your website prominently in the search engine results pages (SERPs).

Common Traits Of A Low-Quality Page

Here are the characteristics of a low-quality page, according to Google’s guidelines:

  •       The page has an inadequate level of Expertise, Authoritativeness, and Trustworthiness (E-A-T).
  •       The quality of the main content (MC) is low.
  •       There is an unsatisfying amount of MC for the purpose of the page.
  •       The title of the MC is exaggerated or shocking.
  •       Ads or secondary content (SC) distracts from the MC.
  •       There is an unsatisfying amount of website information or information about the creator of the MC for the purpose of the page (no good reason for anonymity).
  •       The website or the MC creator has a mildly negative reputation based on extensive reputation research.

In short, low E-A-T means bad content. Bad content means bad SEO, and bad SEO means you’re missing out on valuable traffic and conversions due to low rankings.

How To Improve Your Website’s E-A-T

Hopefully, you now understand the E-A-T concept and why it’s important.

So, how can you make sure your website content is high quality and also boost your on-page SEO efforts?

Advertisement

Continue Reading Below

Here are some best practices to follow when creating new content.

1. Identify Your Authors With A Byline And Bio

Think about the last time you landed on a blog where some content was published by “Admin” or some random guy with no last name. Did you trust that site? Was the content amazing?

No and no.

Google’s guidelines advise creating articles with “journalistic professionalism.”

Part of that professionalism means every piece of content you publish should have the writer’s name – their byline – attached to it.

Here’s how Search Engine Journal highlights the bylines of its articles:

Screenshot from SearchEngineJournal.com, December 2021

Identify All Your Content Contributors

Advertisement

Continue Reading Below

Ideally, you should highlight the biographical details of every person who creates content for you – whether that’s blog posts, articles, or question and answer pages.

Is the author of your content a recognized expert in your field? Then you definitely want to highlight that.

You can do so on a separate bio page that also contains the author’s past content or even at the bottom of the article.

Search Engine Journal does both. At the bottom of any SEJ article, you’ll see an author’s box like this:

Author BioScreenshot from SearchEngineJournal.com, December 2021

Clicking on [Read full bio] leads to my full bio page with information that establishes who I am and what I do:

Advertisement

Continue Reading Below

What To Include On A Bio Page

Here are some essential elements of a good bio page:

  •       Full name.
  •       Headshot.
  •       Title/position.
  •       A detailed bio.
  •       Contact information (e.g., email form, social media).

Doing all of this makes it easy for users (and Google) to know who created the content and assess their individual E-A-T.

2. Make Your Contact Info Easy To Find

When visitors arrive on your landing pages, is it easy to find your contact information?

Can they quickly determine how to get customer support?

Remember, E-A-T evaluates your website as a whole. The easiest solution is to make sure you link to your About Us and Contact Us page in either your main or footer navigation.

If you don’t have those pages on your website, make them now!

3. Remove Or Improve Your Low-Quality Content

As Search Engine Journal’s Executive Editor Danny Goodwin puts it: You have to decide whether to improve or remove your old or outdated content. SEJ jumped on this process and doubled the site traffic in just over a year, according to Goodwin.

Advertisement

Continue Reading Below

If you have content that is no longer useful – or is just so terrible that it’s not worth the time investment to update or improve it – then pruning that content is one quick way to improve your E-A-T.

Giving Your Content A Makeover

Removing content should always be your last resort, but if it needs to be done, do it without hesitation. Ideally, you want to identify any content that looks like it has low E-A-T and figure out ways you can reverse that.

Here are some ways you could increase E-A-T:

  •       Have a more authoritative person write the content.
  •       Add quotes from experts, data, sources, or citations.
  •       Make some simple edits to improve the readability, grammar, spelling, and structure.
  •       Add more information to make it more comprehensive.
  •       Write a new and better title.
  •       Add some visual appeal, such as photos, charts, screenshots (and make sure to optimize those images).
  •       Add a video for people who prefer that format vs. text only (this has the added benefit of potentially keeping visitors on your site longer).

The process of elevating content definitely takes longer, but doing so will greatly improve your website’s E-A-T and performance. This process is especially crucial for any YMYL page.

4. Create A Positive Brand Reputation

A positive brand reputation is key to both growing your business and your E-A-T.

Advertisement

Continue Reading Below

One way you can do this is through thought leadership. If you can share insights that your target audience truly finds valuable, this can push them down the path to conversion.

High-quality thought leadership content is good for winning, keeping, and growing a business. This will also help you build authority in your niche and help Google trust you.

Thought leadership is incredibly powerful when done right, so make sure you aren’t underwhelming your audience!

Learn more about how thought leadership delivers real ROI here.

Summary

Is E-A-T a ranking factor?

This is the wrong question to ask, in my opinion. Let’s forget about ranking factors for a moment and think about your audience instead.

If you’re doing everything outlined in Google’s E-A-T guidelines, then you’re creating informative, useful, high-quality content that your audience wants and helping them accomplish a task (e.g., acquiring knowledge, buying a product).

In other words, you’re providing a satisfying user experience.

Advertisement

Continue Reading Below

Anything good for users is good for helping you rank in Google – and driving the traffic and conversions you really want.


Image credits: Paulo Bobita/Search Engine Journal





Source link

SEO

Surfer SEO Unveils New Semrush Integration

Published

on

Surfer SEO Unveils New Semrush Integration


Surfer SEO has a new growth management tool that includes backlink data provided by industry leader, Semrush.

Here’s what the integration means for users.

Surfer SEO

Surfer SEO is a machine learning tool used by SEOs to conduct keyword research, create content strategies, and generate AI outlines.

Companies such as Square, ClickUp, and Shopify use the tool, in addition to over 15,000 other brands and agencies.

Semrush

Semrush is an enterprise SEO tool marketers use to audit websites, conduct competitor analysis and develop content strategies.

IBM, TESLA, and Amazon are just a few companies using it, in addition to more than 10 million marketing professionals.

Starting today, Surfer SEO will use Semrush’s backlink data to expand Grow Flow task recommendations.

Grow Flow

Grow Flow is like your friendly SEO AI assistant.

Described on the Surfer website as an “AI growth management platform,” the tool provides a few SEO tasks each week to help you stay on track — and not get lost in the overwhelming amount of information that comes with learning SEO.

For example, it may recommend adding keywords (GSC) that a website is ranking for but not explicitly speaking to in an article.

Screenshot from Grow Flow, June 2022

Or, the tool may recommend where to add internal links.

internal links_growflow_surfer seoScreenshot from Grow Flow, June 2022

It can also recommend new content topics that users can open in a content editor at the push of a button.

new content topics_content editor_growflowScreenshot from Grow Flow, June 2022

The integration with Semrush comes into play in the Grow Flow recommendations.

Once you connect Surfer to the freemium Semrush account, you unlock new tasks.

connect semrush to surfer_growflowScreenshot from Grow Flow, June 2022

Voila – a list of referring domains every week pulled from your competitors!

Backlink research automated_Growflow_Surfer SEOScreenshot from Grow Flow, June 2022

Manually researching a competitor’s backlinks is a considerable drain on internal resources. Thanks to this integration, you’ll discover new opportunities automatically.

An Industry First

Until now, Semrush has not integrated with an SEO company.

Historically, Semrush has only integrated with Google Products like Search Console and Analytics, and social media networks like Facebook and Twitter.

It’s also worked with task management tools like Trello and Monday.com – but never a direct SEO competitor.

Semrush’s “SEO writing assistant” feature is a direct competitor to Surfer’s “SEO Content Editor”: one of its most popular features.

So, why partner and offer this fantastic AI assistant to search marketers – for free?

I asked Tomasz Niezgoda, Surfer SEO’s Marketing Executive and Partner, how this partnership came to be.

“Pretty straightforward,” he said. “Semrush reached out to us with a proposition to integrate Surfer inside their marketplace [and] after some time, we decided to give it a shot and started working on this integration.”

From the beginning, Semrush felt like this would be a successful integration because it was applying its backlink data in a customer-oriented way.

“For Semrush, it’s a very meaningful integration. It allows Surfer SEO users to gain valuable link building insights and knowledge, which is crucial for ranking their content,” said Eugene Levin, President, Semrush.

What This Means

Two major SEO competitors are working together to create a free tool for small business owners and entry-level marketers to develop weekly best practices.

Matt Diggity, CEO and Founder at Diggity Marketing, calls the tool “Simple. Efficient. Automated.”

“You get the list every week and gain immediate insights [into] which referrals your competition is getting. Then, you can start working on your own link-building strategy straight away.”

The two giants hint that combining Semrush’s backlink data with the machine learning power of Surfer is only the beginning.

“We like each other,” said Niezgoda. “Maybe it’s just the first integration that’s coming.”


Featured Image: Screenshot from Grow Flow, June 2022





Source link

Continue Reading

SEO

A Complete Google Search Console Guide For SEO Pros

Published

on

A Complete Google Search Console Guide For SEO Pros


Google search console provides data necessary to monitor website performance in search and improve search rankings, information that is exclusively available through Search Console.

This makes it indispensable for online business and publishers that are keen to maximize success.

Taking control of your search presence is easier to do when using the free tools and reports.

What Is Google Search Console?

Google Search Console is a free web service hosted by Google that provides a way for publishers and search marketing professionals to monitor their overall site health and performance relative to Google search.

It offers an overview of metrics related to search performance and user experience to help publishers improve their sites and generate more traffic.

Search Console also provides a way for Google to communicate when it discovers security issues (like hacking vulnerabilities) and if the search quality team has imposed a manual action penalty.

Important features:

  • Monitor indexing and crawling.
  • Identify and fix errors.
  • Overview of search performance.
  • Request indexing of updated pages.
  • Review internal and external links.

It’s not necessary to use Search Console to rank better nor is it a ranking factor.

However, the usefulness of the Search Console makes it indispensable for helping improve search performance and bringing more traffic to a website.

How To Get Started

The first step to using Search Console is to verify site ownership.

Google provides several different ways to accomplish site verification, depending on if you’re verifying a website, a domain, a Google site, or a Blogger-hosted site.

Domains registered with Google domains are automatically verified by adding them to Search Console.

The majority of users will verify their sites using one of four methods:

  1. HTML file upload.
  2. Meta tag
  3. Google Analytics tracking code.
  4. Google Tag Manager.

Some site hosting platforms limit what can be uploaded and require a specific way to verify site owners.

But, that’s becoming less of an issue as many hosted site services have an easy-to-follow verification process, which will be covered below.

How To Verify Site Ownership

There are two standard ways to verify site ownership with a regular website, like a standard WordPress site.

  1. HTML file upload.
  2. Meta tag.

When verifying a site using either of these two methods, you’ll be choosing the URL-prefix properties process.

Let’s stop here and acknowledge that the phrase “URL-prefix properties” means absolutely nothing to anyone but the Googler who came up with that phrase.

Don’t let that make you feel like you’re about to enter a labyrinth blindfolded. Verifying a site with Google is easy.

HTML File Upload Method

Step 1: Go to the Search Console and open the Property Selector dropdown that’s visible in the top left-hand corner on any Search Console page.

Screenshot by author, May 2022

Step 2: In the pop-up labeled Select Property Type, enter the URL of the site then click the Continue button.

Step 2Screenshot by author, May 2022

Step 3: Select the HTML file upload method and download the HTML file.

Step 4: Upload the HTML file to the root of your website.

Root means https://example.com/. So, if the downloaded file is called verification.html, then the uploaded file should be located at https://example.com/verification.html.

Step 5: Finish the verification process by clicking Verify back in the Search Console.

Verification of a standard website with its own domain in website platforms like Wix and Weebly is similar to the above steps, except that you’ll be adding a meta description tag to your Wix site.

Duda has a simple approach that uses a Search Console App that easily verifies the site and gets its users started.

Troubleshooting With GSC

Ranking in search results depends on Google’s ability to crawl and index webpages.

The Search Console URL Inspection Tool warns of any issues with crawling and indexing before it becomes a major problem and pages start dropping from the search results.

URL Inspection Tool

The URL inspection tool shows whether a URL is indexed and is eligible to be shown in a search result.

For each submitted URL a user can:

  • Request indexing for a recently updated webpage.
  • View how Google discovered the webpage (sitemaps and referring internal pages).
  • View the last crawl date for a URL.
  • Check if Google is using a declared canonical URL or is using another one.
  • Check mobile usability status.
  • Check enhancements like breadcrumbs.

Coverage

The coverage section shows Discovery (how Google discovered the URL), Crawl (shows whether Google successfully crawled the URL and if not, provides a reason why), and Enhancements (provides the status of structured data).

The coverage section can be reached from the left-hand menu:

CoverageScreenshot by author, May 2022

Coverage Error Reports

While these reports are labeled as errors, it doesn’t necessarily mean that something is wrong. Sometimes it just means that indexing can be improved.

For example, in the following screenshot, Google is showing a 403 Forbidden server response to nearly 6,000 URLs.

The 403 error response means that the server is telling Googlebot that it is forbidden from crawling these URLs.

Coverage report showing 403 server error responsesScreenshot by author, May 2022

The above errors are happening because Googlebot is blocked from crawling the member pages of a web forum.

Every member of the forum has a member page that has a list of their latest posts and other statistics.

The report provides a list of URLs that are generating the error.

Clicking on one of the listed URLs reveals a menu on the right that provides the option to inspect the affected URL.

There’s also a contextual menu to the right of the URL itself in the form of a magnifying glass icon that also provides the option to Inspect URL.

Inspect URLScreenshot by author, May 2022

Clicking on the Inspect URL reveals how the page was discovered.

It also shows the following data points:

  • Last crawl.
  • Crawled as.
  • Crawl allowed?
  • Page fetch (if failed, provides the server error code).
  • Indexing allowed?

There is also information about the canonical used by Google:

  • User-declared canonical.
  • Google-selected canonical.

For the forum website in the above example, the important diagnostic information is located in the Discovery section.

This section tells us which pages are the ones that are showing links to member profiles to Googlebot.

With this information, the publisher can now code a PHP statement that will make the links to the member pages disappear when a search engine bot comes crawling.

Another way to fix the problem is to write a new entry to the robots.txt to stop Google from attempting to crawl these pages.

By making this 403 error go away, we free up crawling resources for Googlebot to index the rest of the website.

Google Search Console’s coverage report makes it possible to diagnose Googlebot crawling issues and fix them.

Fixing 404 Errors

The coverage report can also alert a publisher to 404 and 500 series error responses, as well as communicate that everything is just fine.

A 404 server response is called an error only because the browser or crawler’s request for a webpage was made in error because the page does not exist.

It doesn’t mean that your site is in error.

If another site (or an internal link) links to a page that doesn’t exist, the coverage report will show a 404 response.

Clicking on one of the affected URLs and selecting the Inspect URL tool will reveal what pages (or sitemaps) are referring to the non-existent page.

From there you can decide if the link is broken and needs to be fixed (in the case of an internal link) or redirected to the correct page (in the case of an external link from another website).

Or, it could be that the webpage never existed and whoever is linking to that page made a mistake.

If the page doesn’t exist anymore or it never existed at all, then it’s fine to show a 404 response.

Taking Advantage Of GSC Features

The Performance Report

The top part of the Search Console Performance Report provides multiple insights on how a site performs in search, including in search features like featured snippets.

There are four search types that can be explored in the Performance Report:

  1. Web.
  2. Image.
  3. Video.
  4. News.

Search Console shows the web search type by default.

Change which search type is displayed by clicking the Search Type button:

Default search typeScreenshot by author, May 2022

A menu pop-up will display allowing you to change which kind of search type to view:

Search Types MenuScreenshot by author, May 2022

A useful feature is the ability to compare the performance of two search types within the graph.

Four metrics are prominently displayed at the top of the Performance Report:

  1. Total Clicks.
  2. Total Impressions.
  3. Average CTR (click-through rate).
  4. Average position.
Screenshot of Top Section of the Performance PageScreenshot by author, May 2022

By default, the Total Clicks and Total Impressions metrics are selected.

By clicking within the tabs dedicated to each metric, one can choose to see those metrics displayed on the bar chart.

Impressions

Impressions are the number of times a website appeared in the search results. As long as a user doesn’t have to click a link to see the URL, it counts as an impression.

Additionally, if a URL is ranked at the bottom of the page and the user doesn’t scroll to that section of the search results, it still counts as an impression.

High impressions are great because it means that Google is showing the site in the search results.

But, the meaning of the impressions metric is made meaningful by the Clicks and the Average Position metrics.

Clicks

The clicks metric shows how often users clicked from the search results to the website. A high number of clicks in addition to a high number of impressions is good.

A low number of clicks and a high number of impressions is less good but not bad. It means that the site may need improvements to gain more traffic.

The clicks metric is more meaningful when considered with the Average CTR and Average Position metrics.

Average CTR

The average CTR is a percentage representing how often users clicked from the search results to the website.

A low CTR means that something needs improvement in order to increase visits from the search results.

A higher CTR means the site is performing well.

This metric gains more meaning when considered together with the Average Position metric.

Average Position

Average Position shows the average position in search results the website tends to appear in.

An average in positions one to 10 is great.

An average position in the twenties (20 – 29) means that the site is appearing on page two or three of the search results. This isn’t too bad. It simply means that the site needs additional work to give it that extra boost into the top 10.

Average positions lower than 30 could (in general) mean that the site may benefit from significant improvements.

Or, it could be that the site ranks for a large number of keyword phrases that rank low and a few very good keywords that rank exceptionally high.

In either case, it may mean taking a closer look at the content. It may be an indication of a content gap on the website, where the content that ranks for certain keywords isn’t strong enough and may need a dedicated page devoted to that keyword phrase to rank better.

All four metrics (Impressions, Clicks, Average CTR, and Average Position), when viewed together, present a meaningful overview of how the website is performing.

The big takeaway about the Performance Report is that it is a starting point for quickly understanding website performance in search.

It’s like a mirror that reflects back how well or poorly the site is doing.

Performance Report Dimensions

Scrolling down to the second part of the Performance page reveals several of what’s called Dimensions of a website’s performance data.

There are six dimensions:

1. Queries: Shows the top search queries and the number of clicks and impressions associated with each keyword phrase.

2. Pages: Shows the top-performing web pages (plus clicks and impressions).

3. Countries: Top countries (plus clicks and impressions).

4. Devices: Shows the top devices, segmented into mobile, desktop, and tablet.

5. Search Appearance: This shows the different kinds of rich results that the site was displayed in. It also tells if Google displayed the site using Web Light results and video results, plus the associated clicks and impressions data. Web Light results are results that are optimized for very slow devices.

6. Dates: The dates tab organizes the clicks and impressions by date. The clicks and impressions can be sorted in descending or ascending order.

Keywords

The keywords are displayed in the Queries as one of the dimensions of the Performance Report (as noted above). The queries report shows the top 1,000 search queries that resulted in traffic.

Of particular interest are the low-performing queries.

Some of those queries display low quantities of traffic because they are rare, what is known as long-tail traffic.

But, others are search queries that result from webpages that could need improvement, perhaps it could be in need of more internal links, or it could be a sign that the keyword phrase deserves its own webpage.

It’s always a good idea to review the low-performing keywords because some of them may be quick wins that, when the issue is addressed, can result in significantly increased traffic.

Links

Search Console offers a list of all links pointing to the website.

However, it’s important to point out that the links report does not represent links that are helping the site rank.

It simply reports all links pointing to the website.

This means that the list includes links that are not helping the site rank. That explains why the report may show links that have a nofollow link attribute on them.

The Links report is accessible  from the bottom of the left-hand menu:

Links reportScreenshot by author, May 2022

The Links report has two columns: External Links and Internal Links.

External Links are the links from outside the website that points to the website.

Internal Links are links that originate within the website and link to somewhere else within the website.

The External links column has three reports:

  1. Top linked pages.
  2. Top linking sites.
  3. Top linking text.

The Internal Links report lists the Top Linked Pages.

Each report (top linked pages, top linking sites, etc.) has a link to more results that can be clicked to view and expand the report for each type.

For example, the expanded report for Top Linked Pages shows Top Target pages, which are the pages from the site that are linked to the most.

Clicking a URL will change the report to display all the external domains that link to that one page.

The report shows the domain of the external site but not the exact page that links to the site.

Sitemaps

A sitemap is generally an XML file that is a list of URLs that helps search engines discover the webpages and other forms of content on a website.

Sitemaps are especially helpful for large sites, sites that are difficult to crawl if the site has new content added on a frequent basis.

Crawling and indexing are not guaranteed. Things like page quality, overall site quality, and links can have an impact on whether a site is crawled and pages indexed.

Sitemaps simply make it easy for search engines to discover those pages and that’s all.

Creating a sitemap is easy because more are automatically generated by the CMS, plugins, or the website platform where the site is hosted.

Some hosted website platforms generate a sitemap for every site hosted on its service and automatically update the sitemap when the website changes.

Search Console offers a sitemap report and provides a way for publishers to upload a sitemap.

To access this function click on the link located on the left-side menu.

sitemaps

The sitemap section will report on any errors with the sitemap.

Search Console can be used to remove a sitemap from the reports. It’s important to actually remove the sitemap however from the website itself otherwise Google may remember it and visit it again.

Once submitted and processed, the Coverage report will populate a sitemap section that will help troubleshoot any problems associated with URLs submitted through the sitemaps.

Search Console Page Experience Report

The page experience report offers data related to the user experience on the website relative to site speed.

Search Console displays information on Core Web Vitals and Mobile Usability.

This is a good starting place for getting an overall summary of site speed performance.

Rich Result Status Reports

Search Console offers feedback on rich results through the Performance Report. It’s one of the six dimensions listed below the graph that’s displayed at the top of the page, listed as Search Appearance.

Selecting the Search Appearance tabs reveals clicks and impressions data for the different kinds of rich results shown in the search results.

This report communicates how important rich results traffic is to the website and can help pinpoint the reason for specific website traffic trends.

The Search Appearance report can help diagnose issues related to structured data.

For example, a downturn in rich results traffic could be a signal that Google changed structured data requirements and that the structured data needs to be updated.

It’s a starting point for diagnosing a change in rich results traffic patterns.

Search Console Is Good For SEO

In addition to the above benefits of Search Console, publishers and SEOs can also upload link disavow reports, resolve penalties (manual actions), and security events like site hackings, all of which contribute to a better search presence.

It is a valuable service that every web publisher concerned about search visibility should take advantage of.

More Resources:


Featured Image: bunny pixar/Shutterstock





Source link

Continue Reading

SEO

New Updates To Google Page Experience Scoring Revealed At SEODay

Published

on

New Updates To Google Page Experience Scoring Revealed At SEODay


In an online session at SEODay 2022, Google Search Advocate John Mueller spoke about the impact of page experience on search engine rankings and changes to how the search engine scores sites.

One of the changes revealed was that Google now bases desktop search results on a site’s desktop experience – and mobile search results on a site’s mobile experience.

He also discussed the three primary metrics the search engine uses in determining experience scores: largest contentful paint (LCP), first input delay (FID), and cumulative layout shift (CLS).

“This is not a tie-breaker,” Mueller said. “It won’t make or break your website in terms of search, but it is a factor that comes into play in regards to ranking between different results.”

Google has also added a new page experience metric, “interactivity to next page,” or INP.

Google initially announced INP at I/O 2022, and while Mueller was clear that it is not a direct rankings factor, he discussed INP  as something that may play a role in the future.

Search Console Insights Provides Easier Way To Track Search Rankings

Mueller spent the first part of his presentation discussing the benefits of Search Console Insights. Using Search Console data, alongside analytics, users can generate custom reports and get a different view of the data.

He specifically mentioned using BigQuery and Data Studio as “a way of connecting different data sources together and creating really fancy reports.”

Google is also working on expanding its Search Console APIs, Mueller said, which will allow users to connect these APIs to code on their sites.

Possible uses Mueller mentioned include monitoring top queries and checking to see if specific URLs are indexed.

Videos & Images Take On More Prominent Role In Search

At I/O earlier this year, Google previewed a set of video reports coming to Search Console: a response to a growing appetite for this type of media in search results.

“We see that people love videos and authentic images in search results, so we try to show them more,” Mueller said.

In this growing trend, he included web stories, a collection of pages that often have videos. To facilitate their use, Google now offers a WordPress plugin for creating them.

Authentic Product Reviews Factored Into Rankings

Ecommerce has been trending upward, with the global market expected to surpass $5.5 trillion this year. In its algorithm, Google includes what Mueller termed “authentic reviews” to accommodate digital shoppers better.

“People have high expectations of reviews they find online, so we’ve also worked specifically on updates to algorithms with regards to ranking these product reviews,” he said.

Other Updates From Mueller

At SEODay, Mueller said Google has slightly changed its terminology, with the term “title links” now being used to refer only to the title of a search result.

The search engine giant has also added a new robots meta tag, “indexifembedded.”

Users can leverage the meta tag when embedding content on the main page and want to control the indexing of that embedded piece of content.

Mueller also said Google’s blog was the best source of information on any SEO-related topic.

“With any kind of bigger update… it’s sometimes really tricky to tell folks what they should be looking at specifically,” he noted. “So we have a fairly comprehensive blog post .”


Featured Image: BestForBest/Shutterstock





Source link

Continue Reading

Trending

Copyright © 2021 Liveseo.com