Connect with us

SEO

How to get buy-in from the C-suite

Published

on

How to get buy-in from the C-suite


30-second summary:

  • Often SEOs and search marketing managers struggle to convey value to the board which hampers funding and support for relevant strategy implementation
  • There are three aspects you need to balance in order to win over C-suite
  • Kevin Indig, Director of SEO at Shopify helps you navigate these crucial conversations

Your best ideas aren’t worth a dime without funding. What’s the key to funding? Executive buy-in! To understand how to get buy-in, you need to know your audience: the mighty C-suite.

Executives are busy, stressed, and care about three things and three things only – 1. Market share 2. Revenue 3. Talent. They want to know if the company is capturing more of the market, makes more money, and has the right people. Mind you, a healthy team and culture are part of talent.

So, whatever you need funding for needs to have a direct line to one of these three factors. Only a few projects can live outside of these and provide enough strategic value to be considered. Everything else gets a friendly head nod and then collects dust in backlog hell. Relevance is important!

But your success will also depend on strong storytelling. Think about it like packaging. A sports car needs a nice chassis, an iPhone needs a classy box, and your presentation needs a capturing narrative.

Designing a narrative

Stories are how we retain information. I’m not going to give you the whole spiel about how humans told stories around fire camps and painted the walls of caves. Let’s just say our brains still connect information with stories because they trigger emotions. We imagine ourselves to be part of the narrative. It even triggers certain parts of the brain – as if we were really in it!

Storytelling has two key components: a problem and a solution. The problem needs to be big, timely, and relevant. You don’t want to cut the problem definition short but take your time showing what the root issue is, its magnitude, and how it is connected to other problems. This is called issue framing. In the end, your audience should think “We need to take care of this right now!

Emphasize the problem with data or a strong construct of reasoning. The executives should be able to see the issue in one paragraph or on one slide without too much explanation. This is an important data visualization challenge. Problems often come down to a simple display or something not trending in the right direction or being too small/large compared to something else.

Seek to connect the issue to a larger goal of the organization or an existing problem. This is easier to grasp than dealing with a completely new problem. Plus, connecting your problem with another one has a carry-over effect of relevance. Suddenly, your point is top of mind.

The solution to the problem can be a set of prioritized actions or an outcome. Just like the problem, keep the solution simple. “Here are three things we’re going to do about it.” Show the time horizon and resources you need to solve the problem. You should be able to show one to three metrics to measure progress against the solution to give everyone an understanding of success.

This is how data and storytelling play together to lead up to a coherent narrative.

Building trust

Ideally, you gain the executives’ trust over time to get the point fairly quicker and not have to develop a full pitch every time. Trust comes from keeping commitments. Following through. Keeping your word.

That’s why one of the best things you can do after a successful pitch that leads to funding is to follow up with progress and results. Showing things turn out the way you said they would displays to executives that they can rely on you.

On the other hand, not following up can stick out negatively and lead to uncomfortable questions during your next pitch. Even if results are not coming in, reaching out and showing you’re on top of it goes a long way.

Emotions matter as much as data

By now, you’ve realized that getting C-suite buy-in depends as much on evoking the right emotions as it does on data.

Be careful with evoking too much fear, it can lead to paralysis and panic. Be careful with too much excitement, it can come across as naive and unserious. Aim for just the right amount.

One factor that helps is timing. Bringing the narrative up at the right moment means executives are primed to listen and be open to understanding. That could be annual/quarterly planning or when the company hits a pivotal moment, but also strategy shifts or personnel changes in the C-suite.

Another factor that helps, are advocates and champions of your pitch. Talk to someone before you pitch and ask them for feedback. When people co-create, they get invested in the outcome.

Kevin Indig is Director of SEO at Shopify. He is also the creator of Growth Memo. You can find Kevin on Twitter at @Kevin_Indig.

Subscribe to the Search Engine Watch newsletter for insights on SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.

Join the conversation with us on LinkedIn and Twitter.





Source link

Continue Reading
Comments

SEO

11 SEO Tips & Tricks To Improve Search Indexation

Published

on

11 SEO Tips & Tricks To Improve Search Indexation


The SEO game has so many moving parts that it often seems like, as soon as we’re done optimizing one part of a website, we have to move back to the part we were just working on.

Once you’re out of the “I’m new here” stage and feel that you have some real SEO experience under your belt, you might start to feel that there are some things you can devote less time to correcting.

Indexability and crawl budgets could be two of those things, but forgetting about them would be a mistake.

I always like to say that a website with indexability issues is a site that’s in its own way; that website is inadvertently telling Google not to rank its pages because they don’t load correctly or they redirect too many times.

If you think you can’t or shouldn’t be devoting time to the decidedly not-so-glamorous task of fixing your site’s indexability, think again.

Indexability problems can cause your rankings to plummet and your site traffic to dry up quickly.

So, your crawl budget has to be top of mind.

In this post, I’ll present you with 11 tips to consider as you go about improving your website’s indexability.

1. Track Crawl Status With Google Search Console

Errors in your crawl status could be indicative of a deeper issue on your site.

Checking your crawl status every 30-60 days is important to identify potential errors that are impacting your site’s overall marketing performance.

It’s literally the first step of SEO; without it, all other efforts are null.

Right there on the sidebar, you’ll be able to check your crawl status under the index tab.

Screenshot by author, May 2022
errors in google search consoleScreenshot by author, May 2022

Now, if you want to remove access to a certain webpage, you can tell Search Console directly. This is useful if a page is temporarily redirected or has a 404 error.

A 410 parameter will permanently remove a page from the index, so beware of using the nuclear option.

Common Crawl Errors & Solutions

If your website is unfortunate enough to be experiencing a crawl error, it may require an easy solution or be indicative of a much larger technical problem on your site.

The most common crawl errors I see are:

semrush crawlability problemsScreenshot by author, May 2022

To diagnose some of these errors, you can leverage the URL Inspection tool to see how Google views your site.

Failure to properly fetch and render a page could be indicative of a deeper DNS error that will need to be resolved by your DNS provider.

google search console URL inspectionScreenshot by author, May 2022

Resolving a server error requires diagnosing a specific error. The most common errors include:

  • Timeout.
  • Connection refused.
  • Connect failed.
  • Connect timeout.
  • No response.

Most of the time, a server error is usually temporary, although a persistent problem could require you to contact your hosting provider directly.

Robots.txt errors, on the other hand, could be more problematic for your site. If your robots.txt file is returning a 200 or 404 error, it means search engines are having difficulty retrieving this file.

You could submit a robots.txt sitemap or avoid the protocol altogether, opting to manually noindex pages that could be problematic for your crawl.

Resolving these errors quickly will ensure that all of your target pages are crawled and indexed the next time search engines crawl your site.

2. Create Mobile-Friendly Webpages

With the arrival of the mobile-first index, we must also optimize our pages to display mobile-friendly copies on the mobile index.

The good news is that a desktop copy will still be indexed and displayed under the mobile index if a mobile-friendly copy does not exist. The bad news is that your rankings may suffer as a result.

There are many technical tweaks that can instantly make your website more mobile-friendly including:

  • Implementing responsive web design.
  • Inserting the viewpoint meta tag in content.
  • Minifying on-page resources (CSS and JS).
  • Tagging pages with the AMP cache.
  • Optimizing and compressing images for faster load times.
  • Reducing the size of on-page UI elements.

Be sure to test your website on a mobile platform and run it through Google PageSpeed Insights. Page speed is an important ranking factor and can affect the speed at which search engines can crawl your site.

3. Update Content Regularly

Search engines will crawl your site more regularly if you produce new content on a regular basis.

This is especially useful for publishers who need new stories published and indexed on a regular basis.

Producing content on a regular basis signal to search engines that your site is constantly improving and publishing new content, and therefore needs to be crawled more often to reach its intended audience.

4. Submit A Sitemap To Each Search Engine

One of the best tips for indexation to this day remains to submit a sitemap to Google Search Console and Bing Webmaster Tools.

You can create an XML version using a sitemap generator or manually create one in Google Search Console by tagging the canonical version of each page that contains duplicate content.

5. Optimize Your Interlinking Scheme

Establishing a consistent information architecture is crucial to ensuring that your website is not only properly indexed, but also properly organized.

Creating main service categories where related webpages can sit can further help search engines properly index webpage content under certain categories when the intent may not be clear.

11 SEO Tips & Tricks To Improve Search IndexationScreenshot by author, May 2022

6. Deep Link To Isolated Webpages

If a webpage on your site or a subdomain is created in isolation or an error preventing it from being crawled, you can get it indexed by acquiring a link on an external domain.

This is an especially useful strategy for promoting new pieces of content on your website and getting it indexed quicker.

Beware of syndicating content to accomplish this as search engines may ignore syndicated pages, and it could create duplicate errors if not properly canonicalized.

7. Minify On-Page Resources & Increase Load Times

Forcing search engines to crawl large and unoptimized images will eat up your crawl budget and prevent your site from being indexed as often.

Search engines also have difficulty crawling certain backend elements of your website. For example, Google has historically struggled to crawl JavaScript.

Even certain resources like Flash and CSS can perform poorly over mobile devices and eat up your crawl budget.

In a sense, it’s a lose-lose scenario where page speed and crawl budget are sacrificed for obtrusive on-page elements.

Be sure to optimize your webpage for speed, especially over mobile, by minifying on-page resources, such as CSS. You can also enable caching and compression to help spiders crawl your site faster.

Search Engine Journal PageSpeed InsightsScreenshot by author, May 2022

8. Fix Pages With Noindex Tags

Over the course of your website’s development, it may make sense to implement a noindex tag on pages that may be duplicated or only meant for users who take a certain action.

Regardless, you can identify webpages with noindex tags that are preventing them from being crawled by using a free online tool like Screaming Frog.

The Yoast plugin for WordPress allows you to easily switch a page from index to noindex. You could also do this manually in the backend of pages on your site.

9. Set A Custom Crawl Rate

In the old version of Google Search Console, you can actually slow or customize the speed of your crawl rates if Google’s spiders are negatively impacting your site.

This also gives your website time to make necessary changes if it is going through a significant redesign or migration.

Google Search Console crawl rateScreenshot by author, May 2022

10. Eliminate Duplicate Content

Having massive amounts of duplicate content can significantly slow down your crawl rate and eat up your crawl budget.

You can eliminate these problems by either blocking these pages from being indexed or placing a canonical tag on the page you wish to be indexed.

Along the same lines, it pays to optimize the meta tags of each individual page to prevent search engines from mistaking similar pages as duplicate content in their crawl.

11. Block Pages You Don’t Want Spiders To Crawl

There may be instances where you want to prevent search engines from crawling a specific page. You can accomplish this by the following methods:

  • Placing a noindex tag.
  • Placing the URL in a robots.txt file.
  • Deleting the page altogether.

This can also help your crawls run more efficiently, instead of forcing search engines to pour through duplicate content.

Conclusion

The state of your website’s crawlability problems will more or less depend on how much you’ve been staying current with your own SEO.

If you’re tinkering in the back end all the time, you may have identified these issues before they got out of hand and started affecting your rankings.

If you’re not sure, though, run a quick scan in Google Search Console to see how you’re doing.

The results can really be educational!

More Resources:


Featured Image: Ernie Janes/Shutterstock





Source link

Continue Reading

SEO

Google Says Keywords In Domain Names Are Overrated

Published

on

Google Says Keywords In Domain Names Are Overrated


Google’s John Mueller said once again that “keywords in domain names are overrated.” He and Googlers over time have said this numerous times over the years and he said it again. Instead he said “pick something for your business, pick something for the long term.”

If you want, I can license variations of Rusty[words-go-here], if you need an excellent domain. Like RustySEO, RustyDesign, RustyFood – you know it makes for a good name. ;-P

Here is John’s tweet on this:

A bit more from this conversation on Twitter:

Here is some of our older, previous coverage of this topic:

Forum discussion at Twitter.





Source link

Continue Reading

SEO

Googlebot Crawls & Indexes First 15 MB HTML Content

Published

on

Googlebot Crawls & Indexes First 15 MB HTML Content


In an update to Googlebot’s help document, Google quietly announced it will crawl the first 15 MB of a webpage. Anything after this cutoff will not be included in rankings calculations.

Google specifies in the help document:

“Any resources referenced in the HTML such as images, videos, CSS and JavaScript are fetched separately. After the first 15 MB of the file, Googlebot stops crawling and only considers the first 15 MB of the file for indexing. The file size limit is applied on the uncompressed data.”

This left some in the SEO community wondering if this meant Googlebot would completely disregard text that fell below images at the cutoff in HTML files.

“It’s specific to the HTML file itself, like it’s written,” John Mueller, Google Search Advocate, clarified via Twitter. “Embedded resources/content pulled in with IMG tags is not a part of the HTML file.”

What This Means For SEO

To ensure it is weighted by Googlebot, important content must now be included near the top of webpages. This means code must be structured in a way that puts the SEO-relevant information with the first 15 MB in an HTML or supported text-based file.

It also means images and videos should be compressed not be encoded directly into the HTML, whenever possible.

SEO best practices currently recommend keeping HTML pages to 100 KB or less, so many sites will be unaffected by this change. Page size can be checked with a variety of tools, including Google Page Speed Insights.

In theory, it may sound worrisome that you could potentially have content on a page that doesn’t get used for indexing. In practice, however, 15MB is a considerably large amount of HTML.

As Google states, resources such as images and videos are fetched separately. Based on Google’s wording, it sounds like this 15MB cutoff applies to HTML only.

It would be difficult to go over that limit with HTML unless you were publishing entire books’ worth of text on a single page.

Should you have pages that exceed 15MB of HTML it’s likely you have underlying issues that need to be fixed anyway.


Source: Google Search Central
Featured Image: SNEHIT PHOTO/Shutterstock





Source link

Continue Reading

Trending

Copyright © 2021 Liveseo.com