Categories
Digital Marketing

A Beginner’s Guide to Ranking in Google Maps

Posted by Alex_Ratynski

For local businesses today, there are numerous different ways to market your brand online. The majority of your potential customers still use Google to find local businesses near them — businesses where they will spend their hard-earned money. In fact, 80% of searches with “local intent” result in a conversion.

This begs the question: “What’s the best way to catch the attention of local searchers on Google?” 

The answer: through Google Maps marketing.

What is Google Maps marketing?

Google Maps marketing is the process of optimizing the online presence of your brand in Google Maps, with the goal of increasing your brand’s online visibility.

When you search a query on Google that has local intent, you often see something like this:

Google Maps marketing utilizes a number of strategies and tactics to help your business become one of those three positions on local map packs.

Why is marketing important for Google Maps?

The reason every local business should care about ranking in Google Maps is simple: potential brand visibility.

It’s no surprise that Google is by far the most popular search engine. But what about Google Maps specifically?

One study found that nearly 70% of smartphone users say they use Google Maps most frequently. On top of that, out of the 3.5 billion searches that happen on Google each day, more and more are considered to have local intent. According to Google, 83% of U.S. people who visited a store said they used online search before going in.

Thus, any business that is serious about getting found in this day and age needs to utilize the power behind Google Maps marketing. This is why we at Ratynski Digital focus much of our local SEO time on getting our clients to rank both in Google Maps AND organic search results.

Before you can rank in Google Maps, make sure you have first set up and optimized your Google My Business profile.

What is Google My Business?

Google My Business (GMB) is a free platform provided by Google where local businesses can create a profile that is displayed across a variety of Google products.

In order to qualify for a GMB profile you must make in-person contact with your customers during your stated business hours. This may mean that you have a brick-and-mortar location where customers come to see you, or perhaps you travel to see your customers.

A GMB profile can display a variety of information about your business such as:

  • Business name
  • Business description
  • Reviews
  • Phone number
  • Address
  • Website
  • Business category or industry
  • Locations that you serve
  • Business hours
  • Products and services
  • Photos

And much more depending on your industry!

The purpose of creating a Google My Business profile for your brand is to increase your rankings, traffic, and revenue.

How to set up Google My Business

Step 1: Head over to the GMB Page.

  • Click on the blue button that says “Manage now” (be sure you are signed into your Google account).

Step 2: Create the listing and name your business profile.

  • Name your new listing and start adding all of your important business information. 
  • It’s important to note that before you create your GMB profile, you should familiarize yourself with Google’s guidelines. And please, don’t create GMB spam. Not only will creating fake or spammy listings offer a horrible user experience for your potential customers, but it also puts you at risk for penalties and suspensions.

Step 3: Add as much relevant information about your business as possible.

  • Remember all those different types of information I mentioned above? This is when you get to add those to your profile. Take advantage of this free platform and try to include as much relevant information as you can. Keep in mind, you will want to avoid adding GMB categories that are NOT relevant to your business. You should also work to keep all of your Google My Business contact information accurate, and make sure that it matches your website.

Step 4: Verify your profile.

  • If this is a brand new account, you will need to verify the physical address with a postcard that will be sent via mail by Google.
  • If you are claiming a listing that already exists on Google Maps but is not verified, you may be able to verify the profile via email or phone.

Step 5: Pop the champagne — you did it! Easy peasy.

Now that we are all set up, let’s dive into Google Maps SEO.

Top Google Maps ranking factors

It’s important to have a firm understanding of Google Maps ranking factors before you can expect to see high-ranking results. Once you understand how it works, Google Maps marketing becomes as easy as operating your 7-year-old’s Easy Bake Oven.

Okay, maybe not that easy, but everything will be much more clear. For a deep dive, I recommend checking out Moz’s 2018 local ranking factors study, but I’ll cover the top factors here.

In a nutshell, there are eight ranking factors that contribute to ranking in Google Maps and the local pack:

  1. Google My Business signals
  2. Link signals
  3. Review signals
  4. On-page signals
  5. Citation signals
  6. Behavioral signals
  7. Personalization
  8. Social signals

It’s important to keep in mind that the local algorithm works differently than Google’s organic search algorithm. SEO queen Joy Hawkins does a beautiful job explaining these algorithm differences in-depth in this Whiteboard Friday.

Google’s local algorithm analyzes all of the signals listed above and ranks listings based on the following three areas:

  • Proximity: How close is the business to the searcher?
  • Prominence: How popular or authoritative is the business in the area?
  • Relevance: How closely does the listing match the searcher’s query?

Now that you have a handle on how the local algorithm works and its many ranking factors, let’s talk about specific ways to optimize your GMB profile to improve your ranking in Google Maps.

How to optimize for Google Maps

To kickoff your optimizations, double check that ALL of your business information is filled out in full and 100% accurate. This includes adding the many services that you might offer as well as descriptions of those services.

Sherri Bonelli wrote a comprehensive post on optimizing the information on your GMB listing. She did a great job covering that topic, so I am going to focus instead on three more factors that will make the biggest impact in the shortest amount of time:

1. Get more online reviews

Reviews continue to be one of the most important components for ranking in Google Maps, but the benefit of building more reviews is not purely for the purpose of SEO (not by a long shot).

Reviews offer a much better customer experience. They help to build up social proof, manage customer expectations, and they can sell your product or service before you even get in touch with your customer.

With 82% of consumers reading online reviews for local businesses, every business owner needs to understand the importance and power of reviews.

Google understands the customer’s desire to read reviews before they visit a store or trust a brand. They have heavily factored reviews into the local algorithm because of this (reviews from both Google and third parties).

Keep in mind that the “review factor” is not simply a measurement of who has the most reviews. That is certainly a piece of the puzzle, but Google also takes into consideration many other aspects like:

  • Whether a review has text along with the star rating or not.
  • The words chosen to write the review.
  • The overall star rating given to the business.
  • The consistency of reviews.
  • Overall review sentiment.

Business owners must regularly train themselves (and their team) to ask their customers for reviews. It’s important to set up systems and processes to make review generation a regular occurrence.

I also recommend setting up a process or purchasing a service that helps with review management. For example, Moz Local offers the ability to monitor the flow of reviews as well as comment and reply to those reviews as they come in (all in one cohesive dashboard). Always reply to your reviews!

Pro Tip: Don’t ask for a review too early. Too many businesses ask for a review for a product or service before their customer has had the opportunity to fully experience it (and actually benefit from it). Only after they have had the chance to solve their problem with your product or service should you ask for a review.

2. Build local links

Links are still one of the largest ranking factors in Google’s algorithm (both in organic ranking and in Google Maps). In fact, building local links is especially important if you want to rank in Google Maps.

It’s true that any link that isn’t marked as nofollow will pass “authority”, which will likely help with rankings. However, local links are especially important because they have a much higher probability of driving actual business.

One of the best ways to start building local links is to utilize your local relationships around town. Think about other businesses that you work closely with, organizations that you support, or even companies that might qualify as a “shoulder niche”.

For the highest success rate, start with businesses that you already have a relationship with or know well. You could offer to write or record a testimonial in exchange for a link, or perhaps you could co-create a piece of content that benefits both of your audiences.

Here’s exactly how to do it:
  1. Create a list of niches that offer services that compliment (but don’t compete) with your business.
  2. Consider how you might be able to incorporate these other companies into your content outreach.

For example, a carpet cleaning business may decide to create a really helpful piece of content about cost-effective ways to increase a home’s value in a specific market. They might include advice about landscaping, painting, and of course, carpet cleaning. Before writing the content, they could reach out to a few local painting, landscaping, or home service businesses in the area and ask if those businesses would be willing to collaborate on the content and perhaps add a link to their resource pages.

This process can also work even if you don’t have an existing relationship with the business currently. Here’s a basic outreach template you can use: 

Hello [NAME],

My name is [YOUR NAME] from [BUSINESS]. We are actually business neighbors in a way, as we are located not too far from you in [CITY]. I often pass by [THEIR BUSINESS] on my way to [LOCAL LANDMARK/DESTINATION].

I thought it was finally time to reach out and say hello, and let you know that if there’s ever anything you or your team need, please let us know.

Also, I am working on writing an article about [INSERT BLOG TOPIC HERE]. Since our businesses both serve a similar audience and compliment each other nicely, I was wondering if you’d like to be featured in the article?

I am going to include a section about [TOPIC ABOUT THEIR INDUSTRY], and would like to use a sentence or two with your advice coming from the [THEIR INDUSTRY]. It might even make a great addition to the resource page on your website. Please let me know if this is something you’d be interested in.

Either way, thanks for your time, and great to meet you!

[YOUR NAME]

Pro Tip: If you are working to build links on a budget, it may help to get approval for the link before you invest the time and resources in content collaborations.

3. Fight off GMB spam in the map

This final optimization is less of an “optimization” and more of a tactic. This tactic is powerful because unlike most GMB optimizations, the goal is not to do something better than your competition, it’s to remove the competitors that are trying to cheat their way to higher rankings. 

Just how powerful is this approach? Very.

Let’s take a look at this Google Maps SERP as an example:

At first glance, all of these listings seem legitimate. However, after about two minutes of investigating you can quickly discern that a few are fake. One of them doesn’t have a website and links to Nerdwallet, some are using fake reviews, and some are even using fake addresses (one is using the DMV’s address).

Now imagine you are DCAP Insurance (a real company) and you are trying to rank higher in Google Maps. If you successfully remove the top four spam listings, you have now jumped to the #1 position without making any additional optimizations.

Starting to see the logic behind this approach?

Unfortunately, Google Maps still has quite a bit of spam throughout its ecosystem. In fact, out of the top 20 spots in the example above, I was able to find seven fake listings and three more that were extremely questionable. This approach can work whether a listing is using an improper business name, keyword stuffing, or is a fake location entirely.

How to remove or edit Google My Business spam

Create a detailed record of each GMB listing you find and what edits are necessary. This will help later on if the changes keep getting reverted back.

Next, head over to Google Maps, find the listing, and click on “Suggest an Edit”.

Depending on the issue at hand you can either select:

  • “Change name or other details”
  • “Remove this place”

If you’re trying to remove keyword stuffing from a listing’s business name, you simply select “change name or other details” and make the necessary edits.

If you’re dealing with spam of some sort, you will need to select “Remove this place” and then select the exact issue from the drop-down list.

When suggesting an edit doesn’t get the job done

Unfortunately, submitting an edit about spam doesn’t always cut it. When this happens the best way to handle these spam listings is to use Google’s Business Redressal Complaint Form.

When using the redressal form, you’ll need to provide evidence before the required action takes place. For more information, be sure to check out this helpful resource.

Google Maps SEO checklist

At this point, you likely understand the importance of filling out your Google My Business profile to completion. But that’s not all it takes to rank in Google Maps — ranking requires comprehensive optimizations on a variety of levels and there is often not just one magic thing.

To help you cover all your bases, I created this Google Maps SEO Checklist that will help you pinpoint specific areas for improvement.

Tracking results and GMB analytics

Tracking your results is crucial in every aspect of SEO and online marketing, and Google My Business is no different. Most of your profile analytics will be found in your Google My Business account.

You can find this information by logging into your account and selecting “insights” on the far left side. Here is an example of what that looks like for Roadside Dental Marketing’s Google My Business account.

From there, you should be able to see things like:

  • Which specific search queries triggered your listing.
  • How often your listing appeared in Google search.
  • How often your listing appeared in Google Maps.
  • What kind of customer actions were taken (e.g. visiting your website, requesting directions, phone calls).
  • Where customers are requesting business information from.
  • Which days of the you week get the most calls.
  • How many photos have been viewed, and how that number compares to your competition.

The one thing that GMB analytics does NOT offer is any sort of rank tracking. Thankfully, the brilliant people at Moz are working on Local Market Analytics (beta). LMA not only offers rank tracking on a local level, but it also contains a plethora of competitor information within a target market.

Conclusion

While covering the GMB basics is fine and dandy, comprehensive optimizations coupled with making ongoing improvements is what truly separates the wheat from the chaff. Regularly test different optimizations within your industry and market and closely monitor your results. If you’re ever in doubt, do whatever is in the best interest of your customer. They must always come first.

By investing in Google Maps marketing, you’ll be able to drive local leads to your business on a consistent basis. If you find yourself with any questions, let me know in the comments below or on Twitter and I will happily answer them!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

source https://moz.com/blog/beginner-guide-google-maps-ranking

Categories
Digital Marketing

Content Expansion: From Prompt to Paragraph to Published Page – Whiteboard Friday

Posted by rjonesx.

We’ve all been there. You’re the SEO on point for a project, and you’re also the one tasked with getting great content written well and quickly. And if you don’t have an expert at your disposal, great content can seem out of reach.

It doesn’t have to be. In today’s Whiteboard Friday, Russ Jones arms you with the tools and processes to expand your content from prompt to paragraph to published piece.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hey, folks, great to be back here with you on Whiteboard Friday. Today we’re going to be talking about content expansion. It’s a term you probably haven’t heard before because I just made it up. So hopefully, it will be useful in the future for you. But I think you’ll get the gist of exactly what we’re trying to accomplish here.

How do SEOs produce great content when they’re not subject matter experts?

You see, search engine optimizers have this really bizarre responsibility. We are often asked by our clients to produce content about things we have no business writing about. As a search engine optimizer, we know exactly the kinds of things that make content good for Google, but that doesn’t mean we have any domain knowledge about whatever it is our customer does.

Maybe your customer is an artist of some sort or your customer runs a restaurant. You might not know anything about it, but you still might have a deadline to hit in order to get good content that talks in depth about some sort of topic which really isn’t in your wheelhouse. Today I’m going to talk about a couple of tricks that you can use in order to go from a prompt to a couple of paragraphs and then ultimately to a published page, to a good piece of content.

Caveat: If an expert can create the content, they should

Now I want to step back for a second and just make one thing clear. This is not the preferred way to produce content. If you can have an expert produce the content, by all means have the expert produce the content, and then you go to work optimizing that content to make it the best it possibly can be. That’s the way it ought to be done whenever possible.

But we know that’s not the case. The truth is that most small business owners don’t have the time to write lengthy articles about their services and their offerings and what makes them special and the kinds of things that their customers might need. They have a business to run. There’s nothing unethical about taking the time to actually try and write a good piece of content for that customer.

But if you’re going to do it, you really should try and create something that’s of value. Hopefully this is going to help you do exactly that. I call this content expansion because the whole purpose is to start from one small prompt and then to expand it a little and expand it a little and expand it even more until eventually you are at something that’s very thorough and useful and valuable for the customers who are reading that content.

Each one of the individual steps is just sort of like taking a breath and blowing it into a balloon to make it a little bigger. Each step is manageable as we expand that content. 

1. Start with a prompt

First, we have to start with some sort of topic or prompt. In this example, I’ve decided just bike safety off the top of my head. I’m here in Seattle and there are bikes everywhere.

It’s completely different from North Carolina, where I’m from, where you’ve got to get in a car to go anywhere. But with the prompt bike safety, we now have to come up with what are we going to talk about with regard to bike safety. We pretty much know off the top of our heads that helmets matter and signaling and things of that sort. 

Find the questions people are asking

But what are people actually asking? What’s the information they want to know? Well, there are a couple of ways we can get at that, and that’s by looking exactly for those questions that they’re searching. One would be to just type in “bike safety” into Google and look for PAAs or People Also Ask. That’s the SERP feature that you’ll see about halfway down the page, which often has a couple of questions and you can click on it and there will be a little featured snippet or paragraph of text that will help you answer it.

Another would be to use a tool like Moz Keyword Explorer, where you could put in “bike safety” and then just select from one of the drop-downs “are questions” and it would then just show you all the questions people are asking about bike safety. Once you do that, you’ll get back a handful of questions that people are asking about bike safety.

In this case, the three that came up from the PAA for just bike safety were: 

  • Is riding a bike safe? 
  • How can I improve safety?
  • Why is bike safety important? 

What this does is start to get us into a position where now we’re building out some sort of outline of the content that we’re going to be building.

Build the outline for your content

We’ve just expanded from a title that said bike safety to now an outline that has a couple of questions that we want to answer. Well, here’s the catch. Bike safety, sure, we’ve got some ideas off the top of our heads about what’s important for bike safety. But the real thing that we’re trying to get at here is authoritative or valuable content.

Well, Google is telling you what that is. When you press the button to show you what the answer is to the question, that’s Google telling you this is the best answer we could find on the internet for that question. What I would recommend you do is you take the time to just copy the answer to that PAA, to that question. Why is bike safety important?

You click the button and it would show you the answer. Then you would write down the citation as well. But if you think about it, this is exactly the way you would write papers in college. If you were writing a paper in college about bike safety, you would go into the library, identify books on safety studies, etc. Then you would go through and then you would probably have note cards pulled out.

You would find a particular page that has an important paragraph. You would write a paraphrase down, and then you would write the citation down. This is the exact same thing. I’m not telling you to copy content. That’s not what we’re going to be doing in the end. But at the same time, it is the way that we take that next step of expanding the content. What we’ve done here is we’ve now gone from a topic to a couple of questions.

Now for each of those questions, we’ve kind of got an idea of what the target answer is. But, of course, the featured snippet isn’t the whole answer. The featured snippet is just the most specific answer to the question, but not the thorough one. It doesn’t cover all the bases. So what are some of the things we can do to expand this even further? 

2. Extract & explain entities

This is where I really like to take advantage of NLP technologies, natural language programming technologies that are going to allow us to be able to expand that content in a way that adds value to the user and in particular explains to the user concepts that both you, as the writer in this particular case, and they, as the reader, might not know.

My favorite is a site called dandelion.eu. It’s completely free for a certain amount of uses. But if you’re going to be producing a lot of content, I would highly recommend you sign up for their API services. What you’re going to do is extract and explain entities

Imagine you’ve got this featured snippet here and it’s talking about bike safety. It answers the question, “Why is bike safety important?” It says that bicyclists who wear their helmets are 50% less likely to suffer traumatic brain injuries in a wreck or something of that sort. That’s the answer in the featured snippet that’s been given to you. 

Well, perhaps you don’t know what a traumatic brain injury is, and perhaps your readers don’t know what that is and why it’s important to know that one thing protects you so much from the other.

Identify and expand upon terminology for your questions

That’s where entity extraction can be really important. What dandelion.eu is going to do is it’s going to identify that noun phrase. It’s going to identify the phrase “traumatic brain injury,” and then it’s going to give you a description of exactly what that is. Now you can expand that paragraph that you originally pulled from the featured snippet and add into it a citation about exactly what traumatic brain injury is.

This will happen for all the questions. You’ll find different terminology that your reader might not know and then be able to expand upon that terminology. 

3. Create novel research

Now the one thing that I want to do here in this process is not just take advantage of content other people have written about, but try and do some novel research. As you know, Google Trends is probably my favorite place to do novel research, because if there is any topic in the world, somebody is searching about it and we can learn things about the way people search.



Use Google Trends

For example, in this Google Trends that I did, I can’t remember the exact products that I was looking up, but they were specific bike safety products, like, for example, bike lights, bike mirrors, bike video cameras or bike cameras, etc. In fact, I’m almost positive that the red one had to do with bicycle cameras because they were becoming cheaper and more easily accessible to bicyclists. They’ve become more popular over time. Well, that’s novel research. 

Bring insights, graphs, and talking points from your novel research into your writing

When you’re writing this article here about bike safety, you can include in it far more than just what other people have said. You can say of the variety of ways of improving your bike safety, the use of a bike camera has increased dramatically over time.

4. Pull it all together

All right. So now that you’ve got some of this novel research, including even graphs that you can put into the content, we’ve got to pull this all together. We started with the prompt, and then we moved into some topics or questions to answer. Then we’ve answered those questions, and then we’ve expanded them by giving clarity and definitions to terms that people might not understand and we’ve also added some novel research.

Rewrite for relevancy

So what’s next? The next step is that we need to rewrite for relevancy. This is a really important part of the process. You see chances are, when you write about a topic that you are not familiar with, you will not use the correct language to describe what’s going on. I think a good example might be if you’re writing about golf, for example, and you don’t know what it means to accidentally hit a golf ball that goes to the right or to the left.

Find relevant words and phrases with nTopic

Which one is a hook and a slice? Now, those of you who play golf I’m sure know right off the top of your head. But you wouldn’t know to use that kind of terminology if you weren’t actually a golfer. Well, if you use a tool like nTopic — it’s at nTopic.org — and you write your content and place it in there and then give bike safety as the keyword you want to optimize for, it will tell you all of the relevant words and phrases you ought to be using in the content.

In doing so, you’ll be able to expand your content even further, not just with further language and definitions that you know, but with the actual language that experts are using right now whenever they’re talking about bike safety or whatever topic it is. 

Examine (and improve) your writing quality with the Hemingway app

The next thing that I would say is that you really should pull things back and take a chance to look at the quality of the writing that you’re producing.

This whole time we’ve been talking mostly about making sure the content is in-depth and thorough and covers a lot of issues and areas and uses the right language. But we haven’t spent any time at all talking about is this actually written well. There’s a fantastic free app out there called Hemingway app.

If you haven’t heard of it, this is going to make your day. [Editor’s note: It made mine!] Every writer in the world should be using a tool like this. You just drop your content in there, and it’s going to give you all sorts of recommendations, from correcting grammar to using different words, shortening sentences, passive and active voice, making sure that you have the right verb tenses, etc. It’s just incredibly useful for writing quality content. 

Two important things to remember:

Now there are two things at the end that matter, and one is really, really important in my opinion and that is to cite

1. Cite your sources — even if they’re competitors!

You see, when you’ve done all of this work, you need to let the world know that this work, one, isn’t only created by you but, two, is backed up by research and information provided by other professionals.

There is no shame whatsoever in citing even competitors who have produced good content that has helped you produce the content that you are now putting up. So cite. Put citations directly in. Look, Wikipedia ranks for everything, and every second sentence is cited and links off to another website. It’s insane.

But Google doesn’t really care about the citation in the sense that somebody else has written about this. What you’re really interested in is showing the users that you did your homework. 

2. Take pride in what you’ve accomplished!

Then finally, once you’re all done, you can publish this great piece of content that is thorough and exceptional and uniquely valuable, written well in the language and words that it should use, cited properly, and be proud of the content that you’ve produced at the end of the day, even though you weren’t an expert in the first place.

Hopefully, some of these techniques will help you out in the long run. I look forward to seeing you in the comments and maybe we’ll have some questions that I can give you some other ideas. Thanks again.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

source https://moz.com/blog/content-expansion

Categories
Digital Marketing

We Need to Talk About Google’s “People Also Ask”: A Finance Case Study

Posted by barryloughran

For a while now, I’ve been disappointed with the People Also Ask (PAAs) feature in Google’s search results. My disappointment is not due to the vast amount of space they take up on the SERPs (that’s another post entirely), but more that the quality is never where I expect it to be.

Google has been running PAAs since April 2015 and they are a pretty big deal. MozCast is currently tracking PAAs (Related Questions) across 90% of all searches, which is more than any other SERP feature.

The quality issue I’m running into is that I still find several obscure PAA questions and results or content from other countries.

When I run searches that have a universal answer, such as “can you eat raw chicken?”, the answer is universally correct so there is no issue with the results. But when I run a search that should return local (UK) content, such as “car insurance”, I’m finding a heavy influence from the US — especially around YMYL queries. 

I wanted to find out how much of an issue this actually is, so my team and I analyzed over 1,000 of the most-searched-for keywords in the finance industry, where we would expect UK PAA results.

Before we dig in, my fundamental question going into this research was: “Should a financial query originating in the UK, whose products are governed within UK regulations, return related questions that contain UK content?”

I believe that they should and I hope that by the end of this post, you agree, too.

Our methodology

To conduct our analysis, we followed these steps:

1. Tag keywords by category and sub-category:

2. Remove keywords where you would expect a universal result, e.g. “insurance definition”.

3. Extract PAAs and the respective ranking URLs using STAT.

4. Identify country origin through manual review: are we seeing correct results?

Our findings

55.1% of the 4,507 available financial PAAs returned non-UK content. US content was served 50.5% of the time, while the remaining 4.6% was made up of sites from India, Australia, Canada, Ireland, South Africa, Spain, and Singapore.

Results by category

Breaking it down by category, we see that personal finance keywords bring back a UK PAA 33.72% of the time, insurance keywords 52.10%, utilities keywords 64.89%, and business keywords 38.76%.

Personal finance

Digging into the most competitive products in the UK, personal finance, we found that a significant percentage of PAAs brought back US or Indian content in the results.

Out of the 558 personal finance keywords, 186 keywords didn’t bring back a single UK PAA result, including:

  • financial advisor
  • first credit card
  • best car loans
  • balance transfer cards
  • how to buy a house
  • best payday loans
  • cheap car finance
  • loan calculator

Credit cards

17.41% of credit card PAAs were showing UK-specific PAAs, with the US taking just over four out of every five. That’s huge.

Another surprising find is that 61 out of 104 credit card keywords didn’t bring back a single UK PAA. I find this remarkable given the fact that the credit card queries originated in the UK.

Loans

Only 15.8% of searches returned a UK PAA result with over 75% coming from the US. We also saw highly-competitive and scrutinized searches for keywords like “payday loans” generate several non-UK results.

Mortgages

While the UK holds the majority of PAA results for mortgage-related keywords at 53.53%, there are still some major keywords (like “mortgages”) that only bring back a single UK result. If you’re searching for “mortgages” in the UK, then you want to see information about UK mortgages, but instead Google serves up mainly US results.

Insurance

Insurance results weren’t as bad as personal finance. However, there was still a big swing towards the US for some products, such as life insurance.

Out of the 350 insurance keywords tested, there were 64 keywords that didn’t bring back a single UK PAA result, including:

  • pet insurance
  • cheap home insurance
  • life insurance comparison
  • car insurance for teens
  • cheap dog insurance
  • types of car insurance

Car insurance

60.54% of car insurance PAAs were showing UK-specific PAAs, with the US taking 36.97%. Out of the 132 keywords that were in this sub-category, UK sites were present for 118, which is better than the personal finance sub-categories.

Home insurance

As one of the most competitive spaces in the finance sector, it was really surprising to see that only 56.25% of results for home insurance queries returned a UK PAA. There are nuances to policies across different markets, so this is a frustrating and potentially harmful experience for searchers.

Utilities

Although we see a majority of PAAs in this keyword category return UK results, there are quite a few more specific searches for which you would absolutely be looking for a UK result (e.g. “unlimited data phone contracts”) but that bring back only one UK result.

One interesting find is that this UKPower page has captured 35 PAAs for the 49 keywords it ranks for. That’s an impressive 71.43% — the highest rating we’ve seen across our analysis.

Business

At the time of our analysis, we found that 36.7% of business-related PAAs were from the UK. One of the keywords with the lowest representation in this category was “business loans”, which generated only 6.25% UK results. While the volume of keywords are smaller in this category, there is more potential for harm with serving international content for queries relating to UK businesses.

What pages generate the most PAA results?

To make this post a little more actionable, I aggregated which URLs generated the most PAAs across some of the most competitive financial products in the UK. 

Ironically, four out of the top 10 were US-based (cars.news.com manages to generate 32 PAAs across one of the most competitive industries in UK financial searches: car insurance). A hat tip to ukpower.co.uk, which ranked #1 in our list, generating 35 results in the energy space.

To summarize the above analysis, it’s clear that there is too much dominance from non-UK sites in finance searches. While there are a handful of UK sites doing well, there are UK queries being searched for that are bringing back clearly irrelevant information.

As an industry, we have been pushed to improve quality — whether it’s increasing our relevancy or the expertise of our content — so findings like these show that Google could be doing more themselves.

What does this mean for your SEO strategy?

For the purpose of this research, we only looked at financial terms, so whilst we can’t categorically say this is the same for all industries, if Google is missing this much across financial YMYL terms then it doesn’t look good for other categories.

My advice would be that if you are investing any time optimizing for PAAs, then you should spend your time elsewhere, for now, since the cards in finance niches are stacked against you.

Featured Snippets are still the prime real estate for SEOs and (anecdotally, anyway) don’t seem to suffer from this geo-skew like PAAs do, so go for Featured Snippets instead.

Have you got any thoughts on the quality of PAAs across your SERPs? Let me know in the comments below!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

source https://moz.com/blog/google-people-also-ask-finance-case-study

Categories
Digital Marketing

Crawled — Currently Not Indexed: A Coverage Status Guide

Posted by cml63

Google’s Index Coverage report is absolutely fantastic because it gives SEOs clearer insights into Google’s crawling and indexing decisions. Since its roll-out, we use it almost daily at Go Fish Digital to diagnose technical issues at scale for our clients.

Within the report, there are many different “statuses” that provide webmasters with information about how Google is handling their site content. While many of the statuses provide some context around Google’s crawling and indexation decisions, one remains unclear: “Crawled — currently not indexed”.

Since seeing the “Crawled — currently not indexed” status reported, we’ve heard from several site owners inquiring about its meaning. One of the benefits of working at an agency is being able to get in front of a lot of data, and because we’ve seen this message across multiple accounts, we’ve begun to pick up on trends from reported URLs.

Google’s definition

Let’s start with the official definition. According to Google’s official documentation, this status means: “The page was crawled by Google, but not indexed. It may or may not be indexed in the future; no need to resubmit this URL for crawling.”

So, essentially what we know is that:

  1. Google is able to access the page
  2. Google took time to crawl the page
  3. After crawling, Google decided not to include it in the index

The key to understanding this status is to think of reasons why Google would “consciously” decide against indexation. We know that Google isn’t having trouble finding the page, but for some reason it feels users wouldn’t benefit from finding it.

This can be quite frustrating, as you might not know why your content isn’t getting indexed. Below I’ll detail some of the most common reasons our team has seen to explain why this mysterious status might be affecting your website.

1. False positives

Priority: Low

Our first step is to always perform a few spot checks of URLs flagged in the “Crawled — currently not indexed” section for indexation. It’s not uncommon to find URLs that are getting reported as excluded but turn out to be in Google’s index after all.

For example, here’s a URL that’s getting flagged in the report for our website: https://ift.tt/2IsfU2O

However, when using a site search operator, we can see that the URL is actually included in Google’s index. You can do this by appending the text “site:” before the URL.

If you’re seeing URLs reported under this status, I recommend starting by using the site search operator to determine whether the URL is indexed or not. Sometimes, these turn out to be false positives.

Solution: Do nothing! You’re good.

2. RSS feed URLs

Priority: Low

This is one of the most common examples that we see. If your site utilizes an RSS feed, you might be finding URLs appearing in Google’s “Crawled — currently not indexed” report. Many times these URLs will have the “/feed/” string appended to the end. They can appear in the report like this:

Google finding these RSS feed URLs linked from the primary page. They’ll often be linked to using a “rel=alternate” element. WordPress plugins such as Yoast can automatically generate these URLs.

Solution: Do nothing! You’re good.

Google is likely selectively choosing not to index these URLs, and for good reason. If you navigate to an RSS feed URL, you’ll see an XML document like the one below:

While this XML document is useful for RSS feeds, there’s no need for Google to include it in the index. This would provide a very poor experience as the content is not meant for users.

3. Paginated URLs

Priority: Low

Another extremely common reason for the “Crawled — currently not indexed” exclusion is pagination. We will often see a good number of paginated URLs appear in this report. Here we can see some paginated URLs appearing from a very large e-commerce site:

Solution: Do nothing! You’re good.

Google will need to crawl through paginated URLs to get a complete crawl of the site. This is its pathway to content such as deeper category pages or product description pages. However, while Google uses the pagination as a pathway to access the content, it doesn’t necessarily need to index the paginated URLs themselves.

If anything, make sure that you don’t do anything to impact the crawling of the individual pagination. Ensure that all of your pagination contains a self-referential canonical tag and is free of any “nofollow” tags. This pagination acts as an avenue for Google to crawl other key pages on your site so you’ll definitely want Google to continue crawling it.

4. Expired products

Priority: Medium

When spot-checking individual pages that are listed in the report, a common problem we see across clients is URLs that contain text noting “expired” or “out of stock” products. Especially on e-commerce sites, it appears that Google checks to see the availability of a particular product. If it determines that a product is not available, it proceeds to exclude that product from the index.

This makes sense from a UX perspective as Google might not want to include content in the index that users aren’t able to purchase.

However, if these products are actually available on your site, this could result in a lot of missed SEO opportunity. By excluding the pages from the index, your content isn’t given a chance to rank at all.

In addition, Google doesn’t just check the visible content on the page. There have been instances where we’ve found no indication within the visible content that the product is not available. However, when checking the structured data, we can see that the “availability” property is set to “OutOfStock”.

It appears that Google is taking clues from both the visible content and structured data about a particular product’s availability. Thus, it’s important that you check both the content and schema.

Solution: Check your inventory availability.

If you’re finding products that are actually available getting listed in this report, you’ll want to check all of your products that may be incorrectly listed as unavailable. Perform a crawl of your site and use a custom extraction tool like Screaming Frog’s to scrape data from your product pages.

For instance, if you want to see at scale all of your URLs with schema set to “OutOfStock”, you can set the “Regex” to: “availability”:”<a target="_blank" href="http://schema.org/OutOfStock&quot; .=""

This: <a target="_blank" href="http://schema.org/OutOfStock&quot; .="" “class=”redactor-autoparser-object”>http://schema.org/OutOfStock&#8221; should automatically scrape all of the URLs with this property:

You can export this list and cross-reference with inventory data using Excel or business intelligence tools. This should quickly allow you to find discrepancies between the structured data on your site and products that are actually available. The same process can be repeated to find instances where your visible content indicates that products are expired.

5. 301 redirects

Priority: Medium

One interesting example we’ve seen appear under this status is destination URLs of redirected pages. Often, we’ll see that Google is crawling the destination URL but not including it in the index. However, upon looking at the SERP, we find that Google is indexing a redirecting URL. Since the redirecting URL is the one indexed, the destination URL is thrown into the “Crawled — currently not indexed” report.

The issue here is that Google may not be recognizing the redirect yet. As a result, it sees the destination URL as a “duplicate” because it is still indexing the redirecting URL.

Solution: Create a temporary sitemap.xml.

If this is occurring on a large number of URLs, it is worth taking steps to send stronger consolidation signals to Google. This issue could indicate that Google isn’t recognizing your redirects in a timely manner, leading to unconsolidated content signals.

One option might be setting up a “temporary sitemap”. This is a sitemap that you can create to expedite the crawling of these redirected URLs. This is a strategy that John Mueller has previously recommended.

To create one, you will need to reverse-engineer redirects that you have created in the past:

  1. Export all of the URLs from the “Crawled — currently not indexed” report.
  2. Match them up in Excel with redirects that have been previously set up.
  3. Find all of the redirects that have a destination URL in the “Crawled — currently not indexed” bucket.
  4. Create a static sitemap.xml of these URLs with Screaming Frog. 
  5. Upload the sitemap and monitor the “Crawled — currently not indexed” report in Search Console.

The goal here is for Google to crawl the URLs in the temporary sitemap.xml more frequently than it otherwise would have. This will lead to faster consolidation of these redirects.

6. Thin content

Priority: Medium

Sometimes we see URLs included in this report that are extremely thin on content. These pages may have all of the technical elements set up correctly and may even be properly internally linked to, however, when Google runs into these URLs, there is very little actual content on the page. Below is an example of a product category page where there is very little unique text:

This product listing page was flagged as “Crawled — Currently Not Indexed”. This may be due to very thin content on the page.

This page is likely either too thin for Google to think it’s useful or there is so little content that Google considers it to be a duplicate of another page. The result is Google removing the content from the index.

Here is another example: Google was able to crawl a testimonial component page on the Go Fish Digital site (shown above). While this content is unique to our site, Google probably doesn’t believe that the single sentence testimonial should stand alone as an indexable page.

Once again, Google has made the executive decision to exclude the page from the index due to a lack of quality.

Solution: Add more content or adjust indexation signals.

Next steps will depend on how important it is for you to index these pages.

If you believe that the page should definitely be included in the index, consider adding additional content. This will help Google see the page as providing a better experience to users. 

If indexation is unnecessary for the content you’re finding, the bigger question becomes whether or not you should take the additional steps to strongly signal that this content shouldn’t be indexed. The “Crawled —currently not indexed” report is indicating that the content is eligible to appear in Google’s index, but Google is electing not to include it.

There also could be other low quality pages to which Google is not applying this logic. You can perform a general “site:” search to find indexed content that meets the same criteria as the examples above. If you’re finding that a large number of these pages are appearing in the index, you might want to consider stronger initiatives to ensure these pages are removed from the index such as a “noindex” tag, 404 error, or removing them from your internal linking structure completely.

7. Duplicate content

Priority: High

When evaluating this exclusion across a large number of clients, this is the highest priority we’ve seen. If Google sees your content as duplicate, it may crawl the content but elect not to include it in the index. This is one of the ways that Google avoids SERP duplication. By removing duplicate content from the index, Google ensures that users have a larger variety of unique pages to interact with. Sometimes the report will label these URLs with a “Duplicate” status (“Duplicate, Google chose different canonical than user”). However, this is not always the case.

This is a high priority issue, especially on a lot of e-commerce sites. Key pages such as product description pages often include the same or similar product descriptions as many other results across the Web. If Google recognizes these as too similar to other pages internally or externally, it might exclude them from the index all together.

Solution: Add unique elements to the duplicate content.

If you think that this situation applies to your site, here’s how you test for it:

  1. Take a snippet of the potential duplicate text and paste it into Google.
  2. In the SERP URL, append the following string to the end: “&num=100”. This will show you the top 100 results.
  3. Use your browser’s “Find” function to see if your result appears in the top 100 results. If it doesn’t, your result might be getting filtered out of the index.
  4. Go back to the SERP URL and append the following string to the end: “&filter=0”. This should show you Google’s unfiltered result (thanks, Patrick Stox, for the tip).
  5. Use the “Find” function to search for your URL. If you see your page now appearing, this is a good indication that your content is getting filtered out of the index.
  6. Repeat this process for a few URLs with potential duplicate or very similar content you’re seeing in the “Crawled — currently not indexed” report.

If you’re consistently seeing your URLs getting filtered out of the index, you’ll need to take steps to make your content more unique.

While there is no one-size-fits-all standard for achieving this, here are some options:

  1. Rewrite the content to be more unique on high-priority pages.
  2. Use dynamic properties to automatically inject unique content onto the page.
  3. Remove large amounts of unnecessary boilerplate content. Pages with more templated text than unique text might be getting read as duplicate.
  4. If your site is dependent on user-generated content, inform contributors that all provided content should be unique. This may help prevent instances where contributors use the same content across multiple pages or domains.

8. Private-facing content

Priority: High

There are some instances where Google’s crawlers gain access to content that they shouldn’t have access to. If Google is finding dev environments, it could include those URLs in this report. We’ve even seen examples of Google crawling a particular client’s subdomain that is set up for JIRA tickets. This caused an explosive crawl of the site, which focused on URLs that shouldn’t ever be considered for indexation.

The issue here is that Google’s crawl of the site isn’t focused, and it’s spending time crawling (and potentially indexing) URLs that aren’t meant for searchers. This can have massive ramifications for a site’s crawl budget.

Solution: Adjust your crawling and indexing initiatives.

This solution is going to be entirely dependent on the situation and what Google is able to access. Typically, the first thing you want to do is determine how Google is able to discover these private-facing URLs, especially if it’s via your internal linking structure.

Start a crawl from the home page of your primary subdomain and see if any undesirable subdomains are able to be accessed by Screaming Frog through a standard crawl. If so, it’s safe to say that Googlebot might be finding those exact same pathways. You’ll want to remove any internal links to this content to cut Google’s access.

The next step is to check the indexation status of the URLs that should be excluded. Is Google sufficiently keeping all of them out of the index, or were some caught in the index? If Google isn’t indexing a large amount of this content, you might consider adjusting your robots.txt file to block crawling immediately. If not, “noindex” tags, canonicals, and password protected pages are all on the table.

Case study: duplicate user-generated content

For a real-world example, this is an instance where we diagnosed the issue on a client site. This client is similar to an e-commerce site as a lot of their content is made up of product description pages. However, these product description pages are all user-generated content.

Essentially, third parties are allowed to create listings on this site. However, the third parties were often adding very short descriptions to their pages, resulting in thin content. The issue occurring frequently was that these user-generated product description pages were getting caught in the “Crawled — currently not indexed” report. This resulted in missed SEO opportunity as pages that were capable of generating organic traffic were completely excluded from the index.

When going through the process above, we found that the client’s product description pages were quite thin in terms of unique content. The pages that were getting excluded only appeared to have a paragraph or less of unique text. In addition, the bulk of on-page content was templated text that existed across all of these page types. Since there was very little unique content on the page, the templated content might have caused Google to view these pages as duplicates. The result was that Google excluded these pages from the index, citing the “Crawled — currently not indexed” status.

To solve for these issues, we worked with the client to determine which of the templated content didn’t need to exist on each product description page. We were able to remove the unnecessary templated content from thousands of URLs. This resulted in a significant decrease in “Crawled — currently not indexed” pages as Google began to see each page as more unique.

Conclusion

Hopefully, this helps search marketers better understand the mysterious “Crawled — currently not indexed” status in the Index Coverage report. Of course, there are likely many other reasons that Google would choose to categorize URLs like this, but these are the most common instances we’ve seen with our clients to date.

Overall, the Index Coverage report is one of the most powerful tools in Search Console. I would highly encourage search marketers to get familiar with the data and reports as we routinely find suboptimal crawling and indexing behavior, especially on larger sites. If you’ve seen other examples of URLs in the “Crawled — currently not indexed” report, let me know in the comments!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

source https://moz.com/blog/crawled-currently-not-indexed-coverage-status

Categories
Digital Marketing

Defense Against the Dark Arts: Why Negative SEO Matters, Even if Rankings Are Unaffected

Posted by rjonesx.

Negative SEO can hurt your website and your work in search, even when your rankings are unaffected by it. In this week’s Whiteboard Friday, search expert Russ Jones dives into what negative SEO is, what it can affect beyond rankings, and tips on how to fight it.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

All right, folks. Russ Jones here and I am so excited just to have the opportunity to do any kind of presentation with the title “Defense Against the Dark Arts.” I’m not going to pretend like I’m a huge Harry Potter fan, but anyway, this is just going to be fun.

But what I want to talk about today is actually pretty bad. It’s the reality that negative SEO, even if it is completely ineffective at doing its primary goal, which is to knock your website out of the rankings, will still play havoc on your website and the likelihood that you or your customers will be able to make correct decisions in the future and improve your rankings.

Today I’m going to talk about why negative SEO still matters even if your rankings are unaffected, and then I’m going to talk about a couple of techniques that you can use that will help abate some of the negative SEO techniques and also potentially make it so that whoever is attacking you gets hurt a little bit in the process, maybe. Let’s talk a little bit about negative SEO.

What is negative SEO?

The most common form of negative SEO is someone who would go out and purchase tens of thousands of spammy links or hundreds of thousands even, using all sorts of different software, and point them to your site with the hope of what we used to call “Google bowling,” which is to knock you out of the search results the same way you would knock down a pin with a bowling ball.

The hope is that it’s sort of like a false flag campaign, that Google thinks that you went out and got all of those spammy links to try to improve your rankings, and now Google has caught you and so you’re penalized. But in reality, it was someone else who acquired those links. Now to their credit, Google actually has done a pretty good job of ignoring those types of links.

It’s been my experience that, in most cases, negative SEO campaigns don’t really affect rankings the way they’re intended to in most cases, and I give a lot of caveats there because I’ve seen it be effective certainly. But in the majority of cases all of those spammy links are just ignored by Google. But that’s not it. That’s not the complete story. 

Problem #1: Corrupt data

You see, the first problem is that if you get 100,000 links pointing to your site, what’s really going on in the background is that there’s this corruption of data that’s important to making decisions about search results. 

Pushes you over data limits in GSC

For example, if you get 100,000 links pointing to your site, it is going to push you over the limit of the number of links that Google Search Console will give back to you in the various reports about links.

Pushes out the good links

This means that in the second case there are probably links, that you should know about or care about, that don’t show up in the report simply because Google cuts off at 100,000 total links in the export.

Well, that’s a big deal, because if you’re trying to make decisions about how to improve your rankings and you can’t get to the link data you need because it’s been replaced with hundreds of thousands of spammy links, then you’re not going to be able to make the right decision. 

Increased cost to see all your data

The other big issue here is that there are ways around it.

You can get the data for more than 100,000 links pointing to your site. You’re just going to have to pay for it. You could come to Moz and use our Link Explorer tool for example. But you’ll have to increase the amount of money that you’re spending in order to get access to the accounts that will actually deliver all of that data.

The one big issue sitting behind all of this is that even though we know Google is ignoring most of these links, they don’t label that for us in any kind of useful fashion. Even after we can get access to all of that link data, all of those hundreds of thousands of spammy links, we still can’t be certain which ones matter and which ones don’t.

Problem #2: Copied content

That’s not the only type of negative SEO that there is out there. It’s the most common by far, but there are other types. Another common type is to take the content that you have and distribute it across the web in the way that article syndication used to work. So if you’re fairly new to SEO, one of the old methodologies of improving rankings was to write an article on your site, but then syndicate that article to a number of article websites and these sites would then post your article and that article would link back to you.

Now the reason why these sites would do this is because they would hope that, in some cases, they would outrank your website and in doing so they would get some traffic and maybe earn some AdSense money. But for the most part, that kind of industry has died down because it hasn’t been effective in quite some time. But once again, that’s not the whole picture. 

No attribution

If all of your content is being distributed to all of these other sites, even if it doesn’t affect your rankings, it still means there’s the possibility that somebody is getting access to your quality content without any kind of attribution whatsoever.

If they’ve stripped out all of the links and stripped out all of the names and all of the bylines, then your hard earned work is actually getting taken advantage of, even if Google isn’t really the arbiter anymore of whether or not traffic gets to that article. 

Internal links become syndicated links

Then on the flip side of it, if they don’t remove the attribution, all the various internal links that you had in that article in the first place that point to other pages on your site, those now become syndicated links, which are part of the link schemes that Google has historically gone after.

In the same sort of situation, it’s not really just about the intent behind the type of negative SEO campaign. It’s the impact that it has on your data, because if somebody syndicates an article of yours that has let’s say eight links to other internal pages and they syndicate it to 10,000 websites, well, then you’ve just got 80,000 new what should have been internal links, now external links pointing to your site.

We actually do know just a couple of years back several pretty strong brands got in trouble for syndicating their news content to other news websites. Now I’m not saying that negative SEO would necessarily trigger that same sort of penalty, but there’s the possibility. Even if it doesn’t trigger that penalty, chances are it’s going to sully the waters in terms of your link data.

Problem #3: Nofollowed malware links & hacked content

There are a couple of other miscellaneous types of negative SEO that don’t get really talked about a lot. 

Nofollowed malware links in UGC

For example, if you have any kind of user-generated content on your site, like let’s say you have comments for example, even if you nofollow those comments, the links that are included in there might point to things like malware.

We know that Google will ultimately identify your site as not being safe if it finds these types of links. 

Hacked content

Unfortunately, in some cases, there are ways to make it look like there are links on your site that aren’t really under your control through things like HTML injection. For example, you can actually do this to Google right now.

You can inject HTML onto the page of part of their website that makes it look like they’re linking to someone else. If Google actually crawled itself, which luckily they don’t in this case, if they crawled that page and found that malware link, the whole domain in the Google search results would likely start to show that this site might not be safe.

Of course, there’s always the issue with hacked content, which is becoming more and more popular. 

Fear, uncertainty, and doubt

All of this really boils down to this concept of FUD — fear, uncertainty, and doubt. You see it’s not so much about bowling you out of the search engines. It’s about making it so that SEO just isn’t workable anymore.

1. Lose access to critical data

Now it’s been at least a decade since everybody started saying that they used data-driven SEO tactics, data-driven SEO strategies. Well, if your data is corrupted, if you lose access to critical data, you will not be able to make smart decisions. How will you know whether or not the reason your page has lost rankings to another has anything to do with links if you can’t get to the link data that you need because it’s been filled with 100,000 spammy links?

2. Impossible to discern the cause of rankings lost

This leads to number two. It’s impossible to discern the cause of rankings lost. It could be duplicate content. It could be an issue with these hundreds of thousands of links. It could be something completely different. But because the waters have been muddied so much, it makes it very difficult to determine exactly what’s going on, and this of course then makes SEO less certain.

3. Makes SEO uncertain

The less certain it becomes, the more other advertising channels become valuable. Paid search becomes more valuable. Social media becomes more valuable. That’s a problem if you’re a search engine optimization agency or a consultant, because you have the real likelihood of losing clients because you can’t make smart decisions for them anymore because their data has been damaged by negative SEO.

It would be really wonderful if Google would actually show us in Google Search Console what links they’re ignoring and then would allow us to export only the ones they care about. But something tells me that that’s probably beyond what Google is willing to share. So do we have any kind of way to fight back? There are a couple.

How do you fight back against negative SEO?

1. Canonical burn pages

Chances are if you’ve seen some of my other Whiteboard Fridays, you’ve heard me talk about canonical burn pages. Real simply, when you have an important page on your site that you intend to rank, you should create another version of it that is identical and that has a canonical link pointing back to the original. Any kind of link building that you do, you should point to that canonical page.

The reason is simple. If somebody does negative SEO, they’re going to have two choices. They’re either going to do it to the page that’s getting linked to, or they’re going to do it to the page that’s getting ranked. Normally, they’ll do it to the one that’s getting ranked. Well, if they do, then you can get rid of that page and just hold on to the canonical burn page because it doesn’t have any of these negative links.

Or if they choose the canonical burn page, you can get rid of that one and just keep your original page. Yes, it means you sacrifice the hard earned links that you acquired in the first place, but it’s better than losing the possibility in the future altogether. 

2. Embedded styled attribution

Another opportunity here, which I think is kind of sneaky and fun, is what I call embedded styled attribution.

You can imagine that my content might say “Russ Jones says so and so and so and so.” Well, imagine surrounding “Russ Jones” by H1 tags and then surrounding that by a span tag with a class that makes it so that the H1 tag that’s under it is the normal-sized text.

Well, chances are if they’re using one of these copied content techniques, they’re not copying your CSS style sheet as well. When that gets published to all of these other sites, in giant, big letters it has your name or any other phrase that you really want. Now this isn’t actually going to solve your problem, other than just really frustrate the hell out of whoever is trying to screw with you.

But sometimes that’s enough to get them to stop. 

3. Link Lists

The third one, the one that I really recommend is Link Lists. This is a feature inside of Moz’s Link Explorer, which allows you to track the links that are pointing to your site. As you get links, real links, good links, add them to a Link List, and that way you will always have a list of links that you know are good, that you can compare against the list of links that might be sullied by a negative SEO campaign.

By using the Link lists, you can discern the difference between what’s actually being ignored by Google, at least to some degree, and what actually matters. I hope this is helpful to some degree. But unfortunately, I’ve got to say, at the end of the day, a sufficiently well-run negative SEO campaign can make the difference in whether or not you use SEO in the future at all.

It might not knock you out of Google, but it might make it so that other types of marketing are just better choices. So hopefully this has been some help. I’d love to talk you in the comments about different ways of dealing with negative SEO, like how to track down who is responsible. So just go ahead and fill those comments up with any questions or ideas.

I would love to hear them. Thanks again and I look forward to talking to you in another Whiteboard Friday.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

source https://moz.com/blog/why-negative-seo-matters

Categories
Digital Marketing

Benchmark for Success: What Your Vertical Can Achieve With Content Marketing

Posted by Domenica

You’ve produced a piece of content you thought was going to be a huge success, but the results were underwhelming.

You double and triple checked the content for all the crucial elements: it’s newsworthy, data-driven, emotional, and even a bit controversial, but it failed to “go viral”. Your digital PR team set out to pitch it, but writers didn’t bite.

So, what’s next?

Two questions you might ask yourself are:

  • Do I have unrealistic link expectations for my link-building content?
  • Is my definition of success backed by data-driven evidence?

Fractl has produced thousands of content marketing campaigns across every topic — sports, entertainment, fashion, home improvement, relationships — you name it. We also have several years’ worth of campaign performance data that we use to learn from our successes and mistakes.

In this article, I’m going to explain how businesses and agencies across seven different niches can set realistic expectations for their link-building content based on the performance of 626 content projects Fractl has produced and promoted in the last five years. I’ll also walk through some best practices for ensuring your content reaches its highest potential.

Managing expectations across verticals

You can’t compare apples to oranges. Each beat has its own unique challenges and advantages. Content for each vertical has to be produced with expert-level knowledge of how publishers within each vertical behave.

We selected the following common verticals for analysis:

  • Health and fitness
  • Travel
  • Sex and relationships
  • Finance
  • Technology
  • Sports
  • Food and drink

Across the entire sample of 626 content projects, on average, a project received 23 dofollow links and 88 press mentions in total. Some individual vertical averages didn’t deviate much from these averages, while others niches did.

Of course, you can’t necessarily expect these numbers when you just start dipping your toes in content marketing or digital PR. It’s a long-term investment, and it usually takes at least six months to a year before you get the results you’re looking for.

A “press mention” refers to any time a publisher wrote about the campaign. A press mention could involve any type of link (dofollow, nofollow, simple text attribution, etc.). We also looked at dofollow links individually, as they provide more value than a nofollow link or text attribution. For campaigns that went “viral” and performed well above the norm, we excluded them in the calculation so as not to skew the averages higher. 

Based on averages from these 626 campaigns, are your performance expectations too high or too low?

Vertical-specific content considerations

Of course, there are universal principles that you should apply to all content no matter the vertical. The data needs to be sound. The graphic assets need to be pleasing to the eye and easy to understand. The information needs to be surprising and informative.

But when it comes to vertical-specific content considerations, what should you pay attention to? What tactics or guidelines apply to one niche that you can disregard for other niches? I solicited advice from the senior team at Fractl and asked what they look out for when making content for different verticals. All have several years of experience producing and promoting content across every vertical and niche. Here’s what they said:

Sex and dating

For content relating to sex and relationships, it’s important to err on the side of caution.

“Be careful not to cross the line between ‘sexy’ content and raunchy content,” says Angela Skane, Creative Strategy. “The internet can be an exciting place, but if something is too out-there or too descriptive, publishers are going to be turned off from covering your content.”

Even magazine websites like Cosmopolitan — a publication known for its sex content — have editorial standards to make sure lines aren’t crossed. For example, when pitching a particularly risqué project exploring bedroom habits of men and women, we learned that just because a project is doing well over at Playboy or Maxim doesn’t mean it would resonate with the primarily female audience over at Cosmopolitan.

Especially be aware of anything that could be construed as misogynistic or pin women against each other. It’s likely not the message your client will want to promote, anyway.

Finance

Given the fact that money is frequently touted as one of the topics you avoid over polite dinner conversation, there’s no doubt that talking and thinking about money evokes a lot of emotion in people.

“Finance can seem dry at first glance, but mentions of money can evoke strong emotions. Tapping into financial frustrations, regrets, and mistakes makes for highly entertaining and even educational content,” says Corie Colliton, Creative Strategy. “For example, one of my best finance campaigns featured the purchases people felt their partners wasted money on. Another showed the amount people spend on holiday gifts — and the number who were in debt for a full year after the holidays as a result.”

Emotion is one of the drivers of social sharing, so use it to your advantage when producing finance-related content.

We also heard from Chris Lewis, Account Strategy: “Relate to your audience. Readers will often try to use financial content marketing campaigns as a way to benchmark their own financial well-being, so giving people lots of data about potential new norms helps readers relate to your content.”

People want to read content and be able to picture themselves within it. How do they compare to the rest of America, or their state, or their age group? Relatability is key in finance-related content.

Sports

A little healthy competition never hurt anyone, and that’s why Tyler Burchett, Promotions Strategy, thinks you should always utilize fan bases when creating sports content: “Get samples from different fan bases when possible. Writers like to pit fans against each other, and fans take pride in seeing how they rank.”

Food and drink

According to Chris Lewis, don’t forgo design when creating marketing campaigns about food: “Make sure to include good visuals. People eat with their eyes!”

If the topic for which you’re creating content typically has visual appeal, it’s best to take advantage of that to draw people into your content. Have you ever bought a recipe book that didn’t include photos of the food?

Technology

Think tech campaigns are just about tech? Think again. Matt Gillespie, Data Science, says: “Technology campaigns are always culture and human behavior campaigns. Comparing devices, social media usage, or more nuanced topics like privacy and security, can only resonate with a general audience if it ties to more common themes like connection, safety, or shared experience — tech savvy without being overly technical.”

Travel

When creating content for travel, it’s important to make sure there are actionable takeaways in the content. If there aren’t, it can be hard for publishers to justify covering it.

“Travel writers love to extract ‘tips’ from the content they’re provided. If your project provides helpful information to travelers or little-known statistics on flights and amenities, you’re likely to gain a lot of traction in the travel vertical,” says Delaney Kline, Brand Promotions. “Come up with these ideal statistics before creating your project and use them as a template for your work.”

Health and fitness

In the health and wellness world, it can seem like everyone is giving advice. If you’re not a doctor, however, err on the side of caution when speaking about specific topics. Try not to pit any particular standard against another. Be careful around diet culture and mental health topics, specifically.

“Try striking a balance between physical and mental well-being, particularly being careful to not glorify or objectify one standard while demeaning others,” says Matt Gillespie, Data Science. “Emphasize overall wellness as opposed to focus on a single area. In this vertical, you need to be especially careful with whatever is trending. Do the legwork to understand the research, or lack thereof, behind the big topics of the moment.”

Improving content in any vertical

While you can certainly tailor your content production and promotion to your specific niche, there are also some guidelines you can follow to improve the chances that you’ll get more media coverage for your content overall.

Create content with a headline in mind

When you begin mapping out your content, identify what you want the outcome to look like. Before you even begin, ask yourself: what do you want people to learn from your content? What are the elements of the content you’re producing that journalists will find compelling for their audiences?

For example, we wrote a survey in which we wanted to compare the levels of cooking experience across different generations. We hypothesized that we’d see some discrepancies between boomers and millennials specifically, and given that millennials ruin everything, it was a good time to join the discussion.

As it turns out, only 64% of millennials could correctly identify a butter knife. Publishers jumped at the stats revealing millennials have a tough time in the kitchen. Having a thesis and an idea of what we wanted the project to look like in advance had a tremendous positive impact on our results.

Appeal to the emotionality of people

In past research on the emotions that make content go viral, we learned that negative content may have a better chance of going viral if it is also surprising. Nothing embodies this combination of emotional drivers than a project we did for a travel client in which we used germ swabs to determine the dirtiest surfaces on airplanes.

This campaign did so well (and continues to earn links to this day) that it’s actually excluded from our vertical benchmarks analysis as we consider it a viral outlier.

Why did this idea work? Most people travel via plane at least once a year, and everyone wants to avoid getting sick while traveling. So, a data-backed report like this one that also yielded some click-worthy headlines is sure to exceed your outreach goals.

Evergreen content wins (sometimes)

You may have noticed from the analysis above that, of the seven topics we chose to look at, the sports vertical has the lowest average dofollows and total press mentions of any other category.

For seasoned content marketers, this is very understandable. Unlike the other verticals, the sports beat is an ever-changing and fast-paced news cycle that’s hard for content marketers to have a presence in. However, for our sports clients we achieve success by understanding this system and working with it — not trying to be louder than it.

One technique we’ve found that works for sports campaigns (as well as other sectors with fast-paced news cycles such as entertainment or politics) is to come up with content that is both timely and evergreen. By capitalizing on the current interests around major sporting events (timely) and creating an idea that would work on any given day of the year (evergreen) we can produce content that’s the best of both worlds, and that will still have legs once the timeliness wears off.

In a series of campaigns for one sports client, we took a look at the evolution of sports jerseys and chose teams with loyal fan bases such as the New York Yankees, Carolina Panthers, Denver Broncos, and Chicago Bears.

The sports niche has an ongoing, fast-paced news cycle that changes every day, if not every hour. Reporters are busy covering by-the-minute breaking news, games, statistics, rankings, trades, personal player news, and injuries. This makes it one of the most challenging verticals to compete in. By capitalizing on teams of interest throughout the year, we were able to squeeze projects into tight editorial calendars and earn our client some press.

For example, timing couldn’t have been better when we pitched “Evolution of the Football Jersey”. We pitched this campaign to USA Today right before the tenacious playoffs in which the Steelers and the Redskins played. Time was of the essence — the editor wrote and published this article within 24 hours and our client enjoyed a lot of good syndication from the powerful publication. In total, the one placement resulted in 15 dofollow links and over 45 press mentions. Not bad for a few transforming GIFs!

Top it off with the best practices in pitching

If you have great content and you have a set of realistic expectations for that content, all that’s left is to distribute it and collect those links and press mentions.

Moz has previously covered some of the best outreach practices for promoting your content to top-tier publishers, but I want to note that when it comes to PR, what you do is just as important as what you don’t do.

In a survey of over 500 journalists in 2019, I asked online editors and writers what their biggest PR pitch pet peeves were. When you conduct content marketing outreach, avoid these top-listed items and you’ll be good to go:



While you might get away with sending one too many follow-ups, most of the offenses on this list are just that — totally offensive to the writer you’re trying to pitch.

Avoid mass email blasts, personalize your pitch, and triple-check that the person you’re contacting is receptive to your content before you hit send.

Conclusion

While there are certainly some characteristics that all great content should have, there are ways to increase the chances your content will be engaging within a specific vertical. Research what your particular audience is interested in, and be sure to measure your results realistically based on how content generally performs in your space.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

source https://moz.com/blog/vertical-content-marketing

Categories
Digital Marketing

Heart, Ear, Eye, Mind, Mouth: Local SEO Exercises for Your Least Technical Clients

Posted by MiriamEllis

When was the last time you relaxed with a client?

As a local business consultant, I know that deeper marketing insights can be discovered when you set aside formality and share experiences: a moment, a laugh, a common bond. 

When I’m looking for ways to make life easier for a client, I sometimes reflect on ancient practices like yoga, tai chi, and mindful breathing, which are increasingly understood as beneficial to human health. For a space in time, they reduce the complex world we live in to a simpler one where being, breath, movement, and focus bring the practitioner to a more intuitive state. 

Local marketing agencies can empathize with the complex world their clients inhabit. Local business owners must manage everything from rent and employee benefits to customer service, business reviews, web content, and online listings. When you take on a new client, you expect them to onboard a ton of information about marketing their brand online. Sometimes, the most basic motivations go unaddressed and get lost in assumptions and jargon — instead of decreasing client stress for your least technical clients, you can accidentally increase it. 

Today, I’ll help you newly create an intuitive space by sharing five simple meditation exercises you can use with your agency’s clients. Instead of signaling via SEO, CTR, USPs, and GMB, let’s relax with clients by relating successful local search marketing practices to experiences people at any level of technical proficiency already understand.

Heart

To show their heart is in the right place, the Vermont Country Store publishes a customer bill of rights.

For a local business owner, there is no more important quality than having their heart in the right place when it comes to their motivation for running a company.

Yes, all of us work to earn money, but it’s the dedication to serving others that is felt by customers in every interaction with them. When customers feel that a business is there for them, it establishes the loyalty and reputation that secure local search marketing success. 

Heart meditation

Close your eyes for a few seconds and think of a time in your life when you most needed help from a business. Maybe you needed a tow truck, a veterinarian, a dentist, or a plumber. You really needed them to understand your plight, deliver the right help, and treat you as an important person who is worthy of respect. Whether you received what you required or not, remember the feeling of need. 

Now, extend that recognition beyond your own heart to the heart of every customer who feels a need for something your client can offer them.

A business owner with their heart in the right place can powerfully contribute to local search marketing by:

  • Running a customer-centric business.
  • Creating customer guarantees that are fair.
  • Creating an employee culture of respect and empowerment that extends to customers.
  • Creating a location that is clean, functional, and pleasant for all.
  • Honestly representing their products, services, location, and reputation.
  • Refraining from practices that negatively impact their customers and reputation.
  • Participating positively in the life of the community they serve.

A good local search marketing agency will help the business owner translate these basics into online content that meets customer needs, local business listings that accurately and richly represent the business, and genuine reviews that serve as a healthy and vital ongoing conversation between the brand and its customers. A trustworthy agency will ensure avoidance of any tactics that pollute the Internet with spam listings, spam reviews, negative attacks on competitors, and negative impacts on the service community. An excellent agency will also assist in finding and promoting community engagement opportunities, helping to win desirable online publicity from offline efforts.

Ear

Keter Salon of Berkeley, Calif. really listens to customers and it shows in its reviews.

Local business success is so linked to the art of listening, I sometimes think Google should replace their teardrop map markers with little ears. In the local SEO world, there are few things sadder than seeing local business profiles filled with disregarded reviews, questions, and negative photos. (Someone cue “The Sound of Silence”.)

From a business perspective, the sound of branded silence is also the sound of customers and profits trickling away. Why does it work this way? Because only 4% of your unhappy customers may actually make the effort to speak up, and if a business owner is not even hearing them, they’ve lost the ability to hear consumer demand. Let’s make sure this doesn’t happen.

Ear meditation

Close your eyes for a few seconds and listen closely to every noise within the range of your hearing. Ask yourself, “Where am I?”

The sound of typing, phone calls, and co-workers chatting might place you in an office. Sliding doors, footsteps on linoleum, and floor staff speaking might mean you’re at your client’s brick-and-mortar location. Maybe it’s birdsong outside and the baby in their crib that tell you you’re working from home today. Listen to every sound that tells you exactly where you are right now.

Now, commit to listening with this level of attention and intention to the signals of customer voices, telling you exactly where a local brand is right now in terms of faults and successes. 

A business owner who keeps their ears open can actively gauge how their business is really doing with its customers by:

  • Having one-on-one conversations with customers.
  • Recording and analyzing phone conversations with customers.
  • Reading reviews on platforms like Google My Business, Yelp, Facebook and sites that are specific to their industry (like Avvo for lawyers or Healthgrades for physicians).
  • Reading the Q&A questions of customers on their Google Business Profile.
  • Reading mentions of their brand on social media platforms like Twitter, Facebook, and Instagram.
  • Reading the responses to surveys they conduct.
  • Reading the emails and form submissions the company receives.

A good local search marketing agency will help their client amass, organize, and analyze all of this sentiment to discover the current reputation of the business. From this information, you and your client can chart a course for improvement. Consider that, in this study, a 1.5 star improvement in online reputation increased consumer activity by 10%-12% and generated 13,000 more leads for the brands included. The first step to a better reputation is simply listening. 

Eye

Moz’s Local Market Analytics (Beta) helps you see your market through customer location emulation.

When your clients choose their business locations, they weigh several factors. They compare how the mantra of “location, location, location” matches their budget, and whether a certain part of town is lacking something their business could provide. They also look at the local competitors to see if the competition would be hard to beat, or if they could do the job better. Success lies in truly seeing the lay of the land.

Local search mirrors the real world. The market on the Internet is made up of the physical locations of your clients’ customers at the time they search for what your client has to offer. 

Eye meditation

You already know most of the businesses on your street, and many of them in your neighborhood. Now, with eyes wide open, start searching Google for the things your listening work has told you customers need. Where appropriate, include attributes you’ve noticed them using like “best tacos near me”, “cheapest gym in North Beach”, or “shipping store downtown.”

See how your client is ranking when a person does these type of searches while at their location. Now, walk or drive a few blocks away and try again. Go to the city perimeter and try again. Where are they ranking, and who is outranking them as you move about their geographic market?

A local business keeping its eyes open never makes assumptions about who its true competitors are or how its customers search. Instead, it:

  • Regularly assesses the competition in its market, taking into account the distance from which customers are likely to come for goods and services.
  • Regularly reviews materials assembled in the listening phase to see how customers word their requests and sentiments.
  • Makes use of tools to analyze both markets and keyword searches.

A good local search marketing agency will help with the tools needed for market and search language analysis. These findings can inform everything from what a client names their business, to how they categorize it on their Google My Business listing, to what they write about to draw in customers from all geographic points in their market. Clear vision simultaneously enables you to analyze competitors who are outranking your client and assess why they’re doing so. It can empower your client to report spammers who are outranking them via forbidden tactics. An excellent agency will help their client see their competitive landscape with eyes on the prize.

Mind

When an independent Arizona appliance chain surprised three shoppers with $10,000, it made headlines.

With hearts ready for service, ears set on listening, and eyes determined to see, you and your client have now taken in useful information about their brand and the customers who make up their local market. You know now whether they’re doing a poor, moderate, or exceptional job of fulfilling needs, and are working with them to strategize next steps. But what are those next steps? 

Mind meditation

Sit back comfortably and think of a time a business completely surprised you, or a time when an owner or employee did something so unexpectedly great, it convinced you that you were in good hands. Maybe they comped your meal when it wasn’t prepared properly, or special-ordered an item just for you, or showed you how to do something you’d never thought of before.

Recall that lightbulb moment of delight. Ask yourself how your client’s brand could surprise customers in memorable ways they would love. Create a list of those ideas.

A creative local business gives full play to the awesome imaginative powers of the brain. It gives all staff permission to daydream and brainstorm questions like:

  • What is something unexpected the business could do that would come as a delightful surprise to customers?
  • What is the most impactful thing the business could do that would be felt as a positive force in the lives of its customers?
  • What risks can the business take for the sake of benevolence, social good, beauty, renown, or joy?

A good local search marketing agency will help sort through ideas that could truly differentiate their clients from the competition and bring them closer to making the kinds of impressions that turn local brands into household names. An excellent agency will bring ideas of their own. Study “surprise and delight marketing” as it’s done on the large, corporate scale, and get it going at a local level like this small coffee roaster in Alexandria, Va. selling ethical java while doubling as funding for LGBTQ+ organizations. 

Mouth

Put your best stories everywhere, like in this social media example. Moz Local can help with publishing those stories.

“Think before you speak” is an old adage that serves well as a marketing guideline. Another way we might say it is “research before you publish”. With heart, ear, eye, and mind, you and your client have committed, collected, analyzed, and ideated their brand to a point where it’s ready to address the public from a firm foundation.

Mouth meditation

Open your favorite word processor on your computer and type a few bars of the lyrics to your favorite song. Next, type the first three brand slogans that come to your mind. Next, type a memorable line from a movie or book. Finally, type out the the words of the nicest compliment or best advice someone ever gave you. 

Sit back and look at your screen. Look at how those words have stuck in your mind — you remember them all! The people who wrote and spoke those words have indelibly direct-messaged you. 

How will you message the public in a way that’s unforgettable?

A well-spoken local business masters the art of face-to-face customer conversation. In-store signage and offline media require great words, too, but local search marketing will take spoken skills onto the web, where they’ll be communicated via:

  • Every page of the website 
  • Every article or blog post 
  • Social media content
  • Review responses
  • Answers to questions like Google Business Profile Q&A
  • Business descriptions on local business listings
  • Google posts
  • Featured snippet content
  • Live chat
  • Email
  • Press releases
  • Interviews
  • Images on the website, business listings, and third-party platforms like Google Images and Pinterest
  • Videos on the website, YouTube, and other platforms

A good local search marketing agency will help their client find the best words, images, and videos based on all the research done together. An excellent agency will help a local business move beyond simply being discovered online to being remembered as a household name each time customer needs arise. An agency should help their clients earn links, unstructured citations, and other forms of publicity from those research efforts.

Determine to help your client be the “snap, crackle, pop”, “un-Cola”, “last honest pizza” with everything you publish for their local market, and to build an Internet presence that speaks well of their business 24-hours a day.

Closing pose



One of the most encouraging aspects of running and marketing a local business is that it’s based on things you already have some life experience doing: caring, listening, observing, imagining, and communicating. 

I personally should be better at technical tasks like diagnosing errors in Schema, configuring Google Search Console for local purposes, or troubleshooting bulk GMB uploads. I can work at improving in those areas, but I can also work at growing my heart, ear, eye, mind, and mouth to master serving clients and customers.

Business is technical. Business is transactional. But good business is also deeply human, with real rewards for well-rounded growth.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

source https://moz.com/blog/local-seo-exercises-least-technical

Categories
Digital Marketing

2020 Google Search Survey: How Much Do Users Trust Their Search Results?

Posted by LilyRayNYC

While Google’s mission has always been to surface high-quality content, over the past few years the company has worked especially hard to ensure that its search results are also consistently accurate, credible, and trustworthy.

Reducing false and misleading information has been a top priority for Google since concerns over misinformation surfaced during the 2016 US presidential election. The search giant is investing huge sums of money and brain power into organizing the ever-increasing amounts of content on the web in a way that prioritizes accuracy and credibility.

In a 30-page whitepaper published last year, Google delineates specifically how it fights against bad actors and misinformation across Google Search, News, Youtube, Ads, and other Google products.

In this whitepaper, Google explains how Knowledge Panels — a common organic search feature — are part of its initiative to show “context and diversity of perspectives to form their own views.” With Knowledge Panel results, Google provides answers to queries with content displayed directly in its organic search results (often without including a link to a corresponding organic result), potentially eliminating the need for users to click through to a website to find an answer to their query. While this feature benefits users by answering their questions even more quickly, it brings with it the danger of providing quick answers that might be misleading or incorrect.

Another feature with this issue is Featured Snippets, where Google pulls website content directly into the search results. Google maintains specific policies for Featured Snippets, prohibiting the display of content that is sexually explicit, hateful, violent, dangerous, or in violation of expert consensus on civic, medical, scientific, or historical topics. However, this doesn’t mean the content included in Featured Snippets is always entirely accurate.

According to data pulled by Dr. Pete Meyers, based on a sample set of 10,000 keywords, Google has increased the frequency with which it displays Featured Snippets as part of the search results. In the beginning of 2018, Google displayed Featured Snippets in approximately 12% of search results; in early 2020, that number hovers around 16%.

Google has also rolled out several core algorithm updates in the past two years, with the stated goal of “delivering on [their] mission to present relevant and authoritative content to searchers.” What makes these recent algorithm updates particularly interesting is how much E-A-T (expertise, authoritativeness, and trustworthiness) appears to be playing a role in website performance, particularly for YMYL (your money, your life) websites.

As a result of Google’s dedication to combating misinformation and fake news, we could reasonably expect searchers to agree that Google has improved in its ability to surface credible and trusted content. But does the average searcher actually feel that way? At Path Interactive, we conducted a survey to find out how users feel about the information they encounter in Google’s organic results.

About our survey respondents and methodology

Out of 1,100 respondents, 70% of live in the United States, 21% in India, and 5% in Europe. 63% of respondents are between the ages of 18 and 35, and 17% are over the age of 46. All respondent data is self-reported.

For all questions involving specific search results or types of SERP features, respondents were provided with screenshots of those features. For questions related to levels of trustworthiness or the extent to which the respondent agreed with the statement, respondents were presented with answers on a scale of 1-5.

Our findings

Trustworthiness in the medical, political, financial, and legal categories

Given how much fluctuation we’ve seen in the YMYL category of Google with recent algorithm updates, we thought it would be interesting to ask respondents how much they trust the medical, political, financial, and legal information they find on Google.

We started by asking respondents about the extent to which they have made important financial, legal, or medical decisions based on information they found in organic search. The majority (51%) of respondents indicated that they “very frequently” or “often” make important life decisions based on Google information, while 39% make important legal decisions, and 46% make important medical decisions. Only 10-13% of respondents indicated that they never make these types of important life decisions based on the information they’ve found on Google.

Medical searches

As it relates to medical searches, 72% of users agree or strongly agree that Google has improved at showing accurate medical results over time.

Breaking down these responses by age, a few interesting patterns emerge:

  • The youngest searchers (ages 18-25) are 94% more likely than the oldest searchers (65+) to strongly believe that Google’s medical results have improved over time.
  • 75% of the youngest searchers (ages 18-25) agree or strongly agree that Google has improved in showing accurate medical searches over time, whereas only 54% of the oldest searchers (65+) feel the same way.
  • Searchers ages 46-64 are the most likely to disagree that Google’s medical results are improving over time.

Next, we wanted to know if Google’s emphasis on surfacing medical content from trusted medical publications — such as WebMD and the Mayo Clinic — is resonating with its users. One outcome of recent core algorithm updates is that Google’s algorithms appear to be deprioritizing content that contradicts scientific and medical consensus (consistently described as a negative quality indicator throughout their Search Quality Guidelines).

The majority (66%) of respondents agree that it is very important to them that Google surfaces content from highly trusted medical websites. However, 14% indicated they would rather not see these results, and another 14% indicated they’d rather see more diverse results, such as content from natural medicine websites. These numbers suggest that more than a quarter of respondents may be unsatisfied with Google’s current health initiatives aimed at surfacing medical content from a set of acclaimed partners who support the scientific consensus.

We asked survey respondents about Symptom Cards, in which information related to medical symptoms or specific medical conditions is surfaced directly within the search results.

Examples of Symptom Cards. Source: https://blog.google/products/search/im-feeling-yucky-searching-for-symptoms/

Our question aimed to gather how much searchers felt the content within Symptom Cards can be trusted.

The vast majority (76%) of respondents indicated they trust or strongly trust the content within Symptom Cards.

When looking at the responses by age, younger searchers once again reveal that they are much more likely than older searchers to strongly trust the medical content found within Google. In fact, the youngest bracket of searchers (ages 18-25) are 138% more likely than the oldest searchers (65+) to strongly trust the medical content found in Symptom Cards.

News and political searches

The majority of respondents (61%) agree or strongly agree that Google has improved at showing high-quality, trustworthy news and political content over time. Only 13% disagree or strongly disagree with this statement.

Breaking the same question down by age reveals interesting trends:

  • The majority (67%) of the youngest searchers (ages 18-25) agree that the quality of Google’s news and political content has improved over time, whereas the majority (61%) of the oldest age group (65+) only somewhat agrees or disagrees.
  • The youngest searchers (ages 18-25) are 250% more likely than the oldest searchers to strongly agree that the quality of news and political content on Google is improving over time.

Misinformation

Given Google’s emphasis on combating misinformation in its search results, we also wanted to ask respondents about the extent to which they feel they still encounter dangerous or highly untrustworthy information on Google.

Interestingly, the vast majority of respondents (70%) feel that they have encountered misinformation on Google at least sometimes, although 29% indicate they rarely or never see misinformation in the results.

Segmenting the responses by age groups reveals a clear pattern that the older the searcher, the more likely they are to indicate that they have seen misinformation in Google’s search results. In fact, the oldest searchers (65+) are 138% more likely than the youngest searchers (18-25) to say they’ve encountered misinformation on Google either often or very frequently.

Throughout the responses to all questions related to YMYL topics such as health, politics, and news, a consistent pattern emerged that the youngest searchers appear to have more trust in the content Google displays for these queries, and that older searchers are more skeptical.

This aligns with our findings from a similar survey we conducted last year, which found that younger searchers were more likely to take much of the content displayed directly in the SERP at face value, whereas older searchers were more likely to browse deeper into the organic results to find answers to their queries.

This information is alarming, especially given another question we posed asking about the extent to which searchers believe the information they find on Google influences their political opinions and outlook on the world.

The question revealed some interesting trends related to the oldest searchers: according to the results, the oldest searchers (65+) are 450% more likely than the youngest searchers to strongly disagree that information they find on Google influences their worldview.

However, the oldest searchers are also most likely to agree with this statement; 11% of respondents ages 65+ strongly agree that Google information influences their worldview. On both ends of the spectrum, the oldest searchers appear to hold stronger opinions about the extent to which Google influences their political opinions and outlook than respondents from other age brackets.

Featured Snippets and the Knowledge Graph

We also wanted to understand the extent to which respondents found the content contained within Featured Snippets to be trustworthy, and to segment those responses by age brackets. As with the other scale-based questions, respondents were asked to indicate how much they trusted these features on a scale of 1-5 (Likert scale).

According to the results, the youngest searchers (ages 18-25) are 100% more likely than the oldest searchers (ages 65+) to find the content within Featured Snippets to be very trustworthy. This aligns with a similar discovery we found in our survey from last year: “The youngest searchers (13–18) are 220 percent more likely than the oldest searchers (70–100) to consider their question answered without clicking on the snippet (or any) result.”

For Knowledge Graph results, the results are less conclusive when segmented by age. 95% of respondents across all age groups find the Knowledge Panel results to be at least “trustworthy.”

Conclusion: Young users trust search results more than older users

In general, the majority of survey respondents appear to trust the information they find on Google — both in terms of the results themselves, as well as the content they find within SERP features such as the Knowledge Panel and Featured Snippets. However, there still appears to be a small subset of searchers who are dissatisfied with Google’s results. This subset consists of mostly older searchers who appear to be more skeptical about taking Google’s information at face value, especially for YMYL queries.

Across almost all survey questions, there is a clear pattern that the youngest searchers tend to trust the information they find on Google more so than the older respondents. This aligns with a similar survey we conducted last year, which indicated that younger searchers were more likely to accept the content in Featured Snippets and Knowledge Panels without needing to click on additional results on Google.

It is unclear whether younger searchers trust information from Google more because the information itself has improved, or because they are generally more trusting of information they find online. These results may also be due to older searchers not having grown up with the ability to rely on internet search engines to answer their questions. Either way, the results raise an interesting question about the future of information online: will searchers become less skeptical of online information over time?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

source https://moz.com/blog/2020-google-search-survey

Categories
Digital Marketing

The Rules of Link Building – Best of Whiteboard Friday

Posted by BritneyMuller

Are you building links the right way? Or are you still subscribing to outdated practices? Britney Muller clarifies which link building tactics still matter and which are a waste of time (or downright harmful) in one of our very favorite classic episodes of Whiteboard Friday.

The Rules of Link Building

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Happy Friday, Moz fans! Welcome to another edition of Whiteboard Friday. Today we are going over the rules of link building. It’s no secret that links are one of the top three ranking factors in Google and can greatly benefit your website. But there is a little confusion around what’s okay to do as far as links and what’s not. So hopefully, this helps clear some of that up.

The Dos

All right. So what are the dos? What do you want to be doing? First and most importantly is just to…

I. Determine the value of that link. So aside from ranking potential, what kind of value will that link bring to your site? Is it potential traffic? Is it relevancy? Is it authority? Just start to weigh out your options and determine what’s really of value for your site. Our own tool, Moz Link Explorer, can 

II. Local listings still do very well. These local business citations are on a bunch of different platforms, and services like Moz Local or Yext can get you up and running a little bit quicker. They tend to show Google that this business is indeed located where it says it is. It has consistent business information — the name, address, phone number, you name it. But something that isn’t really talked about all that often is that some of these local listings never get indexed by Google. If you think about it, Yellowpages.com is probably populating thousands of new listings a day. Why would Google want to index all of those?

So if you’re doing business listings, an age-old thing that local SEOs have been doing for a while is create a page on your site that says where you can find us online. Link to those local listings to help Google get that indexed, and it sort of has this boomerang-like effect on your site. So hope that helps. If that’s confusing, I can clarify down below. Just wanted to include it because I think it’s important.

III. Unlinked brand mentions. One of the easiest ways you can get a link is by figuring out who is mentioning your brand or your company and not linking to it. Let’s say this article publishes about how awesome SEO companies are and they mention Moz, and they don’t link to us. That’s an easy way to reach out and say, “Hey, would you mind adding a link? It would be really helpful.”

IV. Reclaiming broken links is also a really great way to kind of get back some of your links in a short amount of time and little to no effort. What does this mean? This means that you had a link from a site that now your page currently 404s. So they were sending people to your site for a specific page that you’ve since deleted or updated somewhere else. Whatever that might be, you want to make sure that you 301 this broken link on your site so that it pushes the authority elsewhere. Definitely a great thing to do anyway.

V. HARO (Help a Reporter Out). Reporters will notify you of any questions or information they’re seeking for an article via this email service. So not only is it just good general PR, but it’s a great opportunity for you to get a link. I like to think of link building as really good PR anyway. It’s like digital PR. So this just takes it to the next level.

VI. Just be awesome. Be cool. Sponsor awesome things. I guarantee any one of you watching likely has incredible local charities or amazing nonprofits in your space that could use the sponsorship, however big or small that might be. But that also gives you an opportunity to get a link. So something to definitely consider.

VII. Ask/Outreach. There’s nothing wrong with asking. There’s nothing wrong with outreach, especially when done well. I know that link building outreach in general kind of gets a bad rap because the response rate is so painfully low. I think, on average, it’s around 4% to 7%, which is painful. But you can get that higher if you’re a little bit more strategic about it or if you outreach to people you already currently know. There’s a ton of resources available to help you do this better, so definitely check those out. We can link to some of those below.

VIII. COBC (create original badass content). We hear lots of people talk about this. When it comes to link building, it’s like, “Link building is dead. Just create great content and people will naturally link to you. It’s brilliant.” It is brilliant, but I also think that there is something to be said about having a healthy mix. There’s this idea of link building and then link earning. But there’s a really perfect sweet spot in the middle where you really do get the most bang for your buck.

The Don’ts

All right. So what not to do. The don’ts of today’s link building world are…

I. Don’t ask for specific anchor text. All of these things appear so spammy. The late Eric Ward talked about this and was a big advocate for never asking for anchor text. He said websites should be linked to however they see fit. That’s going to look more natural. Google is going to consider it to be more organic, and it will help your site in the long run. So that’s more of a suggestion. These other ones are definitely big no-no’s.

II. Don’t buy or sell links that pass PageRank. You can buy or sell links that have a no-follow attached, which attributes that this is paid-for, whether it be an advertisement or you don’t trust it. So definitely looking into those and understanding how that works.

III. Hidden links. We used to do this back in the day, the ridiculous white link on a white background. They were totally hidden, but crawlers would pick them up. Don’t do that. That’s so old and will not work anymore. Google is getting so much smarter at understanding these things.

IV. Low-quality directory links. Same with low-quality directory links. We remember those where it was just loads and loads of links and text and a random auto insurance link in there. You want to steer clear of those.

V. Site-wide links also look very spammy. Site-wide being whether it’s a footer link or a top-level navigation link, you definitely don’t want to go after those. They can appear really, really spammy. Avoid those.

VI. Comment links with over-optimized anchor link text, specifically, you want to avoid. Again, it’s just like any of these others. It looks spammy. It’s not going to help you long-term. Again, what’s the value of that overall? So avoid that.

VII. Abusing guest posts. You definitely don’t want to do this. You don’t want to guest post purely just for a link. However, I am still a huge advocate, as I know many others out there are, of guest posting and providing value. Whether there be a link or not, I think there is still a ton of value in guest posting. So don’t get rid of that altogether, but definitely don’t target it for potential link building opportunities.

VIII. Automated tools used to create links on all sorts of websites. ScrapeBox is an infamous one that would create the comment links on all sorts of blogs. You don’t want to do that.

IX. Link schemes, private link networks, and private blog networks. This is where you really get into trouble as well. Google will penalize or de-index you altogether. It looks so, so spammy, and you want to avoid this.

X. Link exchange. This is in the same vein as the link exchanges, where back in the day you used to submit a website to a link exchange and they wouldn’t grant you that link until you also linked to them. Super silly. This stuff does not work anymore, but there are tons of opportunities and quick wins for you to gain links naturally and more authoritatively.

So hopefully, this helps clear up some of the confusion. One question I would love to ask all of you is: To disavow or to not disavow? I have heard back-and-forth conversations on either side on this. Does the disavow file still work? Does it not? What are your thoughts? Please let me know down below in the comments.

Thank you so much for tuning in to this edition of Whiteboard Friday. I will see you all soon. Thanks.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

source https://moz.com/blog/link-building-rules

Categories
Digital Marketing

How Low Can #1 Go? (2020 Edition)

Posted by Dr-Pete

Being #1 on Google isn’t what it used to be. Back in 2013, we analyzed 10,000 searches and found out that the average #1 ranking began at 375 pixels (px) down the page. The worst case scenario, a search for “Disney stock,” pushed #1 all the way down to 976px.

A lot has changed in seven years, including an explosion of rich SERP (Search Engine Results Page) features, like Featured Snippets, local packs, and video carousels. It feels like the plight of #1 is only getting worse. So, we decided to run the numbers again (over the same searches) and see if the data matches our perceptions. Is the #1 listing on Google being pushed even farther down the page?

I try to let the numbers speak for themselves, but before we dig into a lot of stats, here’s one that legitimately shocked me. In 2020, over 1,600 (16.6%) of the searches we analyzed had #1 positions that were worse than the worst-case scenario in 2013. Let’s dig into a few of these …

What’s the worst-case for #1?

Data is great, but sometimes it takes the visuals to really understand what’s going on. Here’s our big “winner” for 2020, a search for “lollipop” — the #1 ranking came in at an incredible 2,938px down. I’ve annotated the #1 position, along with the 1,000px and 2,000px marks …

At 2,938px, the 2020 winner comes in at just over three times 2013’s worst-case scenario. You may have noticed that the line is slightly above the organic link. For the sake of consistency and to be able to replicate the data later, we chose to use the HTML/CSS container position. This hits about halfway between the organic link and the URL breadcrumbs (which recently moved above the link). This is a slightly more conservative measure than our 2013 study.

You may also have noticed that this result contains a large-format video result, which really dominates page-one real estate. In fact, five of our top 10 lowest #1 results in 2020 contained large-format videos. Here’s the top contender without a large-format video, coming in at fourth place overall (a search for “vacuum cleaners”) …

Before the traditional #1 organic position, we have shopping results, a research carousel, a local pack, People Also Ask results, and a top products carousel with a massive vertical footprint. This is a relentlessly commercial result. While only a portion of it is direct advertising, most of the focus of the page above the organic results is on people looking to buy a vacuum.

What about the big picture?

It’s easy — and more than a little entertaining — to cherry-pick the worst-case scenarios, so let’s look at the data across all 10,000 results. In 2013, we only looked at the #1 position, but we’ve expanded our analysis in 2020 to consider all page-one organic positions. Here’s the breakdown …

The only direct comparison to 2013 is the position #1 row, and you can see that every metric increased, some substantially. If you look at the maximum Y-position by rank, you’ll notice that it peaks around #7 and then begins to decrease. This is easier to illustrate in a chart …

To understand this phenomenon, you have to realize that certain SERP features, like Top Stories and video carousels, take the place of a page-one organic result. At the same time, those features tend to be longer (vertically) than a typical organic result. So, a page with 10 traditional organic results will in many cases be shorter than a page with multiple rich SERP features.

What’s the worst-case overall?

Let’s dig into that seven-result page-one bucket and look at the worst-case organic position across all of the SERPs in the study, a #7 organic ranking coming in at 4,487px …

Congratulations, you’re finally done scrolling. This SERP has seven traditional organic positions (including one with FAQ links), plus an incredible seven rich features and a full seven ads (three are below the final result). Note that this page shows the older ad and organic design, which Google is still testing, so the position is measured as just above the link.

How much do ads matter?

Since our 2013 study (in early 2016), Google removed right-hand column ads on desktop and increased the maximum number of top-left ads from three to four. One notable point about ads is that they have prime placement over both organic results and SERP features. So, how does this impact organic Y-positions? Here’s a breakdown …

Not surprisingly, the mean and median increase as ad-count increases – on average, the more ads there are, the lower the #1 organic position is. So why does the maximum Y-position of #1 decrease with ad-count? This is because SERP features are tied closely to search intent, and results with more ads tend to be more commercial. This naturally rules out other features.

For example, while 1,270 SERPs on February 12 in our 10,000-SERP data set had four ads on top, and 1,584 had featured snippets, only 16 had both (just 1% of SERPs with featured snippets). Featured snippets naturally reflect informational intent (in other words, they provide answers), whereas the presence of four ads signals strong commercial intent.

Here’s the worst-case #1 position for a SERP with four ads on top in our data set …

The college results are a fairly rare feature, and local packs often appear on commercial results (as anyone who wants to buy something is looking for a place to buy it). Even with four ads, though, this result comes in significantly higher than our overall worst-case #1 position. While ads certainly push down organic results, they also tend to preclude other rich SERP features.

What about featured snippets?

In early 2014, a year after our original study, Google launched featured snippets, promoted results that combine organic links with answers extracted from featured pages. For example, Google can tell you that I am both a human who works for Moz and a Dr. Pepper knock-off available at Target …

While featured snippets are technically considered organic, they can impact click-through rates (CTR) and the extracted text naturally pushes down the organic link. On the other hand, Featured Snippets tend to appear above other rich SERP features (except for ads, of course). So, what’s the worst-case scenario for a #1 result inside a featured snippet in our data set?

Ads are still pushing this result down, and the bullet list extracted from the page takes up a fair amount of space, but the absence of other SERP features above the featured snippet puts this in a much better position than our overall worst-case scenario. This is an interesting example, as the “According to mashable.com …” text is linked to Mashable (but not considered the #1 result), but the images are all linked to more Google searches.

Overall in our study, the average Y-position of #1 results with featured snippets was 99px lower/worse (704px) than traditional #1 results (605px), suggesting a net disadvantage in most cases. In some cases, multiple SERP features can appear between the featured snippet and the #2 organic result. Here’s an example where the #1 and #2 result are 1,342px apart …

In cases like this, it’s a strategic advantage to work for the featured snippet, as there’s likely a substantial drop-off in clicks from #1 to #2. Featured snippets are going to continue to evolve, and examples like this show how critical it is to understand the entire landscape of your search results.

When is #2 not worth it?

Another interesting case that’s evolved quite a bit since 2013 is brand searches, or as Google is more likely to call them, “dominant intent” searches. Here’s a SERP for the company Mattress Firm …

While the #1 result has solid placement, the #2 result is pushed all the way down to 2,848px. Note that the #1 position has a search box plus six full site-links below it, taking up a massive amount of real estate. Even the brand’s ad has site-links. Below #1 is a local pack, People Also Ask results, Twitter results from the brand’s account, heavily branded image results, and then a product refinement carousel (which leads to more Google searches).

There are only five total, traditional organic results on this page, and they’re made up of the company’s website, the company’s Facebook page, the company’s YouTube channel, a Wikipedia page about the company, and a news article about the company’s 2018 bankruptcy filing.

This isn’t just about vertical position — unless you’re Mattress Firm, trying to compete on this search really doesn’t make much sense. They essentially own page one, and this is a situation we’re seeing more and more frequently for searches with clear dominant intent (i.e. most searchers are looking for a specific entity).

What’s a search marketer to do?

Search is changing, and change can certainly be scary. There’s no question that the SERP of 2020 is very different in some ways than the SERP of 2013, and traditional organic results are just one piece of a much larger picture. Realistically, as search marketers, we have to adapt — either that, or find a new career. I hear alpaca farming is nice.

I think there are three critical things to remember. First, the lion’s share of search traffic still comes from traditional organic results. Second, many rich features are really the evolution of vertical results, like news, videos, and images, that still have an organic component. In other words, these are results that we can potentially create content for and rank in, even if they’re not the ten blue links we traditionally think of as organic search.

Finally, it’s important to realize that many SERP features are driven by searcher intent and we need to target intent more strategically. Take the branded example above — it may be depressing that the #2 organic result is pushed down so far, but ask yourself a simple question. What’s the value of ranking for “mattress firm” if you’re not Mattress Firm? Even if you’re a direct competitor, you’re flying in the face of searchers with a very clear brand intent. Your effort is better spent on product searches, consumer questions, and other searches likely to support your own brand and sales.

If you’re the 11th person in line at the grocery checkout and the line next to you has no people, do you stand around complaining about how person #2, #7, and #9 aren’t as deserving of groceries as you are? No, you change lines. If you’re being pushed too far down the results, maybe it’s time to seek out different results where your goals and searcher goals are better aligned.

Brief notes on methodology

Not to get too deep in the weeds, but a couple of notes on our methodology. These results were based on a fixed set of 10,000 keywords that we track daily as part of the MozCast research project. All of the data in this study is based on page-one, Google.com, US, desktop results. While the keywords in this data set are distributed across a wide range of topics and industries, the set skews toward more competitive “head” terms. All of the data and images in this post were captured on February 12, 2020. Ironically, this blog post is over 26,000 pixels long. If you’re still reading, thank you, and may God have mercy on your soul.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

source https://moz.com/blog/how-low-can-number-one-go-2020

Design a site like this with WordPress.com
Get started