Tuesday, October 31, 2017

Unfiltered: How to Show Up in Local Search Results

Posted by sherrybonelli

If you're having trouble getting your local business' website to show up in the Google local 3-pack or local search results in general, you're not alone. The first page of Google's search results seems to have gotten smaller over the years – the top and bottom of the page are often filled with ads, the local 7-pack was trimmed to a slim 3-pack, and online directories often take up the rest of page one. There is very little room for small local businesses to rank on the first page of Google.

To make matters worse, Google has a local "filter" that can strike a business, causing their listing to drop out of local search results for seemingly no reason – often, literally, overnight. Google's local filter has been around for a while, but it became more noticeable after the Possum algorithm update, which began filtering out even more businesses from local search results.

If you think about it, this filter is not much different than websites ranking organically in search results: In an ideal world, the best sites win the top spots. However, the Google filter can have a significantly negative impact on local businesses that often rely on showing up in local search results to get customers to their doors.

What causes a business to get filtered?

Just like the multitude of factors that go into ranking high organically, there are a variety of factors that go into ranking in the local 3-pack and the Local Finder.

https://d2eeipcrcdle6.cloudfront.net/learn/seo/Algo-update-pages/Google-Possum1.png?mtime=20170612120640

Here are a few situations that might cause you to get filtered and what you can do if that happens.

Proximity matters

With mobile search becoming more and more popular, Google takes into consideration where the mobile searcher is physically located when they're performing a search. This means that local search results can also depend on where the business is physically located when the search is being done.

A few years ago, if your business wasn't located in the large city in your area, you were at a significant disadvantage. It was difficult to rank when someone searched for "business category + large city" – simply because your business wasn't physically located in the "large city." Things have changed slightly in your favor – which is great for all the businesses who have a physical address in the suburbs.

According to Ben Fisher, Co-Founder of SteadyDemand.com and a Google Top Contributor, "Proximity and Google My Business data play an important role in the Possum filter. Before the Hawk Update, this was exaggerated and now the radius has been greatly reduced." This means there's hope for you to show up in the local search results – even if your business isn't located in a big city.

Google My Business categories

When you're selecting a Google My Business category for your listing, select the most specific category that's appropriate for your business.

However, if you see a competitor is outranking you, find out what category they are using and select the same category for your business (but only if it makes sense.) Then look at all the other things they are doing online to increase their organic ranking and emulate and outdo them.

If your category selections don't work, it's possible you've selected too many categories. Too many categories can confuse Google to the point where it's not sure what your company's specialty is. Try deleting some of the less-specific categories and see if that helps you show up.

Your physical address

If you can help it, don't have the same physical address as your competitors. Yes, this means if you're located in an office building (or worse, a "virtual office" or a UPS Store address) and competing companies are also in your building, your listing may not show up in local search results.

When it comes to sharing an address with a competitor, Ben Fisher recommends, "Ensure that you do not have the same primary category as your competitor if you are in the same building. Their listing may have more trust by Google and you would have a higher chance of being filtered."

Also, many people think that simply adding a suite number to your address will differentiate your address enough from a competitor at the same location — it won't. This is one of the biggest myths in local SEO. According to Fisher, "Google doesn't factor in suite numbers."

Additionally, if competing businesses are located physically close to you, that, too, can impact whether you show up in local search results. So if you have a competitor a block or two down from your company, that can lead to one of you being filtered.

Practitioners

If you're a doctor, attorney, accountant or are in some other industry with multiple professionals working in the same office location, Google may filter out some of your practitioners' listings. Why? Google doesn't want one business dominating the first page of Google local search results. This means that all of the practitioners in your company are essentially competing with one another.

To offset this, each practitioner's Google My Business listing should have a different category (if possible) and should be directed to different URLs (either a page about the practitioner or a page about the specialty – they should not all point to the site's home page).

For instance, at a medical practice, one doctor could select the family practice category and another the pediatrician category. Ideally you would want to change those doctors' landing pages to reflect those categories, too:

Doctorsoffice.com/dr-mathew-family-practice
Doctorsoffice.com/dr-smith-pediatrician

Another thing you can do to differentiate the practitioners and help curtail being filtered is to have unique local phone numbers for each of them.

Evaluate what your competitors are doing right

If your listing is getting filtered out, look at the businesses that are being displayed and see what they're doing right on Google Maps, Google+, Google My Business, on-site, off-site, and in any other areas you can think of. If possible, do an SEO site audit on their site to see what they're doing right that perhaps you should do to overtake them in the rankings.

When you're evaluating your competition, make sure you focus on the signals that help sites rank organically. Do they have a better Google+ description? Is their GMB listing completely filled out but yours is missing some information? Do they have more 5-star reviews? Do they have more backlinks? What is their business category? Start doing what they're doing – only better.

In general Google wants to show the best businesses first. Compete toe-to-toe with the competitors that are ranking higher than you with the goal of eventually taking over their highly-coveted spot.

Other factors that can help you show up in local search results

As mentioned earlier, Google considers a variety of data points when it determines which local listings to display in search results and which ones to filter out. Here are a few other signals to pay attention to when optimizing for local search results:

Reviews

If everything else is equal, do you have more 5-star reviews than your competition? If so, you will probably show up in the local search results instead of your competitors. Google is one of the few review sites that encourages businesses to proactively ask customers to leave reviews. Take that as a clue to ask customers to give you great reviews not only on your Google My Business listing but also on third-party review sites like Facebook, Yelp, and others.

Posts

Are you interacting with your visitors by offering something special to those who see your business listing? Engaging with your potential customers by creating a Post lets Google know that you are paying attention and giving its users a special deal. Having more "transactions and interactions" with your potential customers is a good metric and can help you show up in local search results.

Google+

Despite what the critics say, Google+ is not dead. Whenever you make a Facebook or Twitter post, go ahead and post to Google+, too. Write semantic posts that are relevant to your business and relevant to your potential customers. Try to write Google+ posts that are approximately 300 words in length and be sure to keyword optimize the first 100 words of each post. You can often see some minor increases in rankings due to well-optimized Google+ posts, properly optimized Collections, and an engaged audience.

Here's one important thing to keep in mind: Google+ is not the place to post content just to try and rank higher in local search. (That's called spam and that is a no-no.) Ensure that any post you make to Google+ is valuable to your end-users.

Keep your Google My Business listing current

Adding photos, updating your business hours for holidays, utilizing the Q&A or booking features, etc. can help you show off in rankings. However, don't add content just to try and rank higher. (Your Google My Business listing is not the place for spammy content.) Make sure the content you add to your GMB listing is both timely and high-quality content. By updating/adding content, Google knows that your information is likely accurate and that your business is engaged. Speaking of which...

Be engaged

Interacting with your customers online is not only beneficial for customer relations, but it can also be a signal to Google that can positively impact your local search ranking results. David Mihm, founder of Tidings, feels that by 2020, the difference-making local ranking factor will be engagement.

engagement-ranking-factor.jpg

(Source: The Difference-Making Local Ranking Factor of 2020)

According to Mihm, "Engagement is simply a much more accurate signal of the quality of local businesses than the traditional ranking factors of links, directory citations, and even reviews." This means you need to start preparing now and begin interacting with potential customers by using GMB's Q&A and booking features, instant messaging, Google+ posts, responding to Google and third-party reviews, ensure your website's phone number is "click-to-call" enabled, etc.

Consolidate any duplicate listings

Some business owners go overboard and create multiple Google My Business listings with the thought that more has to be better. This is one instance where having more can actually hurt you. If you discover that for whatever reason your business has more than one GMB listing, it's important that you properly consolidate your listings into one.

Other sources linking to your website

If verified data sources, like the Better Business Bureau, professional organizations and associations, chambers of commerce, online directories, etc. link to your website, that can have an impact on whether or not you show up on Google's radar. Make sure that your business is listed on as many high-quality and authoritative online directories as possible – and ensure that the information about your business – especially your company's Name, Address and Phone Number (NAP) -- is consistent and accurate.

So there you have it! Hopefully you found some ideas on what to do if your listing is being filtered on Google local results.

What are some tips that you have for keeping your business "unfiltered"?


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from Moz Blog https://moz.com/blog/unfiltered-local-search-results
via IFTTT

Friday, October 27, 2017

How to Use the "Keywords by Site" Data in Tools (Moz, SEMrush, Ahrefs, etc.) to Improve Your Keyword Research and Targeting - Whiteboard Friday

Posted by randfish

One of the most helpful functions of modern-day SEO software is the idea of a "keyword universe," a database of tens of millions of keywords that you can tap into and discover what your site is ranking for. Rankings data like this can be powerful, and having that kind of power at your fingertips can be intimidating. In today's Whiteboard Friday, Rand explains the concept of the "keyword universe" and shares his most useful tips to take advantage of this data in the most popular SEO tools.

How to use keywords by site

Click on the whiteboard image above to open a high-resolution version in a new tab!


Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to chat about the Keywords by Site feature that exists now in Moz's toolset — we just launched it this week — and SEMrush and Ahrefs, who have had it for a little while, and there are some other tools out there that also do it, so places like KeyCompete and SpyFu and others.

In SEO software, there are two types of rankings data:

A) Keywords you've specifically chosen to track over time

Basically, the way you can think of this is, in SEO software, there are two kinds of keyword rankings data. There are keywords that you have specifically selected or your marketing manager or your SEO has specifically selected to track over time. So I've said I want to track X, Y and Z. I want to see how they rank in Google's results, maybe in a particular location or a particular country. I want to see the position, and I want to see the change over time. Great, that's your set that you've constructed and built and chosen.

B) A keyword "universe" that gives wide coverage of tens of millions of keywords

But then there's what's called a keyword universe, an entire universe of keywords that's maintained by a tool provider. So SEMrush has their particular database, their universe of keywords for a bunch of different languages, and Ahrefs has their keyword universe of keywords that each of those two companies have selected. Moz now has its keyword universe, a universe of, I think in our case, about 40 million keywords in English in the US that we track every two weeks, so we'll basically get rankings updates. SEMrush tracks their keywords monthly. I think Ahrefs also does monthly.

Depending on the degree of change, you might care or not care about the various updates. Usually, for keywords you've specifically chosen, it's every week. But in these cases, because it's tens of millions or hundreds of millions of keywords, they're usually tracking them weekly or monthly.

So in this universe of keywords, you might only rank for some of them. It's not ones you've specifically selected. It's ones the tool provider has said, "Hey, this is a broad representation of all the keywords that we could find that have some real search volume that people might be interested in who's ranking in Google, and we're going track this giant database." So you might see some of these your site ranks for. In this case, seven of these keywords your site ranks for, four of them your competitors rank for, and two of them both you and your competitors rank for.

Remarkable data can be extracted from a "keyword universe"

There's a bunch of cool data, very, very cool data that can be extracted from a keyword universe. Most of these tools that I mentioned do this.

Number of ranking keywords over time

So they'll show you how many keywords a given site ranks for over time. So you can see, oh, Moz.com is growing its presence in the keyword universe, or it's shrinking. Maybe it's ranking for fewer keywords this month than it was last month, which might be a telltale sign of something going wrong or poorly.

Degree of rankings overlap

You can see the degree of overlap between several websites' keyword rankings. So, for example, I can see here that Moz and Search Engine Land overlap here with all these keywords. In fact, in the Keywords by Site tool inside Moz and in SEMrush, you can see what those numbers look like. I think Moz actually visualizes it with a Venn diagram. Here's Distilled.net. They're a smaller website. They have less content. So it's no surprise that they overlap with both. There's some overlap with all three. I could see keywords that all three of them rank for, and I could see ones that only Distilled.net ranks for.

Estimated traffic from organic search

You can also grab estimated traffic. So you would be able to extract out — Moz does not offer this, but SEMrush does — you could see, given a keyword list and ranking positions and an estimated volume and estimated click-through rate, you could say we're going to guess, we're going to estimate that this site gets this much traffic from search. You can see lots of folks doing this and showing, "Hey, it looks this site is growing its visits from search and this site is not." SISTRIX does this in Europe really nicely, and they have some great blog posts about it.

Most prominent sites for a given set of keywords

You can also extract out the most prominent sites given a set of keywords. So if you say, "Hey, here are a thousand keywords. Tell me who shows up most in this thousand-keyword set around the world of vegetarian recipes." The tool could extract out, "Okay, here's the small segment. Here's the galaxy of vegetarian recipe keywords in our giant keyword universe, and this is the set of sites that are most prominent in that particular vertical, in that little galaxy."

Recommended applications for SEOs and marketers

So some recommended applications, things that I think every SEO should probably be doing with this data. There are many, many more. I'm sure we can talk about them in the comments.

1. Identify important keywords by seeing what you rank for in the keyword universe

First and foremost, identify keywords that you probably should be tracking, that should be part of your reporting. It will make you look good, and it will also help you keep tabs on important keywords where if you lost rankings for them, you might cost yourself a lot of traffic.

Monthly granularity might not be good enough. You might want to say, "Hey, no, I want to track these keywords every week. I want to get reporting on them. I want to see which page is ranking. I want to see how I rank by geo. So I'm going to include them in my specific rank tracking features." You can do that in the Moz Keywords by Site, you'd go to Keyword Explorer, you'd select the root domain instead of the keyword, and you'd plug in your website, which maybe is Indie Hackers, a site that I've been reading a lot of lately and I like a lot.

You could see, "Oh, cool. I'm not tracking stock trading bot or ark servers, but those actually get some nice traffic. In this case, I'm ranking number 12. That's real close to page one. If I put in a little more effort on my ark servers page, maybe I could be on page one and I could be getting some of that sweet traffic, 4,000 to 6,000 searches a month. That's really significant." So great way to find additional keywords you should be adding to your tracking.

2. Discover potential keywords targets that your competitors rank for (but you don't)

Second, you can discover some new potential keyword targets when you're doing keyword research based on the queries your competition ranks for that you don't. So, in this case, I might plug in "First Round." First Round Capital has a great content play that they've been doing for many years. Indie Hackers might say, "Gosh, there's a lot of stuff that startups and tech founders are interested in that First Round writes about. Let me see what keywords they're ranking for that I'm not ranking for."

So you plug in those two to Moz's tool or other tools. You could see, "Aha, I'm right. Look at that. They're ranking for about 4,500 more keywords than I am." Then I could go get that full list, and I could sort it by volume and by difficulty. Then I could choose, okay, these keywords all look good, check, check, check. Add them to my list in Keyword Explorer or Excel or Google Docs if you're using those and go to work.

3. Explore keywords sets from large, content-focused media sites with similar audiences

Then the third one is you can explore keyword sets. I'm going to urge you to. I don't think this is something that many people do, but I think that it really should be, which is to look outside of your little galaxy of yourself and your competitors, direct competitors, to large content players that serve your audience.

So in this case, I might say, "Gosh, I'm Indie Hackers. I'm really competing maybe more directly with First Round. But you know what? HBR, Harvard Business Review, writes about a lot of stuff that my audience reads. I see people on Twitter that are in my audience share it a lot. I see people in our forums discussing it and linking out to their articles. Let me go see what they are doing in the content world."

In fact, when you look at the Venn diagram, which I just did in the Keywords by Site tool, I can see, "Oh my god, look there's almost no overlap, and there's this huge opportunity." So I might take HBR and I might click to see all their keywords and then start looking through and sort, again, probably by volume and maybe with a difficulty filter and say, "Which ones do I think I could create content around? Which ones do they have really old content that they haven't updated since 2010 or 2011?" Those types of content opportunities can be a golden chance for you to find an audience that is likely to be the right types of customers for your business. That's a pretty exciting thing.

So, in addition to these, there's a ton of other uses. I'm sure over the next few months we'll be talking more about them here on Whiteboard Friday and here on the Moz blog. But for now, I would love to hear your uses for tools like SEMrush and the Ahrefs keyword universe feature and Moz's keyword universe feature, which is called Keywords by Site. Hopefully, we'll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from Moz Blog https://moz.com/blog/improve-keyword-research-targeting
via IFTTT

Wednesday, October 25, 2017

How to Do a Competitor Analysis for SEO

Posted by John.Reinesch

Competitive analysis is a key aspect when in the beginning stages of an SEO campaign. Far too often, I see organizations skip this important step and get right into keyword mapping, optimizing content, or link building. But understanding who our competitors are and seeing where they stand can lead to a far more comprehensive understanding of what our goals should be and reveal gaps or blind spots.

By the end of this analysis, you will understand who is winning organic visibility in the industry, what keywords are valuable, and which backlink strategies are working best, all of which can then be utilized to gain and grow your own site’s organic traffic.

Why competitive analysis is important

SEO competitive analysis is critical because it gives data about which tactics are working in the industry we are in and what we will need to do to start improving our keyword rankings. The insights gained from this analysis help us understand which tasks we should prioritize and it shapes the way we build out our campaigns. By seeing where our competitors are strongest and weakest, we can determine how difficult it will be to outperform them and the amount of resources that it will take to do so.

Identify your competitors

The first step in this process is determining who are the top four competitors that we want to use for this analysis. I like to use a mixture of direct business competitors (typically provided by my clients) and online search competitors, which can differ from whom a business identifies as their main competitors. Usually, this discrepancy is due to local business competitors versus those who are paying for online search ads. While your client may be concerned about the similar business down the street, their actual online competitor may be a business from a neighboring town or another state.

To find search competitors, I simply enter my own domain name into SEMrush, scroll down to the “Organic Competitors” section, and click “View Full Report.”

The main metrics I use to help me choose competitors are common keywords and total traffic. Once I've chosen my competitors for analysis, I open up the Google Sheets Competitor Analysis Template to the “Audit Data” tab and fill in the names and URLs of my competitors in rows 2 and 3.

Use the Google Sheets Competitor Analysis Template

A clear, defined process is critical not only for getting repeated results, but to scale efforts as you start doing this for multiple clients. We created our Competitor Analysis Template so that we can follow a strategic process and focus more on analyzing the results rather than figuring out what to look for anew each time.

In the Google Sheets Template, I've provided you with the data points that we'll be collecting, the tools you'll need to do so, and then bucketed the metrics based on similar themes. The data we're trying to collect relates to SEO metrics like domain authority, how much traffic the competition is getting, which keywords are driving that traffic, and the depth of competitors’ backlink profiles. I have built in a few heatmaps for key metrics to help you visualize who's the strongest at a glance.

This template is meant to serve as a base that you can alter depending on your client’s specific needs and which metrics you feel are the most actionable or relevant.

Backlink gap analysis

A backlink gap analysis aims to tell us which websites are linking to our competitors, but not to us. This is vital data because it allows us to close the gap between our competitors’ backlink profiles and start boosting our own ranking authority by getting links from websites that already link to competitors. Websites that link to multiple competitors (especially when it is more than three competitors) have a much higher success rate for us when we start reaching out to them and creating content for guest posts.

In order to generate this report, you need to head over to the Moz Open Site Explorer tool and input the first competitor’s domain name. Next, click “Linking Domains” on the left side navigation and then click “Request CSV” to get the needed data.

Next, head to the SEO Competitor Analysis Template, select the “Backlink Import - Competitor 1” tab, and paste in the content of the CSV file. It should look like this:

Repeat this process for competitors 2–4 and then for your own website in the corresponding tabs marked in red.

Once you have all your data in the correct import tabs, the “Backlink Gap Analysis” report tab will populate. The result is a highly actionable report that shows where your competitors are getting their backlinks from, which ones they share in common, and which ones you don’t currently have.

It’s also a good practice to hide all of the “Import” tabs marked in red after you paste the data into them, so the final report has a cleaner look. To do this, just right-click on the tabs and select “Hide Sheet,” so the report only shows the tabs marked in blue and green.

For our clients, we typically gain a few backlinks at the beginning of an SEO campaign just from this data alone. It also serves as a long-term guide for link building in the months to come as getting links from high-authority sites takes time and resources. The main benefit is that we have a starting point full of low-hanging fruit from which to base our initial outreach.

Keyword gap analysis

Keyword gap analysis is the process of determining which keywords your competitors rank well for that your own website does not. From there, we reverse-engineer why the competition is ranking well and then look at how we can also rank for those keywords. Often, it could be reworking metadata, adjusting site architecture, revamping an existing piece of content, creating a brand-new piece of content specific to a theme of keywords, or building links to your content containing these desirable keywords.

To create this report, a similar process as the backlink gap analysis one is followed; only the data source changes. Go to SEMrush again and input your first competitor’s domain name. Then, click on the “Organic Research” positions report in the left-side navigation menu and click on "Export" on the right.

Once you download the CSV file, paste the content into the “Keyword Import - Competitor 1” tab and then repeat the process for competitors 2–4 and your own website.

The final report will now populate on the “Keyword Gap Analysis” tab marked in green. It should look like the one below:

This data gives us a starting point to build out complex keyword mapping strategy documents that set the tone for our client campaigns. Rather than just starting keyword research by guessing what we think is relevant, we have hundreds of keywords to start with that we know are relevant to the industry. Our keyword research process then aims to dive deeper into these topics to determine the type of content needed to rank well.

This report also helps drive our editorial calendar, since we often find keywords and topics where we need to create new content to compete with our competitors. We take this a step further during our content planning process, analyzing the content the competitors have created that is already ranking well and using that as a base to figure out how we can do it better. We try to take some of the best ideas from all of the competitors ranking well to then make a more complete resource on the topic.

Using key insights from the audit to drive your SEO strategy

It is critically important to not just create this report, but also to start taking action based on the data that you have collected. On the first tab of the spreadsheet template, we write in insights from our analysis and then use those insights to drive our campaign strategy.

Some examples of typical insights from this document would be the average number of referring domains that our competitors have and how that relates to our own backlink profile. If we are ahead of our competitors regarding backlinks, content creation might be the focal point of the campaign. If we are behind our competitors in regards to backlinks, we know that we need to start a link building campaign as soon as possible.

Another insight we gain is which competitors are most aggressive in PPC and which keywords they are bidding on. Often, the keywords that they are bidding on have high commercial intent and would be great keywords to target organically and provide a lift to our conversions.

Start implementing competitive analyses into your workflow

Competitive analyses for SEO are not something that should be overlooked when planning a digital marketing strategy. This process can help you strategically build unique and complex SEO campaigns based on readily available data and the demand of your market. This analysis will instantly put you ahead of competitors who are following cookie-cutter SEO programs and not diving deep into their industry. Start implementing this process as soon as you can and adjust it based on what is important to your own business or client’s business.

Don’t forget to make a copy of the spreadsheet template here:

Get the Competitive Analysis Template


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from Moz Blog https://moz.com/blog/competitor-analysis-for-seo
via IFTTT

Tuesday, October 24, 2017

Tangential Content Earns More Links and Social Shares in Boring Industries [New Research]

Posted by kerryjones

Many companies still don’t see the benefit of creating content that isn’t directly about their products or brand. But unless you have a universally interesting brand, you’ll be hard-pressed to attract much of an audience if all you do is publish brand-centric content.

Content marketing is meant to solve this dilemma. By offering genuinely useful content to your target customers, rather than selling to them, you earn their attention and over time gain their trust.

And yet, I find myself explaining the value of non-branded content all too often. I frequently hear grumblings from fellow marketers that clients and bosses refuse to stray from sales-focused content. I see companies publishing what are essentially advertorials and calling it content marketing.

In addition to turning off customers, branded content can be extremely challenging for building links or earning PR mentions. If you’ve ever done outreach for branded content, you’ve probably gotten a lot of pushback from the editors and writers you’ve pitched. Why? Most publishers bristle at content that feels like a brand endorsement pretending not to be a brand endorsement (and expect you to pay big bucks for a sponsored content or native advertising spot).

Fortunately, there’s a type of content that can earn your target customers’ attention, build high-quality links, and increase brand awareness...

Tangential content: The cure for a boring niche

At Fractl, we refer to content on a topic that’s related to (but not directly about) the brand that created it as "tangential content."

Some hypothetical examples of tangential content would be:

  • A pool installation company creating content about summer safety tips and barbeque recipes.
  • A luggage retailer publishing country-specific travel guides.
  • An auto insurance broker offering car maintenance advice.

While there’s a time for branded content further down the sales funnel, tangential content might be right for you if you want to:

  1. Reach a wide audience and gain top-of-funnel awareness. Not a lot of raving fans in your “boring” brand niche? Tangential topics can get you in front of the masses.
  2. Target a greater number of publishers during outreach to increase your link building and PR mention potential. Tangential topics work well for outreach because you can expand your pool of publishers (larger niches vs. a small niche with only a few dedicated sites).
  3. Create more emotional content that resonates with your audience. In an analysis of more than 300 client campaigns, we found the content that received more than 200 media mentions was more likely than low-performing campaigns to have a strong emotional hook. If your brand niche doesn’t naturally tug on the heartstrings, tangential content is one way to create an emotional reaction.
  4. Build a more diverse content library and not be limited to creating content around one topic. If you’ve maxed out on publishing content about your niche, broadening your content repertoire to tangential topics can reinvigorate your content strategy (and your motivation).

Comparison of tangential vs. on-brand content performance

In our experience at Fractl, tangential content has been highly effective for link building campaigns, especially in narrow client niches that lack broad appeal. While we’ve assumed this is true based on our observations, we now have the data to back up our assumption.

We recently categorized 835 Fractl client campaigns as either “tangential” or “on-brand,” then compared the average number of pickups (links and press mentions) and number of social shares for each group. Our hunch was right: The tangential campaigns earned 30% more media mentions and 77% more social shares on average than the brand-focused campaigns.

So what exactly does a tangential campaign look like? Below are some real examples of our client campaigns that illustrate how tangential topics can yield stellar results.

Most Hateful/Most Politically Correct Places

  • Client niche: Apartment listing site
  • Campaign topic: Which states and cities use the most prejudiced/racist language based on geo-tagged Twitter data
  • Results: 67,000+ social shares and 620 media pickups, including features on CNET, Slate, Business Insider, AOL, Yahoo, Mic, The Daily Beast, and Adweek

Why it worked

After a string of on-brand campaigns for this client yielded average results, we knew capitalizing on a hot-button, current issue would attract tons of attention. This topic still ties back into the client’s main objective of helping people find a home since the community and location of that home are important factors in one’s decisions. Check out the full case study of this campaign for more insights into why it was successful.

Most Instagrammed Locations

  • Client niche: Bus fare comparison and booking tool
  • Campaign topic: Points of interest where people post the most Instagram photos in North America
  • Results: 40,000+ social shares and more than 300 pickups, including TIME, NBC News, Business Insider, Today, Yahoo!, AOL, Fast Company, and The Daily Mail

Why it worked

Our client’s niche, bus travel, had a limited audience, so we chose a topic that was of interest to anyone who enjoys traveling, regardless of the mode of transportation they use to get there. By incorporating data from a popular social network and using an idea with a strong geographic focus, we could target a lot of different groups — the campaign appealed to travel enthusiasts, Instagram users, and regional and city news outlets (including TV stations). For more details about our thought process behind this idea, see the campaign case study.

Most Attractive NFL Players and Teams

whitney-mercilus.png

Client niche: Sports apparel retailer

Campaign topic: Survey that rates the most attractive NFL players

Results: 45,000+ social shares and 247 media pickups, including CBS Sports, USA Today, Fox Sports, and NFL.com

Why it worked

Since diehard fans want to show off that their favorite player is the best, even if it’s just in the looks department, we were confident this lighthearted campaign would pique fan interest. But fans weren’t the only ones hitting the share button — the campaign also grabbed the attention of the featured teams and players, with many sharing on their social media profiles, which helped drive exposure.

On-brand content works best in certain verticals

Tangential content isn’t always necessary for earning top-of-funnel awareness. So, how do you know if your brand-centric topics will garner lots of interest? A few things to consider:

  • Is your brand topic interesting or useful to the general population?
  • Are there multiple publishers that specifically cover your niche? Do these publishers have large readerships?
  • Are you already publishing on-brand content that is achieving your goals/expectations?

We’ve seen several industry verticals perform very well using branded content. When we broke down our campaign data by vertical, we found our top performing on-brand campaign topics were technology, drugs and alcohol, and marketing.

Some examples of our successful on-brand campaign topics include:

  • “Growth of SaaS” for a B2B software comparison website
  • “Influencers on Instagram” for an influencer marketplace
  • “Global Drug Treatment Trends” for an addiction recovery client
  • “The Tech Job Network” for a tech career website

Coming up with tangential content ideas

Once you free yourself from only brainstorming brand-centric ideas, you might find it easy to dream up tangential concepts. If you need a little help, here are a few tips to get you started:

Review your buyer personas.

In order to know which tangential topics to choose, you need to understand your target audience’s interests and where your niche intersects with those interests. The best way to find this information? Buyer personas. If you don’t already have detailed buyer personas built out, Mike King’s epic Moz post from a few years ago remains the bible on personas in my opinion.

Find topics your audience cares about with Facebook Audience Insights.

Using its arsenal of user data, this Facebook ads tool gives you a peek into the interests and lifestyles of your target audience. These insights can supplement and inform your buyer personas. See the incredibly actionable post “How to Create Buyer Personas on a Budget Using Facebook Audience Insights” for more help with leveraging this tool.

Consider how trending news topics are tangential to your brand.

Pay attention to themes that keep popping up in the news and how your brand relates back to these stories (this is how the most racist/bigoted states and cities campaign I mentioned earlier in this post came to be). Also anticipate seasonal or event-based topics that are tangential to your brand. For example, a tire manufacturer may want to create content on protecting your car from flooding and storm damage during hurricane season.

Test tangential concepts on social media.

Not sure if a tangential topic will go over well? Before moving forward with a big content initiative, test it out by sharing content related to the topic on your brand’s social media accounts. Does it get a good reaction? Pro tip: spend a little bit of money promoting these as sponsored posts to ensure they get in front of your followers.

Have you had success creating content outside of your brand niche? I’d love to hear about your tangential content examples and the results you achieved, please share in the comments!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from Moz Blog https://moz.com/blog/tangential-content
via IFTTT

Monday, October 23, 2017

NEW in Keyword Explorer: See Who Ranks & How Much with Keywords by Site

Posted by randfish

For many years now, Moz's customers and so, so many of my friends and colleagues in the SEO world have had one big feature request from our toolset: "GIVE ME KEYWORDS BY SITE!"

Today, we're answering that long-standing request with that precise data inside

Keyword Explorer:

This data is likely familiar to folks who've used tools like SEMRush, KeywordSpy, Spyfu, or others, and we have a few areas we think are stronger than these competitors, and a few known areas of weakness (I'll get to both in a minute). For those who aren't familiar with this type of data, it offers a few big, valuable solutions for marketers and SEOs of all kinds. You can:

  1. Get a picture of how many (and which) keywords your site is currently ranking for, in which positions, even if you haven't been directly rank-tracking.
  2. See which keywords your competitors rank for as well, giving you new potential keyword targets.
  3. Run comparisons to see how many keywords any given set of websites share rankings for, or hold exclusively.
  4. Discover new keyword opportunities at the intersection of your own site's rankings with others, or the intersection of multiple sites in your space.
  5. Order keywords any site ranks for by volume, by ranking position, or by difficulty
  6. Build lists or add to your keyword lists right from the chart showing a site's ranking keywords
  7. Choose to see keywords by root domain (e.g. *.redfin.com including all subdomains), subdomain (e.g. just "www.redfin.com" or just "press.redfin.com"), or URL (e.g. just "https://www.redfin.com/blog/2017/10/migration-patterns-show-more-people-leaving-politically-blue-counties.html")
  8. Export any list of ranking keywords to a CSV, along with the columns of volume, difficulty, and ranking data

Find your keywords by site

My top favorite features in this new release are:

#1 - The clear, useful comparison data between sites or pages

Comparing the volume of a site's ranking keywords is a really powerful way to show how, even when there's a strong site in a space (like Sleepopolis in the mattress reviews world), they are often losing out in the mid-long tail of rankings, possibly because they haven't targeted the quantity of keywords that their competitors have.

This type of crystal-clear interface (powerful enough to be used by experts, but easily understandable to anyone) really impressed me when I saw it. Bravo to Moz's UI folks for nailing it.

#2 - The killer Venn diagram showing keyword overlaps

Aww yeah! I love this interactive venn diagram of the ranking keywords, and the ability to see the quantity of keywords for each intersection at a glance. I know I'll be including screenshots like this in a lot of the analyses I do for friends, startups, and non-profits I help with SEO.

#3 - The accuracy & recency of the ranking, volume, & difficulty data

As you'll see in the comparison below, Moz's keyword universe is technically smaller than some others. But I love the trustworthiness of the data in this tool. We refresh not only rankings, but keyword volume data multiple times every month (no dig on competitors, but when volume or rankings data is out of date, it's incredibly frustrating, and lessens the tool's value for me). That means I can use and rely on the metrics and the keyword list — when I go to verify manually, the numbers and the rankings match. That's huge.

Caveat: Any rankings that are personalized or geo-biased tend to have some ranking position changes or differences. If you're doing a lot of geographically sensitive rankings research, it's still best to use a rank tracking solution like the one in Moz Pro Campaigns (or, at an enterprise level, a tool like STAT).


How does Moz's keyword universe stack up to the competition? We're certainly the newest player in this particular space, but we have some advantages over the other players (and, to be fair, some drawbacks too). Moz's Russ Jones put together this data to help compare:

Click the image for a larger version

Obviously, we've made the decision to be generally smaller, but fresher, than most of our competitors. We do this because:

  • A) We believe the most-trafficked keywords matter more when comparing the overlaps than getting too far into the long tail (this is particularly important because once you get into the longer tail of search demand, an unevenness in keyword representation is nearly unavoidable and can be very misleading)
  • B) Accuracy matters a lot with these types of analyses, and keyword rankings data that's more than 3–4 weeks out of date can create false impressions. It's also very tough to do useful comparisons when some keyword rankings have been recently refreshed and others are weeks or months behind.
  • C) We chose an evolving corpus that uses clickstream-fed data from Jumpshot to cycle in popular keywords and cycle out others that have lost popularity. In this fashion, we feel we can provide the truest, most representational form of the keyword universe being used by US searchers right now.

Over time, we hope to grow our corpus (so long as we can maintain accuracy and freshness, which provide the advantages above), and extend to other geographies as well.

If you're a Moz Pro subscriber and haven't tried out this feature yet, give it a spin. To explore keywords by site, simply enter a root domain, subdomain, or exact page into the universal search bar in Keyword Explorer. Use the drop if you need to modify your search (for example, researching a root domain as a keyword).

There's immense value to be had here, and a wealth of powerful, accurate, timely rankings data that can help boost your SEO targeting and competitive research efforts. I'm looking forward to your comments, questions, and feedback!


Need some extra guidance? Sign up for our upcoming webinar on either Thursday, October 26th or Monday, October 30th.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from Moz Blog https://moz.com/blog/keyword-explorer-keywords-by-site
via IFTTT

Friday, October 20, 2017

How Links in Headers, Footers, Content, and Navigation Can Impact SEO - Whiteboard Friday

Posted by randfish

Which link is more valuable: the one in your nav, or the one in the content of your page? Now, how about if one of those in-content links is an image, and one is text? Not all links are created equal, and getting familiar with the details will help you build a stronger linking structure.

How Links in Headers, Footers, Content, and Navigation Can Impact SEO

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to chat about links in headers and footers, in navigation versus content, and how that can affect both internal and external links and the link equity and link value that they pass to your website or to another website if you're linking out to them.

So I'm going to use Candy Japan here. They recently crossed $1 million in sales. Very proud of Candy Japan. They sell these nice boxes of random assortments of Japanese candy that come to your house. Their website is actually remarkably simplistic. They have some footer links. They have some links in the content, but not a whole lot else. But I'm going to imagine them with a few more links in here just for our purposes.

It turns out that there are a number of interesting items when it comes to internal linking. So, for example, some on-page links matter more and carry more weight than other kinds. If you are smart and use these across your entire site, you can get some incremental or potentially some significant benefits depending on how you do it.

Do some on-page links matter more than others?

So, first off, good to know that...

I. Content links tend to matter more

...just broadly speaking, than navigation links. That shouldn't be too surprising, right? If I have a link down here in the content of the page pointing to my Choco Puffs or my Gummies page, that might actually carry more weight in Google's eyes than if I point to it in my navigation.

Now, this is not universally true, but observably, it seems to be the case. So when something is in the navigation, it's almost always universally in that navigation. When something is in here, it's often only specifically in here. So a little tough to tell cause and effect, but we can definitely see this when we get to external links. I'll talk about that in a sec.

II. Links in footers often get devalued

So if there's a link that you've got in your footer, but you don't have it in your primary navigation, whether that's on the side or the top, or in the content of the page, a link down here may not carry as much weight internally. In fact, sometimes it seems to carry almost no weight whatsoever other than just the indexing.

III. More used links may carry more weight

This is a theory for now. But we've seen some papers on this, and there has been some hypothesizing in the SEO community that essentially Google is watching as people browse the web, and they can get that data and sort of see that, hey, this is a well-trafficked page. It gets a lot of visits from this other page. This navigation actually seems to get used versus this other navigation, which doesn't seem to be used.

There are a lot of ways that Google might interpret that data or might collect it. It could be from the size of it or the CSS qualities. It could be from how it appears on the page visually. But regardless, that also seems to be the case.

IV. Most visible links may get more weight

This does seem to be something that's testable. So if you have very small fonts, very tiny links, they are not nearly as accessible or obvious to visitors. It seems to be the case that they also don't carry as much weight in Google's rankings.

V. On pages with multiple links to the same URL

For example, let's say I've got this products link up here at the top, but I also link to my products down here under Other Candies, etc. It turns out that Google will see both links. They both point to the same page in this case, both pointing to the same page over here, but this page will only inherit the value of the anchor text from the first link on the page, not both of them.

So Other Candies, etc., that anchor text will essentially be treated as though it doesn't exist. Google ignores multiple links to the same URL. This is actually true internal and external. For this reason, if you're going ahead and trying to stuff in links in your internal content to other pages, thinking that you can get better anchor text value, well look, if they're already in your navigation, you're not getting any additional value. Same case if they're up higher in the content. The second link to them is not carrying the anchor text value.

Can link location/type affect external link impact?

Other items to note on the external side of things and where they're placed on pages.

I. In-content links are going to be more valuable than footers or nav links

In general, nav links are going to do better than footers. But in content, this primary content area right in here, that is where you're going to get the most link value if you have the option of where you're going to get an external link from on a page.

II. What if you have links that open in a new tab or in a new window versus links that open in the same tab, same window?

It doesn't seem to matter at all. Google does not appear to carry any different weight from the experiments that we've seen and the ones we've conducted.

III. Text links do seem to perform better, get more weight than image links with alt attributes

They also seem to perform better than JavaScript links and other types of links, but critically important to know this, because many times what you will see is that a website will do something like this. They'll have an image. This image will be a link that will point off to a page, and then below it they'll have some sort of caption with keyword-rich anchors down here, and that will also point off. But Google will treat this first link as though it is the one, and it will be the alt attribute of this image that passes the anchor text, unless this is all one href tag, in which case you do get the benefit of the caption as the anchor. So best practice there.

IV. Multiple links from same page — only the first anchor counts

Well, just like with internal links, only the first anchor is going to count. So if I have two links from Candy Japan pointing to me, it's only the top one that Google sees first in the HTML. So it's not where it's organized in the site as it renders visually, but where it comes up in the HTML of the page as Google is rendering that.

V. The same link and anchor on many or most or all pages on a website tends to get you into trouble.

Not always, not universally. Sometimes it can be okay. Is Amazon allowed to link to Whole Foods from their footer? Yes, they are. They're part of the same company and group and that kind of thing. But if, for example, Amazon were to go crazy spamming and decided to make it "cheap avocados delivered to your home" and put that in the footer of all their pages and point that to the WholeFoods.com/avocadodelivery page, that would probably get penalized, or it may just be devalued. It might not rank at all, or it might not pass any link equity. So notable that in the cases where you have the option of, "Should I get a link on every page of a website? Well, gosh, that sounds like a good deal. I'd pass all this page rank and all this link equity." No, bad deal.

Instead, far better would be to get a link from a page that's already linked to by all of these pages, like, hey, if we can get a link from the About page or from the Products page or from the homepage, a link on the homepage, those are all great places to get links. I don't want a link on every page in the footer or on every page in a sidebar. That tends to get me in trouble, especially if it is anchor text-rich and clearly keyword targeted and trying to manipulate SEO.

All right, everyone. I look forward to your questions. We'll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from Moz Blog https://moz.com/blog/links-headers-footers-navigation-impact-seo
via IFTTT

Monday, October 16, 2017

Google Shares Details About the Technology Behind Googlebot

Posted by goralewicz

Crawling and indexing has been a hot topic over the last few years. As soon as Google launched Google Panda, people rushed to their server logs and crawling stats and began fixing their index bloat. All those problems didn’t exist in the “SEO = backlinks” era from a few years ago. With this exponential growth of technical SEO, we need to get more and more technical. That being said, we still don’t know how exactly Google crawls our websites. Many SEOs still can’t tell the difference between crawling and indexing.

The biggest problem, though, is that when we want to troubleshoot indexing problems, the only tool in our arsenal is Google Search Console and the Fetch and Render tool. Once your website includes more than HTML and CSS, there's a lot of guesswork into how your content will be indexed by Google. This approach is risky, expensive, and can fail multiple times. Even when you discover the pieces of your website that weren’t indexed properly, it's extremely difficult to get to the bottom of the problem and find the fragments of code responsible for the indexing problems.

Fortunately, this is about to change. Recently, Ilya Grigorik from Google shared one of the most valuable insights into how crawlers work:

Interestingly, this tweet didn’t get nearly as much attention as I would expect.

So what does Ilya’s revelation in this tweet mean for SEOs?

Knowing that Chrome 41 is the technology behind the Web Rendering Service is a game-changer. Before this announcement, our only solution was to use Fetch and Render in Google Search Console to see our page rendered by the Website Rendering Service (WRS). This means we can troubleshoot technical problems that would otherwise have required experimenting and creating staging environments. Now, all you need to do is download and install Chrome 41 to see how your website loads in the browser. That’s it.

You can check the features and capabilities that Chrome 41 supports by visiting Caniuse.com or Chromestatus.com (Googlebot should support similar features). These two websites make a developer’s life much easier.

Even though we don’t know exactly which version Ilya had in mind, we can find Chrome’s version used by the WRS by looking at the server logs. It’s Chrome 41.0.2272.118.

It will be updated sometime in the future

Chrome 41 was created two years ago (in 2015), so it’s far removed from the current version of the browser. However, as Ilya Grigorik said, an update is coming:

I was lucky enough to get Ilya Grigorik to read this article before it was published, and he provided a ton of valuable feedback on this topic. He mentioned that they are hoping to have the WRS updated by 2018. Fingers crossed!

Google uses Chrome 41 for rendering. What does that mean?

We now have some interesting information about how Google renders websites. But what does that mean, practically, for site developers and their clients? Does this mean we can now ignore server-side rendering and deploy client-rendered, JavaScript-rich websites?

Not so fast. Here is what Ilya Grigorik had to say in response to this question:

We now know WRS' capabilities for rendering JavaScript and how to debug them. However, remember that not all crawlers support Javascript crawling, etc. Also, as of today, JavaScript crawling is only supported by Google and Ask (Ask is most likely powered by Google). Even if you don’t care about social media or search engines other than Google, one more thing to remember is that even with Chrome 41, not all JavaScript frameworks can be indexed by Google (read more about JavaScript frameworks crawling and indexing). This lets us troubleshoot and better diagnose problems.

Don’t get your hopes up

All that said, there are a few reasons to keep your excitement at bay.

Remember that version 41 of Chrome is over two years old. It may not work very well with modern JavaScript frameworks. To test it yourself, open http://jsseo.expert/polymer/ using Chrome 41, and then open it in any up-to-date browser you are using.

The page in Chrome 41 looks like this:

The content parsed by Polymer is invisible (meaning it wasn’t processed correctly). This is also a perfect example for troubleshooting potential indexing issues. The problem you're seeing above can be solved if diagnosed properly. Let me quote Ilya:

"If you look at the raised Javascript error under the hood, the test page is throwing an error due to unsupported (in M41) ES6 syntax. You can test this yourself in M41, or use the debug snippet we provided in the blog post to log the error into the DOM to see it."

I believe this is another powerful tool for web developers willing to make their JavaScript websites indexable. We will definitely expand our experiment and work with Ilya’s feedback.

The Fetch and Render tool is the Chrome v. 41 preview

There's another interesting thing about Chrome 41. Google Search Console's Fetch and Render tool is simply the Chrome 41 preview. The righthand-side view (“This is how a visitor to your website would have seen the page") is generated by the Google Search Console bot, which is... Chrome 41.0.2272.118 (see screenshot below).

Zoom in here

There's evidence that both Googlebot and Google Search Console Bot render pages using Chrome 41. Still, we don’t exactly know what the differences between them are. One noticeable difference is that the Google Search Console bot doesn’t respect the robots.txt file. There may be more, but for the time being, we're not able to point them out.

Chrome 41 vs Fetch as Google: A word of caution

Chrome 41 is a great tool for debugging Googlebot. However, sometimes (not often) there's a situation in which Chrome 41 renders a page properly, but the screenshots from Google Fetch and Render suggest that Google can’t handle the page. It could be caused by CSS animations and transitions, Googlebot timeouts, or the usage of features that Googlebot doesn’t support. Let me show you an example.

Chrome 41 preview:

Image blurred for privacy

The above page has quite a lot of content and images, but it looks completely different in Google Search Console.

Google Search Console preview for the same URL:

As you can see, Google Search Console’s preview of this URL is completely different than what you saw on the previous screenshot (Chrome 41). All the content is gone and all we can see is the search bar.

From what we noticed, Google Search Console renders CSS a little bit different than Chrome 41. This doesn’t happen often, but as with most tools, we need to double check whenever possible.

This leads us to a question...

What features are supported by Googlebot and WRS?

According to the Rendering on Google Search guide:

  • Googlebot doesn't support IndexedDB, WebSQL, and WebGL.
  • HTTP cookies and local storage, as well as session storage, are cleared between page loads.
  • All features requiring user permissions (like Notifications API, clipboard, push, device-info) are disabled.
  • Google can’t index 3D and VR content.
  • Googlebot only supports HTTP/1.1 crawling.

The last point is really interesting. Despite statements from Google over the last 2 years, Google still only crawls using HTTP/1.1.

No HTTP/2 support (still)

We've mostly been covering how Googlebot uses Chrome, but there's another recent discovery to keep in mind.

There is still no support for HTTP/2 for Googlebot.

Since it's now clear that Googlebot doesn’t support HTTP/2, this means that if your website supports HTTP/2, you can’t drop HTTP 1.1 optimization. Googlebot can crawl only using HTTP/1.1.

There were several announcements recently regarding Google’s HTTP/2 support. To read more about it, check out my HTTP/2 experiment here on the Moz Blog.

Via https://developers.google.com/search/docs/guides/r...

Googlebot’s future

Rumor has it that Chrome 59’s headless mode was created for Googlebot, or at least that it was discussed during the design process. It's hard to say if any of this chatter is true, but if it is, it means that to some extent, Googlebot will “see” the website in the same way as regular Internet users.

This would definitely make everything simpler for developers who wouldn’t have to worry about Googlebot’s ability to crawl even the most complex websites.

Chrome 41 vs. Googlebot’s crawling efficiency

Chrome 41 is a powerful tool for debugging JavaScript crawling and indexing. However, it's crucial not to jump on the hype train here and start launching websites that “pass the Chrome 41 test.”

Even if Googlebot can “see” our website, there are many other factors that will affect your site’s crawling efficiency. As an example, we already have proof showing that Googlebot can crawl and index JavaScript and many JavaScript frameworks. It doesn’t mean that JavaScript is great for SEO. I gathered significant evidence showing that JavaScript pages aren’t crawled even half as effectively as HTML-based pages.

In summary

Ilya Grigorik’s tweet sheds more light on how Google crawls pages and, thanks to that, we don’t have to build experiments for every feature we're testing — we can use Chrome 41 for debugging instead. This simple step will definitely save a lot of websites from indexing problems, like when Hulu.com’s JavaScript SEO backfired.

It's safe to assume that Chrome 41 will now be a part of every SEO’s toolset.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from Moz Blog https://moz.com/blog/google-shares-details-googlebot
via IFTTT