Monday, February 29, 2016

The Google Analytics Add-On for Sheets: An Intro to an Underutilized Tool

Posted by tian_wang

With today’s blog post I’m sharing everything one needs to know about an underappreciated tool: the Google Analytics add-on for Google Sheets. In this post I’ll be covering the following:

1. What is the Google Analytics add-on?

2. How to install and set up the Google Analytics add-on.

3. How to create a custom report with the Google Analytics add-on.

4. A step-by-step worked example of setting up an automated report.

5. Further considerations and pitfalls to avoid.

Thanks to Moz for having me, and for giving me the chance to write about this simple and powerful tool!

1. What is the Google Analytics add-on and why should I care?

I’m glad I asked. Simply put, the Google Analytics add-on is an extension for Google Sheets that allows you to create custom reports within Sheets. The add-on works by linking up to an existing Analytics account, using Google’s Analytics API and Regular Expressions to filter the data you want to pull, and finally gathering the data into an easy and intuitive format that’s ripe for reporting.

The Google Analytics add-on’s real value-add to a reporting workflow is that it’s extremely flexible, reliable, and a real time-saver. Your reporting will still be constrained by the limitations of Sheets itself (as compared to, say, Excel), but the Sheets framework has served almost every reporting need I’ve come across to date and the same will probably be true for most of you!

In a nutshell, the Add-On allows you to:

  • Pull any data that you’d be able to access in the Analytics API (i.e analytics.google.com) directly into a spreadsheet
  • Easily compare historical data across time periods
  • Filter and segment your data
  • Automate regular reporting
  • Make tweaks to existing reports to get new data (no more re-inventing wheels!)

If this all sounds like you could use it, read on!

2. Getting started: How to install and set up the Google Analytics add-on

2A. Installing the Google Analytics add-on

  • Go into Google Sheets.
  • On the header bar, under your Workbook’s title, click add-on.
  • This opens a drop-down menu — click “Get add-ons.”
  • In the following window, type “Google Analytics" into the search bar on the top right and hit enter.

  • The first result is the add-on we want, so go ahead and install it.

  • Refresh your page and confirm the add-on is installed by clicking “Add-ons” again. You should see an option for “Google Analytics.”

That’s all there is to installation!

2B. Setting up the Google Analytics add-on

Now that we have the Google Analytics add-on installed, we need to set it up by linking it to an Analytics account before we can use it.

  • Under the “Add-ons” tab in Sheets, hover “Google Analytics” to expose a side-bar as shown below.

  • Click “Create New Report.” You’ll see a menu appear on the right side of your screen.

  • In this menu, set the account information to the Analytics account you want to measure.
  • Fill out the metrics and dimensions you want to analyze. You can further customize segmentation within the report itself later, so just choose a simple set for now.
  • Click “Create Report.” The output will be a new sheet, with a report configuration that looks like this:

  • Note: This is NOT your report. This is the setup configuration for you to let the add-on know exactly what information you’d like to see in the report.

Once you’ve arrived at this step, your set-up phase is done!

Next we’ll look at what these parameters mean, and how to customize them to tailor the data you receive.

3. Creating a custom report with the Google Analytics add-on

So now you have all these weird boxes and you’re probably wondering what you need to fill out and what you don’t.

Before we get into that, let’s take a look at what happens if you don’t fill out anything additional, and just run the report from here.

To run a configured report, click back into the “Add-Ons” menu and go to Google Analytics. From there, click “Run Reports.” Make sure you have your configuration sheet open when you do this!

You’ll get a notification that the report was either successfully created, or that something went wrong (this might require some troubleshooting).

Following the example above, your output will look something like this:

This is your actual report. Hooray! So what are we actually seeing? Let’s go back to the “Report Configuration” sheet to find out.

The report configuration:

Type and View ID are defaults that don’t need to be changed. Report Name is what you want your report to be called, and will be the name generated for the report sheet created when you run your reports.

So really, in the report configuration above, all the input we’re seeing is:

  • Last N Days = 7
  • Metrics = ga:users

In other words, this report shows the total number of sessions in the specified View ID over the last week. Interesting maybe, but not that helpful. Let’s see what happens if we make a few changes.

I’ve changed Last N Days from 7 to 30, and added Date as a Dimension. Running the report again yields the following output:

By increasing the range of data pulled from last 7 to 30 days, we get a data from a larger set of days. By adding date as a dimension, we can see how much traffic the site registered each day.

This is only scratching the surface of what the Google Analytics add-on can do. Here’s a breakdown of the parameters, and how to use them:

Parameter Name

Required?

Description & Notes

Example Value(s)

Report Name

No

The name of your report. This will be the name of the report sheet that's generated when you run reports. If you’re running multiple reports, and want to exclude one without deleting its configuration setup, delete the report name and the column will be ignored next time you run your reports.

“January Organic Traffic”

Type

No

Inputs are either “core” or “mcf,” representative of Google’s Core Reporting API and Multi-Channel Funnels API respectively. Core is the default and will serve most of your needs!

“core”

/

“mcf”

View (Profile) ID

Yes

The Analytics view that your report will pull data from. You can find your view ID in the Analytics interface, under the Admin tab.

ga:12345678

Start / End Date

No

Used alternatively with Last N Days (i.e you must use exactly one), allows you to specify a range of data to pull from.

2/1/2016 – 2/31/2016

Last N Days

No

Used alternatively with Start / End Date (i.e you must use exactly one), pulls data from the last N days from the current date. Counts backwards from the current date.

Any integer

Metrics

Yes

Metrics you want to pull. You can include multiple metrics per report. Documentation on Metrics and dimensions can be found in Google’s Metrics & Dimensions Explorer

“ga:sessions”

Dimensions

No

Dimensions you want your metrics to be segmented by. You can include multiple dimensions per report. Documentation on metrics and dimensions can be found here.

“ga:date”

Sort

No

Specifies an order to return your data by, can be used to organize data before generating a report. Note: you can only sort by metrics/dimensions that are included in your report.


“sort=ga:browser,
ga:country”

Filters

No

Filter the data included in your report based on any dimension (not just those included in the report).

“ga:country==Japan;
ga:sessions>5”

Segment

No

Use segments from the main reporting interface.

“users::condition::
ga:browser==Chrome”

Sampling Level

No

Directs the level of sampling for the data you’re pulling. Analytics samples data by default, but the add-on can increase the precision of sampling usage.

“HIGHER_PRECISION”

Start Index

No

Shows results starting from the current index (default = 1, not 0). For use with Max Results, when you want to retrieve paginated data (e.g if you’re pulling 2,000 results, and want to get results 1,001 – 2,000).

Integer

Max Results

No

Default is 1,000, can be raised to 10,000.

Integer up to 10,000

Spreadsheet URL

No

Sends your data to another spreadsheet.

URL for sheet where you want data to be sent

By using these parameters in concert, you can arrive at a customized report detailing exactly what you want. The best part is, once you’ve set up a report in your configuration sheet and confirmed the output is what you want, all you have to do to run it again is run your reports in the add-on! This makes regular reporting a breeze, while still bringing all the benefits of Sheets to bear.

Some important things to note and consider, when you’re setting up your configuration sheet:

  • You can include multiple report configurations in the the sheet (see below):

In the image above, running the report configuration will produce four separate reports. You should NOT have one configuration sheet per report.

  • Although you can have your reports generated in the same workbook as your configuration sheet, I recommend copying the data into another workbook or using the Spreadsheet URL parameter to do the same thing. Loading multiple reports in one workbook can create performance problems.
  • You can schedule your reporting to run automatically by enabling scheduled reporting within the Google Analytics add-on. Note: this is only helpful if you are using “Last N Days” for your time parameter. If you’re using a date range, your report will just give you the same data for that range every month.

The regularity options are hourly, daily, weekly, and monthly.

4. Creating an automated report: A worked example

So now that we’ve installed, set up, and configured a report, next up is the big fish, the dream of anyone who’s had to do regular reporting: automation.

As an SEO, I use the Google Analytics add-on for this exact purpose for many of my clients. I’ll start by assuming you’ve installed and set up the add-on, and are ready to create a custom report configuration.

Step one: Outline a framework

Before we begin creating our report, it’s important we understand what we want to measure and how we want to measure it. For this example, let’s say we want to view organic traffic to a specific set of pages on our site from Chrome browsers and that we want to analyze the traffic month-over-month and year-over-year.

Step two: Understand your framework within the add-on

To get everything we want, we’ll use three separate reports: organic traffic in the past month (January 2016), organic traffic in the month before that (December 2015), and organic traffic in the past month, last year (January 2015). It’s possible to include this all in one report, but I recommend creating one report per date period, as it makes organizing your data and troubleshooting your configuration significantly easier.

Step three: Map your key elements to add-on parameters

Report One parameter breakdown:

Report Name – 1/1/2016

  • Make it easily distinguishable from the other reports we’ll be running

Type – core

  • The GA API default

View (Profile) ID

  • The account we want to pull data from

Start Date – 1/1/2016

  • The beginning date we want to pull data from

End Date – 1/31/2016

  • The cutoff date for the data we want to pull

Metrics – ga:sessions

  • We want to analyze sessions for this report

Dimensions – ga:date

  • Allows us to see traffic the site received each day in the specified range

Filters – ga:medium==organic;ga:landingpagepath=@resources

  • We’ve included two filters, one that specifies only organic traffic and another that specifies sessions that had a landing page with “resources” in the URL (resources is the subdirectory on Distilled’s website that houses our editorial content)
  • Properly filling out filters and segments requires specific syntax, which you can find on Google’s Core Reporting API resources.

Segments – sessions::condition::ga:browser==Chrome

  • Specifies that we only want session data from Chrome browsers

Sampling Level – HIGHER_PRECISION

  • Specifies that we want to minimize sampling for this data set

Report One output: Past month’s sessions

Now that we’ve set up our report, it’s time to run it and check the results.

So, in the month of January 2016, the resources section on Distilled’s website saw 10,365 sessions that satisfied the following conditions:

  • organic source/medium
  • landing page containing “resources”
  • Chrome browser

But how do we know this is accurate? It’s impossible to tell at face value, but you can reliably check accuracy of a report by looking at the analogous view in Google Analytics itself.

Confirming Report One data

Since the Google Analytics add-on is an analogue to what you find on analytics.google.com, in your account, we can combine separate pieces in GA to achieve the same effect as our report:

Date Range

Organic Source/Medium

Landing Page Path & Browser

The result

Hooray!

Now that we’ve confirmed our framework works, and is showing us what we want, creating our other two reports can be done by simply copying the configuration and making minor adjustments to the parameters.

Since we want a month-over-month comparison and a year-over-year comparison for the exact same data, all we have to do is change the date range for the two reports.

One should detail the month before (December 2015) and the other should detail the same month in the previous year (January 2015). We can run these reports immediately.

The results?

Total Sessions In January 2015 (Reporting Month, Previous Year: 2,608

Total Sessions In December 2015 (Previous Month): 7,765

Total Sessions In January 2016 (Reporting Month): 10,365

We’re up 33% month-over-month and 297% year-over-year. Not bad!

Every month, we can update the dates in the configuration. For example, next month we’ll be examining February 2016, compared to January 2016 and February 2015. Constructing a dashboard can be done in Sheets, as well, by creating an additional sheet that references the outputs from your reports!

5. Closing observations and pitfalls to avoid

The Google Analytics add-on probably isn’t the perfect reporting solution that all digital marketers yearn for. When I first discovered the Google Analytics add-on for Google Sheets, I was intimidated by its use of Regular Expressions and thought that you needed to be a syntax savant to make full use of the tool. Since then, I haven’t become any better at Regular Expressions, but I’ve come to realize that the Google Analytics add-on is versatile enough that it can add value to most reporting processes, without the need for deep technical fluency.

I was able to cobble together each of the reports I needed by testing, breaking, and researching different combinations of segments, filters, and frameworks and I encourage you to do the same! You’ll most likely be able to arrive at the exact report you need, given enough time and patience.

One last thing to note: the Google Analytics interface (i.e what you use when you access your analytics account online) has built-in safeguards to ensure that the data you see matches the reporting level you’ve chosen. For example, if I click into a session-level report (e.g landing pages), I’ll see mostly session-level metrics. Similarly, clicking into a page-level report will return page-level metrics. In the Google Analytics add-on, however, this safeguard doesn’t exist due to the add-on being designed for greater versatility. It’s therefore all the more important that you’re thorough in outlining, designing, and building your reporting framework within the add-on. After you’ve configured a custom report and successfully run it, be sure to check your results against the Google Analytics interface!

Abraham Lincoln famously said, “Give me six hours to chop down a tree and I will spend the first four sharpening the axe.” Good advice in general that also holds true for using the Google Analytics add-on for Google Sheets.

Supplementary resource appendix:

  • RegExr – General Regular Expressions resource.
  • Debuggex – Visual Regular Expressions debugging tool.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from Moz Blog http://ift.tt/1TM94WN
via IFTTT

Friday, February 26, 2016

Overcoming Objections on Your Landing Pages - Whiteboard Friday

Posted by randfish

[Estimated read time: 9 minutes]

How do you take your potential customers' problems and turn them into a conversion success? If you're having trouble with low conversion rates on high-traffic landing pages, don't worry — there's help. In today's Whiteboard Friday, Rand shares a process to turn your landing page objections into improved conversion rates.

Overcoming Objections on Your Landing Pages in Order to Improve Your Conversion Rates Whiteboard

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to chat about overcoming objections on your landing pages in order to improve conversion rates. So this a process that I have stolen part and parcel from Conversion Rate Experts, a British consulting company that Moz has used a couple of times to help with our campaigns. Karl Blanks and Ben Jesson have just been phenomenal for this stuff.

Look, they're not the only ones who do it. A lot of people in conversion rate optimization use a process similar to this, but it's something I talk about and share so often that I thought, hey, let's bring it to Whiteboard Friday.

Enter a problem...

So a lot of the time marketers have this problem where a lot of people are visiting a page, a landing page where you're trying to sell someone or get someone to take a conversion action, maybe sign up for an email list or join a community or download an app, take a free trial of something, test out a free tool or buy an actual product, like in this case my minimalist noise-canceling headphones.

They are very minimalist indeed thanks to my subpar drawing skills. But when lots of people are visiting this page and very few are converting, you've got a conversion rate optimization problem and challenge, and this process can really help you through it.

So first off, let's start with the question around what's a low conversion rate?

The answer to that is it really depends. It depends on who you are and what you're trying to accomplish. If you're in business to consumer ecommerce, like selling headphones, then you're getting what I'd say is relatively qualified traffic. You're not just blasting traffic to this page that's coming from sources that maybe don't even know what they're getting, but in fact people who clicked here knew that they were looking for headphones. 1.5% to 2%, that's reasonably solid. Below that you probably have an issue. It's likely that you can improve it.

With email signups, if you're trying to get people to convert to an email list, 3% to 5% with B2B. Software as a service, it's a little bit lower, 0.5% to 1%. Those tend to be tougher to get people through. This number might be higher if the B2B product that you're serving and the SaaS product is a free trial or something like that. In fact, a software free trial usually is in the 1.5% to 2% range. A free app install, like if people are getting to an app download page or to an app's homepage or download page, and you're seeing below 4% or 5%, that's probably a problem. Free account signup, if you're talking about people joining a community or maybe connecting a Facebook or a Google account to start a free account on a website, that's maybe in the 2% to 3% range.

But these are variable. Your mileage may vary. But I want to say that if you start from these assumptions and you're looking and you're going, "Wow, we're way under these for our target," yeah, let's try this process.

Collect contact information

So what we do to start, and what Conversion Rate Experts did to start, is they collect contact information for three different groups of people. The first group is people who've heard of your product, your service, your company, but they've never actually tried it. Maybe they haven't even made their way to a landing page to convert yet, but they're in your target demographic. They're the audience you're trying to reach.

The second group is people who have tried out your product or service but decided against it. That could be people who went through the shopping cart but abandoned it, and so you have their email address. It could be people who've signed up for an email newsletter but canceled it, or signed up for an account but never kept using it, or signed up for a free trial but canceled before the period was over. It could be people who have signed up for a mailing list to get a product but then never actually converted.

Then the third one is people who have converted, people who actually use your stuff, like it, have tried it, bought it, etc.

You want to interview them.

You can use three methods, and I recommend some combination of all of these. You can do it over email, over the phone, or in person. When we've done this specifically in-house for Moz, or when Conversion Rate Experts did it for Moz, they did all three. They interviewed some folks over email, some folks they talked to over the phone, some folks they went to, literally, conferences and events and met with them in person and had those interviews, those sit-down interviews.

Then they grouped them into these three groups, and then they asked slightly different questions, variations of questions to each group. So for people who had heard of the product but never actually tried it, they asked questions like: "What have you heard about us or about this product? What would make you want to try it, and what objections do you currently have that's stopping you from doing that?"

For people who sort of walked away, they maybe tried or they didn't get all the way through trying, but they walked away, they didn't end up converting or they didn't stick with it, we could say: "What made you initially interested? What objections did you have, and how did you overcome those? What made you change your mind or decide against this product?" Oftentimes that's a mismatch of expectations versus what was delivered.

Then for the people who loved it, who are loyal customers, who are big fans, you can say: "Well, what got you interested? What objections did you have and how did you overcome them? What has made you stick with us? What makes you love us or this product or this service, this newsletter, this account, this community, and if you did love it, can we share your story?" This is powerful because we can use these later on for testimonials.

Create a landing page

Then C, in this process, we're going to actually create a landing page that takes the answers to these questions, which are essentially objections, reasons people didn't buy, didn't convert or weren't happy when they did, and we're going to turn them into a landing page that offers compelling explanations, compelling reasons, examples, data and testimonials to get people through that process.

So if you hear, for example, "Hey, I didn't buy this because I wasn't sure if the right adapters would be included for my devices," or, "I travel on planes a lot and I didn't know whether the headphones would support the plane use that I want to have," great, terrific. We're going to include what the adapters are right on there, which airlines they're compatible with, all that kind of information. That's going on the page.

If they say, "Hey, I actually couldn't tell how big the headphones were. I know you have dimensions on there, but I couldn't tell how big they were from the photos," okay, let's add some photos of representative sample sizes of things that people are very familiar with, maybe a CD, maybe an iPhone that people are like, "Oh yeah, I know the size of a CD. I know the size of an iPhone. I can compare that against the headphones." So now that's one of the images in there. Great, we've answered the objection.

"I wasn't sure if they had volume control." Great. Let's put that in a photo.

"Is tax and shipping included in the cost? I didn't want to get into a shopping cart situation where I wasn't sure." Perfect. We're going to put in there, "Tax included. Free shipping."

"Is the audio quality good enough for audiophiles and pros because I'm really . . ." well, terrific. Let's find a known audiophile, let's add their testimonial to the page.

We're essentially going one by one through the objections that we hear most frequently here, and then we're turning those into content on the page. That content can be data, it can be reasons, it can be examples, it can be testimonials. It's whatever we needed to be to help get people through that purchase process.

Split test

Then, of course, with every type of conversion rate optimization test and landing page optimization, we want to actually try some variations. So we're going to do a split test of the new page against the old one, and if we see there's stronger conversion rate, we know we've had success.

If we don't, we can go back to the drawing board and potentially broaden our audience here, try and understand how have we not overcome these objections, maybe show this new page to some of these people and see what additional objections they've got, all that kind of stuff.

This process is really powerful. It helps you uncover the problems and issues that you may not even know exist. In my experience, it's the case that when companies try this, whether it's for products or for services, for landing pages, for new accounts, for apps, whatever it is, they tend to uncover the same small set of answers from these groups over and over again. It's just a matter of getting those four or five questions right and answering them on the landing page in order to significantly improve conversion.

All right, everyone. Look forward to your suggestions, your ideas, your feedback, and we'll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from Moz Blog http://ift.tt/1Ozw1o5
via IFTTT

Thursday, February 25, 2016

Is Any Press Good Press? Measuring the SEO Impact of PR Wins and Fails

Posted by KelseyLibert

[Estimated read time: 15 minutes]

Is the saying “any press is good press” really true? Whether it happens as part of a carefully orchestrated PR stunt or accidentally, the potential payoffs and drawbacks when a brand dominates the news can be huge.

In our latest collaboration, Fractl and Moz explored how a surge of media coverage impacted seven companies.

Is-Any-Press-Good-Press-Header.png

By looking at brands that dominated headlines within the last year, we set out to answer the following questions:

  • Does positive press coverage always bring more benefits than negative press coverage?
  • Beyond the initial spikes in traffic and backlinks, what kind of long-term SEO value can be gained from massive media coverage?
  • Do large brands or unknown brands stand to gain more from a frenzy of media attention?
  • Are negative PR stunts worth the risk? Can the potential long-term benefits outweigh the short-term damage to the brand’s reputation?

Methodology

Our goal was to analyze the impact of major media coverage on press mentions, organic traffic, and backlinks, based on seven companies that appeared in the news between February 2015 and February 2016. Here’s how we gathered the data:

  • Press mentions were measured by comparing how often the brand appeared in Google News search results the month before and the month after the PR event occurred.
  • A combination of Moz’s Open Site Explorer, SEMrush, and Ahrefs was used to measure traffic and backlinks. Increases and decreases in traffic and backlinks were determined by calculating the percentage change from the month before the story broke compared to the month after.
  • BuzzSumo was used to measure how often brand names appeared in headlines around the time of the PR event and how many social shares those stories received.

Note: We left out a few metrics for some brands, due to incomplete or unavailable data. For example, backlink percentage growth was not measured for Airbnb or Miss Universe, since these events happened too recently before this study was published for us to provide an accurate count of new backlinks. Additionally, organic traffic and backlink percentage growth were not measured for Peeple, since it launched its site around the same time as its news appearance.

pr-stunt-header.png

I. How media coverage affects press mentions, organic traffic, and backlinks

We looked at seven brands, both well-known and unknown, which received a mix of positive and negative media attention. Before we dive into our overall findings, let’s examine why these companies made headlines and how press coverage impacted each one. Be sure to check out our more detailed graphics around these PR events, too.

Impact of positive media coverage

During the last year, Roman Originals, Airbnb, and REI were part of feel-good stories in the press.

Roman Originals cashes in on #TheDress

What happened

Were you Team Black and Blue or Team Gold and White? It was the stuff PR teams dream of when this UK-based retail brand inadvertently received a ton of press when a photo of one of its dresses ignited a heated debate over its color.

the-dress.jpg

The story was picked up by major publishers including BuzzFeed, Time, Gawker, and Wired. Some A-list celebrities chimed in with their dress-color opinions on social media as well.

The results

Roman Originals was by far the biggest winner out of the brands we analyzed, seeing a 17.5K% increase in press mentions, nearly a 420% increase in US organic traffic, and 2.3K% increase in new backlinks. By far the greatest benefit was the impact on sales — Roman Originals’ global sales increased by 560% within a day of the story hitting the news.

Beyond the short-term increases, it appears Roman Originals gained significant long-term benefits from the media frenzy. Its site has seen a lift in both UK and US organic traffic since the story broke in February 2015.

In addition to the initial spikes directly after the story broke, RomanOriginals.co.uk saw a solid lift in backlinks over time, too.

Man lists igloo on Airbnb for $200

What happened

After Blizzard Jonas had hit the Northeast, a man built an igloo in Brooklyn and listed it for $200 per night on Airbnb as a joke. Airbnb deleted the listing shortly after it was posted. Media pickups included ABC News, USA Today, Washington Post, Mashable, and The Daily Mail.

The results

Of all the PR events we analyzed, the igloo story was the most recent, having occurred at the end of January. Although we can’t yet gauge the long-term impact this media hit will have on Airbnb, the initial impact appears to be minimal. Since Airbnb is frequently in the news, it’s not very surprising that one PR event doesn’t have a significant effect.

Airbnb’s site only saw a 2% increase in organic traffic, despite an 83% increase in press mentions.

It’s also too soon to measure the story’s impact on new backlinks. However, the chart below shows the backlinks around the time of the story breaking relative to the new backlinks acquired during the rest of the year.

REI opts out of Black Friday

What happened

The retail chain announced it would be closed on Black Friday and created the #OptOutside campaign urging Americans to spend Black Friday outdoors instead of shopping. Major media outlets picked up the story, including CNN, USA Today, CBS News, and Time.

screenshot-time.com 2016-02-16 13-45-54.png

The results

While REI received great publicity by saying “no” to Black Friday, the media coverage appeared to have little impact on organic traffic to REI.com. In fact, traffic decreased by 5% the month after the story broke compared to the previous month.

REI.com did see a 51% increase in new backlinks after the story broke. Additionally, the subdomain created as part of the #OptOutside campaign has received nearly 8,000 backlinks since its launch.

When good press turns bad (and vice versa)

In addition to both positive and negative spins being put on a story, the sentiment around the story can change as more details emerge. Sometimes a positive story turns negative or a bad story turns positive. Such is the case with Gravity Payments and Miss Universe, respectively.

CEO of Gravity Payments announces $70K minimum wage

What happened

The CEO of this credit card-processing company announced he was cutting his salary to provide a minimum staff salary of $70K. It was hard to miss this story, which was covered by nearly every major US media outlet (and some global), and included a handful of TV appearances by the CEO. The brand later received backlash when it was discovered that the CEO, Dan Price, may have increased employee wages in response to a lawsuit from his brother.

screenshot-www.today.com 2016-02-16 13-57-32.png

The results

Initial spikes after the story broke included a 90% increase in press mentions, 139% increase in organic traffic, and 146% increase in new backlinks. But it didn’t end there for Gravity Payments.

What’s been most incredible about this story is its longevity in the press. Six months after the story broke, publishers were doing follow-up stories about the CEO signing a book deal and how business was booming. In December 2015, Bloomberg wrote a piece revealing that there was more to the story and suggested the wage increase was motivated by a lawsuit.

So far it looks like the benefits from the good press have outweighed any negative stories. In addition to the initial spike, to date GravityPayments.com has seen a 1,888% increase in organic traffic from the month before the story broke (March 2015).

The site has also received a substantial lift in new backlinks since the story broke.

Steve Harvey crowns the wrong Miss Universe winner

What happened

Host Steve Harvey accidentally announced the wrong winner during the 2015 Miss Universe pageant. Some speculated the slip up was an elaborate PR stunt organized to combat the pageant’s falling ratings.

While there was initial backlash over the mistake, after several public apologies from Harvey, the incident may end up being best remembered for the memes it inspired. steve-harvey-meme.jpeg

The results

It appears the negative sentiment around this story has not hurt the brand. With a 199% increase in press mentions compared to the previous year’s pageant, this year’s Miss Universe stayed top of mind long after the pageant was over.

After the incident, there was nearly a 123% increase in monthly organic traffic to MissUniverse.com compared to the month following the 2014 Miss Universe pageant. However, organic traffic had steadily increased throughout 2015. For this reason, it’s difficult to give Steve Harvey's flub all the credit for any increases in organic traffic. It’s also too early to measure the long-term impact on traffic.

It’s also difficult to gauge how much of an effect it had on backlinks to MissUniverse.com. Judging from the chart below, so far there has been a minimal impact on new backlinks, but this may change as more articles related to this story are indexed.

For a brand that relies on TV viewership, perhaps the greatest payoff from this incident has yet to come. You can bet the world will tune in when Steve Harvey hosts next year’s Miss Universe pageant (he signed a multi-year hosting contract).

Is there any value to bad publicity?

Crafting controversial stories around a brand can have a huge payoff. After all, the press loves conflict. But too much negative press coverage can lead to a company’s downfall, as is the case with Turing Pharmaceuticals and Peeple.

Turing Pharmaceuticals raises drug price by 5,000%

What happened

You may not recognize the company name, but you’ve most likely heard of its former CEO Martin Shkreli. This pharmaceutical company bought a prescription drug and raised the price by 5,000%. The story made global headlines, including coverage by the New York Times, BBC, NBC News, and NPR, and the CEO had multiple TV interviews.

shkreli-daily-beast.png

Shkreli defended the price hike, saying the profits would be funneled back into new treatment research, but his assertions that the pricing was a sound business decision wasn’t enough to save face. He later stepped down as Turing’s CEO after being arrested by the FBI on fraud charges.

The results

Like Gravity Payments, the Turing Pharma story has had a long lifespan in the news cycle. After the story broke on September 20, press mentions of Turing Pharmaceuticals increased by 821% over the previous month.

During the month after the story first broke, turingpharma.com saw a 318% increase in organic traffic. Traffic also spiked in December and February, which is when Shkreli’s arrest, resignation as Turing CEO, and congressional hearing were making headlines.

Turingpharma.com also saw a significant increase in backlinks after the story broke. Within a month after the story broke, the site had a 382% increase in new backlinks.

While Turing Pharmaceuticals gained SEO value and brand recognition from the media frenzy, the benefits don’t make up for the negative sentiment toward the brand; the company posted a $14.6 million loss during the third quarter of 2015.

Peeple promotes new app as “Yelp for people”

What happened

A new site announcing a soon-to-be-launched "Yelp for people” app caused a huge social media and press backlash. The creepy nature of the app, which allowed people to review one another like businesses, sparked criticism as well as concerns that it would devolve into a virtual “burn book.”

wp-peeple.png

The Washington Post broke the story, and from there it was picked up by the New York Times, BBC, Wired, and Mashable.

The results

Peeple is an exceptional case since the app’s site launched right before the brand received the flurry of media coverage. Because of that, it’s possible that forthepeeple.com had not been indexed by Google yet at the time of the press coverage. Unlike the other brands we looked at in this study, we don’t have traffic and backlink benchmarks to compare from before press attention. But still, the Peeple story serves as a cautionary tale for brands hoping to attract attention to a new product with negative press.

Peeple received a 343% increase in press mentions during the month after the story broke. But since it was a new site, it’s difficult to accurately gauge how much of an impact media attention had on organic traffic and backlinks. Despite all of the attention, to date, the site only receives an estimated 1,000 visitors per month.

Since the story broke, the site has received around 3,800 backlinks.

An abundance of negative media coverage buried Peeple before its product even launched. By the time the founders backtracked and repositioned Peeple in a more positive light, it was too late to turn the brand’s image around. The app still hasn’t launched.

II. What marketers can learn from these 7 PR wins and fails

A substantial increase in press mentions, rather than volume, can yield significant benefits.

Overall, the stories about large brands (Airbnb, REI, Miss Universe) received more exposure than the unknown brands (Turing Pharmaceuticals, Roman Originals, Peeple, Gravity Payments). The well-known brands were mentioned in 148% more headlines than the unknown brands, and those stories received on average 190% more social shares than stories about the lesser-known brands.

Although stories about smaller brands received less press coverage than large brands, the relatively unknown companies saw a greater impact from being in the news than large brands. Roman Originals, Gravity Payments, and Turing Pharmaceuticals saw the greatest increases in organic traffic and backlinks. Comparatively, a surge of press coverage did not have as dramatic of an impact on the large companies. Of the well-known brands, Miss Universe saw the greatest impact, with a 199% increase in press mentions and 123% increase in site traffic compared to the previous year’s pageant.

Negative stories attracted more coverage and social shares than positive stories.

On average, the brands with negative stories (Miss Universe, Turing Pharma, and Peeple) appeared in 172% more headlines which received 176% more social shares than positive stories.

Have you noticed that the news feels predominantly negative? This is for good reason, since conflict is a pillar of good storytelling. Just as a novel or movie needs conflict, so do news stories.

That being said, there is such a thing as too much conflict. As we saw with Turing Pharmaceuticals and Peeple, company reputations can be irreversibly damaged when the brand itself is the source of conflict.

An element of unexpectedness is a key ingredient for massive press coverage.

There’s an old saying in journalism: "When a dog bites a man, that is not news because it happens so often. But if a man bites a dog, that is news.”

From a CEO paying all employees $70,000 salaries to a major retailer closing on the busiest shopping day of the year to a seasoned TV host announcing the wrong beauty pageant winner, all of the stories we analyzed were surprising in some way.

Surprising stories attract initial attention and then ignite others to share it. This crucial element of newsworthiness also plays a role in making content go viral.

A quick, positive reaction when the brand isn’t controlling the story may help boost the beneficial impact of media coverage.

A carefully orchestrated PR stunt allows a company to plan for the potential press reaction, but what’s a brand to do when it unexpectedly ends up in the news?

While this may sound like a bureaucratic company’s worst nightmare, nimble brands can cash in on the attention with a quick, good-spirited reaction. Roman Originals masterfully news-jacked a story about itself by doing just that.

First, it put out a tweet that settled the debate over the dress’ color and updated its homepage to showcase #TheDress.

screenshot-twitter.com 2016-02-15 13-29-22.pngSoon after, a white and gold version of the dress was put up for auction, with the proceeds donated to charity. Had Roman Originals spent too much time planning a response, it may have missed out while the story was still relevant in the news cycle.

Key takeaways

While most brands will never achieve this level of media coverage, the instances above teach pertinent lessons about what makes a story catch fire in the media:

  • A PR win for a little-known brand doesn’t necessarily require thousands of press mentions. For this reason, unknown companies stand to benefit more from riskier tactics like PR stunts. On the flipside, it may be more difficult for a large brand to initiate a PR stunt that makes a significant impact.
  • An element of unexpectedness may be a primary driver for what makes a news story go viral. When possible, include an unexpected angle into your PR pitches by focusing on what’s unique, bizarre, or novel about your brand.
  • Plan for the unexpected by having processes in place that empower marketing and PR teams to act fast with a public response to sudden media attention.
  • As we saw in our study, controversial stories are a big hit with journalists, but make sure your brand is the hero, not the villain. Look for opportunities to weave the “bad guys” your company is fighting into your pitches. Your company’s villain could be as obvious as a competitor or more subtle adversaries like the establishment (Uber vs. taxi industry).

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from Moz Blog http://ift.tt/1LesOzD
via IFTTT