Advanced Local Citation Audit With Clean Up To Achieve Consistent Data With Higher Rankings

Are you aware that having consistent and correct citations are neccessities for any local SEO Campaign to be successful? If you’ve read the latest Moz Local Ranking Factors survey or you’ve been in the local SEO game for a while, it’s no surprise to ranking locally in Google your citations are very important. In fact,according to the Moz local search ranking factors survey, citations and external location signals are the 3rd most important ranking factor. That’s why it’s important to have your correct NAP listed across the major citation sources.
So what is the NAP format?

NAP stands for Business Name, Address, and Phone Number. Having this information listed on another website such as Yelp, Citysearch, or Yellowpages acts like a positive vote for your local listings. While many people know they need to build citations to help increase their local rankings, surprisingly many people overlook having duplicate listings and incorrect listings.

Some people think they only need to update their Google My Business listing which is incorrect. In fact, according to David Mihm from Moz “If all you’re doing is updating your Google+ Local Page, you’re going to continue to see problems because “new” erroneous data will constantly feed into Google from all of its other sources.” This has been known to create bigger problems down the road. So what is the downside if you have inconsistent citations, duplicate citations, or citations that are just plain wrong?

In a nutshell you’re missing out on getting credit for that citation, it’s
hurting your local rankings in Google, and its potentially creating longer term issues when the incorrect data is scraped. Unfortunately while there are some resources you can use to simplify this process, it’s not as easy as waving a magic wand or blinking while wearing your Google Glass.

Let’s Start With The Basics: What is an Incorrect NAP?

So what is an incorrect NAP? The long and the short of it is that Google and other search engines want to give you credit for having your business name, address, and phone number listed on other reputable websites. It acts as a vote of confidence for you similar to a link in organic SEO.
You should make sure that your Google My Business Listing has your correct NAP format the way you want it displayed across the web. If any of this information is not the same as it is listed in Google My Business then you may not be getting credit for it. Additionally if you have a duplicate listing it could be hurting you as well. Just because you didn’t create or publish the incorrect information doesn’t mean it’s not polluting the local ecosystem. There are plenty of ways this can happen as I discuss a little later in this article. But first, let’s take a look of some good and bad examples of correct and incorrect NAP.

How Exact Do These Citations Need to Be?
As you can see from the examples above, I was very clear with the items I changed from the correct example. Although Google has gotten good at detecting minor differences, you should always aim to be the least imperfect. The whole reason we are fixing these in the first place is to make it easier for Google to associate the proper listings together for your business. Minor differences such as Street and St. should not be an issue. However, incorrect, duplicate, or false information is a big no no.
The long and the short of it is that the Business Name, Address, and Phone Number you want to use should be 100% correct in your Google My Business Dashboard (formerly Google Places, Plus Local, etc.). From here, you can copy this exact format on every source you wish to get a citation from.
Overall, there are three types of citations we’re trying to fix during this process. These include:Duplicates -Duplicate listings on the same directory
Mismatches – Listings for your business that have the wrong Business Name, Physical Address, or Phone Number (or just the 800 number and no local number). This can get especially complicated for doctors and lawyers, as I will discuss later in this article.
Incomplete Citations – It’s important that you fill out the profiles to completion once you’ve claimed them. This includes adding photo and filling out every field there is an option for.How Does This Happen & What Causes These?
Just because you don’t remember creating an incorrect listing doesn’t mean there is no bad data in the local ecosystem. In fact, here are just some of the common reasons you have incorrect NAP across the web:

  • Your business moved physical locations
  • You used tracking phone numbers at one point
  • You hired an SEO to create citations or get listed on online directories
  • The data aggregators have incorrect information
  • You inherited a dirty phone number
  • You changed your local phone number
  • You used tracking phone numbers
  • You used an 800 number and not a local number
  • You have different trade name or business name variations
  • Your listing was incorrectly submitted and scraped to other sites

Someone in your organization setup the listings without knowledge of NAP consistency (this is pretty common).
While there is a major possible ranking benefit of cleaning up this data, there is another reason it should be on your radar too. If you’re a fan of brand consistency like me, then you want to be the least imperfect and ensure all information about your company is accurate across all mediums you control.
The example below shows just how confusing this data can be and the issues that can be caused by incorrect citations on one of these sites. (Graphic from David Mihm’s Local Search Ecosystem).

Before We Start: Here are Some Important Things to Know If you’re paralyzed just thinking about the hundreds you need to fix, don’t sweat it.

  • While a good long term-goal would be to clean up a lot of the data, the reality is that your first focus should be on the top citations for your industry and city.
  • Also check out Phil Rozek’s list here and the Top 50 Citation Sources that Whitespark mentions on this page.
  • Focus your efforts on the primary citation sources for your niche and area. Once these are fixed up you can move on. Just spend 15 minutes a day cleaning this up.
  • Keep good records using this spreadsheet. You will need to follow up with these directories again and again in some cases. Don’t worry, I made a spreadsheet below that you can use for this.
  • Make sure to update the old incorrect citation instead of just adding new ones!

This work can be tedious, but accuracy is essential. Don’t try to use shortcuts.
Read Moz’s case study from David Mihm regarding cleaning up citationsLet’s Start By Identifying Possible NAP Variations & Recording Them In the Spreadsheet.

  • The first step in the citation cleanup process is to find out exactly what information is actually out there about your business. I put together an
    awesome spreadsheet you can use here. The first tab has a place to post the duplicate information. I like to color code it for simplicity as you can see in the example below:
  • In the example above, I have the correct business information at the top of the spreadsheet in green for easy reference and the incorrect variations in red. You will want to record every variation you find here to make our job a little bit easier moving forward. But how do you find the incorrect variations for your client or business?

I prefer to start at the source by talking to the business owners and marketing managers. After you have collected their proper NAP info, ask these questions to see if you can get any details:

  • Have you ever moved physical locations?
  • Is this the address you have listed on your legal business paperwork with the State and Federal government?
  • Have you ever used tracking phone numbers?
  • Have you ever hired an SEO company or someone to manage your online presence.
  • Do you have a list of logins or websites they submitted you to?
  • Do you use any lead generation services? (Sometimes they use tracking phone numbers)
  • Does your business go by any trade or fictions business names?

Typically asking these simple questions up front can save a lot of time in the long run. If you don’t get any good info from them or they just don’t know there are several ways you can look for this information online first to make your life easier.

After You Have Asked the Questions, It’s Time to Do Your Own Investigative Work
While the questions above are helpful, it’s important to dive a little deeper and see what you can find. These are the steps I typically take:

  • Check the secretary of state’s filing for the business. Most have an online search platform where you can see who registered the business. If it has a different Name, Mailing Address, or Phone number go ahead and add these to the spreadsheet. We will want to check these out when searching for duplicates. (BONUS TIP: Search their filed business documents online and see if they had previously filed for a fictitious business name or DBA.)
  • Review the company BBB listing. Check out Phil Rozek’s article on his BBB Hack for finding possible conflicting information. The long and the short of it is that the BBB.org business listings show additional reported phone numbers, business names, and addresses as shown in the example below from his website.
  • Check Google Map Maker. By viewing the classic Google Map Maker, you can see the edit history of a business. This will tell you if a phone number or business name has been changed. To get this data simply pull up the Classic Map Maker, search for a business and then select the history tab. Once you’re on the history click “Show All Changes” in the upper right corner of the listing as shown below:
  • Once you have clicked on this, it will show the entire edit history. Look for edits to the NAP over the time the listing has been live. In the example below, you can see how the business name was actually changed at one point. This is the business name I will want to record in my spreadsheet (the old one).

Once you feel like you have a good handle on this, you can start by moving on and searching for these culprits hiding across the web. Now it’s time to get fixing! Here is a Quick Way to See What NAP Variations Google Already Associates with Your Business

If your business is recognized by Google and has reviews on other websites the new Google My Business dashboard tries to condense that information in one place. It provides examples of listings it has associated with your listing already. I recommend checking this to see what differences it recognizes for your business and mainly used for reference. If Google detects an inaccurate citation, don’t assume that it will find others. Remember, always aim to be the least imperfect.
To access this simply follow the steps below: Login to your dashboard at www.google.com/mybusiness
Open up one of your locations and scroll down to the reviews section.
Click the blue “Manage Reviews” button
Then scroll down and check under the “Reviews from around the web” heading and see what pops up.
You should see the listings here of other detected reviews.
You can click the “View full review on….” link and view the full review there.
Check the NAP for that citation and see how or if it varies from your correct NAP. Record the differences as we can use them later in this guide. Start With the Data Aggregators Before Your Manual Efforts
Tools are great and help make tedious jobs like this easier. While there are some tools I advocate for this job, the reality is that most of them don’t cover the niche specific directories and others you may be listed on. That being said there are some great tools you can use to help save time and money and are recommended in my overall procedure below.Start with Moz Local. Moz Local provides a Check My Listing score which will scan your listings just by entering your Business Name and Zip Code. This will give you a score that includes the citations that are Complete, Incomplete, Inconsistent, and Duplicates from the Top 15 citation sources and data aggregators. If you’re not starting here, you might be shooting yourself in the foot. Signing up for this service which is $50 a year will help fix this data at some of the sources that distribute their data to many other providers across the web. You can also use this service to find other possible NAP variants.
Consider additional tools to see if they will help you. My manual methods are below but if you want to pull other data, you can also check out Brightlocal’s Local SEO Checkup product which will show you NAP variants and the accuracy of major listings. You can also check out Whitespark’s citation finder to start with a list of sites it detects you being listed on. They both offer great citation finding resources which will make this a bit easier. Also, Yext just recently introduced a product for fixing duplicates. While I have not had a chance to review this yet I believe it’s only for their network and it is a paid service.
Once you’re ready, it’s time to move on to the manual side of NAP Cleanup using my method below. The Manual Cleanup Process
When dealing with citation cleanup,
efficiency and accuracy is the name of the game. I have developed a process that I find works best for me when it comes to being productive in fixing citations and removing duplicates. This is what I’m going to explain below in more details, but basically it boils down to four steps. On the second tab of
this spreadsheet that I created for you, you will see the sheet has several columns. They are identified and explained below:

Website

Put the domain of the citation source. This will help you sort it later for easy tracking.

Business Name

Copy and paste the business name from the citation you want to keep here. If the one you want to keep is wrong, paste it here anyhow. We will correct it later.

Address

Copy and paste the address including suite # from the citation you want to keep here. If the one you want to keep is wrong paste it here anyhow. We will correct it later.

City State Zip

Copy and paste the City, State, and Zip Code from the listing here.

Phone

Copy and paste the Phone Number from the listing here.

Links To

Put the URL that the citation is linking to if applicable.

Issues

Put the main issue here. Mention all issues if possible. If the citation is a Duplicate and has an incorrect name I would put “Duplicate | Incorrect Business Name”

URL Of Live Listing

Copy and paste the URL of the citation source so we can refer to it later if needed.

Duplicate 1

Copy and paste the URL of any duplicates here

Duplicate 2

Copy and paste any duplicates here

Status

I added a status column to check and update the status. Sometimes when you contact them they may not be prompt.

 

Green

If you highlight the row in this color, you have confirmed there are no issues with this citation and no duplicates.

Yellow

There is an issue with this listing like the company name is missing “The” in front of the name or the suite number is not perfect. Basically this is for minor secondary issues that don’t need fixing, but you could fix them if you wanted to.

Red

If there is a major issue with the NAP such as wrong Name, Address, Phone Number or a Duplicate you can mark it as red. This will help us to prioritize our work later.

Below are two screenshots of how the spreadsheet looks when you pull it up.

Once you start finding the citations, you will want to color-code each row after evaluating the citation. This will help you prioritize your work later once your’re ready to start fixing these up. The ProcessAudit Your Citations – Using my spreadsheet and the methods listed below you can start by auditing every citation source you find for your business.
Record the Data – Record the NAP information in the spreadsheet provided and don’t be shy with the details. After you have identified a problem make sure to color code the row. Red is a very important fix, Yellow is something you can fix but can wait, and Green is good meaning there are no problems and no duplicates.
Outreach & Fix – Once you have a list of your action items, you can sort the list by RED or priority items. You can then outreach to these sites and record it in the notes with the date.
Follow Up, Record, & Repeat – You can’t just send an email or contact form and call it good. You have to follow up. Don’t change the color of the row until the live listings are fixed. This will allow you to check and re-check until these issues are cleaned up. The reality is that some of these listings will require multiple contacts to get fixed (just like link removals). Finding Your Incorrect Citations
Finding these citation sources can be a difficult task. However, if you already have a list of primary citations you want to tackle you’re in a good spot. Remember that focusing your efforts on the
primary sources will provide the most ROI.
Remember that when you’re searching for citations using these methods you will want to search for each of the ones you identified to ensure complete accuracy. In other words, don’t just search a directory by the proper business name or phone number. Also, search it with the
WRONG information you identified to see if any wrong sources come up. Method 1: Search Specific Directories & Websites
If you only have one business location this task gets a bit easier as there a search string you can use to narrow down your results. However if you’re a multi-location business it may not work as well (depending on how many locations you have). This search string is going to use three commands. The first command
site: searches only within the website immediately following the colon. If I just wanted to search Yellowpages.com I could type site:yellowpages.com. Now putting information after the site command will help narrow down your search. Let’s say that I wanted to search only YellowPages.com for my exact company name, but only for listings that DO NOT contain my primary phone number that is associated with my NAP. In this case I could put in this search:

  • site:yellowpages.com “The Reeves Law Group” -714-550-6000The site: command tells Google to search only the website (in this case YellowPages.com)
  • The Quoted “The Reeves Law Group” tells Google to only return results that include the company name in that exact phrase order
  • The -714-550-6000 tells Google to not include any results that use this phone number. The minus allows you to exclude information you don’t want to appear in the results.
  • If you have a list of citation sources you want to check such as the Moz Top 10 by City or Industry, you could then use these search strings, identify duplicates and problems on the primary sites.
  • Most reputable websites also have an internal search function as well. It’s important to check this too as the Google Site: command only searches for indexed citations. It’s possible that the incorrect one may not be indexed yet, but could cause problems in the future. Method 2: Searching Google’s Index Citations

While you can search specific directories for incorrect citations if you already know the websites you want to check, what if you don’t have that list? Another easy way is to pull the incorrect results direct from Google. To do this we will use the MozBar and modify our search settings which allow us to scrape 100 results at a time. Simply follow the example below.

Before using this method, you need to change your search settings in Google
Start by pulling up Google.com and clicking on the gear icon in the upper right hand corner of a search page. You will want to navigate to search settings where you will check the button “Never Show Instant Results” and then change the Results Per Page slider to 100. This will allow you to search 100 entries as a time.

You will also need the MozBar for this. If you don’t have the extension you can download the Chrome version here and the Firefox version here. Once you have the MozBar installed you will be ready to start scraping these results!

When the MozBar is on and you do a search in Google you will now be able to see 100 results and easily export them by clicking the export button in the top left corner of the MozBar as shown in the example below. Once you have these results you can copy and paste them in my trusty spreadsheet for evaluation. Of course if you’re doing a lot of searches I recommend conducting the searches first, combining the results, and then removing duplicates in Excel. This will save you a ton of time!

So Which Search Operators Should You Use?
Using the proper search operators and getting a bit creative will save you a substantial amount of time. Don’t think that you’re stuck with the ones I have provided below. Get creative and think outside of the box based on your situations. Below are some examples you can use along with an explanation of them. They are sorted by categories below.

Casey’s OCD Pro Tip: Using Google can produce different results depending on how the data is entered on the actual citation site. For example it’s a good idea to search different Phone number variations. Some variants include: 111111111111, 111-111-1111, (111) 111-1111Take note that when you do a search with quote around the keyword (e.g: “Keyword One”), it will search for the words in that order exactly as they appear. If you want to learn more about creative boolean search terms check out this resource.

How to Find Listings With Incorrect Phone Numbers
What you should search: 800 Number -Local Number
Example: 800-644-8000 -714-550-6000          This search when preformed in Google is telling it to search for the main company’s 800 number (800-644-8000), but exclude the local phone number of the main office (which is why I used the – sign before 714-550-6000). If you have one location that uses or used an 800 number at some point this will be your primary go to search. If you have multiple locations though it will likely just return results from the other locations. You could of course add – to additional offices and search this way as well. What you should search: 800 Number -Local Number +Company Name
Example: 800-644-800 -714-550-6000 +Reeves Law Group

At first glance this search operator probably looks like the one above with the simple addition of +reeves. However take a closer look and notice how I took out one “0” from the 800 number. What I am doing here is looking for a possible wrong entry but also making sure that part of the company name (in this case The Reeves Law Group) is shown in the string.
How To Find Incorrect Business Names That Have The Proper Phone
What you should search: 555-555-555 -“Company Name”
Searching for the office locations phone number and then excluding the company’s name using the -“company name” command will show all results for that phone number that do not mention the proper company name. This is an easy way to find variants of the business name across the internet.
Other Searches You Can Try

What you should search : “Business Name”+”Address”         With quotes this will search for all instances of the exact business name and exact address you put in. The more specific you get the narrower the search results will be.
What you should search : “Business Name”+”Zip code”              Doing this will give you another list of  options that could include listings without the proper business phone number.

Finding Which Citations are Correct

What you should search (Without quotes): “City Name”+”Zip Code”+”Company Name”+”Phone Number”

Thankfully, you can also use these tricks to see which citations you have that are correct. If you’re scanning for citations this way make sure you also check each of these sites for possible duplicates as you could have one correct listing and one or more bad ones too.

Once you have your list of sources you can us the Mozbar export option outlined above and sort through these on the spreadsheet. Once You Have Them Documented You Can Prioritize and Outreach.

Once all of these are all properly documented comes the painstaking task of fixing them. Some of these websites will allow you to claim listings and directly edit them which is nice. Some you will have to hunt for the contact information and if you can’t find it I recommend checking their WHOIS information to get the data of the domain owner. Most reputable sites though will have some way of contacting them.

Usually when you encounter duplicate listings, you will have to contact the website to get them removed. Be patient. Remember that in most of these cases you’re not paying to be listed on their website so their response can take some time. Be sure to document your contact dates in the spreadsheet as well so you can easily follow up.

Here are a few tips for the outreach methods:Make sure all email contacts come from an email address on your websites domain such as Webmaster@YourDomain.com. This may help the back and forth verification process where possible.

Some listings will require you to claim and verify them and may call the business with an automated system. Be prepared to take a few calls.

Always be very clear with your request but also be concise. They don’t typically spend a lot of time on these requests so making it as easy as possible with the links is recommended.

Make sure to read the websites FAQ’s for removing duplicates or updating listings. It will save you a lot of time and they may already have a process in place for this.
If you can’t find the procedure try the contact form on the website first, then email if you don’t hear back in a reasonable amount of time.
Contacting Websites to Fix Listings via Email

Below is a very quick and easy sample outreach email I use for some of these contacts. This example can be used if you have two listings at YellowPages.com that are on the following URLs:

  1. http://www.YellowPages.com/Listing1
  2. http://www.YellowPages.com/Listing2

Sample Contact Email:
Hello,
I recently discovered that your website has two listings for my business, “Business Name” located at “Address”. I was hoping you could help me delete the duplicate listing.
The correct listing is: 1) http://www.YellowPages.com/Listing1
The listing I need deleted is: 2) http://www.YellowPages.com/Listing2
Could you please notify me once you have had the chance to fix this?
Thanks!
– Business Owner

How This Helps
By sending out clear and concise emails you may eliminate the back and forth emails and get them done quicker. Over time you may notice that some of these websites don’t reply. The reality is that some of them won’t reply or will charge a fee to be fixed. You can make the decision on a case by case basis whether these are important enough to worry about.

Conclusion
I hope you found this guide useful and hope it’s something tactical that you can put to use right away. Using this method you will be off to a good start at fixing up your citations. Like everything else in local search this will take time to cleanup and time to process. Let Google find and index these naturally over time and watch your local rankings soar. If you have any other tips for citation cleanup please post them in the comments below. Additionally if you have any specific questions please feel free to contact me directly anytime. Just take it one step at a time and you’ll be done in a instant!

Why Mobile Matters – Now

Having built an online business during the dot-com boom and bust, I’ve always been a bit skeptical about the mobile revolution. Every year since the late 90s, we’ve heard that this would be “The Year” for mobile. In the past year, though, my skepticism has been challenged by a wide range of data, and I no longer believe that the mobile web is simply a miniature desktop. This post is an in-depth analysis of why I think online marketers need to start paying attention to mobile now.Google’s “Mobile First” Shift
It’s no mystery that I follow Google’s actions pretty closely. When Google launched a significant redesign back in March, Jon Wiley – Lead Designer for Google Search – posted this on Google+:

For a long time, we’ve assumed that mobile would naturally follow desktop, and trends like the slow death of WML (Wireless Markup Language) seemed to support that assumption. In the past two years, though, Google has repeatedly designed and launched new features on mobile first, including the most recent ad format and the latest version of Google Maps.
So, it begs the question – what does Google know that the rest of us don’t?Google’s Greatest Fear
In July of 2013, Google migrated AdWords advertisers to what it calls “enhanced” campaigns. Many in the industry viewed this as a euphemism for preventing advertisers from bidding separately on mobile and tablet vs. desktop. Google had been experiencing long-term CPC losses, and most analysts blamed those losses on advertisers’ unwillingness to pay the same rates for mobile/tablet clicks as they did for desktop.
Google has strongly resisted splitting out mobile vs. desktop performance, going as far as to tell the SEC that “…disclosing or quantifying the impact of only one factor, such as platform mix, could be misleading and confusing to investors.” This has nothing to do with usability or confusion – Google is afraid of mobile and its impact on their $60B bottom line, the vast majority of which depends on advertising. Mobile-first design is about survival, plain and simple.Google’s Multi-Screen World
Back in 2012, Google released a fascinating study about the multi-screen world. It paints a complex picture of how we use multiple screens to navigate the web, and often perform activities across mobile, tablet, and desktop. Google ended that report with eight conclusions, and this was the final one:

What led them to this conclusion? A couple of data points give a very interesting view of the impact of mobile on search. First, Google reported (see slide #20) that a full 65% of searches begin on mobile phones. Second, they found – which seems obvious in retrospect – that we reach for the “screen” that’s closest (slide #34). So, if you see something on TV, hear about it on XM Radio in the car, or read about it in the doctor’s waiting room, you’re going to reach for your mobile phone.More Mobile Trends (2014)Recently, Mary Meeker’s closely-watched annual state of the internet report was released, and it contains a great deal of data about where mobile is headed. Smartphone adoption is climbing and tablet sales are skyrocketing, but I’d like to focus on one graph that sums up the trend pretty well (from slide #9):

Globally, the percentage of page views coming from mobile devices has jumped substantially in the past year, and accounts for almost one-fifth of North American page views. Critics will argue that desktop usage has not substantially decreased, and that’s true, but the problem is this – as mobile gets to be a larger piece of the picture, we’re seeing less and less of that picture by excluding mobile data.
Look at it this way – let’s say we had a sample of 1M page views, and all of them came from desktop visitors. That would give us the pie on the left. Now, let’s say desktop holds steady at 1M page views, but mobile is now 19% of total views. This is what that reality would look like:

If we only look at those 1M page views, then it seems like nothing has changed, but the reality is that the desktop piece of the pie has shrunk. If we ignore mobile in this case, we’re missing out on 234,568 page views, and our picture is incomplete.Why This Matters for Search
So what if someone starts a search on mobile – why should that matter to us as search marketers? The problem is simple: while Google desktop search design is being inspired by mobile design, the reality of a small screen means that mobile SERPs can look very different. Just as Google found with ad CTRs, this can lead to very different user behavior.
So, how different are mobile SERPs? I’d like to look at a few notable examples of desktop vs. mobile SERPs, starting from most similar to least similar. For all of these examples, the desktop SERP was captured on a Windows 7 PC using Chrome, at 1280×1024, and the mobile screen was captured on an iPhone 5S using Safari.
Here’s a fairly basic SERP (a search for “plumbers”) with ads and some local features. The desktop version is on the left, and the mobile version is on the right. I apologize for the reduced size, but I felt that a side-by-side version would be the most useful:

The impact of the smaller screen here is readily apparent – even though the desktop SERP shows eight full ads above the fold and the mobile SERP shows only two, the desktop screen still has room for three organic results, a map, and a couple of local pack results. Meanwhile, the one organic result that does pop up on the mobile screen has the advantage of being the only organic element on the “page”.
Unfortunately, we have very little data on relative CTR for either ads or organic results, and Google is tweaking both designs all of the time. I think the core point is that these user experiences, even for a relatively straightforward SERP, are clearly different.
Let’s look at another SERP (“army birthday”) where the major elements are similar, but the screen space creates a different experience. In this case, we get one of the new answer boxes:

An answer box is disruptive on any screen, but on the mobile screen it occupies almost the entire SERP above the fold. Of course, scrolling is easier and more natural on mobile, so I don’t want to pretend this is a true apples-to-apples comparison, but if the answer meets the user’s needs, they’re unlikely to keep looking.
Let’s look at a standard Knowledge Graph box, in this case one for a local entity (“woodfield mall”). Here, while the styles of the Knowledge Graph boxes are similar, the SERPs are radically different:

While the desktop SERP has a rich Knowledge Graph entry, we also see a substantial amount of organic real estate. On the mobile SERP, a condensed Knowledge Graph box dominates. That box also contains mobile-specific features, like click-to-call and directions, which could easily divert the searchers and keep them from scrolling down to organic results.
Finally, let’s consider a SERP where the presentation and structure are completely different between desktop and mobile. This is a search for “pizza” (from the Chicago suburbs, where I’m located), which triggers a local carousel:

Carousels – whether they’re local, Knowledge Graph, or the newer song and episode lists – are a great example of mobile-first design. While the desktop carousel seems out of place in Google’s design history and requires awkward horizontal scrolling, the mobile carousel is built for a finger-swipe interface. What’s more, the horizontal swipe may derail vertical scrolling to some degree. So, again, a single element dominates the mobile SERP in this example.The Mobile Feature Graph
These differences naturally lead to a follow-up question – do mobile SERPs just look different, or are they fundamentally showing different rankings and features than desktop SERPs? You may be familiar with the MozCast Feature Graph, which tracks the presence of specific SERP features (such as ads, verticals, and Knowledge Graph) across 10K searches. I decided to run the same analysis across mobile results and compare the two.
The table below shows the presence of features across both desktop and mobile SERPs. Data was recorded on June 5th. Both data sets were depersonalized and half of the queries (5K) were localized, to five different cities.

For the most part, SERP features were consistent across the two devices. While it’s very difficult to compare two sets of rankings (even when they differ only by a few hours), the similar number of sitelinks suggests a similar make-up of 10-result vs. 7-result SERPs. A cursory glance at the data suggests that page-1 rankings were not dramatically different.
The big feature difference (which is entirely driven by layout considerations) was in the presence and structure of AdWords blocks. Mobile SERPs only allow top and bottom ad blocks, since there’s no right-hand column. While bottom-of-page ads are the rarest block on desktop SERPs, they’re fairly common on mobile SERPs. The overall presence of ads in any single position was lower on mobile than desktop (at least for this data set). All of this has CTR implications, but we as an industry don’t have adequate data on that subject at present.
The local data is somewhat surprising – I would have predicted a noticeably higher presence of local pack results in mobile SERPs. Google has implied that as many as half of mobile searches have local intent, with desktop trailing substantially. Unfortunately, collecting comparable data required matching the local methodology across both sets of SERPs, so my methodology here is unreliable for determining local intent. This data only suggests that, if local intent is the same, local results will probably appear consistently across desktop and mobile.The Google Glass Feint
Beyond our current smartphone and tablet world is the next generation of wearable technology, which promises even more constrained displays. Right now, we tend to think of Google Glass when we hear “wearables,” and it’s easy to dismiss Glass as an early-adopter fad. When we dismiss Glass, though, I think we’re missing a much bigger picture. Let’s say our timeline looks something like this, with us in the present and Glass in the future…

In other words, I think it was fair to say that Glass, whether you love or hate it, was clearly a future-looking move and is pushing our comfort zones. It was ahead of what we were ready for, and so Google pulled us ahead…

Let’s say we’re not quite halfway-ready for Glass. Stay with me – there’s a point to my crude line art. What about the wearables that aren’t quite as futuristic, including the wide array of fitness band options and the coming storm of smartwatches? Our perception now looks something like this…

Before Glass, we were just warming up to fitness bands, and smartwatches still sounded a bit too much like science fiction. After Glass, challenged with that more radical view of the future, fitness bands almost seem passé, and smartwatches are looking viable. I’m not sure if any of this was intentional on Google’s part, but I strongly believe that they’ve moved the market and pushed ahead our timeline for adopting wearables.
This isn’t just idle speculation paired with pseudo-scientific visuals (it is that, but it’s not just that) – Samsung sold half a million Galaxy Gear smartwatches in Q1 of 2014. Google has recently announced Android Wear, and the first devices built on it have hit the market. More Android-based devices are likely to explode onto the market in the second half of 2014. Rumors of an Apple smartwatch are probably only months away from becoming reality.
I expect solid smartwatch adoption over the next 3-5 years, and with it a new form of browsing and a new style of SERPs. If the smartphone is our closest device and first stop today, the smartwatch will become the next first stop. Put simply, it’s easier to look at our wrists than reach for our pockets. The natural interplay of smartwatches and smartphones (Android Wear already connects smartwatches to Android-powered phones, as does Google Glass) will make the mobile scene even more rich and complex.What It Means for You
My goal is to put the data out there as matter-of-factly as possible, but I personally believe that the long-awaited mobile disruption is upon us. Google is designing a SERP that’s not only “mobile first”, but can be broken into fragments (like answer boxes and Google Now “cards”) that can be mixed-and-matched across any device or screen-size. Search volume across non-desktop devices will increase, and mobile in all its forms may become the first stop for the majority of consumer searches.
For now, the most important thing we can do is be aware. I’ve always encouraged browsing your “money” terms – what does your URL really look like on a SERP, and how does the feature set impact it? I’d strongly encourage the same for mobile – open a phone browser and really try to see what the consumer is experiencing. If your business is primarily local or an impulse buy driven by TV and other advertising, the time to consider mobile is already behind you. For the rest of us, the mobile future is unfolding now.

How To Tap Into Social Norms to Build a Strong Brand

The author’s posts are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.

In recent years there has been a necessary shift in the way businesses advertise themselves to consumers, thanks to the increasingly common information overload experienced by the average person.
In 1945, just after WWII, the
annual total ad spend in the United States was about $2.8 billion (that’s around $36.8 million before the adjustment for inflation). In 2013, it was around $140 billion.
Don’t forget that this is just paid media advertising; it doesn’t include the many types of earned coverage like search, social, email, supermarket displays, direct mail and so on. Alongside the growth in media spends is a growth in the sheer volume of products available, which is made possible by increasingly sophisticated technologies for sales, inventory, delivery and so on.
What does this mean? Well, simply that the strategy of ‘just buy some ads and sell the benefits’ isn’t enough anymore: you’ll be lost in the noise. How can a brand retain customers and create loyalty in an atmosphere where everyone else has a better offer? Through tapping into the psychology of social relationships.
Imagine that you are at home for Thanksgiving, and your mother has pulled out all the stops to lovingly craft the most delicious, intricate dinner ever known to man. You and your family have enjoyed a wonderful afternoon of socializing and snacking on leftovers and watching football, and now it’s time to leave. As you hug your parents goodbye, you take out your wallet. “How much do I owe you for all the love and time you put into this wonderful afternoon?” you ask. “$100 for the food? here, have $50 more as a thank you for the great hospitality!” How would your mother respond to such an offer? I don’t know about your mother, but my mom would be deeply offended.
New scenario: You’ve gone to a restaurant for Thanksgiving dinner. It’s the most delicious dinner you’ve ever had, the atmosphere is great with the football playing in the background, and best of all, your server is attentive, warm, and maternal. You feel right at home. At the end of the meal, you give her a hug and thank her for the delicious meal before leaving. She calls the cops and has you arrested for a dine-and-dash.
And herein lies the difference between social norms and market norms. Social norms vs. market norms
The Thanksgiving dinner example is one which I’ve borrowed from a book by Dan Ariely,
Predictably Irrational: The Hidden Forces that Shape Our Decisions. Ariely discusses two ways in which humans interact: social norms and market norms.

Social norms, as Ariely explains, “are wrapped up in our social nature and our need for community. They are usually warm and fuzzy. Instant paybacks are not required.” Examples would be: helping a friend move house, babysitting your grandchild, having your parents over for dinner. There is an implied reciprocity on some level but it is not instantaneous nor is it expected that the action will be repaid on a financial level. These are the sort of relationships and interactions we expect to have with friends and family.

Market norms, on the other hand, are about the exchange of resources and in particular, money. Examples of this type of interaction would be any type of business transaction where goods or services are exchanged for money: wages, prices, rents, interest, and cost-and-benefit. These are the sort of relationships and interactions we expect to have with businesses.
I’ve drawn you a very rough illustration – it may not be the most aesthetically pleasing visual, but it gets the point across:

Market norms come into play any time money enters into the equation, sometimes counter-intuitively! Ariely gives the example of a group of lawyers who were approached by the AARP and asked whether they would provide legal services to needy retirees at a drastically discounted rate of $30/hour. The lawyers said no. From a market norms perspective, the exchange didn’t make sense. Later the same lawyers were asked whether they would consider donating their time free of charge to needy retirees. The vast majority of the lawyers said yes. The difference is that, when no money changes hands, the exchange shifts from a poor-value market exchange to an altruistic and therefore high-value social exchange. It is a strange psychological quirk that ‘once market norms enter our considerations, the social norms depart.’Mixed signals: when social and market norms collide
In a book called
Positioning: The Battle for Your Mind by Al Ries and Jack Trout (originally published in 1981), the authors describe the 1950s as the ‘product era’ of advertising, when ‘advertising people focused their attention on product features and customer benefits.’ It was all about the unique selling proposition (USP).

In this case, the USP is mildness: “not one single case of throat irritation!” (image source)
However, as the sheer volume of products on the market increased, it became more difficult to sell a product simply by pointing out the benefits. As Ries and Trout put it, ‘Your “better mousetrap” was quickly followed by two more just like it. Both claiming to be better than the first one.’
They describe the next phase of advertising (which hit its peak in the 1960s and 70s and which we can probably all relate to if we watch Mad Men) as the ‘image era’, pioneered by David Ogilvy. In this period, successful campaigns sold the reputation, or ‘image’ of a brand and a product rather than its features. Ries and Trout quote Ogilvy as saying that ‘Every advertisement is a long-term investment in the image of a brand’. Examples include Hathaway shirts and Rolls-Royce.

Rather than the product benefits, this ad focuses on the ‘image’ of the man who smokes Viceroys: “Viceroy has a thinking man’s filter and a smoking man’s taste. (image source)
But yet again, as more and more brands imitate the strategy of these successful campaigns, the space gets more crowded and the consumer becomes more jaded and these techniques become less effective.
According to Ries and Trout, this brought the world of advertising into the ‘positioning era’ of the 80s, which is where they positioned (hehe) themselves. As they described this, “To succeed in our overcommunicated society, a company must create a position in the prospect’s mind, a position that takes into consideration not only a company’s own strengths and weaknesses, but those of its competitors as well.”

This one’s all about positioning Winston’s in opposition to competitors: as the brand with real taste, as opposed to other brands which ‘promise taste’ but fail to deliver. (image source)
And yet, despite this evolution of advertising strategy over the course of the 20th century, all of these different approaches are ultimately based on market norms. The ‘product era’ sells you features and benefits in exchange for money; the ‘image era’ sells you on an image and a lifestyle in exchange for money, and the ‘positioning era’ sells you on why a particular company is the right one to supply your needs in exchange for money.Social norms and loyalty

When does cheap not win? When it comes to social norms. Social norms are about relationships, community and loyalty. If your sister is getting married, you don’t do a cost benefit analysis to decide whether or not you should go to her wedding or whether the food will be better and the travel cheaper if you go to your next door neighbor’s BBQ instead. If anything, it’s the opposite: some people take it to such an extreme that they will go into massive debt to attend friends’ weddings and bring lavish gifts. That is certainly not a decision based on monetary considerations.
Therefore, if the average brand wants to get out of the vicious cycle of undercutting competitors in order to gain business, they need to start focusing on relationships and community building instead of ‘SUPER CHEAP BEST LOW LOW PRICES!!®’ and sneaky upsells at the point of sale. This is something my colleague
Tim Allen spoke about in a presentation called “Make Me Love Your Brand, Not Just Tolerate It”. And this is what a large number of recent ‘advertising success stories’ are based on and it’s the whole premise behind many of the more recent trends in marketing: email marketing, personalization, SMS marketing, good social media marketing, and so on.
Some of the most popular brands are the ones which are able to find the perfect balance between:a friendly, warm relationship with customers and potential customers, which also often includes a fun, personal tone of voice (the ‘brand personality’) – in these interactions there is often an offering of something to the customer without an expectation of instant payback, anda strong product which they offer at a good price with good ‘market’ benefits like free returns and so on.
One example of this is John Lewis, who have good customer service policies around returns etc but also offer free perks to their shoppers, like the maternity room where breastfeeding mothers can relax. One of my colleagues mentioned that, as a new mother, his girlfriend always prefers to shop at John Lewis over other competitor stores for that very reason. Now if this is purely a convenience factor for her, and after her child is older she stops shopping at John Lewis in favor of a cheaper option, you could argue that this is less of a social interaction and more market influenced (in some sense it serves as a service differentiator between JL and their customers). However, if after she no longer requires the service, she continues to shop there because she wants to reciprocate their past support of her as a breastfeeding mother, that pushes it more firmly into the realm of the social.
Another thing John Lewis do for their fans is the annual Christmas ad, which (much like the 
Coca-Cola Santa truck in the UK) has become something which people look forward to each year because it’s a heartwarming little story more than just an ad for a home and garden store. Their 2012 ad was my favorite (and a lot of other people’s too, with over 4.5 million Youtube views).
But usually anytime a brand ‘do something nice’ for no immediate monetary benefit, it counts as a ‘social’ interaction – a classic example is
Sainsbury’s response to the little girl who wrote to them about ‘tiger bread’.
Some of my other favorite examples of social norm interactions by brands are:
The catch is, you have to be careful and keep the ‘mix’ of social and market norms consistent.
Ariely uses the example of a bank when describing the danger of bringing social norms into a business relationship:
“What happens if a customer’s check bounces? If the relationship is based on market norms, the bank charges a fee, and the customer shakes it off. Business is business. While the fee is annoying, it’s nonetheless acceptable. In a social relationship, however, a hefty late fee–rather than a friendly call from the manager or an automatic fee waiver–is not only a relationship-killer; it’s a stab in the back. Consumers will take personal offense. They’ll leave the bank angry and spend hours complaining to their friends about this awful bank.”
Richard Fergie also summed this issue up nicely in this G+ post about the recent outrage over Facebook manipulating users’ emotions; in this case, the back-stab effect was due to the fact that the implicit agreement between the users and the company about what was being ‘sold’ and therefore ‘valued’ in the exchange changed without warning.

The basic rule of thumb is that whether you choose to emphasize market norms or social norms, you can’t arbitrarily change the rules.A side note about social media and brands: Act like a normal person
In a time when
the average American aged 18-64 spends 2-3 hours a day on social media, it is only logical that we would start to see brands and the advertising industry follow suit. But if this is your only strategy for building relationships and interacting with your customers socially, it’s not good enough. Instead, in this new ‘relationship era’ of advertising (as I’ve just pretentiously dubbed it, in true Ries-and-Trout fashion), the brands who will successfully merge market and social norms in their advertising will be the brands which are able to develop the sort of reciprocal relationships that we see with our friends and family. I wrote a post over on the Distilled blog about what social media marketers can learn from weddings. That was just one example, but the TL;DR is: as a brand, you still need to use social media the way that normal people do. Otherwise you risk becoming a Condescending Corporate Brand on Facebook. On Twitter too.Social norms and authenticity: Why you actually do need to care
Another way in which brands tap into social norms are through their brand values. My colleague
Hannah Smith talked about this in her post on The Future of Marketing. Moz themselves are a great example of a brand with strong values: for them it’s TAGFEE. Hannah also gives the examples of Innocent Drinks (sustainability), Patagonia (environmentalism) and Nike (whose strapline ‘Find Your Greatness’ is about their brand values of everyone being able to ‘achieve their own defining moment of greatness’).
Havas Media have been doing some interesting work around trying to ‘measure’ brand sentiment with something call the
‘Meaningful Brands Index’ (MBi), based on how much a brand is perceived as making a meaningful difference in people’s lives, both for personal wellbeing and collective wellbeing. Whether or not you like their approach, they have some interesting stats: apparently only 20% of brands worldwide are seen to ‘meaningfully positively impact peoples’ lives’, but the brands that rank high on the MBi also tend to outperform other brands significantly (120%).
Now there may be a ‘correlation vs causation’ argument here, and I don’t have space to explore it. But regardless of whether you like the MBi as a metric or not, countless case studies demonstrate that it’s valuable for a brand to have strong brand values.
There are two basic rules of thumb when it comes to choosing brand values:
1) I
t has to be relevant to what you do. If a bingo site is running an environmentalism campaign, it might seem a bit weird and it won’t resonate well with your audience. You also need to watch out for accidental irony. For example, McDonalds and Coca-Cola came in for some flak when they sponsored the Olympics, due to their reputation as purveyors of unhealthy food/drink products.
Nike’s #FindYourGreatness campaign, on the other hand, is a great example of how to tie in your values with your product. Another example is one of our clients at Distilled, SimplyBusiness, a business insurance company whose brand values include being ‘the small business champion’. This has informed their content strategy, leading them to develop in-depth resources for small businesses, and it has served them very well.
2) I
t can’t be so closely connected to what you do that it comes across as self-serving. For example, NatWest’s NatYes campaign claims to be about enabling people to become homeowners, but ultimately (in no small part thanks to the scary legal compliance small print about foreclosure) the authenticity of the message is undermined.

The most important thing when it comes to brand values: it’s very easy for people to be cynical about brands and whether they ‘care’. Havas did a survey that found that
only 32% of people feel that brands communicate honestly about commitments and promises. So choose values that you do feel strongly about and follow through even if it means potentially alienating some people. The recent OKCupid vs Mozilla Firefox episode is an illustration of standing up for brand values (regardless of where you stand on this particular example, it got them a lot of positive publicity).Key takeaways
So what can we take away from these basic principles of social norms and market norms? If you want to build a brand based on social relationships, here’s 3 things to remember.
1)
Your brand needs to provide something besides just a low price. In order to have a social relationship with your customers, your brand needs a personality, a tone of voice, and you need to do nice things for your customers without the expectation of immediate payback.
2)
You need to keep your mix of social and market norms consistent at every stage of the customer lifecycle. Don’t pull the rug out from under your loyal fans by hitting them with surprise costs after they checkout or other tricks. And don’t give new customers significantly better benefits. What you gain in the short term you will lose in the long term resentment they will feel about having been fooled. Instead, treat them with transparency and fairness and be responsive to customer service issues.
3)
You need brand values that make sense for your brand and that you (personally and as a company) really believe in. Don’t have values that don’t relate to your core business. Don’t have values which are obviously self-serving. Don’t be accidentally ironic like McDonalds.

Have you seen examples of brands building customer relationships based on social norms? Did it work? Do you do this type of relationship-building for your brand?
I’d love to hear your thoughts in the comments.
About bridget.randolph —

Bridget Randolph is a Consultant with online marketing agency Distilled. She is particularly interested in the way that developments in mobile technology and social media affect our online experiences, and how these changes impact the nature of digital marketing.

5 Fashion Hacks for the Modern Male Marketer – Whiteboard Friday

Editor’s note: Happy 4th of July! We’re off observing our Independence Day, so we decided to celebrate with a non-SEO Whiteboard Friday. From the undeniable class of a full windsor to the (all too common) mistake of letting our underwear become accidental outerwear, today’s modern marketers are prone to some very easily solveable fashion faux-pas. On this Independence Day, we take a quick break from discussing the online world and bring you a whiteboard video on the lighter side. Enjoy!

Rand: Howdy gang, and welcome to a very special Whiteboard video on men’s fashion. Well, so it turns out of all things that, in addition to my deep, deep passion around search and social media and content marketing and all things inbound, I’m also particularly passionate about what guys wear. I hope that I can be helpful in upgrading some of the things that we all wear, because as an industry, occasionally I will go to a conference or an event or someone’s office and see things that make me scratch my head and wonder whether women or men, who might be interested in these men, also scratch their head.

Hence, for you today, I have five fashion hacks for the modern male marketer. The first one is don’t let your underwear become accidental outerwear. So I’m obviously wearing underwear. I’m not going to show you, sorry, or maybe a good thing. But I’m also wearing underwear underneath this shirt, which you can see here. This is an undershirt. It’s underwear.

Now, it turns out that not all men are, let’s say, cognizant or fantastic at hiding their underwear. Let me show you what I’m talking about. Ta-da! Look at this brand new undershirt that I am now wearing and which tragically is on display for the whole world to see because I’m wearing a button-up shirt, but I’m not, of course, going to button it all the way to the top. So you’re seeing my underwear. Unintentionally my underwear has become outerwear. I just find this, well, not optimized, and as you know, I love optimization.
Unintentionally, my underwear has become outerwear. I just find this, well, not optimized.Now let me show you. There is an actual exception to this underwear/outwear rule, and I’ve asked our head of Big Data, Martin York . . . Martin, would you join me here for a second? So Martin has very wisely, wonderfully worn a ring-collar shirt with a button up. But look, it’s a T-shirt. It is not underwear. His underwear is not on display. His T-shirt is on display. This is a totally acceptable way to wear a ring-collar shirt with a button-up shirt and leave the button undone and show it off. No problem at all from a fashion standpoint.

The problem is when you do what I’m doing here. Don’t be like me. Wear a V-neck. Thanks Martin.

Another fantastic thing about having an open collared shirt and wearing an undershirt, like a V-neck underneath it, is when I get home, I don’t actually need to wash this. I can just put it right back on my hanger in my closet, and then I throw the undershirt in the wash. Undershirts are easy to wash. They’re also very inexpensive. If something happens to the undershirt, no problem. It might get a little sweaty during the day, especially filming so many videos.

My second point here, so a lot of times I talk to guys about shoes and footwear, because I’ll compliment a guy on what he’s wearing. I think it’s really cool. I’m obviously deep into footwear, right? I have my yellow Pumas, and I have all sorts of other shoes, and my footwear is just something I’m very passionate about. I love when I get compliments about my shoes, and so I like getting new ones. But I’ve noticed that many gentlemen have a big challenge around this, which is breaking in those new shoes, and I completely get where you’re coming from.

So I’m wearing today a pair of new shoes. I just recently got these. I think I’ve worn them only once or twice before, and not even the whole day, and they’re still breaking in, like they’re not quite comfortable yet. A lot of guys I talk to say, “Gosh, I hate when I buy new shoes, because I have to break them in. They take time. That’s why I have only my old pair of shoes, or I only wear athletic shoes,” or these kinds of things. That’s sad, because there’s a lot of cool things you can do with footwear.

But there’s a trick for breaking in shoes that makes it way more comfortable. It’s totally stolen from hikers. Let me show you what I’m talking about. So watch.

I’m not just wearing one pair of socks. I’m wearing two pairs of socks. So I’ve got this white sock underneath here, which it’s a fine sock. No one’s going to see it, because it’s shorter than the shoe. But then I’m actually wearing a little slip on sock over it, and the reason is I’m breaking in these shoes. Wearing a second pair of socks over them, yeah it uses up an extra pair of socks, but man it’s way more comfortable, very easy to break in new shoes. I wear this a couple of days, three days in a row, and this shoe will feel like an old worn pair, which is just awesome. Slides right over and slides right in. Now if I do it right, no one’s the wiser.

Number three on my list, it turns out that a lot of men’s shirt makers these days are doing some really cool things with kind of fashion details and hidden fashion details. They don’t have to be completely hidden. So one of my favorite things is when I buy a shirt, I look inside and I see, “Wow that’s really cool,” they’ve got kind of an off color cuff, like the cuff is a different shade, and the inside of the shirt material is a different shade than the outside. I almost want to show that off in some way. But the only time you can do it is when you’re putting on the shirt or taking off the shirt, unless you roll up your sleeves in a very clever way. Let me show you what I’m talking about.

All right. Now depending on how OCD you want to be, you can make it look even better than I have here in this short amount of time with no mirror. But you can see what’s happened is I’ve taken the cuff, rolled it up, so that now the exterior of the cuff is actually inside, and the interior is shown off, at least the top of it is shown off. I love that sort of mismatched, interior exposer, showing off the detail of the shirt. It’s just a fun way.

The other thing I love about this is, unlike traditional ways of rolling up your cuffs where it often falls down, these don’t fall. I’ve worn it on stage like this, this particular shirt in fact, on stage with the cuffs rolled up, and it doesn’t fall off, even if I’m doing my wild hand gesticulating while I’m speaking. So I really appreciate that, and I think it’s super cool that you can try this out.

Number four, details aren’t just details. In fact, details are a lot of times what makes men’s fashion really fun, really enjoyable, really shareable. So I like to do some fun stuff with all kinds of things, my eyeglasses that I wear. I’m wearing contacts right now. Obviously, this ludicrous mustache, which is a whole other story, but playing around with hairstyles.

I actually really enjoy messing around with watches. I’ve got this one from ZIIIRO that I love. You’ve probably seen me wear a couple of other ones on Whiteboard Friday.

I have this belt. It’s actually a kid’s belt. I know that’s weird, but it’s kind of fun. It’s got like these dinosaurs eating toast, like making toast and then eating toast, and then stealing it from each other kind of print on there. A belt is a very hidden thing, like you rarely, rarely see it, especially because so few men tuck in their shirts anymore, which, God forbid, please don’t tuck in your shirts, especially not T-shirts. Just don’t do it. It’s not allowed. I should bring in a gentleman from our engineering team to maybe show that off.
God forbid, please don’t tuck in your shirts. Especially not T-shirts.Just don’t do it. It’s not allowed.
But in any case, other kinds of details can be really cool too. So one of the things that I love and that has been taking off in popularity in men’s fashion is socks and shoe matching. So these are some old sneakers that I’ve got, which you can see have purple flowers on the side there. So I grabbed some purple polka dot socks that kind of match the color patterns in there, a few of the flowers, and it just makes for fun. It’s something that might catch your eye as you’re walking by. Details, details.

I’ll show you another one. On the topic of undershirts and underwear, it is absolutely appropriate to wear this ring-collar undershirt when I am putting on a suit. I plan to have a tie here. I definitely don’t want an undershirt that’s going to be V-neck that will show off like a little kind of weird patch of skin underneath the shirt, especially if I’m going to be outdoors, for example, for a wedding or at work or something like that.

Now here in the U.S. it’s spring, which means formal events are coming up, a lot of weddings, Bat and Bar Mitzvahs, all kinds of stuff. It also means that a lot of gentlemen are about to make the critical mistake that you see me making right now.

Let’s imagine that I’m at a formal event, not at work. I probably wouldn’t wear a three piece suit to work, or everybody here would think I was crazy. But what’s going on? What am I missing? What am I doing wrong?

It’s my necktie. Look at this shabby tie. Can you see? There you go. Look this knot is called a four-in-hand knot, and a four-in- hand is a very, very simple to tie, tie. But it is not a formal necktie knot.

Let me show you how a formal necktie knot looks. I’m going to start with a half Windsor, and I do the full Windsor as well. [Cut to new scene.] Here we have the half Windsor. This is what I call the minimum acceptable bar of necktie knot formality for a wedding or another formal occasion.

Now let’s take a look at the full Windsor, which looks even nicer in some ways. [Cut.] Here we have the full Windsor. The full Windsor, as you can see, has this lovely sort of balance to it on both sides. It’s very even. It looks like I’m going to some sort of formal British state affair. English state affair? I don’t know. The Queen will be there. She probably won’t. In any case, what I urge you to do gentlemen is if you’re going to a formal event that demands formal wear, please bring with you a formal knot.

If you’re going to a formal event that demands formal wear, please bring with you a formal knot.

Now I’d very much like to thank my colleague Wes. Wes is one of our lead engineers on Moz Local, and Wes has done me the kind favor of committing a horrible fashion crime. He’s tucked in his T-shirt. Wes, would you repair that mistake?

Wes: Sure thing.

Rand: Oh my gosh, looking so much better. Please gentlemen, don’t tuck. Look, it’s fantastic. It doesn’t need to go in there. You don’t need to show that to people. Thank you, Wes, I appreciate it.

Wes: Oh, you’re welcome.

Rand: One of my favorite trends in men’s fashion, by the way, is the return of the bow tie. Now I think it looks ridiculous with 99% of outfits, but with a full suit, a three piece, or a tuxedo, a bow tie can make a great accoutrement, and it’s actually a little more fun to put on and fun to wear. It can make your outfit a little . . . well go better with a mustache anyway.

Don’t cheat by the way. They are challenging to tie, but they’re also a lot of fun. These things with the little clip, I don’t know what this is. This is like the black hat of bow ties. Don’t do it, people.
These things with the little clip … this is like the black hat of bow ties.Don’t do it, people.
All right, everyone. I have really enjoyed having a little bit of fun, talking some men’s fashion with you guys. I’m sure there are going to be some great comments, and if you have questions about this stuff, I’m happy to answer it. I’m not an expert. This is not my field. I just like to have fun in here, and I really enjoy giving a hard time to the guys in my life who happen to tuck in their shirts or wear clip-on bow ties or mismatch their socks.

So when you see me at a conference, be sure and say hi and give me a hard time about whatever I’m wearing, because I need it and I deserve it. Thanks everyone. We’ll see you again next time. Take care.

Panda Pummels Press Release Websites: The Road to Recovery

Many of us in the search industry were caught off guard by the release of Panda 4.0. It had become common knowledge that Panda was essentially “baked into” the algorithm now several times a month, so a pronounced refresh was a surprise. While the impact seemed reduced given that it coincided with other releases including a payday loans update and a potential manual penalty on Ebay, there were notable victims of the Panda 4.0 update which included major press release sites. Both Search Engine Land and Seer Interactive independently verified a profound traffic loss on major press release sites following the Panda 4.0 update. While we can’t be certain that Google did not, perhaps, roll out a handful of simultaneous manual actions or perhaps these sites were impacted by the payday loans algo update, Panda remains the inference to the best explanation for their traffic losses.
So, what happened? Can we tease out why Press Release sites were seemingly singled out? Are they really that bad? And why are they particularly susceptible to the Panda algorithm? To answer this question, we must first address the main question: what is the Panda algorithm?Briefly: What is the Panda Algorithm?
The Panda algorithm was a ground-breaking shift in Google’s methodology for addressing certain search quality issues. Using patented machine learning techniques, Google used real, human reviewers to determine the quality of a sample set of websites. We call this sample the “training set”. Examples of the questions they were asked are below:Would you trust the information presented in this article? Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature? Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations? Would you be comfortable giving your credit card information to this site? Does this article have spelling, stylistic, or factual errors? Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines? Does the article provide original content or information, original reporting, original research, or original analysis? Does the page provide substantial value when compared to other pages in search results? How much quality control is done on content? Does the article describe both sides of a story? Is the site a recognized authority on its topic? Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care? Was the article edited well, or does it appear sloppy or hastily produced? For a health related query, would you trust information from this site? Would you recognize this site as an authoritative source when mentioned by name? Does this article provide a complete or comprehensive description of the topic? Does this article contain insightful analysis or interesting information that is beyond obvious? Is this the sort of page you’d want to bookmark, share with a friend, or recommend? Does this article have an excessive amount of ads that distract from or interfere with the main content? Would you expect to see this article in a printed magazine, encyclopedia or book? Are the articles short, unsubstantial, or otherwise lacking in helpful specifics? Are the pages produced with great care and attention to detail vs. less attention to detail? Would users complain when they see pages from this site?
Once Google had these answers from real users, they built a list of variables that might potentially predict these answers, and applied their machine learning techniques to build a model of predicting low performance on these questions. For example, having an HTTPS version of your site might predict a high performance on the “trust with a credit card” question. This model could then be applied across their index as a whole, filtering out sites that would likely perform poorly on the questionnaire. This filter became known as the Panda algorithm.How do press release sites perform on these questions?
First, Moz has a great tutorial on running your own Panda questionnaire on your own website, which is useful not just for Panda but really any kind of user survey. The graphs and data in my analysis come from PandaRisk.com, though. Full disclosure, Virante, Inc., the company for which I work, owns PandaRisk. The graphs were built by averaging the results from several pages on each press release site, so they represent a sample of pages from each PR distributor.
So, let’s dig in. In the interest of brevity, I have chosen to highlight just four of the major concerns that came from the surveys, question-by-question.Q1. Does this site contain insightful analysis?
Google wants to send users to web pages that are uniquely useful, not just unique and not just useful. Unfortunately, press release sites uniformly fail on this front. On average, only 50% of reviewers found that BusinessWire.com content contained insightful analysis. Compare this to Wikipedia, EDU and Government websites which, on average, score 84%, 79% and 94% respectively, and you can see why Google might choose not to favor their content.

But does this have to be the case? Of course not. Press release websites like BusinessWire.com have first mover status on important industry information. They should be the first to release insightful analysis. Now, press release sites do have to be careful about editorializing the content of their users, but there are clearly improvements that could be made. For example, we know that use of structured data and visual aids improves performance on this question (ie: graphs and charts). BusinessWire could extract stock exchange symbols from press releases and include graphs and data related to the business right in the post. This would separate their content from other press release sites that simply reproduce the content verbatim. There are dozens of other potential improvements that can be added either programmatically or by an editor. So, what exactly would these kinds of changes look like?

In this case, we simply inserted a graph from stock exchange data and included on the right-hand side some data from Freebase on the Securities and Exchange Commission, which could easily be extracted as an entity from the documentation using, for example, Alchemy API. These modest improvements to the page increased the “insightful analysis” review score by 15%. Q2. Would you trust this site with your credit card?
This is one of the most difficult ideals to measure up to. E-Commerce sites, in general, perform better automatically, but there are clear distinctions between sites people trust and don’t trust. Press release websites do have an e-commerce component, so one would expect them to fare comparatively well to non-commercial sites. Unfortunately, this is just not the case. PR.com failed this question in what can only be described as epic fashion. 91% of users said they would not trust the site with their credit card details. This isn’t just a Panda issue for PR.com, this is a survival-of-the-business issue. 
Luckily, there are some really clear, straight-forward solutions to this address this problem. Extend HTTPS/SSL Sitewide
Not every site needs to have HTTPS enabled, but if you have a 600,000+ page site with e-commerce functionality, let’s just go ahead and assume you do. Users will immediately trust your site more if they see that pretty little lock icon in their browser.  Site Security Solutions
Take advantage of solutions like Comodo Hacker Proof or McAfee SiteAdvisor to verify that your site is safe and secure. Include the badges and link to them so that both users and the bots know that you have a safe site. Business Reputation Badges
Use at least one trade group or business reputation group (like the better business bureau) or, at minimum, employ some form of schema review markup that makes it clear to your users that at least some person or group of persons out there trusts your site. If you use a trade group membership or the BBB, make sure you link to them so that, once again, it is clear to the bots as well as your users. Up-to-date Design
This is a clear issue time and time again. In the technology world, old means insecure. The site PR.com looks old-fashioned by all measures of the word, especially in comparison to the other press release websites. It is no wonder that it performs so horribly.
It is worth pointing out here that Google doesn’t need to find markup on your site to come to the conclusion that your site is untrustworthy. Because the Panda algorithm likely takes into account engagement metrics and behaviors (like pogo sticking), Google can use the behavior of users to predict the performance on these questions. So, even if there isn’t a clear path between a change you make on your site and Googlebot’s ability to identify that change doesn’t mean the change cannot and will not have an impact on site performance in the search results. The days of thinking about your users and the bots as separate audiences are gone. The bots now measure both your site and your audience. Your impact on users can and will have an impact on search performance.Q3. Do you consider this site an authority?
This question is particularly difficult for sites that both don’t control the content they create and have a wide variety of content. This places press release websites squarely in the bullseye of the Panda algorithm. How does a website that accepts thousands of press releases on nearly any topic dare claim to be an authority? Well, it generally doesn’t, and the numbers bear that out. 75% of respondents wouldn’t consider PRNewswire an authority. 
Notice, though, that Wikipedia performs poorly on this metric as well (at least compared to EDUs and GOVs). So what exactly is going on here? How can a press release site hope to escape from this authority vacuum? Topically Segment Content
This was one of the very first reactions to Panda. Many of the sites that were hit with Panda 1.0 sub-domained their content into particular topic areas. This seemed to provide some relief but was never a complete or permanent solution. Whether you segment your content into sub-directories or sub-domains, what you are really doing here is helping make clear to your users that the specific content your users are reading is part of a bigger piece of the pie. It isn’t some random page on your site, it fits in nicely with your website’s stated aims.  Create an Authority
Just because you don’t write the content for your site doesn’t mean you can’t be authoritative. In fact, most major press release websites have some degree of editorial oversight sitting between the author and the website. That editorial layer needs to be bolstered and exposed to the end user, making it obvious that the website does more than simply regurgitate the writing of anyone with a few bucks. 
So, what exactly would this look like? Let’s return to the Businesswire press release we were looking at earlier. We started with a bland page comprised of almost nothing but the press release. We then added a graph and some structured data automagically. Now, we want to add in some editor creds and topic segmentation.

Notice in the new design that we have created the “Securities & Investment Division”, added an editor with a fancy title “Business Desk Editor” and a credentialed by-line. You could even use authorship publisher markup. The page no longer looks like a sparse press release but an editorially managed piece of news content in a news division dedicated to this subject matter. Authority done.Q4. Would you consider bookmarking/sharing this site?
When I look at this question, I am baffled. Seriously, how do you make a site in which you don’t control the content worth bookmarking or sharing? Furthermore, how do you do this with overtly commercial, boring content like press releases? As you could imagine, press release sites fair quite poorly on this. Over 85% of respondents said they weren’t interested at all in bookmarking or sharing content from PRWeb.com. And why should they? 
So, how exactly does a press release website encourage users to share? The most common recommendations are already in place on PRWeb. They are quite overt with the usage of social sharing and bookmarking buttons (placed right at the top of the content). Their content is constantly fresh because new press releases come out every day. If these techniques aren’t working, then what will?
The problem with bookmarking and sharing on press release websites is two-fold. First, the content is overtly commercial so users don’t want to share it unless the press release is about something truly interesting. Secondly, the content is ephemeral so users don’t want to return to it. We have to solve both of these problems.
Unfortunately, I think the answer to this question is some tough medicine for press release websites. The solution is multi-faceted. It starts with putting a meta expires tag on press releases. Sorry, but there is no reason for PRWeb to maintain a 2009 press release about a business competition in the search results. In its place, though, should be company and/or categorical pages which thoughtfully index and organize archived content. While LumaDerm may lose their press release from 2009, they would instead have a page on the site dedicated to their press releases so that the content is still accessible, albeit one click away, and the search engines know to ignore it. With this solution, the pages that end up ranking in the long run for valuable words and phrases are the aggregate pages that truly do offer authoritative information on what is up-and-coming with the business. The page is sticky because it is updated as often as the business releases new information, you still get some of the shares out of new releases but you don’t risk the problems of PR sprawl and crawl prioritization. Aside from the initial bump of fresh content, there is no good SEO reason to keep old press releases in the index.So, I don’t own a press release site…Most of us don’t run sites with thousands of pages of low quality content. But that doesn’t mean we shouldn’t be cognizant of Panda. Of all of Google’s search updates, Panda is the one I respect the most. I respect it because it is an honest attempt to measure quality. It doesn’t ask how you got to your current position in the search results (a classic genetic fallacy problem), it simply asks whether the page and site itself deserve that ranking based on human quality measures (as imperfect as it may be at doing so). Most importantly, even if Google didn’t exist at all, you should aspire to have a website that scores well on all of these metrics. Having a site that performs well on the Panda questions means more than insulation from a particular algorithm update, it means having a site that performs well for your users. That is a site you want to have.Take a look again at the questionnaire. Does your site honestly meet these standards? Ask someone unbiased. If your site does, then congratulations – you have an amazing site. But if not, it is time to get to work building the site that you were meant to build.
About russvirante —
I am the CTO of Virante, Inc. I am married to Morgan, who is frickin awesome, and I have two daughters Claren and Aven who are also frickin awesome. We live happily in Durham, NC.

Virante, Inc. is a full service Search, Social and Analytics Consulting Company.

Stop Worrying About the New Google Maps; These URL Parameters Are Gold

I suspect I’m not alone in saying: I’ve never been a fan of the New Google Maps.

In the interstitial weeks between that tweet and today, Google has made some noticeable improvements. But the user experience still lags in many ways relative to the classic version (chief among them: speed).
Google’s invested so heavily in this product, though, that there’s no turning back at this point. We as marketers need to come to terms with a product that will drive an increasing number of search results in the future.
Somewhat inspired by this excellent Pete Wailes post from many years ago, I set out last week to explore Google Maps with a fresh set of eyes and an open mind to see what I could discover about how it renders local business results. Below is what I discovered.Basic URL structure
New Google Maps uses a novel URL structure (novel for me, anyway) that is not based around the traditional ? and & parameters of Classic Google Maps, but instead uses /’s and something called hashbangs to tell the browser what to render.
The easiest way to describe the structure is to illustrate it:

There are also some additional useful hashbang parameters relating to local queries that I’ll describe in further detail below.Some actual feature improvements
Despite the performance issues, New Google Maps has introduced at least two useful URL modifiers I’ve grown to love.
/am=tThis generates a stack-ranked list of businesses in a given area that Google deems relevant for the keyword you’re searching. It’s basically the equivalent of the list on the lefthand panel in Classic Google Maps but much easier to get to via direct URL. Important: am=t must always be placed after /search and before the hashbang modifiers, or else the results will break.
by:expertsThis feature shows you businesses that have been reviewed by Google+ experts (the equivalent of what we’ve long-called “power reviewers” or “authority reviewers” on my annual Local Search Ranking Factors survey). To my knowledge it’s the first time Google has publicly revealed who these power users are, opening up the possibility of an interesting future study correlating PlaceRank with the presence, valence, and volume of these reviews. In order to see these power reviewers, it seems like you have to be signed into a Google+ account, but perhaps others have found a way around this requirement.
Combining these two parameters yields incredibly useful results like these, which could form the basis for an influencer-targeting campaign:

Above: a screenshot of the results for: https://www.google.com/maps/search/grocery+stores+by:experts/@45.5424364,-122.654422,11z/am=t/Local pack results and the vacuum left by tbm=plcs
Earlier this week, Steve Morgan noticed that Google crippled the ability to render place-based results from a Google search (ex: google.com/search?q=realtors&tbm=plcs). Many local rank-trackers were based on the results of these queries.
Finding a replacement for this parameter in New Google Maps turns out to be a little more difficult than it would first appear. You’ll note in the summary of URL structure above that each URL comes with a custom-baked centroid. But local pack results on a traditional Google SERP each have their own predefined viewport — i.e. the width, height, and zoom level that most closely captures the location of each listing in the pack, making it difficult to determine the appropriate zoom level.

Above: the primary SERP viewport for ‘realtors’ with location set to Seattle, WA.
Note that if you click that link of “Map for realtors” today, and then add the /am=t parameter to the resulting URL, you tend to get a different order of results than what appears in the pack.
I’m not entirely sure as to why the order changes–one theory is that Google is now back to blending pack results (using both organic and maps algorithms). Another theory is that the aspect ratio on the viewport on the /am=t window is invariably square, which yields a different set of relevant results than the “widescreen” viewport on the primary SERP.
One thing I have found helps with replicability is to leave the @lat,lng,zoom parameters out of the URL, and let Google automatically generate them for you.
Here are a couple of variations that I encourage you to try:
https://www.google.com/maps/search/realtors/am=t/data=
followed by:
!3m1!4b1!1srealtors!2sSeattle,+WA!3s0x5490102c93e83355:0x102565466944d59a
or
!3m1!4b1!4m5!2m4!3m3!1srealtors!2sSeattle,+WA!3s0x5490102c93e83355:0x102565466944d59a
Take a closer look at those trailing parameters and you’ll see a structure that looks like this:

The long string starting with 0x and ending with 9a is the Feature ID of the centroid of the area in which you’re searching (in this case, Seattle). Incidentally, this feature ID is also rendered by Google Mapmaker using a URL similar to http://www.google.com/mapmaker?gw=39&fid={your_fid}.
This is the easy part. You can find this string by typing the URL:
https://www.google.com/maps/place/seattle,+WA
waiting for the browser to refresh, and then copying it from the end of the resulting URL.
The hard part is figuring out which hashbang combo will generate which order of results, and I still haven’t been able to do it. I’m hoping that by publishing this half-complete research, some enterprising Moz reader might be able to complete the puzzle! And there’s also the strong possibility that this theory is completely off base.
In my research thus far, the shorter hashbang combination (!3m1!4b1) seems to yield the closest results to what tbm=plcs used to render, but they aren’t 100% identical.

The longer hashbang combination (!3m1!4b1!4m5!2m4!3m3) actually seems to predictably return the same set of results as a Local search on Google Plus — and note the appearance of the pushpin icon next to the keyword when you add this longer combination:

Who’s #1?
Many of us in the SEO community, even before the advent of (not provided), encouraged marketers and business owners to stop obsessing about individual rankings and start looking at visibility in a broader sense. Desperately scrambling for a #1 ranking on a particular keyword has long been a foolish waste of resources.
Google’s desktop innovations in local search add additional ammunition to this argument. Heat map studies have shown that the first carousel result is far from dominant, and that a compelling Google+ profile photo can perform incredibly well even as far down the “sixth or seventh” (left to right) spot.  Ranking #1 in the carousel doesn’t provide quite the same visual benefit as ranking #1 in an organic SERP or 7-pack.

The elimination of the lefthand list pane on New Google Maps makes an even stronger case. It’s literally impossible to rank these businesses visually no matter how hard you stare at the map:

Mobile, mobile, mobileParadoxically, though, just as Google is moving away from ranked results on the desktop, my view is that higher rankings matter more than ever in mobile search. And as mobile and wearables continue to gain market share relative to desktop, that trend is likely to increase.
The increasing ubiquity of Knowledge Panels in search results the past couple of years has been far from subtle. Google is now not only attempting to organize the world’s information, but condense each piece of it into a display that will fit on a Google Glass (or Google Watch, or certainly a Google Android phone).
Nowhere is the need to be #1 more dramatic than in the Google Maps app, in which users perform an untold number of searches each month. List view is completely hidden (I didn’t even know it existed until this week) and an average user is just as likely to think the first result is the only one for them as they are to figure out they need to swipe right to view more businesses.
Above: a Google Maps app result for ‘golf courses’, in which the first result has a big-time advantage.
The other issue that mobile results really bring to the fore is that the user is becoming the centroid.
This is true even when searching from the desktop. I performed some searches one morning from a neighborhood coffee shop with wifi, and a few minutes later from my house six blocks away. To my surprise, I got completely different results. From my house, Google is apparently only able to detect that I’m somewhere in “Portland.” But from the coffee shop, it was able to detect my location at a much more granular level (presumably due to the coffee shop’s wifi?), and showed me results specific to my ZIP code, with the centroid placed at the center of that ZIP.  And the zoom setting for both adjusted automatically–the more granular ZIP code targeting defaulted to a zoom level of 15z or 16z, versus 11z to 13z from my home, where Google wasn’t as sure of my location.
Note, too, that I was unable to be exact about the zoom level in the previous paragraph. That’s because the centroid is category-dependent. It likely always has been category dependent but that fact is much more noticeable in New Google Maps.
Maps app visibilityTaking both of these into account, in terms of replicating Google Maps App visibility, here is a case where specifying @lat,lng,zoom (with the zoom set to 17z)can be incredibly useful. 
As an example, I performed the search below from my iPhone at the hotel I was staying at in Little Italy after a recent SEM SD event. And was able to replicate the results with this URL string on desktop:
http://google.com/maps/search/lawyers/@32.723278,-117.168528,17z/am=t/data=!3m1!4b1

Conclusions and recommendations
While I still feel the user experience of New Google Maps is subpar, as a marketer I found myself developing a very Strangelovian mindset over the past week or so — I have actually learned to stop worrying and love the new Google Maps. There are some incredibly useful new URL parameters that allow for a far more complete picture of local search visibility than the classic Google Maps provided.
With this column, I wanted to at least present a first stab to the Moz community to hopefully build on and experiment with. But this is clearly an area that is ripe for more research, particularly with an eye towards finding a complete replacement for the old tbm=plcs parameter.
As mobile usage continues to skyrocket, identifying the opportunities in your (or your client’s) competitive set using the new Google Maps will only become more important.
About David-Mihm —

David Mihm is one of the world’s leading practitioners of Local search engine marketing. He has created and promoted search-friendly websites for clients of all sizes since the early 2000’s. David co-founded GetListed.org, which he sold to Moz in November 2012. His annual Local Search Ranking Factors project is among the most important studies of Local SEO.

One Content Metric to Rule Them All

Let’s face it: Measuring, analyzing, and reporting the success of content marketing is
hard.

Not only that, but
we’re all busy. In its latest report on B2B trends, the Content Marketing Institute quantified some of the greatest challenges faced by today’s content marketers, and a whopping 69% of companies cited a lack of time. We spend enough of our time sourcing, editing, and publishing the content, and anyone who has ever managed an editorial calendar knows that fires are constantly in need of dousing. With so little extra time on our hands, the last thing content marketers want to do is sift through a heaping pile of data that looks something like this:

Sometimes we want to dig into granular data. If a post does exceptionally well on Twitter, but just so-so everywhere else, that’s noteworthy. But
when we look at individual metrics, it’s far too easy to read into them in all the wrong ways.

Here at Moz, it’s quite easy to think that a post isn’t doing well when it doesn’t have a bunch of thumbs up, or to think that we’ve made a horrible mistake when a post gets several thumbs down. The truth is, though, that we can’t simply equate metrics like thumbs to success. In fact, our
most thumbed-down post in the last two years was one in which Carson Ward essentially predicted the recent demise of spammy guest blogging.

We need a solution. We need something that’s easy to track at a glance, but doesn’t lose the forest for the trees. We need a way to quickly sift through the noise and figure out which pieces of content were really successful, and which didn’t go over nearly as well. We need something that looks more like this:

This post walks through how we combined our content metrics for the Moz Blog into a single, easy-to-digest score, and better yet, almost completely automated it.

What it is not

It is
not an absolute score. Creating an absolute score, while the math would be equally easy, simply wouldn’t be worthwhile. Companies that are just beginning their content marketing efforts would consistently score in the single digits, and it isn’t fair to compare a multi-million dollar push from a giant corporation to a best effort from a very small company. This metric isn’t meant to compare one organization’s efforts with any other; it’s meant to be used inside of a single organization.

What it is and what it measures

The One Metric is a single score that tells you how successful a piece of content was by comparing it to the average performance of the content that came before it. We made it by combining several other metrics, or “ingredients,” that fall into three equally weighted categories:

Google Analytics
On-page (in-house) metrics
Social metrics

It would never do to simply smash all these metrics together, as the larger numbers would inherently carry more weight. In other words, we cannot simply take the average of 10,000 visits and 200 Facebook likes, as Facebook would be weighted far more heavily—moving from 200 to 201 likes would be an increase of 0.5%, and moving from 10,000 to 10,001 visits would be an increase of 0.01%. To ensure every one of the ingredients is weighted equally, we compare them to our expectations of them individually.

Let’s take a simple example using only one ingredient. If we wanted to get a sense for how well a particular post did on Twitter, we could obviously look at the number of tweets that link to it. But what does that number actually
mean? How successful is a post that earns 100 tweets? 500? 2,000? In order to make sense of it, we use past performance. We take everything we’ve posted over the last two months, and find the average number of tweets each of those posts got. (We chose two months; you can use more or less if that works better for you.) That’s our benchmark—our expectation for how many tweets our future posts will get. Then, if our next post gets more than that expected number, we can safely say that it did well by our own standards. The actual number of tweets doesn’t really matter in this sense—it’s about moving up and to the right, striving to continually improve our work.

Here’s a more visual representation of how that looks:

Knowing a post did better or worse than expectations is quite valuable, but
how much better or worse did it actually do? Did it barely miss the mark, or did it completely tank? It’s time to quantify.

It’s that
percentage of the average (92% and 73% in the examples above) that we use to seed our One Metric. For any given ingredient, if we have 200% of the average, we have a post that did twice as well as normal. If we have 50%, we have a post that did half as well.

From there, we do the exact same thing for all the other ingredients we’d like to use, and then combine them:

This gives us a single metric that offers a quick overview of a post’s performance. In the above example, our overall performance came out to 113% of what we’d expect based on our average performance. We can say it outperformed expectations by 13%.

We don’t stop there, though. This percent of the average is quite useful… but we wanted this metric to be useful outside of our own minds. We wanted it to make sense to just about anyone who looked at it, so we needed a different scale. To that end, we took it one step farther and applied that percentage to a logarithmic scale, giving us a single two-digit score much like you see for Domain Authority and Page Authority.

If you’re curious, we used the following equation for our scale (though you should feel free to adjust that equation to create a scale more suitable for your needs):

Where y is the One Metric score, and x is the percent of a post’s expected performance it actually received. Essentially, a post that exactly meets expectations receives a score of 50.

For the above example, an overall percentage of expectations that comes out to 113% translates as follows:

Of course, you won’t need to calculate the value by hand; that’ll be done automatically in a spreadsheet. Which is actually a great segue…

The whole goal here is to make things easy, so what we’re going for is a spreadsheet where all you have to do is “fill down” for each new piece of content as it’s created. About 10-15 seconds of work for each piece. Unfortunately, I can’t simply give you a ready-to-go template, as I don’t have access to your Google Analytics, and have no clue how your on-page metrics might be set up. 

As a result, this might look a little daunting at first.

Once you get things working once, though, all it takes is copying the formulas into new rows for new pieces of content; the metrics will be filled automatically.
It’s well worth the initial effort.

Ready? Start here:

Make a copy of that document so you can make edits (File > Make a Copy), then follow the steps below to adjust that spreadsheet based on your own preferences.

You’ll want to add or remove columns from that sheet to match the ingredients you’ll be using. Do you not have any on-page metrics like thumbs or comments? No problem—just delete them. Do you want to add Pinterest repins as an ingredient? Toss it in there. It’s your metric, so make it a combination of the things that matter to you.
Get some content in there. Since the performance of each new piece of content is based on the performance of what came before it, you need to add the “what came before it.” If you’ve got access to a database for your organization (or know someone who does), that might be easiest. You can also create a new tab in that spreadsheet, then use the =IMPORTFEED function to automatically pull a list of content from your RSS feed.
Populate the first row. You’ll use a variety of functionality within Google Spreadsheets to pull the data you need in from various places on the web, and I go through many of them below. This is the most time-consuming part of setting this up; don’t give up!
Got your data successfully imported for the first row? Fill down. Make sure it’s importing the right data for the rest of your initial content.
Calculate the percentage of expectations. Depending on how many ingredients you’re using, this equation can look mighty intimidating, but that’s really just a product of the spreadsheet smooshing it all onto one line. Here’s a prettier version:
All this is doing (remember Step 2 above, where we combined the ingredients) is comparing each individual metric to past performance, and then weighting them appropriately.
And, here’s what that looks like in plain text for
our metric (yours may vary):
=((1/3)*(E48/(average(E2:E47))))+((1/3)*((F48/(average(F2:F47)))+(G48/(average(G2:G47))))/2)+((1/3)*((H48/(average(H2:H47)))+(I48/(average(I2:I47)))+(J48/(average(J2:J47)))/3))

Note that this equation goes from row 2 through row 47 because we had 46 pieces of content that served to create our “expectation.”

Convert it to the One Metric score. This is a piece of cake. You can certainly use our logarithmic equation (referenced above): y = 27*ln(x) +50, where x is the percent of expectations you just finished calculating. Or, if you feel comfortable adjusting that to suit your own needs, feel free to do that as well.
You’re all set! Add more content, fill down, and repeat!

Here are more detailed instructions for pulling various types of data into the spreadsheet:

Adding new rows with IFTTT

If This Then That (IFTTT) makes it brilliantly easy to have your new posts automatically added to the spreadsheet where you track your One Metric. The one catch is that your posts need to have an RSS feed set up (more on that from FeedBurner). Sign up for a free IFTTT account if you don’t already have one, and then set up a recipe that adds a row to a Google Spreadsheet for every new post in the RSS feed.

When creating that recipe, make sure you include “Entry URL” as one of the fields that’s recorded in the spreadsheet; that’ll be necessary for pulling in the rest of the metrics for each post.

Also, IFTTT shortens URLs by default, which you’ll want to turn off, since the shortened URLs won’t mean anything to the APIs we’re using later. You can find that setting in your account preferences.

Pulling Google Analytics

One of the beautiful things about using a Google Spreadsheet for tracking this metric is the easy integration with Google Analytics. There’s an add-on for Google Spreadsheets that makes pulling in just about any metric a simple process. The only downside is that even after setting things up correctly, you’ll still need to manually refresh the data.

To get started, 
install the add-on. You’ll want to do so while using an account that has access to your Google Analytics.

Then, create a new report; you’ll find the option under “Add-ons > Google Analytics:”

Select the GA account info that contains the metrics you want to see, and choose the metrics you’d like to track. Put “Page” in the field for “Dimensions;” that’ll allow you to reference the resulting report by URL.

You can change the report’s configuration later on, and if you’d like extra help figuring out how to fiddle with it, check out
Google’s documentation.

This will create (at least) two new tabs on your spreadsheet; one for Report Configuration, and one for each of the metrics you included when creating the report. On the Report Configuration tab, you’ll want to be sure you set the date range appropriately (I’d recommend setting the end date fairly far in the future, so you don’t have to go back and change it later). To make things run a bit quicker, I’d also recommend setting a filter for the section(s) of your site you’d like to evaluate. Last but not least, the default value for “Max Results” is 1,000, so if you have more pages than that, I’d change that, as well (the max value is 10,000).

Got it all set up? Run that puppy! Head to Add-ons > Google Analytics > Run Reports. Each time you return to this spreadsheet to update your info, you’ll want to click “Run Reports” again, to get the most up-to-date stats.

There’s one more step. Your data is now in a table on the wrong worksheet, so we need to pull it over using the VLOOKUP formula. Essentially, you’re telling Excel, “See that URL over there? Find it in the table on that report tab, and tell me what the number is next to it.” If you haven’t used VLOOKUP before, it’s well worth learning. There’s a fantastic 
explanation over at Search Engine Watch if you could use a primer (or a refresher).

Pulling in social metrics with scripts

This is a little trickier, as Google Spreadsheets doesn’t include a way to pull in social metrics, and that info ins’t included in GA. The solution? We create our own functions for the spreadsheet to use.

Relax; it’s not as hard as you’d think. =)

I’ll go over Facebook, Twitter, and Google Plus here, though the process would undoubtedly be similar for any other social network you’d like to measure.

We start in the script editor, which you’ll find under the tools menu:

If you’ve been there before, you’ll see a list of scripts you’ve already made; just click “Create a New Project.” If you’re new to Google Scripts, it’ll plop you into a blank project—you can just dismiss the popup window that tries to get you started.

Google Scripts organizes what you create into “projects,” and each project can contain multiple scripts. You’ll only need one project here—just call it something like “Social Metrics Scripts”—and then create a new script within that project for each of the social networks you’d like to include as an ingredient in your One Metric.

Once you have a blank script ready for each network, go through one by one, and paste the respective code below into the large box in the script editor (make sure to replace the default “myFunction” code).

function fbshares(url) {
var jsondata = UrlFetchApp.fetch(“http://api.facebook.com/restserver.php?method=links.getStats&format=json&urls=”+url);
var object = Utilities.jsonParse(jsondata.getContentText());
return object[0].total_count;
Utilities.sleep(1000)
}

function tweets(url) {
var jsondata = UrlFetchApp.fetch(“http://urls.api.twitter.com/1/urls/count.json?url=”+url);
var object = Utilities.jsonParse(jsondata.getContentText());
Utilities.sleep(1000)
return object.count;
}

function plusones(url) {
var options =
{
“method” : “post”,
“contentType” : “application/json”,
“payload” :
‘{“method”:”pos.plusones.get”,”id”:”p”,”params”:{“nolog”:true,”id”:”‘+url+'”,”source”:”widget”,”userId”:”@viewer”,”groupId”:”@self”},”jsonrpc”:”2.0″,”key”:”p”,”apiVersion”:”v1″}’
};
var response = UrlFetchApp.fetch(“https://clients6.google.com/rpc?key=AIzaSyCKSbrvQasunBoV16zDH9R33D88CeLr9gQ”, options);
var results = JSON.parse(response.getContentText());
if (results.result != undefined)
return results.result.metadata.globalCounts.count;
return “Error”;
}

Make sure you save these scripts—that isn’t automatic like it is with most Google applications. Done? You’ve now got the following functions at your disposal in Google Spreadsheets:

=fbshares(url)
=tweets(url)
=plusones(url)

The (url) in each of those cases is where you’ll point to the URL of the post you’re trying to analyze, which should be pulled in automatically by IFTTT. Voila! Social metrics.

Pulling on-page metrics

You may also have metrics built into your site that you’d like to use. For example, Moz has thumbs up on each post, and we also frequently see great discussions in our comments section, so we use both of those as success metrics for our blog. Those can
usually be pulled in through one of the following two methods.

But first,
obligatory note: Both of these methods involve scraping a page for information, which is obviously fine if you’re scraping your own site, but it’s against the ToS for many services out there (such as Google’s properties and Twitter), so be careful with how you use these.

=IMPORTXML

While getting it set up correctly can be a little tricky, this is an incredibly handy function, as it allows you to scrape a piece of information from a page using an XPath. As long as your metric is displayed somewhere on the URL for your piece of content, you can use this function to pull it into your spreadsheet.

Here’s how you format the function:

If you’d like a full tutorial on XPaths (they’re quite useful), our friends at Distilled put together a really fantastic guide to using them for things just like this. 
It’s well worth a look. You can skip that for now, if you’d rather, as you can find the XPath for any given element pretty quickly with a tool built into Chrome.

Right-click on the metric you’d like to pull, and click on “Inspect element.”

That’ll pull up the developer tools console at the bottom of the window, and will highlight the line of code that corresponds to what you clicked. Right-click on that line of code, and you’ll have the option to “Copy XPath.” Have at it.

That’ll copy the XPath to your clipboard, which you can then paste into the function in Google Spreadsheets.

Richard Baxter of BuiltVisible created a wonderful 
guide to the IMPORTXML function a few years ago; it’s worth a look if you’d like more info.

Combining =INDEX with =IMPORTHTML

If your ingredient is housed in a <table> or a list (ordered or unordered) on your pages, this method might work just as well.

=IMPORTHTML simply plucks the information from a list or table on a given URL, and =INDEX pulls the value from a cell you specify within that table. Combining them creates a function something like this:

Note that without the INDEX function, the IMPORTHTML function will pull in the
entire piece of content it’s given. So, if you have a 15-line table on your page and you import that using IMPORTHTML, you’ll get the entire table in 15 rows in your spreadsheet. INDEX is what restricts it to a single cell in that table. For more on this function, check out this quick tutorial.

Taking it to the next level

I’ve got a few ideas in the works for how to make this metric even better. 

Automatically check for outlier ingredients and flag them

One of the downsides of smooshing all of these ingredients together is missing out on the insights that individual metrics can offer. If one post did fantastically well on Facebook, for example, but ended up with a non-remarkable One Metric score, you might still want to know that it did really well on Facebook.

In the next iteration of the metric, my plan is to have the spreadsheet automatically calculate not only the average performance of past content, but also the standard deviation. Then, whenever a single piece differs by more than a couple of standard deviations (in either direction), that ingredient will get called out as an outlier for further review.

Break out the categories of ingredients

In the graphic above that combines the ingredients into categories in order to calculate an overall average, it might help to monitor those individual categories, too. You might, then, have a spreadsheet that looked something like this:

Make the weight of each category adjustable based on current goals

As it stands, each of those three categories is given equal weight in coming up with our One Metric scores. If we broke the categories out, though, they could be weighted differently to reflect our company’s changing goals. For example, if increased brand awareness was a goal, we could apply a heavier weight to social metrics. If retention became more important, on-page metrics from the existing community could be weighted more heavily. That weighting would adapt the metric to be a truer representation of the content’s performance against current company goals.

I hope this comes in as handy for everyone else’s analysis as it has for my own. If you have any questions and/or feedback, or any other interesting ways you think this metric could be used, I’d love to hear from you in the comments!

About Trevor-Klein —
Trevor is the content strategist at Moz—a proud member of the content team. He manages the Moz Blog, helps craft and execute content strategy, and wrangles other projects in an effort to align Moz’s content with the company’s business objectives and to provide the most valuable experience possible for the Moz community.

Setting Up 4 Key Customer Loyalty Metrics in Google Analytics

The author’s posts are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.

Customer loyalty is one of the strongest assets a business can have, and one that any can aim to improve. However, improvement requires iteration and testing, and iteration and testing require measurement.
Traditionally, customer loyalty has been measured using customer surveys. The
Net Promoter Score, for example, is based on the question (on a scale of one to ten) “How likely is it that you would recommend our company/product/service to a friend or colleague?”. Regularly monitoring metrics like this with any accuracy is going to get expensive (and/or annoying to customers), and is never going to be hugely meaningful, as advocacy is only one dimension of customer loyalty. Even with a wider range of questions, there’s also some risk that you end up tracking what your customers claim about their loyalty rather than their actual loyalty, although you might expect the two to be strongly correlated.Common mistakes
Google Analytics and other similar platforms collect data that could give you more meaningful metrics for free. However, they don’t always make them completely obvious – before writing this post, I checked to be sure there weren’t any very similar ones already published, and I found some fairly dubious reoccurring recommendations. The most common of these was
using % of return visitors as a sole or primary metric for customer loyalty. If the percentage of visitors to your site who are return visitors drops, there are plenty of reasons that could be behind that besides a drop in loyalty—a large number of new visitors from a successful marketing campaign, for example. Similarly, if the absolute number of return visitors rises, this could be as easily caused by an increase in general traffic levels as by an increase in the loyalty of existing customers.
Visitor frequency is another easily misinterpreted metric; 
infrequent visits do not always indicate a lack of loyalty. If you were a loyal Mercedes customer, and never bought any car that wasn’t a new Mercedes, you wouldn’t necessarily visit their website on a weekly basis, and someone who did wouldn’t necessarily be a more loyal customer than you.The metrics
Rather than starting with the metrics Google Analytics shows us and deciding what they mean about customer loyalty (or anything else), a better approach is to decide what metrics you want, then deciding how you can replicate them in Google Analytics.
To measure the various dimensions of (online) customer loyalty well, I felt the following metrics would make the most sense:Proportion of visitors who want to hear more Proportion of visitors who advocate Proportion of visitors who return Proportion of macro-converters who convert again
Note that a couple of these may not be what they initially seem. If your registration process contains an awkwardly worded checkbox for email signup, for example, it’s not a good measure of whether people want to hear more. Secondly, “proportion of visitors who return” is not the same as “proportion of visitors who are return visitors.”1. Proportion of visitors who want to hear more
This is probably the simplest of the above metrics, especially if you’re already tracking newsletter signups as a micro-conversion. If you’re not, you probably should be, so see Google’s guidelines for event tracking using the
analytics.js tracking snippet or Google Tag Manager, and set your new event as a goal in Google Analytics.
2. Proportion of visitors who advocate
It’s never possible to track every public or private recommendation, but there are two main ways that customer advocacy can be measured in Google Analytics: social referrals and social interactions. Social referrals may be polluted as a customer loyalty metric by existing campaigns, but these can be segmented out if properly tracked, leaving the social acquisition channel measuring only organic referrals.

Social interactions can also be tracked in Google Analytics, although surprisingly, with the exception of Google+, tracking them does require additional code on your site. Again, this is probably worth tracking anyway, so if you aren’t already doing so, see Google’s guidelines for
analytics.js tracking snippets, or this excellent post for Google Tag Manager analytics implementations.3. Proportion of visitors who return
As mentioned above, this isn’t the same as the proportion of visitors who are return visitors. Fortunately, Google Analytics does give us a feature to measure this.

Even though date of first session isn’t available as a dimension in reports, it can be used as a criteria for custom segments. This allows us to start building a data set for how many visitors who made their first visit in a given period have returned since.
There are a couple of caveats. First, we need to pick a sensible time period based on our frequency and recency data. Second, this data obviously takes a while to produce; I can’t tell how many of this month’s new visitors will make further visits at some point in the future.
In Distilled’s case, I chose 3 months as a sensible period within which I would expect the vast majority of loyal customers to visit the site at least once. Unfortunately, due to the 90-day limit on time periods for this segment, this required adding together the totals for two shorter periods. I was then able to compare the number of new visitors in each month with how many of those new visitors showed up again in the subsequent 3 months:

As ever with data analysis, the headline figure doesn’t tell the story. Instead, it’s something we should seek to explain. Looking at the above graph, it would be easy to conclude “Distilled’s customer loyalty has bombed recently; they suck.” However, the fluctuation in the above graph is mostly due to the enormous amount of organic traffic that’s been generated by
Hannah’s excellent blog post 4 Types of Content Every Site Needs.
Although many new visitors who discovered the Distilled site through this blog post have returned since, the return rate is unsurprisingly lower than some of the most business-orientated pages on the site. This isn’t a bad thing—it’s what you’d expect from top-of-funnel content like blog posts—but it’s a good example of why it’s worth keeping an eye out for this sort of thing if you want to analyse these metrics. If I wanted to dig a little deeper, I might start by segmenting this data to get a more controlled view of how new visitors are reacting to Distilled’s site over time.4. Proportion of macro-converters who convert again
While a standard Google Analytics implementation does allow you to view how many users have made multiple purchases, it doesn’t allow you to see how these fell across their sessions. Similarly, if you can see how many users have had two sessions and two goal conversions, but you can’t see whether those conversions were in different visits, it’s entirely possible that some had one accidental visit that bounced, and one visit with two different conversions (note that you cannot perform the same conversion twice in one session).
It would be possible to create custom dimensions for first (and/or second, third, etc.) purchase dates using internal data, but this is a complex and site-specific implementation. Unfortunately, for the time being, I know of no good way of documenting user conversion patterns over multiple sessions using only Google Analytics, despite the fact that it collects all the data required to do this.Contribute
These are only my favourite customer loyalty metrics. If you have any that you’re already tracking or are unsure how to track, please explain in the comments below.

Google Kills Author Photos in Search Results: What You Should Know

Google gives, and Google takes away.
Even so, it came as a surprise when John Mueller announced Google is dropping authorship photos from most search results.
This one hits particularly hard, as I’m known as the guy who
optimized his Google author photo. Along with many other SEOs, I constantly advise webmasters to connect their content writers with Google authorship. Up until now, would-be authors clamored to verify authorship, both for the potential of increased click-through rates, and also for greater brand visibility by introducing real people into search results.Update: As of June 29th, the MozCast feature graph shows traditional authorship snippets dropping to 0% of search results across all data centers. Previously, Google displayed authorship photos in 22% of all searches.How are author photos changing?
The announcement means author photos in
most Google search results are going away. John Mueller indicated the change will roll out globally over the next few days.
Up until now, if you
verified your authorship through Google+, and Google choose to display it, you might have seen your author photo displayed in Google search results. This included both your author photo and your Google circle count.
Going forward, Google plans to only display the author’s name in the search snippet, dropping the photo and the circle count.

Google News adds a different twist. 
In this case, Google’s plans show them adding a small author photo next to Google News snippets, in addition to a larger news photo snippet. 
At this time, we’re not sure how authorship in Google News will display in mobile results.Why did Google drop author photos?
In his announcement, John Mueller said they were working to clean up the visual design of search results, and also to create a “better mobile experience and a more consistent design across devices.”
This makes sense in the way Google has
embraced mobile-first design. Those photos take up a lot of real estate on small screens. 
On the other hand, it also leaves many webmasters scratching their heads as most seemed to enjoy the author photos and most of the web is moving towards a more visual experience.
John Mueller indicated that testing shows that “click-through behavior” with the new results
is about the same, but we don’t know exactly what that means. One of the reasons authors like the photos in search results was the belief that a good photo could result in more clicks (although this was never a certainty). 
Will the new SERPs result in the same amount of clicks for authorship results? For now, it’s hard to say.
Critics argue that the one thing that will actually become more visible as a result of this change will be Google’s ads at the top and sides of the page.What isn’t changing?
Despite this very drastic visual change in Google search results, several things
are not changing:
1. Authorship is still here
As Mark Traphagen eloquently
pointed out on Google+, the loss of photos does not mean Google authorship itself is going anywhere. 

“Google Authorship continues. Qualifying authors will still get a byline on search results, so Google hasn’t abandoned it.”
2. Authors’ names still appear in search results
In the new system, authors still get their name displayed in search results, which presumably clicks through to their Google+ profile. Will this be enough to sway searchers into clicking a link? Time will tell.3. Your rankings don’t change
Authorship does not influence rankings for most search results. (exceptions for certain results like In-depth articles) Sometimes the photo led to more clicks for some people, but the new change should not alter the order of results.
4. You must still verify authorship for enhanced snippets
Google isn’t changing the guidelines for establishing authorship. This can be accomplished either through
email verification or linking your content to your Google+ profile, and adding a link back to your website from your Google+ contributor section.Tracking your authorship CTR
If you have authorship set up, you can easily track changes to your click-through rate using Google Webmaster Tools. Navigate to Labs > Author Stats to see how many time your author information has appeared in search results, along with total number of clicks and average position.

In the example above, search results associated with my authorship receive around 50,000 impressions a day, with an average of 1831 clicks, for an overall CTR of 3.6%. 
If you track your CTR immediately before and after the Google authorship change (by adjusting the dates in Webmaster Tools) you might notice any changes caused by the shakeup.
Keep in mind that CTR is highly determined by rank, or average position. Small fluctuations in rank can mean a large difference in the number of clicks each URL receives.Is Google Authorship still worth it?
For many, scoring photos in search results was the
only incentive people had to verify authorship. Whether or not it increased click-through rates, it was an ego boost, and it was great to show clients. With the photos gone, it’s likely fewer people will work to get verified.
Even with the photos gone, there is still ample reason to verify authorship, and I highly recommend you continue to do so. Even though a byline is much less visible than a photo, across the hundreds or thousands of search impressions you receive each day, those bylines can make a measurable difference in your traffic, and may improve your online visibility.
Google continues to work on promoting authoritative authors in search results, and authorship is one of the better ways for Google to establish “identity” on the web. Google continues to make statements explaining how important identity in content is, as explained by Matt Cutts both publicly and in this rarely seen interview. <a href=”http://polldaddy.com/poll/7962265/”>Poll: How do you feel about Google removing author photos from search results?</a>Facing the future
If Google begins to incorporate more “Author Rank” signals into its search algorithm, establishing yourself as a trusted authority now could pay off big down the road. Disappearing author photos today may someday be replaced by actual higher rankings for credible authors, but there are no guarantees. 
At this point, it’s hard to say exactly where the future of authorship lies, especially given the unknown future of Google+ itself.
Personally, I will be sad to see author photos disappear. Let’s hope for something better down the road.
More from across the web:Google Removes Author Photos From Search: Why And What Does It Mean?

8 Ways to Use Email Alerts to Boost SEO – Whiteboard Friday

Link building is nowhere near dead, and some of the best link opportunities can be discovered by setting up email alerts for various things that are published on the web. In today’s Whiteboard Friday, Rand runs through eight specific types of alerts that you can implement today for improved SEO.

Howdy Moz fans, and welcome to another edition of Whiteboard Friday. Today we’re going to chat about email alerts and using them to help with some of your SEO efforts, specifically content identification, competitive intelligence, some keyword research, and, of course, a lot of link building because email alerts are just fantastic for this.

Now here’s what we’ve got going on. There are a number of tools that you can use to do email alerts. Obviously, Google Alerts, very well-known. It’s free. It does have some challenges and some limitations in scope, so you won’t be able to do everything that I’m going to talk about today.

There’s Fresh Web Explorer from Moz. Of course, if you’re a Moz Pro subscriber, you’ve probably used Fresh Web Explorer. And Fresh Web Explorer’s alerts functionality, in particular, is kind of my favorite Moz feature period right now.

We also have some very strong, good competitors in this space—Talkwalker, Mention.net, and Tracker—all of which have many of the features that I’m going to be talking about here. So whatever program you’re using, this stuff can help.

That being said, I am going to be talking in terms of the operators that you would use for Fresh Web Explorer specifically. Google Alerts has some of these operators but not all of them, and so do Talkwalker, Mention, and Tracker. They might not have all of these, or theirs might be slightly different. So make sure you take a look at how the search operators for each of those work before you go engaging in this.

The operators I’m going to specifically mention are the minus command, which removes. I think that works in all of them. That’s essentially saying show me this stuff, but don’t show me anything that contains this.

Link:, this works in plenty of them. That’s showing links to the URL specifically. RD: which in Fresh Web Explorer shows links to the root domain, and SD: which shows links to the subdomain.

Quotes, which matches something exactly, works in all of these. TLD, which shows only links from a given domain extension. If I want to see only German websites, I can put TLD:DE and see only sites from Germany. Then, site: which shows only results from a specific sub or root domain, as opposed to like SD or RD, which show links to a subdomain or root domain.

This will all make sense in a second. But what I want to impart is that you can be using these tools, these types of commands to get a ton of intelligence that’s updated daily.

What I love about alerts is whether you do it weekly, or you do it daily, however, whatever frequency works for you, the beautiful thing is it’s a constant nudge, a constant reminder to us as marketers to be concentrating on something like, oh, yeah, I should really be thinking about link building. I should really be thinking about what my competition’s writing about. I should really be thinking about what bloggers in this niche think about my keywords and who they’re talking about when they mention these keywords, all that kind of stuff.

That nudge phenomenon of just having the repetitive cycle is really important for marketers. I feel like it helps me a tremendous amount when I get my alerts every night just to remember oh, yeah, I should do this. I should take a look at that. It’s right in my email. I take care of it with the rest of my work. Very, very helpful.

#1: Links to my competitors, but not to me

I mean come on. It’s just a gimme. It’s an opportunity for a bunch of things. It shows you what types of keywords and content people are writing about in the field, and it almost always gives you a link opportunity or at least insight into how you might get a link from those types of folks. So I love this.

I’m going to imagine that I’m Rover.com. Rover is a startup here in Seattle. They essentially have a huge network. They’re sort of like Airbnb but for people who do dog sitting and pet sitting. Great little company.

Rover has got some competitors in the field, like DogVacay.com and PetSitters.org and some of these other ones. They might, for example, create an alert that is RD:dogvacay.com. Show me people who link to my competitor’s domain, anywhere on my competitor’s domain, people who link to PetSitters.org minus RD:rover.com. Don’t show me people who also link to me. This will show them a subset of folks who are linking to their competition not linking to them. What a beautiful link building opportunity.

#2: Mentions my brand, but doesn’t link to me

Number two, another gimme and one that I’ve mentioned previously in some link building videos on Whiteboard Friday, places that mention my brand but don’t link to me. A number of these services can help you with this. Unfortunately, tragically, Google Alerts is the only one that can’t. But mentions my brand, doesn’t link to me, this is great.

In this case, because Rover’s brand name is so generic, people might use it for a lot of different things, they’re not always referring to the company Rover. They might use a keyword in here like Rover and any mention of dog sitting minus RD:rover.com. That means someone’s talked about Rover, talked about dog sitting, and they didn’t link to them.

This happens all the time. I have an alert set up for Moz that is “RD:moz.com,” and actually for me I just put minus Morrissey because the singer Morrissey is like the most common thing that people mention with Moz. I think I have another one that’s like “moz marketing minus RD:moz.com.” Literally, every week I have at least some news sites or sites that have mentioned us but haven’t linked to us. A comment or a tweet at them almost always gets us the link. This is great. I mean it’s like free link building.

#3: Mentions my keywords, but doesn’t link to me

This is similar to the competitive one but a little broader in scope.

So I might, for example, say “dog sitting or pet sitting minus RD:rover.com.” Show me all the people in the space who are talking about dog sitting. What are they saying?

The nice thing is with Fresh Web Explorer, and I think Talkwalker and Mention both do this, they’re sorted in terms of authority. So you don’t just get a bunch of random jumble. You can actually see the most authoritative sites.

Maybe it is the case that The Next Web is covering pet sitting marketplaces, and they haven’t written about Rover, but they’re mentioning the word “dog sitting.” That’s a great outreach point of view, and it can help uncover new content and new keyword opportunities too.

#4: Shows content produced by a competitor or news site on a topic related to me

For example, in the case of Rover.com, they might be a little creative and go, “Man, I really want to see whenever the Humane Society mentions dog sitting, which they do maybe once every two or three months. Let me just get a reminder of that. I don’t want to subscribe to their whole blog and read every post they put out. But I do really care when they talk about my topic.”

So you can set up an alert like dog sitting “site:humanesociety.org.” Perfect. Brilliant. Now I’m getting those content ideas. Potentially there are some outreach opportunities here, link building opportunities, keyword opportunities. Awesome.

#5: Show links coming from a geographic region

Let’s say, hey, I saw PetSitters.org is going international. They just opened up their UK branch. They haven’t actually, but let’s say that they did. I could create an alert like “RD:petsitters.org TLD:.co.uk.” Now it shows me all the people who are linking to PetSitters.org from the U.K. Since I know they just expanded there, I can start to target all those people who are coming out.

#6: Links to me or my site

This is very important for two reasons. One is so you know when new links are coming, where they’re coming from, that kind of stuff, which is cool to see. Sometimes you can forward those on, see what people are saying about you. That’s great.

But my favorite part of this is so I can thank those people, usually via Twitter, or so I can promote it on social media networks. Seriously, if someone’s going to go and say something nice about Rover and link to me, and it’s a third party news source or a blogger or something, I totally want to share that with my audience, because it reminds them of me and is also great promotional content that’s coming from someone else, an authoritative external voice. That’s wonderful. This can also be extremely helpful, by the way, to find testimonials for your business and press mentions that you might want to put on your site or in your conversion funnel.

#7: Find blogs that are writing about topics relevant to my business

This is pretty slick.

It turns out that most of these alerts systems will also look at the URL when they’re considering alerts, meaning that if someone has blog.domain.com, or domain.com/blog/whateverpost, you can search for the word “blog” and then something like “dog sitter.” Optionally, you could add things like site:wordpress.com, site:blogspot.com, so that you are getting more and more alerts that are showing you blogs that write about your topic, your keywords, that kind of stuff. This is pretty slick.

I especially like this one if you have a very broad topic area. I mean if you’re only getting a few results with your keywords anyway, then you can just keep an alert on that shows you everything. But if you have a very broad topic area, and dog sitting is probably one of those, you want to be able to narrow in on the blogs that you really care about or the types of sites that you really care about.

#8: Links to resources/data that I can compete with/offer a better version

I like this as a link building strategy, and I’ll use it on occasion. I don’t do it all the time, but I do care at certain points when we’re doing a campaign.

For example, a link to a resource or a piece of data that’s been collected out there on the Web that I can compete with or offer a better version of. Somebody, for example, is linking to the Wikipedia page on dog sitting or, let’s say, a statistics page from a Chamber of Commerce or something like that, and I have data that’s better, because I’ve done a survey of dog owners and pet sitting, and I’ve collected all this stuff. So I have more recent, and more updated, and more useful data than what Wikipedia has or this other resource.

I can reach out to these folks. I love seeing that. When you see these, these are often really good link targets, targets for outreach. So there’s just a lot of opportunity by looking at those specific resources and why people link to them and who.

So, with all of this stuff, I hope you’re going, setting up those alerts, getting your daily or weekly nudges, and improving your SEO based on all this stuff.

Thanks, everyone. See you again next week for another edition of Whiteboard Friday.

Take care.