Breaking the Bad Habit of Robotic Emails

I come from a background that consists of customer service, sales, and social media marketing. Needless to say I naturally love talking to people. But when I joined the world of SEO a little over 2 years ago I found myself suddenly forgetting all that I knew about connecting with a human being. So for my first post on the Builtvisble blog I thought what better than to write about the first thing I had to conquer after joining the team.
This may sound like a simple fix but it was not an easy thing to do. I had no idea that my method just plain sucked until I was faced with the task of doing really targeted outreach. It was then that I realized my emails just weren’t going to cut it. So, after a few harsh words to myself, I just wanted to figure out how I got so robotic in the first place in order to fix it.
The Problems:
Templates

It all started at the very beginning of my outreach career. I had one goal and 1000+ contacts. I guess after hitting send over 1000 times and saying the same thing over and over again I started to become the template. Anything I did beyond that even if no template was involved, started to actually sound like it came from one.
I honestly thought it worked. But in reality it was the mirage of line dropping. If you drop a ton of lines, one is bound to bite. But I was sick of catching the yuppies!
Blowing Smoke
I learned early on in my sales role that someone can detect if you’re being insincere even over the phone. Yet one of the most common mistakes in outreach that I was guilty of, was insincerity. I was trying to flatter these people into thinking I was worthy of their time. It’s what people in our industry like to call “ego-bait” and it’s vastly overused.
If someone can tell through the words you use over the phone that you are faking it, then it must be even more obvious in plain text. Moral of the story: Don’t tell someone that they’ve influenced your whole piece when in reality you just discovered their writings 5 minutes before typing up that email.
I Barely Knew Them

I didn’t know the people on the receiving end of my emails at all and in return they had no idea who I was. Because I didn’t know them, it was really hard to try and write something out of the blue that I thought they might respond to. How is a complete stranger going to feel comfortable responding when I haven’t shown any type of genuine interest?
The Solutions:
I did my research
I started to simply do a little research on who I was contacting. But one thing lead to another and I discovered going beyond basic research really pays off.
So, I started “social stalking”!
You will probably see this term pop up in a lot of my posts. This is a step I used to skip but now it is no longer just a task to me but more of an at work hobby of mine.
My goal is to find out as much as possible about the person and social media allows me to do that. Not only do you learn the basics like their name and location, but you now have a window into their social lives. You can see who they are engaging with, when they are conversing the most, and what gets them to respond to others. Those things alone should give you enough insight into how to approach this person in a way that merits a response.
Tip: Most of your stalking is best done on Twitter. Facebook is more personal whereas Twitter gives even an average day person the chance to interact with some of the biggest names. Rather than wasting time scrolling through tweet after tweet, I use AllMyTweets.net. Just plug in their Twitter handle and it will generate all their tweets including retweets and replies. The goal here is to look at what they are sharing along with who they are talking to. So make sure you filter out the retweets and pay extra attention to the replies.

I would write to someone I know
I started to write up emails with the intentions of sending it to a friend or family member rather than an editor or writer. Why?
I was no longer using a template! Speaking to a friend comes natural, so naturally you are going to sound more casual and comfortable with the person on the other end.
There was no need to blow smoke and I started to trim the unwanted junk. I also noticed I was ditching the awkward self-introduction and didn’t have to explain my motives as much.
I know them! Everyone knows it is easier to talk to someone you know over a complete stranger. And if you have done enough social stalking then even someone you don’t know at all will feel a bit more familiar than before.
You may also find that they give some of the best feedback. Starting out I sent so many emails to family and friends with my request. They would then come back with feedback or questions so I knew what to change or better explain.
I celebrated my success
I’m big on this one and definitely see this as a solution!
A lot of the time, people get stuck in the depths of a bad habit because they aren’t celebrating even the smallest of successes. Whether the response is positive or negative the fact that someone thought you were important enough to respond to, means a foot in the door. For us ex robotic folk a response is a nice way to be recognized as a human again. No matter how big or small, success is success!
Now, do your research, write up that email, hit send, and fist pump!

Stay Updated: Sign Up for Webinar & New Blog Alerts

Stop Worrying About the New Google Maps; These URL Parameters Are Gold

I suspect I’m not alone in saying: I’ve never been a fan of the New Google Maps.

In the interstitial weeks between that tweet and today, Google has made some noticeable improvements. But the user experience still lags in many ways relative to the classic version (chief among them: speed).
Google’s invested so heavily in this product, though, that there’s no turning back at this point. We as marketers need to come to terms with a product that will drive an increasing number of search results in the future.
Somewhat inspired by this excellent Pete Wailes post from many years ago, I set out last week to explore Google Maps with a fresh set of eyes and an open mind to see what I could discover about how it renders local business results. Below is what I discovered.Basic URL structure
New Google Maps uses a novel URL structure (novel for me, anyway) that is not based around the traditional ? and & parameters of Classic Google Maps, but instead uses /’s and something called hashbangs to tell the browser what to render.
The easiest way to describe the structure is to illustrate it:

There are also some additional useful hashbang parameters relating to local queries that I’ll describe in further detail below.Some actual feature improvements
Despite the performance issues, New Google Maps has introduced at least two useful URL modifiers I’ve grown to love.
/am=tThis generates a stack-ranked list of businesses in a given area that Google deems relevant for the keyword you’re searching. It’s basically the equivalent of the list on the lefthand panel in Classic Google Maps but much easier to get to via direct URL. Important: am=t must always be placed after /search and before the hashbang modifiers, or else the results will break.
by:expertsThis feature shows you businesses that have been reviewed by Google+ experts (the equivalent of what we’ve long-called “power reviewers” or “authority reviewers” on my annual Local Search Ranking Factors survey). To my knowledge it’s the first time Google has publicly revealed who these power users are, opening up the possibility of an interesting future study correlating PlaceRank with the presence, valence, and volume of these reviews. In order to see these power reviewers, it seems like you have to be signed into a Google+ account, but perhaps others have found a way around this requirement.
Combining these two parameters yields incredibly useful results like these, which could form the basis for an influencer-targeting campaign:

Above: a screenshot of the results for: https://www.google.com/maps/search/grocery+stores+by:experts/@45.5424364,-122.654422,11z/am=t/Local pack results and the vacuum left by tbm=plcs
Earlier this week, Steve Morgan noticed that Google crippled the ability to render place-based results from a Google search (ex: google.com/search?q=realtors&tbm=plcs). Many local rank-trackers were based on the results of these queries.
Finding a replacement for this parameter in New Google Maps turns out to be a little more difficult than it would first appear. You’ll note in the summary of URL structure above that each URL comes with a custom-baked centroid. But local pack results on a traditional Google SERP each have their own predefined viewport — i.e. the width, height, and zoom level that most closely captures the location of each listing in the pack, making it difficult to determine the appropriate zoom level.

Above: the primary SERP viewport for ‘realtors’ with location set to Seattle, WA.
Note that if you click that link of “Map for realtors” today, and then add the /am=t parameter to the resulting URL, you tend to get a different order of results than what appears in the pack.
I’m not entirely sure as to why the order changes–one theory is that Google is now back to blending pack results (using both organic and maps algorithms). Another theory is that the aspect ratio on the viewport on the /am=t window is invariably square, which yields a different set of relevant results than the “widescreen” viewport on the primary SERP.
One thing I have found helps with replicability is to leave the @lat,lng,zoom parameters out of the URL, and let Google automatically generate them for you.
Here are a couple of variations that I encourage you to try:
https://www.google.com/maps/search/realtors/am=t/data=
followed by:
!3m1!4b1!1srealtors!2sSeattle,+WA!3s0x5490102c93e83355:0x102565466944d59a
or
!3m1!4b1!4m5!2m4!3m3!1srealtors!2sSeattle,+WA!3s0x5490102c93e83355:0x102565466944d59a
Take a closer look at those trailing parameters and you’ll see a structure that looks like this:

The long string starting with 0x and ending with 9a is the Feature ID of the centroid of the area in which you’re searching (in this case, Seattle). Incidentally, this feature ID is also rendered by Google Mapmaker using a URL similar to http://www.google.com/mapmaker?gw=39&fid={your_fid}.
This is the easy part. You can find this string by typing the URL:
https://www.google.com/maps/place/seattle,+WA
waiting for the browser to refresh, and then copying it from the end of the resulting URL.
The hard part is figuring out which hashbang combo will generate which order of results, and I still haven’t been able to do it. I’m hoping that by publishing this half-complete research, some enterprising Moz reader might be able to complete the puzzle! And there’s also the strong possibility that this theory is completely off base.
In my research thus far, the shorter hashbang combination (!3m1!4b1) seems to yield the closest results to what tbm=plcs used to render, but they aren’t 100% identical.

The longer hashbang combination (!3m1!4b1!4m5!2m4!3m3) actually seems to predictably return the same set of results as a Local search on Google Plus — and note the appearance of the pushpin icon next to the keyword when you add this longer combination:

Who’s #1?
Many of us in the SEO community, even before the advent of (not provided), encouraged marketers and business owners to stop obsessing about individual rankings and start looking at visibility in a broader sense. Desperately scrambling for a #1 ranking on a particular keyword has long been a foolish waste of resources.
Google’s desktop innovations in local search add additional ammunition to this argument. Heat map studies have shown that the first carousel result is far from dominant, and that a compelling Google+ profile photo can perform incredibly well even as far down the “sixth or seventh” (left to right) spot.  Ranking #1 in the carousel doesn’t provide quite the same visual benefit as ranking #1 in an organic SERP or 7-pack.

The elimination of the lefthand list pane on New Google Maps makes an even stronger case. It’s literally impossible to rank these businesses visually no matter how hard you stare at the map:

Mobile, mobile, mobileParadoxically, though, just as Google is moving away from ranked results on the desktop, my view is that higher rankings matter more than ever in mobile search. And as mobile and wearables continue to gain market share relative to desktop, that trend is likely to increase.
The increasing ubiquity of Knowledge Panels in search results the past couple of years has been far from subtle. Google is now not only attempting to organize the world’s information, but condense each piece of it into a display that will fit on a Google Glass (or Google Watch, or certainly a Google Android phone).
Nowhere is the need to be #1 more dramatic than in the Google Maps app, in which users perform an untold number of searches each month. List view is completely hidden (I didn’t even know it existed until this week) and an average user is just as likely to think the first result is the only one for them as they are to figure out they need to swipe right to view more businesses.
Above: a Google Maps app result for ‘golf courses’, in which the first result has a big-time advantage.
The other issue that mobile results really bring to the fore is that the user is becoming the centroid.
This is true even when searching from the desktop. I performed some searches one morning from a neighborhood coffee shop with wifi, and a few minutes later from my house six blocks away. To my surprise, I got completely different results. From my house, Google is apparently only able to detect that I’m somewhere in “Portland.” But from the coffee shop, it was able to detect my location at a much more granular level (presumably due to the coffee shop’s wifi?), and showed me results specific to my ZIP code, with the centroid placed at the center of that ZIP.  And the zoom setting for both adjusted automatically–the more granular ZIP code targeting defaulted to a zoom level of 15z or 16z, versus 11z to 13z from my home, where Google wasn’t as sure of my location.
Note, too, that I was unable to be exact about the zoom level in the previous paragraph. That’s because the centroid is category-dependent. It likely always has been category dependent but that fact is much more noticeable in New Google Maps.
Maps app visibilityTaking both of these into account, in terms of replicating Google Maps App visibility, here is a case where specifying @lat,lng,zoom (with the zoom set to 17z)can be incredibly useful. 
As an example, I performed the search below from my iPhone at the hotel I was staying at in Little Italy after a recent SEM SD event. And was able to replicate the results with this URL string on desktop:
http://google.com/maps/search/lawyers/@32.723278,-117.168528,17z/am=t/data=!3m1!4b1

Conclusions and recommendations
While I still feel the user experience of New Google Maps is subpar, as a marketer I found myself developing a very Strangelovian mindset over the past week or so — I have actually learned to stop worrying and love the new Google Maps. There are some incredibly useful new URL parameters that allow for a far more complete picture of local search visibility than the classic Google Maps provided.
With this column, I wanted to at least present a first stab to the Moz community to hopefully build on and experiment with. But this is clearly an area that is ripe for more research, particularly with an eye towards finding a complete replacement for the old tbm=plcs parameter.
As mobile usage continues to skyrocket, identifying the opportunities in your (or your client’s) competitive set using the new Google Maps will only become more important.
About David-Mihm —

David Mihm is one of the world’s leading practitioners of Local search engine marketing. He has created and promoted search-friendly websites for clients of all sizes since the early 2000’s. David co-founded GetListed.org, which he sold to Moz in November 2012. His annual Local Search Ranking Factors project is among the most important studies of Local SEO.

The Content Lifecycle

Recently outside of my daily digital marketing role I have been expanding my skills in nature photography and advancing my iPhoneography photography skills, as this is a passion of mine. What I’m learning during this process of advancement, is you can use certain methods of creativity in photography to implement in your daily working content life in digital marketing.
Take these two examples below, they are both flowers that show their lifecycle stages on how they grow and adapt within different timeframes – but how exactly does this relate to content development you might be asking? Here’s how…

A Lifecycle of Changes
A lifecycle is a series of changes and this is exactly what great content should go through. As you can see from the nature lifecycle above, each part of the flower can change through different times of the day – it’s the same content (flower) but adapting to change and circumstances.
For example, nature climates and environments affect the way these flowers change and grow, and in any business environment different types of climates can affect or change the way content exists and develops in its lifecycle process. The types of climates in content range from:

Building the strength of your content (flower) from the root up is the most important thing, and optimising (nurturing) it is critical to extending its presence in the right direction. You do not want to end up with messy content you want to find the gem of your content and make known its presence.
Take my flower examples below, the first image looks like messy flowers (messy content) but the second image zooms in to reveal a beautiful part of the flower (content). With a little effort you can reveal the true beauty of any content.
  
What you should always be aware of is the importance of the content lifecycle, the management and how every stage connects in the following way:

Creativity Is The Real King
Last year, I wrote on how Creativity Is The Real King and I still truly believe it is. When it comes to building your content lifecycle you should be thinking about the connections I just mentioned and the creativity on all these levels. Below are some examples of what you should be thinking about at each stage:
Planning Content Structure
1. What type of content will you be creating and who for?
Size does not matter – you can have a range of “Content Pillars” and the small pillars can often produce more value and outweigh the large pillars through going viral, but the large pillars could maintain search growth in the long term. It all comes down to the purpose of your content and the route you choose to market it that will decide the winner; large or small.
In order to answer who you are creating it for you need to conduct some audience profiling. Analyse your target audience and answer who you are trying to engage with? To conduct audience profiling you can create a campaign flowchart to show who your content may reach and which audience type you may lose along the way. You could even compile a list of who might influence your work and who their connections may be linked to. It is vital to plan in advance who you want to engage with so you know you have a market for your idea.
Check out the recent write-up by Adam of Builtvisible on “Identifying Your Audience: a Data Driven Approach to Content Planning” for further tips and tricks.
2. How will this content connect with its intended environment?
Resources and guides are long and detailed, and they solve problems. This type of content intends to live in its environment forever as it offers guidance and value to the reader in the short term and long term; something that they can refer back to.
For example, if you happen to be new to digital marketing or SEO you may not be familiar with the settings and filters in Google Analytics and it may be frightening to venture into at first. This unknown fear may put users off using this great tracking tool, however creating a user-friendly guide to understand it can help solve a major problem and that is exactly what Kaitlin of Builtvisible did. Kaitlin created a really useful piece of content to solve a problem – “Google Analytics Resource Guide”.
The structure of this content was not only well planned which is why it is epic but it addressed queries that many users are asking. The guide is not only a good learning lesson for those looking to create epic content but is an asset in itself, a pillar that will stand strong amongst other content types and deliver traffic naturally as it is linkable content that will blossom because of the detailed and useful step by step actionable tips it provides.
3. Have you thought about the aesthetics?
What makes a user want to favourite your work? Typical things like bad text, typography, paragraphing, design and poor imagery can all make a user bounce away quickly if your content is not readable, user-friendly or engaging so consider this a the planning stage to avoid this going horribly wrong later in the lifecycle.
4. Where will this content live, on your own site or external site?
When planning where your content will live think about if it will be internal (on your own site) or external (on another site). Think about where your content is best suited.
Developing Content
Before developing your content you need to take a step back and have a think about the following:
1 – Keyword Research: you planned your content type but have you researched the market and search volume for this type of content before developing it, if not then develop a keyword research project and evaluate if this is an idea that people will actually spend time searching for.
2 – Cost: you have planned a new content project but do you have the budget for the development of your content and have you enough aside for any edits or re-design it may require?
Now you have these answered you can begin developing and here are a few tips that will help along the way:
Tip 1: Socially assess what has already worked well for you, your client’s, or competitors work in the past. You can use the following tools:
– Chrome plugin extensions like the “Sharemetric Extension” that will count how many social shares you have on a page
– Topsy for searching and analysing the social web
– Buzzsumo for analysing what content performs best (any topic or competitor)
Tip 2: Develop a design strategy on how to market the content through positioning. For example, you may want to cut up your content into separate parts and use it as individual graphics or you may want to segment each part of an infographic and sell it as a separate story rather than a whole piece. I like to experiment my layout with some colourful post it notes to play around with the layout – this keeps it interesting and organised.

Managing Content
Before you can deploy your content you need to manage a number of things. Ask yourself, is this fresh content you are uploading on your site or is it duplicated elsewhere? If it is duplicated then stop immediately and begin creating and managing your unique content, to avoid updates such as Panda that can hit your rankings.
Now, how will you go about managing the outreach of your content? Think about:
– Influencers: Remember the audience profiling you started the lifecycle with and planned for, well now take that list of influencers and get ready to make them aware of your content, and manage the relationship. You could open the communication with a soft and friendly email about the purpose of your content. By doing this you are prompting them to either link to or network what you have built and if the content mentions the influencer themselves then that is some “ego bait”.
– Syndicate: Manage the different forms of content. For example, you can make a transcript of your video content or turn your data graphic into a PDF and upload onto slideshare, this can then be deployed to a wider audience.
– Press Release: If you have developed data graphics that are on issues being reported on in the news then start to offer parts of your content to upcoming press releases to use as visuals for reporter write-ups and source it back to you or your client.
– Webinars: If your content holds a discussion point create a webinar to follow up the content release to get people talking about your content.
Deploying Content
As you move through the lifecycle to deploying your content, look at your content with a strategic lens. Just like photography you can zoom into your content. I personally like to get up close and personal with content by zooming into every angle and analysing it inside and out.
Every angle/data point is potentially a story-seller to a journalist/publisher. Analyse your content from as many angles as possible to really unlock the opportunities to enhance your outreach, and don’t forget to experiment i.e. subject lines, email content like images/screenshots, quotes, key data points etc.
During this process you are deploying a new method of creativity by reorganising the content structure and are able to develop new ideas to sow and grow in the future.
Preserving Content
Preserving content after deployment is important because you want your work to still be seen. However, in the future a site migration may affect where your content moves to.
Think about where your content lived and how it will live in the future, are you preserving it in the right way? Will you nurture your content well by 301 redirecting it to other relevant content or will it end up a 404.
One of the professional tips I can offer you and is also great for link building for your content is this, look for competitor URLs that return a 404 response that have acquired links and then produce better content for this page. Inform the publications linking to the 404 URL of the dead page, and then point them in the direction of your new content. There you go you have a new link.
Evaluating Content
At the end of the lifecycle evaluate how your content has come across in the whole lifecycle process. Did each stage run smoothly or what errors occurred? Note these answers down in a spreadsheet for each creative project, and see where you can learn from for next time round.
Evaluate the strength of the content by analysing how well it performed, how much social love it gained and track if it was linked to.
Just remember that you can make your lifecycle grow into whatever you want. Once you have the seed (content idea) all you need to do is plan it, produce it, nurture it, grow it and reproduce it into a new one of a kind form of content.

Stay Updated: Sign Up for Webinar & New Blog Alerts

One Content Metric to Rule Them All

Let’s face it: Measuring, analyzing, and reporting the success of content marketing is
hard.

Not only that, but
we’re all busy. In its latest report on B2B trends, the Content Marketing Institute quantified some of the greatest challenges faced by today’s content marketers, and a whopping 69% of companies cited a lack of time. We spend enough of our time sourcing, editing, and publishing the content, and anyone who has ever managed an editorial calendar knows that fires are constantly in need of dousing. With so little extra time on our hands, the last thing content marketers want to do is sift through a heaping pile of data that looks something like this:

Sometimes we want to dig into granular data. If a post does exceptionally well on Twitter, but just so-so everywhere else, that’s noteworthy. But
when we look at individual metrics, it’s far too easy to read into them in all the wrong ways.

Here at Moz, it’s quite easy to think that a post isn’t doing well when it doesn’t have a bunch of thumbs up, or to think that we’ve made a horrible mistake when a post gets several thumbs down. The truth is, though, that we can’t simply equate metrics like thumbs to success. In fact, our
most thumbed-down post in the last two years was one in which Carson Ward essentially predicted the recent demise of spammy guest blogging.

We need a solution. We need something that’s easy to track at a glance, but doesn’t lose the forest for the trees. We need a way to quickly sift through the noise and figure out which pieces of content were really successful, and which didn’t go over nearly as well. We need something that looks more like this:

This post walks through how we combined our content metrics for the Moz Blog into a single, easy-to-digest score, and better yet, almost completely automated it.

What it is not

It is
not an absolute score. Creating an absolute score, while the math would be equally easy, simply wouldn’t be worthwhile. Companies that are just beginning their content marketing efforts would consistently score in the single digits, and it isn’t fair to compare a multi-million dollar push from a giant corporation to a best effort from a very small company. This metric isn’t meant to compare one organization’s efforts with any other; it’s meant to be used inside of a single organization.

What it is and what it measures

The One Metric is a single score that tells you how successful a piece of content was by comparing it to the average performance of the content that came before it. We made it by combining several other metrics, or “ingredients,” that fall into three equally weighted categories:

Google Analytics
On-page (in-house) metrics
Social metrics

It would never do to simply smash all these metrics together, as the larger numbers would inherently carry more weight. In other words, we cannot simply take the average of 10,000 visits and 200 Facebook likes, as Facebook would be weighted far more heavily—moving from 200 to 201 likes would be an increase of 0.5%, and moving from 10,000 to 10,001 visits would be an increase of 0.01%. To ensure every one of the ingredients is weighted equally, we compare them to our expectations of them individually.

Let’s take a simple example using only one ingredient. If we wanted to get a sense for how well a particular post did on Twitter, we could obviously look at the number of tweets that link to it. But what does that number actually
mean? How successful is a post that earns 100 tweets? 500? 2,000? In order to make sense of it, we use past performance. We take everything we’ve posted over the last two months, and find the average number of tweets each of those posts got. (We chose two months; you can use more or less if that works better for you.) That’s our benchmark—our expectation for how many tweets our future posts will get. Then, if our next post gets more than that expected number, we can safely say that it did well by our own standards. The actual number of tweets doesn’t really matter in this sense—it’s about moving up and to the right, striving to continually improve our work.

Here’s a more visual representation of how that looks:

Knowing a post did better or worse than expectations is quite valuable, but
how much better or worse did it actually do? Did it barely miss the mark, or did it completely tank? It’s time to quantify.

It’s that
percentage of the average (92% and 73% in the examples above) that we use to seed our One Metric. For any given ingredient, if we have 200% of the average, we have a post that did twice as well as normal. If we have 50%, we have a post that did half as well.

From there, we do the exact same thing for all the other ingredients we’d like to use, and then combine them:

This gives us a single metric that offers a quick overview of a post’s performance. In the above example, our overall performance came out to 113% of what we’d expect based on our average performance. We can say it outperformed expectations by 13%.

We don’t stop there, though. This percent of the average is quite useful… but we wanted this metric to be useful outside of our own minds. We wanted it to make sense to just about anyone who looked at it, so we needed a different scale. To that end, we took it one step farther and applied that percentage to a logarithmic scale, giving us a single two-digit score much like you see for Domain Authority and Page Authority.

If you’re curious, we used the following equation for our scale (though you should feel free to adjust that equation to create a scale more suitable for your needs):

Where y is the One Metric score, and x is the percent of a post’s expected performance it actually received. Essentially, a post that exactly meets expectations receives a score of 50.

For the above example, an overall percentage of expectations that comes out to 113% translates as follows:

Of course, you won’t need to calculate the value by hand; that’ll be done automatically in a spreadsheet. Which is actually a great segue…

The whole goal here is to make things easy, so what we’re going for is a spreadsheet where all you have to do is “fill down” for each new piece of content as it’s created. About 10-15 seconds of work for each piece. Unfortunately, I can’t simply give you a ready-to-go template, as I don’t have access to your Google Analytics, and have no clue how your on-page metrics might be set up. 

As a result, this might look a little daunting at first.

Once you get things working once, though, all it takes is copying the formulas into new rows for new pieces of content; the metrics will be filled automatically.
It’s well worth the initial effort.

Ready? Start here:

Make a copy of that document so you can make edits (File > Make a Copy), then follow the steps below to adjust that spreadsheet based on your own preferences.

You’ll want to add or remove columns from that sheet to match the ingredients you’ll be using. Do you not have any on-page metrics like thumbs or comments? No problem—just delete them. Do you want to add Pinterest repins as an ingredient? Toss it in there. It’s your metric, so make it a combination of the things that matter to you.
Get some content in there. Since the performance of each new piece of content is based on the performance of what came before it, you need to add the “what came before it.” If you’ve got access to a database for your organization (or know someone who does), that might be easiest. You can also create a new tab in that spreadsheet, then use the =IMPORTFEED function to automatically pull a list of content from your RSS feed.
Populate the first row. You’ll use a variety of functionality within Google Spreadsheets to pull the data you need in from various places on the web, and I go through many of them below. This is the most time-consuming part of setting this up; don’t give up!
Got your data successfully imported for the first row? Fill down. Make sure it’s importing the right data for the rest of your initial content.
Calculate the percentage of expectations. Depending on how many ingredients you’re using, this equation can look mighty intimidating, but that’s really just a product of the spreadsheet smooshing it all onto one line. Here’s a prettier version:
All this is doing (remember Step 2 above, where we combined the ingredients) is comparing each individual metric to past performance, and then weighting them appropriately.
And, here’s what that looks like in plain text for
our metric (yours may vary):
=((1/3)*(E48/(average(E2:E47))))+((1/3)*((F48/(average(F2:F47)))+(G48/(average(G2:G47))))/2)+((1/3)*((H48/(average(H2:H47)))+(I48/(average(I2:I47)))+(J48/(average(J2:J47)))/3))

Note that this equation goes from row 2 through row 47 because we had 46 pieces of content that served to create our “expectation.”

Convert it to the One Metric score. This is a piece of cake. You can certainly use our logarithmic equation (referenced above): y = 27*ln(x) +50, where x is the percent of expectations you just finished calculating. Or, if you feel comfortable adjusting that to suit your own needs, feel free to do that as well.
You’re all set! Add more content, fill down, and repeat!

Here are more detailed instructions for pulling various types of data into the spreadsheet:

Adding new rows with IFTTT

If This Then That (IFTTT) makes it brilliantly easy to have your new posts automatically added to the spreadsheet where you track your One Metric. The one catch is that your posts need to have an RSS feed set up (more on that from FeedBurner). Sign up for a free IFTTT account if you don’t already have one, and then set up a recipe that adds a row to a Google Spreadsheet for every new post in the RSS feed.

When creating that recipe, make sure you include “Entry URL” as one of the fields that’s recorded in the spreadsheet; that’ll be necessary for pulling in the rest of the metrics for each post.

Also, IFTTT shortens URLs by default, which you’ll want to turn off, since the shortened URLs won’t mean anything to the APIs we’re using later. You can find that setting in your account preferences.

Pulling Google Analytics

One of the beautiful things about using a Google Spreadsheet for tracking this metric is the easy integration with Google Analytics. There’s an add-on for Google Spreadsheets that makes pulling in just about any metric a simple process. The only downside is that even after setting things up correctly, you’ll still need to manually refresh the data.

To get started, 
install the add-on. You’ll want to do so while using an account that has access to your Google Analytics.

Then, create a new report; you’ll find the option under “Add-ons > Google Analytics:”

Select the GA account info that contains the metrics you want to see, and choose the metrics you’d like to track. Put “Page” in the field for “Dimensions;” that’ll allow you to reference the resulting report by URL.

You can change the report’s configuration later on, and if you’d like extra help figuring out how to fiddle with it, check out
Google’s documentation.

This will create (at least) two new tabs on your spreadsheet; one for Report Configuration, and one for each of the metrics you included when creating the report. On the Report Configuration tab, you’ll want to be sure you set the date range appropriately (I’d recommend setting the end date fairly far in the future, so you don’t have to go back and change it later). To make things run a bit quicker, I’d also recommend setting a filter for the section(s) of your site you’d like to evaluate. Last but not least, the default value for “Max Results” is 1,000, so if you have more pages than that, I’d change that, as well (the max value is 10,000).

Got it all set up? Run that puppy! Head to Add-ons > Google Analytics > Run Reports. Each time you return to this spreadsheet to update your info, you’ll want to click “Run Reports” again, to get the most up-to-date stats.

There’s one more step. Your data is now in a table on the wrong worksheet, so we need to pull it over using the VLOOKUP formula. Essentially, you’re telling Excel, “See that URL over there? Find it in the table on that report tab, and tell me what the number is next to it.” If you haven’t used VLOOKUP before, it’s well worth learning. There’s a fantastic 
explanation over at Search Engine Watch if you could use a primer (or a refresher).

Pulling in social metrics with scripts

This is a little trickier, as Google Spreadsheets doesn’t include a way to pull in social metrics, and that info ins’t included in GA. The solution? We create our own functions for the spreadsheet to use.

Relax; it’s not as hard as you’d think. =)

I’ll go over Facebook, Twitter, and Google Plus here, though the process would undoubtedly be similar for any other social network you’d like to measure.

We start in the script editor, which you’ll find under the tools menu:

If you’ve been there before, you’ll see a list of scripts you’ve already made; just click “Create a New Project.” If you’re new to Google Scripts, it’ll plop you into a blank project—you can just dismiss the popup window that tries to get you started.

Google Scripts organizes what you create into “projects,” and each project can contain multiple scripts. You’ll only need one project here—just call it something like “Social Metrics Scripts”—and then create a new script within that project for each of the social networks you’d like to include as an ingredient in your One Metric.

Once you have a blank script ready for each network, go through one by one, and paste the respective code below into the large box in the script editor (make sure to replace the default “myFunction” code).

function fbshares(url) {
var jsondata = UrlFetchApp.fetch(“http://api.facebook.com/restserver.php?method=links.getStats&format=json&urls=”+url);
var object = Utilities.jsonParse(jsondata.getContentText());
return object[0].total_count;
Utilities.sleep(1000)
}

function tweets(url) {
var jsondata = UrlFetchApp.fetch(“http://urls.api.twitter.com/1/urls/count.json?url=”+url);
var object = Utilities.jsonParse(jsondata.getContentText());
Utilities.sleep(1000)
return object.count;
}

function plusones(url) {
var options =
{
“method” : “post”,
“contentType” : “application/json”,
“payload” :
‘{“method”:”pos.plusones.get”,”id”:”p”,”params”:{“nolog”:true,”id”:”‘+url+'”,”source”:”widget”,”userId”:”@viewer”,”groupId”:”@self”},”jsonrpc”:”2.0″,”key”:”p”,”apiVersion”:”v1″}’
};
var response = UrlFetchApp.fetch(“https://clients6.google.com/rpc?key=AIzaSyCKSbrvQasunBoV16zDH9R33D88CeLr9gQ”, options);
var results = JSON.parse(response.getContentText());
if (results.result != undefined)
return results.result.metadata.globalCounts.count;
return “Error”;
}

Make sure you save these scripts—that isn’t automatic like it is with most Google applications. Done? You’ve now got the following functions at your disposal in Google Spreadsheets:

=fbshares(url)
=tweets(url)
=plusones(url)

The (url) in each of those cases is where you’ll point to the URL of the post you’re trying to analyze, which should be pulled in automatically by IFTTT. Voila! Social metrics.

Pulling on-page metrics

You may also have metrics built into your site that you’d like to use. For example, Moz has thumbs up on each post, and we also frequently see great discussions in our comments section, so we use both of those as success metrics for our blog. Those can
usually be pulled in through one of the following two methods.

But first,
obligatory note: Both of these methods involve scraping a page for information, which is obviously fine if you’re scraping your own site, but it’s against the ToS for many services out there (such as Google’s properties and Twitter), so be careful with how you use these.

=IMPORTXML

While getting it set up correctly can be a little tricky, this is an incredibly handy function, as it allows you to scrape a piece of information from a page using an XPath. As long as your metric is displayed somewhere on the URL for your piece of content, you can use this function to pull it into your spreadsheet.

Here’s how you format the function:

If you’d like a full tutorial on XPaths (they’re quite useful), our friends at Distilled put together a really fantastic guide to using them for things just like this. 
It’s well worth a look. You can skip that for now, if you’d rather, as you can find the XPath for any given element pretty quickly with a tool built into Chrome.

Right-click on the metric you’d like to pull, and click on “Inspect element.”

That’ll pull up the developer tools console at the bottom of the window, and will highlight the line of code that corresponds to what you clicked. Right-click on that line of code, and you’ll have the option to “Copy XPath.” Have at it.

That’ll copy the XPath to your clipboard, which you can then paste into the function in Google Spreadsheets.

Richard Baxter of BuiltVisible created a wonderful 
guide to the IMPORTXML function a few years ago; it’s worth a look if you’d like more info.

Combining =INDEX with =IMPORTHTML

If your ingredient is housed in a <table> or a list (ordered or unordered) on your pages, this method might work just as well.

=IMPORTHTML simply plucks the information from a list or table on a given URL, and =INDEX pulls the value from a cell you specify within that table. Combining them creates a function something like this:

Note that without the INDEX function, the IMPORTHTML function will pull in the
entire piece of content it’s given. So, if you have a 15-line table on your page and you import that using IMPORTHTML, you’ll get the entire table in 15 rows in your spreadsheet. INDEX is what restricts it to a single cell in that table. For more on this function, check out this quick tutorial.

Taking it to the next level

I’ve got a few ideas in the works for how to make this metric even better. 

Automatically check for outlier ingredients and flag them

One of the downsides of smooshing all of these ingredients together is missing out on the insights that individual metrics can offer. If one post did fantastically well on Facebook, for example, but ended up with a non-remarkable One Metric score, you might still want to know that it did really well on Facebook.

In the next iteration of the metric, my plan is to have the spreadsheet automatically calculate not only the average performance of past content, but also the standard deviation. Then, whenever a single piece differs by more than a couple of standard deviations (in either direction), that ingredient will get called out as an outlier for further review.

Break out the categories of ingredients

In the graphic above that combines the ingredients into categories in order to calculate an overall average, it might help to monitor those individual categories, too. You might, then, have a spreadsheet that looked something like this:

Make the weight of each category adjustable based on current goals

As it stands, each of those three categories is given equal weight in coming up with our One Metric scores. If we broke the categories out, though, they could be weighted differently to reflect our company’s changing goals. For example, if increased brand awareness was a goal, we could apply a heavier weight to social metrics. If retention became more important, on-page metrics from the existing community could be weighted more heavily. That weighting would adapt the metric to be a truer representation of the content’s performance against current company goals.

I hope this comes in as handy for everyone else’s analysis as it has for my own. If you have any questions and/or feedback, or any other interesting ways you think this metric could be used, I’d love to hear from you in the comments!

About Trevor-Klein —
Trevor is the content strategist at Moz—a proud member of the content team. He manages the Moz Blog, helps craft and execute content strategy, and wrangles other projects in an effort to align Moz’s content with the company’s business objectives and to provide the most valuable experience possible for the Moz community.