SEO Strategy For Business – How To Build A Keyword Opportunity Model

by Nick · 41 comments

The Business of SEO
  • Buffer
  • Buffer

When it comes to business – SEO is a unique monster, because unlike most vertical markets, you don’t need products, vendor relationships, distribution channels, or even start up capital.

SEO without business strategy is like catching lightning in a bottle

But without the right strategy, leveraging search for business can feel a lot like trying to capture lightning in a bottle.

In this post I am going to step through creating a keyword opportunity model to project estimated revenue from SEO.

I am going to do this 2-fold:

  1. With an simplified model to help develop the initial concept, and then
  2. With a detailed, assumption-based model as an example

This approach is meant to help you weigh different keyword opportunities and uncover where your SEO priorities should be focused. Make sure you use this in conjunction with SEO competitive analysis and are leveraging on-page optimization for your target URL’s.

The Simple Keyword Opportunity Model

The simple model is meant to get you thinking about the implications of cost versus revenue in terms of return on SEO.

For revenue I will be using the following formula, where monthly search volume is representative of [exact] searches from Google’s keyword tool.

Conversion value will vary depending on the goal of your SEO campaign, for example; E-commerce sites could use average order value where lead generation sites might use their average lead value, in any case make sure you use an actual dollar amount.

The value of a keyword is pretty subjective based on your conversions, and therefore you need to utilize goal-specific conversion rates to customize this formula to fit your business.

For average SERP click-through I prefer to use an aggregate measure to project returns from larger scenarios, for example if I designed a model to target page 1 of Google I would use a 6.1% click-through rate, but if the model was specifically focused on top 5 rankings I would use 12.1%. These are just my starting point figures.

If you want to use more approximate measures of average click-through for exact rankings, a very helpful case study from Slingshot SEO on SERP CTR produced this lovely graph (click to enlarge).

Slingshot dives into the details of their study here on SEOmoz, and very recently Geoff Kenyon posted his thoughts on SERP click through rates, so you have some starter data.

18% lines up much closer to my historical SERP performance data than the previously reported data from Optify, which posited a potential of a 36.5% CTR for position #1.

Update: The Keyword Opportunity Model is no longer publicly available, but you can still get access by downloading my complete guide to keyword research.

Download Guide to Keyword Research For SEO

Within the spreadsheet you will see there are 2 sheets; Simple and Advanced, and all values that drive the model are highlighted in yellow.

Use the simple model to get a basic sense of how things work; how different values for different metrics will impact your total monthly revenue, and don’t be afraid to make adjustments and get creative.

Try inputting the lifetime value of a customer to forecast far into the future or bake in a measure for time on site and how certain thresholds may lead to higher variable conversion rates. If your goal is to drive newsletter signups, measure it; think outside the box! Then share what you did.

For cost assumptions I’m going to use average costs associated with developing one URL targeting one primary keyword. In this scenario I’m assuming the website already has a baseline of steady traffic, but not much, say 5,000 visits per month or so. This is important to note because it reduces the need for additional link building or paid advertising promotion for posts to gain enough rank signals to make it to page one.

For this model, the costs associated with content  development are:

Research
This represents the average cost to do post-level keyword research and compile a matrix for the URL.

Writing
This represents the actual ideation and writing of all of the content for the URL.

Production
This is representative of the editing and actual production of the content; final proofreading, formatting images, and loading into the content management system for publication.

Adjusting For Competitiveness

A critical part of any good business model is adjusting for relative competition.

I use a relative measure of keyword difficulty to do this, either average domain authority or competitiveness index, but you could just as easily use keyword difficulty score.

In the simple model I call it ‘Average SERP CI,’ but it doesn’t have to be, feel free to throw average SERP DA in there and everything will still work, just make sure you keep your metrics consistent between column G and the competitive weightings in H2 and H3 (click to enlarge).

About Those Competitive Weightings

Cell H2: Cost Multiplier – The cost multiplier is pretty much exactly what it sounds like; it multiplies your cost based on the competitiveness threshold you set.

This represents how much more expensive it is for you to create content that can compete for a first page ranking based on your website’s relative authority. If your website has a DA of 60, it is going to be easier for you to compete in SERP’s where the average is only 50 versus 70.

Cell H3: CI Threshold – This is the authority or competitiveness threshold where it becomes more expensive for you to compete.

This threshold should be relatively close to your website’s authority score, so if you have a DA of 40, you might want to set this at 50.

PLEASE NOTE: As is the case with this model in general, both of these measures that drive the cost side of this evaluation are extremely subjective. You will need to dial them in over time, so start conservatively and try to glean data from your website so these are as accurate as possible.

Let’s Build A Test Case

For this example I am going to use an SEO vertical market and focus on a set of closely related keywords both in terms of semantic relevance and related user intent.

Just for fun I’m including tools, services, and consultant queries (click to enlarge).

SEO Industry Keywords

You will see that I have loaded all of the above keyword data into the model and plugged in some test conversion data.

I created a column titled (Optional) where you can enter in a specific SERP rank between 1 and 5 and it will adjust the potential visitors based on the CTR values entered in F2 through F6, currently it’s configured with the CTR data from the Slingshot SEO case study.

To make this applicable to your website:

  1. Replace the SEO keywords with your keywords and their estimated [exact] monthly search volume
  2. Enter your average costs (or total cost into D6) for creating a new URL

In case you’re not sure what your costs are, here’s an approach to estimate them:

Estimating Your Development Costs

Everyone’s costs are different and some are not straightforward.

Do your best to average out what it costs to produce a piece of content from start to finish, and try not to leave out any piece of the development life-cycle.

For example, a company’s content development process might look like this:

Try to boil down all of your costs to an hourly rate and then average them across each functional department, so you end up with a representative cost for each piece of your process.

However your process works, try to capture all of the costs. If all of your resources are salary, figure out what their hourly rate is and estimate their average production time for all of their tasks on one piece of content.

Once you have a handle on the costs, input them into the model in D3 through D5, or place your total cost into D6.

Estimating Your Revenue

Revenue is going to be a bit easier to estimate, especially if you already have analytics or tracking in place and you know exactly how much you make per visitor or per conversion.

The model is setup to take care of estimating how many visitors you can acquire each month and your revenue per conversion, all you need to provide is your:

  • Average conversion rate, and
  • Average conversion value

Input these into cells B4 and B5, respectively.

The Advanced Keyword Opportunity Model

Advanced Keyword Opportunities Are in the DetailsHopefully by now you’ve taken some time to test out the simple model within the spreadsheet. Now it’s time to drill down into more specific costs, adjusting for a wider range of variables based on more specific heuristics.

In the simple model we used a base cost and a multiplier, for the advanced model we will still assume a base cost, adding in an average cost per link and discount rates based on some additional competitive factors.

The competitive factors we will use to augment costs in the advanced model are:

  • Domain Authority (DA), and
  • Page Authority (PA)

To take into account both the logarithmic nature of DA and PA I will be using discount rates for each of these metrics, to compute an estimated net present value to be used as our multiplier.

Baseline Assumptions

First it is important to understand a little bit about the discount rates.

The purpose is to allow for the model to be adjusted for volatility and the specific nuances of your website(s).

Logarithmic CurveThe best way to get started is to set a conservative cost multiplier (higher versus lower) for both DA and PA and then dial in discount rates to help represent variable ranking factors such as temporal ranking factorsquery deserves freshness, or even brand bias.

Newer, less established websites can outrank their older, higher authority competitors – the SERP’s are different on an almost case by case basis.

As your website authority grows, your link profile gains velocity, and builds trust – you should be able to adjust the discount rates downward to reflect less cost to reach page 1.

I have ranked pages with a DA/PA of 50% less than the SERP average, sometimes with only a few links and within a few short days or weeks. This type of volatility is not predictable, but that’s the not the end-goal here.

The reason you are making assumptions and dialing in ranking variables is to develop a measure within an ‘order of magnitude‘ so you can begin to project cost and performance return from SEO.

Adjusting Metrics to Capture Cost

Now that you have your average cost per URL, you need to figure out the adjustments for the model based on  additional external factors and work these into your discount rates.

Average cost per link is a tough one, so my recommendation for the purposes of this model is to use a minimum DA threshold of 30, i.e. link cost is representative of a link from a website with DA 30+.

Discount rates are going to take you the most time to develop, where as your threshold and multiplier you should be able to figure out in a few weeks.

If you would like help getting started, I will be sending out some example discount rates to my subscribers – so if you haven’t already, take a moment and sign up for updates »

Measuring Potential Return – Evaluating Based on MER

MER stands for mean efficiency ratio, and is the metric I use to evaluate keyword opportunities.

It is a simple computation done by dividing total revenue by total cost, which provides you an at a glance metric for gauging profitability. An MER of 1.0 means that you broke even and made 100% of your initial investment back – so anything greater than 1.0 is gravy profit.

If you’re tracking all of your costs and revenue at the keyword level, this is a great simple metric for measuring ROI. I have set up some simple conditional formatting within the spreadsheet so positive MER cells render green.

Adjusting Based on Actual Performance

Your SEO business model is only as good as the data it is built on, so if you want it to be usable, it needs to be accurate. Set aside time to update and refine your model to be as accurate and representative of your business as possible.

Treat it honestly and with respect; if your costs go up, change them. If you find that your carry costs for getting to page 1 are higher, increase your discount rates.

In Conclusion

This is by no means a perfect system or evaluation platform. It is, however, a lesser discussed element of SEO and keyword evaluation that I would like to discuss.

What are your thoughts?

About Nick
Nick is the VP of Digital Strategy at W.L. Snook & Associates, Co-Founder of I'm From The Future an ecommerce consultancy, and the author of this SEO Blog. Follow Nick on Google+.

Follow me on Twitter · Visit my website →


{ 35 comments… read them below or add one }

Jamie Knop February 12, 2013 at 9:10 am

Good post Nick, I need to update my CTR figures! Will read through the two posts you mentioned detailing CTR.

Do you not think it can be dangerous pitching with potential income ROI/income figures? It can push a contract but if the expect x in return and they do not get x then they wouldn’t be happy. I think its always best to set the expectations slightly lower that way when you double them they are over the moon.

Reply

Nick February 12, 2013 at 9:17 am

Hey Jamie – That’s a great point.

It all depends on the website’s history of performance within specific SERP’s. It’s definitely subjective and on a case-by-case basis, but I think as long as you have the numbers dialed in tightly for accuracy (which is the hardest part that takes the longest), you can use a model to help set reasonable expectations.

I’m with you on under-promising and over-delivering (and I’m of the mind that you should do that anyway :) ) but I also think that when building a bigger business, especially one designed to leverage enterprise SEO, you need to have a handle on relative costs and returns going in; it gives you a set of benchmarks and a baseline for cost expectations.

Reply

John-Henry Scherck February 12, 2013 at 9:52 am

At first I thought it was going to be along the lines of “Estimating the Value of a Like” but this is well thought out, rational and grounded. Thanks for sharing – model downloaded, post bookmarked.

Reply

Nick February 12, 2013 at 10:04 am

I’m glad you found it to be useful, that’s very encouraging. It’s pretty scaled back but I think it can provide a foundation for deeper discussion and potentially more proactive projections analysis within organizations.

Always happy to get your thoughts JH.

Reply

Geoff February 12, 2013 at 10:55 am

Hey Nick,

Thanks for publishing this – I really like that it takes cost/effort into account, something that my model didn’t do. How did you choose the Slingshot model for CTR’s?

Downloaded and saved for later.

Reply

Nick February 12, 2013 at 11:01 am

Hey Geoff – my pleasure.

I have been working on abstracting my model to something simple for a while and your post was the final nail in the motivation coffin to get this done, so thank you :)

The Slingshot CTR’s lined up much closer with the model I built for my Japanese content project, their sample size was close to the one I worked with (10,646 vs. mine which was roughly 20,000 to start), and their methodology of watching page 1 ranking that remained stable for 30 days all made it a pretty easy pick to use for baseline data.

I definitely recommend using actual CTR’s based on historical analytics, but I figured this provided a solid jumping off point.

Reply

Geoff February 12, 2013 at 11:04 am

Cool, thanks dude.

Joe February 13, 2013 at 11:50 am

Sorry if I missed it, but where do you calculate the cost multiplier number again? I understand it’s higher if you’re in a more competitive SERP, but don’t see where the 1.75 comes from. Thanks!

Reply

Nick February 13, 2013 at 4:12 pm

Hey Joe,

Nope, you didn’t miss a thing. The cost multiplier is going to be specific to the amount of additional cost it takes you to rank on page one above and beyond a given competitiveness threshold, so it’s subject to each individual SERP and your specific website.

I use 1.75 as a good starting point but this could just as easily be 1.5, 2, etc.

Reply

Joe February 13, 2013 at 6:22 pm

Ok cool, I made an edit for fun to make that Cost Multiplier number automatic, based on the average difficulty of the SERPs. It kind of automates your number that I assume you were basing off of the average difficulty of the SERPs, since the more difficult the SERP page as a whole, the more difficult it is to get your built out content to rank.

I also took into consideration that as the Average SERP CI approaches 100, the difficulty increases exponentially, so I added some columns in the upper right to add difficulty points to the Cost Multiplier.

So the final Cost Multiplier = (Avg SERP CI / CI Threshold) + CI Adjustment
That formula works if the Avg SERP CI beats the CI Threshold, by the way.

For example: If your Avg SERP CI is 40 and the Threshold is 50, the cost multiplier is 1 since you won’t have a problem in that weak of SERPs

But if the Your Avg SERP CI is 90 and the Threshold is 50, the cost multiplier comes out to be 2.8 since you’ll need to spend some extra money and time on that content and page to get it in there.

There’s definitely some more refining to do on my equation, but I think it’s a fair addition to an excellent model you created :)

Here’s the doc to take a look

Vikram February 14, 2013 at 3:22 am

But if the domain authority drops, will your site still rank in the SERPS

and thanks for the insights…

Reply

Nick February 14, 2013 at 8:43 am

Vikram – Probably not. What would you be doing that would cause your DA to drop?

Reply

Vikram February 14, 2013 at 2:31 pm

I have a blog and i am quite unsure what to do and i am totally interested in getting more exposure, and thinking of what to do next, any ideas!!

Chris February 19, 2013 at 1:21 pm

Hi Vikram,

Read your comment and couldn’t help poke my nose into the discussion. ;)

Why don’t you try leveraging existing authority sites, like Youtube, Squidoo, Blogspot and then rank those. Then funnel the traffic through to your blog.

Chris James (out of the box thinker – hates status quo)

Marketing Company USA February 16, 2013 at 4:14 am

This is SEO Business Box with SEO Strategy For Business.
thanks For help with SEO & social media for your business

Reply

Graham Hunter February 18, 2013 at 5:37 pm

It seems like it will be harder and harder to project CTR as G+ and rich snippets become involved. Is there a way that you deal with this?

Reply

Nick February 18, 2013 at 5:44 pm

That’s a really good point. It’s going to take some large population studies once there’s a mildly representative population of SERP’s with top ranked G+ results. I’m going to look into this a bit further… you’ve gotten me very curious about any existing data sets. In close relation to this, have you seen the post by Chris Winfield on impact of G+ for news?

Reply

Aguae Internet marketing Calgary February 26, 2013 at 7:32 am

Great post describing SEO business models. I like to read it and found it useful one. Thank you very much for sharing this here with us.

Reply

Henry Smith Social Media Expert February 28, 2013 at 12:27 am

Hey Nick, I have landed in your blog while travelling through the cyber space. Thanks mate for sharing such important topic with the readers. You have depicted the total picture of CTR. It is really important to improve CTR. It will help to increase the chance of generating targeted leads. So if you are working in such a way so that your CTR is improving, by that way you can fulfill your site’s target. Thanks once again mate! Keep it up – bookmarked your website to see much more from you. :)

Reply

Totio Filipov March 8, 2013 at 8:42 am

It’s a cool formula and a great article overall, Nick. I think many people have problems choosing the right keywords for their business.

Reply

Nick March 8, 2013 at 1:25 pm

Oh cool, I’ll check it out. Thanks.

Reply

begonia March 11, 2013 at 6:14 am

hey nick… you done great job, it is a one of the interesting post…
i am expecting more from you.

Reply

logo design company March 14, 2013 at 5:25 am

Interesting read! Nick you really had shown here some great approaches for every SEO. Indeed, very impressive and useful post. Great job!!

Reply

Aaron March 16, 2013 at 9:33 am

This is a very useful article for people that are ready to take their SEO to the next level. I think it’s important to track and estimate your costs even for the beginner / amateur; can definitely be a big motivator in improving their processes.

You have a new reader here, thanks Nick!

Reply

Kris Dietz June 11, 2013 at 10:16 am

I like this a lot, people typically stay away from putting a price on it. I like that you went for it with some hard data.

I think a second approach for targeting low competition longtails would make for an interesting comparison.

Reply

Soma Digital Marketing June 18, 2013 at 11:46 am

Very interesting article Nick! Love the way you break down metrics and link them to ROI!

Reply

Brandon June 24, 2013 at 8:43 pm

Way to go on really breaking things down! Very informative.

Reply

joysam July 20, 2013 at 6:04 am

Nice educational post. Would like to emphasize on two important tips for SEO users who are quite new in the industry. For item number 3, backlinks – do it legitimately and do not buy it. lastly, there are now many SEO tools like Colibri (http://colibritool.com) that integrates many of the essential aspects like Google position checker, keyword density, traffic estimator, etc…

Reply

Spook SEO August 14, 2013 at 4:27 am

Hey Nick great post! I was also thinking about giving the potential ROI to clients when pitching but it just doesn’t seem like the way to go.

I still think that emphasizing on the “how we’ll do it” part seems like the safest and more strategic way.

Reply

Nick August 14, 2013 at 1:55 pm

Thank you :)

I’ve found that showing potential ROI, even using very conservative numbers can at least make the connection in their minds that these investments do equate to real dollars. I think it makes it more tangible.

Cheers!

Reply

Prabal Chowdhury August 29, 2013 at 10:43 am

Hi Nick,
Thanks for your research tips about SEO strategy for business. I appreciate your writing. I would like to suggest my best tool for ranking your website. It’s the best SEO monitoring tool for you all. You can try ColibriTool for daily SEO works. As SEO is not a one time work, we need to perform it continuously. This ColibriTool is build for this purpose. You can find ColibriTool here (http://colibritool.com)

Reply

Spook SEO November 25, 2013 at 8:25 am

Hi Nick!

Nice post about SEO strategy for business. I believe that SEO is very important in each business. On the other hand, for the revenue; I think revenue is going to be a bit easier to estimate, especially if we already have analytics in place and we know exactly how much we make per visitor.

Reply

Nick February 14, 2013 at 8:42 am

Joe – This is very interesting, did you take a look at the advanced model tab? It adds adjustment based on DA/PA thresholds using a discount rate.

The only thing I would be cautious of here is you’re using the average of the average SERP CI, so instead of basing the cost multiplier on your website you are basing it on the average competitiveness of all of the keyword SERP’s in the model. It’s important to note that each row is representative of an individual SERP, so taking the average of all of the SERP averages is going to skew cost.

I am pretty thrilled that you are using the model and playing around with the approach, that’s awesome! Thank you.

Reply

Joe February 14, 2013 at 11:52 am

Ahhh I see where I made my mistake – it shouldn’t be based off the average of multiple SERPs, but just one SERP. I guess it COULD work only if the keywords in the models are all closely related and target terms for the new page/content piece.

I didn’t take a good look at the advanced tab until just now and it looks very robust. How do you decide on the threshold and multiplier numbers? Thanks!

Reply

Nick February 14, 2013 at 12:32 pm

Yes, I think it could work if the keywords were all very closely related and the SERP’s had a lot of URL overlap.

The threshold is a guess and test number, but your initial thoughts on staying relatively close to your website’s DA is a good start. You also were right in adjusting this so the range is smaller as the DA increases, so if your site has a DA of 40 using 50 is find, but if your site has a DA of 60, 65 would be more appropriate.

As for the multiplier, this should be gleaned from historical performance and how much it has cost you in the past to crack page 1.

Reply

Leave a Comment

{ 6 websites link to this post }

Previous post:

Next post: