Realistic SEO – Understanding Rank Potential

by Nick · 40 comments

in SEO

Realistic SEO - Understanding Rank Potential

Forrester Research projects U.S. consumers will spend $327 Billion online in 2016. With Google seeing approximately 7.1 Billion searches per day globally* (between mobile, desktop, and tablets) we can approximate that every single query in the U.S. is worth $1.26 (based on U.S. representing 10% of world’s internet users).

*Update: thank you Rand Fishkin for pointing out the original stat was outdated.

In a recent study by BrightEdge, SEO accounts for as much as 51% of the traffic being driven to B2B and B2C pages, re-enforcing that SEO is far from dead – and continues to offer a long-term, sustainable ROI channel.

So, why are we talking about SEO rank potential? Regardless of who you need to pitch SEO to : your boss, your client or business partner, it will all come down to one question: How much time and money will this cost me?

Prior to launching any SEO campaign, you need to know the resources needed to achieve your business goals. That way you can realistically answer the question.

It’s generally a bad idea to pitch an SEO campaign without first researching the keywords you’re going to be targeting, and understanding who you’re competing with… if you bet the farm on unattainable keywords, you’re gonna have a bad time.

So What is Rank Potential?

Rank potential is an analytical approach to understanding where a given webpage can actually rank in organic search, with respect to two axes of consideration; time and investment*.

*that’s my definition

It’s not realistic to project that you’re going to outrank a Wikipedia page for an informational query, or a Fortune 500 brand for their brand or branded product name – at least not without significant investment, if ever.

My approach is to analyze a page’s rank potential based on the qualitative SEO metrics for the current top 10 ranking URL’s (what I will refer to as the search engine results page 1, or SERP 1).

The metrics I analyze are:

  • Number of Links
  • Number of Linking Root Domains
  • Trust Flow
  • Citation Flow
  • Domain Authority
  • Page Authority

In addition to this core set I also will evaluate additional relative measures of authority including Domain Age, Organic Difficulty, Link Strength, Brand Footprint, and Social Media Impact.

Now I promise this is far from a beginner post, but to ensure you’re thinking of the base metrics the same way I do, I’m going to run through a quick and dirty description from my perspective.

My Perception of the Metrics

Number of Links

rank-potential_Total-backlinksMore importantly than just the pure number of links, this metric is used in conjunction with the number of unique linking root domains to determine diversity ratio. Organically authoritative websites have high link diversity ratios (LDR – links from many unique, authoritative root domains) versus gobs of links from the same 5 websites, likely owned by the same person.

In addition to the number of links as a consideration for LDR, it is also important to look at link velocity. If a site is picking up tons of links very quickly there should be an obvious cause such as press coverage, product launch, a new partnership, or a positive mention on a very large publication. If not, this is suspect that darker activities are afoot.

Number of Linking Root Domains

rank-potential_linking-root-domainsAs mentioned above, this is generally a sound measure of the organic authority of a website. Like everything else in SEO there are always exceptions to the rule, but generally websites that maintain an organic link profile should have anywhere from a 10 – 30% diversity ratio.

Which means 1 linking root domain for every 3.3 – 10 indexed links.

Trust Flow

rank-potential_trust-flowThis metric goes hand in hand with the latter metric, Citation Flow (CF), but for the purposes of this post I will try to describe here. This measure, like CF, was developed by Majestic, and is a relative measure of how trustworthy the link is.

Majestic has built a manually reviewed index of trusted websites and use a general rule they developed from a manual analysis of a representative sample of the URL’s included.

The rule is based on there finding that:

trustworthy sites tend to link to trustworthy neighbors – Dixon Jones, Founder of Majestic SEO

Trust Flow (TF) is a logarithmic scale from 0 to 100. In my experience you should stay away from links with a TF under 25.

Citation Flow

rank-potential_citation-flowCitation Flow is a predictive measure of how much influence a given URL is likely to pass on to links that it points to.

With specific respect to link juice, URL’s with higher CF are more likely to pass more of that influence downstream to the URL’s they link out to.

The practical application of this is links from pages with higher CF will send greater positive signaling than their weaker alternatives (generally below 15 is suspect).

Domain Authority

rank-potential_domain-authorityDomain Authority is an SEO KPI created by MOZ, and represents a relative measure of a domain’s authoritative value based on a logarithmic scale between 1-100. It is generally very accurate as a barometer for how *powerful* a domain is in consideration of that domain’s link profile and citation flow. One limiting factor is that it is calculated based on links contained within MOZ’s web index; Mozscape.

Page Authority

rank-potential_page-authorityThe child of Domain Authority, Page Authority, is the same basis scale as Domain Authority but instead of consideration for the authority of the entire domain, it is scored at the individual URL-level. It is a good indication of how powerful a specific URL is within a given domain, and is a great second-tier barometer for gauging the difficulty to outrank a page.

For Additional Consideration

Domain Age

This one is very obvious, it’s exactly what you think it is. How old is the domain – as in how long has it been registered for? What is the history of the domain and it’s indexed URL’s and links? How many times has it changed owners (WhoIs), IP’s, Nameservers?

The big search engines use some or all of these metrics as trust indicators. In addition, it is speculated that websites less than ~6 months old (think freshly registered domains) are likely to experience what is often referred to as Google Jail if they try to acquire links to quickly.

While Cutts has hinted that domain age may not be a big factor, older more established and more trusted domains are going to have an advantage when it comes to ranking.

Brand Footprint

What I am specifically talking about here is a general sentiment analysis that can be quickly and manually run for a website’s brand name. In my experience certain search verticals use variations of Google’s ranking algorithms to serve and rank different kinds of results.

In my very humble opinion this is why you will sometimes see search results with a lot of rich snippets, or packets of video results, and even sometimes results from review or complaint websites. In these instances it is important to consider the diversity of the results and think about the experience that G is trying to provide.

If a brand (or Entity) has swaths of negative press around it, from specific kinds of review websites, regardless os how weak that URL may be – it may be harder to unhinge and outrank. I believe this also has a lot to do with the idea behind time to long click, or G’s projected measure for quality / user satisfaction.

Social Media Impact

I’ve saved the most variable metric for last – which is also the hardest to define. What I’m looking at here more than anything else is what do the social properties for this brand look like; how often do they update them? How popular are they? Do they own at least 80% of their brand SERP?

If not, what other kinds of websites are ranking? Are there competitor sites in their brand SERP? (that’s generally a good sign – it means that G does not yet see them as a more established Entity for that keyword).

Getting Realistic with SEO

What better way to crush the dreams of all aspiring SEO’s out there then to break down just how realistic (also read: expensive) it would be to rank on some of the most coveted enterprise SERP’s.

For this I’m going to analyze SERP1 for keywords in the following 3 verticals:

  1. credit cards
  2. used cars
  3. home loans

For each of these verticals I’m going to run a keyword discovery report, select the 3 keywords with the highest search volume (which is not necessarily the seed keyword), and then analyze the rank potential of the SERP’s based on the metrics I listed above.

It’s going to be a relatively rough breakdown but my goal is to illustrate the time/money/resources you need to invest if you’re going to crack big money SEO.

Ready to get realistic with your SEO? Drop me a line ›

The Keyword Discovery Process

The fastest way to generate solid keyword ideas while getting all the important metrics you need, is to use a tool that does all the heavy lifting for you.

Fire up what ever your keyword tool of choice is, for this post I’ll be using Term Explorer, mostly because it gives me practically unlimited relevant suggestions (up to 90,000) but it also provides all the search volume and competitive data I need to get started.

*Update: Term Explorer has published this killer process for how to prioritize your keywords

Using each keyword as a base seed, I’m going to run 3 small keyword jobs and get a quick 1,000 related keywords with most of the directional data I need:

Keyword Discovery Results for Credit Cards (click to enlarge)

CreditCards-kw-discovery

Keyword Discovery Results for Used Cars (click to enlarge)

UsedCars-kw-discovery

Keyword Discovery Results for Home Loans (click to enlarge)

HomeLoans-kw-discovery

Time to Select Our Analysis Pool

Based on the above results I’ve selected the following 9 SERP’s to review for rank potential. I’m going to dissect one of the most coveted terms in search (credit cards) and then select one term from each of the other keyword sets to analyze.

You can click each one of the keywords below to access all the SERP-specific data for each term – I’ve made it publicly available using Term Explorer’s SERP share functionality.

Credit Cards

Used Cars

Home Loans

All SERP screenshots were taken on 3/15/15 using Google Chrome, Logged Out in an Incognito Window.

SERP’s for “Credit Cards” Keywords

I’ve included SERP screenshots for each of the enterprise keywords I’ve chosen for this post.

Even though I’m not going to do a full analysis for all 9 keywords, based on some interesting nuances I observed across these SERP’s while selecting terms for analysis, I wanted to include all of the screenshots for reference – I’ll explain more on this later.

“credit cards”

credit cards - Google Search

“credit cards for bad credit”

credit cards for bad credit - Google Search

“best credit cards”

best credit cards - Google Search

SERP’s for “Used Cars” Keywords

“used cars”

used cars - Google Search

“used cars for sale”

used cars for sale - Google Search

“used car dealerships”

used car dealerships - Google Search

SERP’s for “Home Loans” Keywords

“mortgage rates”

mortgage rates - Google Search

“mortgage payment calculator”

mortgage payment calculator - Google Search

“fha loan”

fha loan - Google Search

A Quick Observation

In case you didn’t notice, the SERP’s with “Google News” results only show 9 results outside of the news interface element. Even when a local search pack is rendered, if there’s no new results there are still 10 URL’s shown.

Let’s Analyze Rank Potential

I’m going to approach this 2 ways:

  1. The maximum potential search rank I believe can be achieved and at what cost
  2. What is going to be required to achieve that ranking, including acquisition costs of domain vs. website, links, and content.

Here are the top 10 ranking URL’s for our first credit card keyword (click to enlarge):

SERP1_credit-cards

 

and here is the same results when I highlight them competitively from green to red, i.e. green is GOOD for us and red is BAD, relatively speaking (click to enlarge).

SERP1_credit-cards_analyzer

Breaking this Down

First and foremost, it’s important to understand that the analysis we’re doing here is based on all metrics at this instantaneous point in time (IPT). This is a particularly relevant metric to consider when looking at enterprise SERP’s as they tend to change frequently, not necessarily in terms of rankings, but in terms of qualitative metrics, i.e. these pages will likely continue to acquire more links…

Be Sure To Consider Velocity

As further reenforcement to my note above, to get a better sense of which of these URL’s is building there arsenal versus potentially slowing down (or leveling out) we will want to look at the velocity they are acquiring new links.

For this I personally pay closer attention to number of new linking root domains a site is adding versus just number of links from pages. So in the case of creditcards.com’s link profile (pictured below) we see they’ve recently been adding crap tons of new links, but the trajectory of new linking root domains is a very different angle.

link-profile_creditcardscom

They went from 9,567 LRD’s on March 1st to 9,273 on March 31st, so a net loss of -294. Let’s compare this to the chase.com sub-domain that’s ranking #2:

link-profile_chasecom

creditcards.chase.com went from 1,511 linking domains on March 1st to 1,443 by March 31st, so another net loss, but only -68. However, in terms of magnitude creditcards.com lost 3% of LRD’s versus chase.com which lost closer to 5%.

So at first glance chase.com’s link profile from a diversity ratio looks healthier; 3.6% versus creditcards.com’s 1.1% – but chase is shedding LRD’s at a faster rate.

It’s also worth noting proper velocity analysis should be done over at least trailing 12 months, if not longer, and not just the most recent full calendar month – this was just done to show an example of link relativity for the purpose of this post.

Circling Back to SEO Metrics

So I’ve made my point about velocity, now I want to return to the Excel screencap above.

What we’re looking for is horizontal bands of green, hence the color coding to assist with visual identification.

So the first row that jumps out at me is #5, creditkarma.com with the following SERP metrics:

Outbound Links: 0
Relevancy Score: 6
Page Links: 391
Page PageRank: 4
Page Authority: 32
Domain Age: 9.5 years
Domain PageRank: 5
Domain Authority: 69
Domain Links: 2,193,824
Trust Score: 10
Link Strength: 10
Difficulty Score: 9.99
Trust Flow: 16
Citation Flow: 25

So as a whole, and without context, this is a pretty scary set of SEO metrics to compete with out of the gate. But as is the case with everything in life, context is king – and given these metrics are the foundation behind the URL ranking #5 for the term credit cards, this actually isn’t so bad.

The scariest numbers are the DA at 69, the trust score at 10, and the domain links at 2.1 million. But on the flip-side of the scary coin is the opportunity coin – and the metrics that actually pretty exciting; page links under 400, domain age under 10 years (on a SERP where the average domain age is ~15 years), and a page authority under 40.

Backing in Cost to Rank

This is where the process becomes subjective, relative to the SEO and their available resources for ranking.

My process is based on computing the value of the projected traffic using my keyword valuation model, however, I don’t spend my days competing in enterprise SERP’s – so I asked for some insight from some SEO’s who do.

Here’s how Ian Howells, Owner of Hustle & Savvy, analyzes rank potential:

I take a look at the top 20 for the SERP and pull all the usual suspects on metrics. Then I go into ahrefs and look at how many links each is adding in a typical month.

 

That gives me a ballpark for how many links I would need to build per month to catch the lowest LRD count for page 1 and page 2 in ~12 months (or whatever timeline I’m looking at) given their starting links + new monthly additions.

 

That’s my absolute basement.

 

Beyond that, I use that to inform what I label as my conservative and aggressive max attainable positions – you can pretty confidently say you’re not going to outrank Amex for ‘amex small business credit card’, etc.

 

I then take the average new LRD acquisition cost (call it $300 for sake of argument). $300 times X links/month gives me my monthly cost just for link acquisition, times however many months for the campaign. That compared to the expected return once I achieve my max attainable position gives me my payback period.

 

If the total payback period is beyond Y months (call it 24), I’m probably not interested.

Running Ian’s Model

So taking Ian’s quantitative model into consideration, let’s use some conservative revenue projections and calculate the rank potential and cost to rank for our next enterprise keyword from the “home loans” set; mortgage rates.

For the purposes of my spin on the valuation model I’m going to look only at SERP 1 (whereas Ian looks at SERP’s 1 and 2). Click to enlarge the image – sorry it’s tough with these super wide screenshots.

rank-potential_mortgage-rates

Here’s a rundown of approximately how many links each URL is adding or losing (net) per month (based on available data from Ahrefs including both recrawled and dropped links):

  1. Zillow.com = -30,000
  2. BankRate.com = -600
  3. WellsFargo.com = -1,100
  4. MortgageNewsDaily.com = -4,000
  5. QuickenLoans.com = -23
  6. MLCalc.com = -7,500
  7. LendingTree.com = -5,000 (note major swings from 1-2k per day)
  8. HSH.com = +1,100
  9. Topics.Bloomberg.com = -25

So it seems almost all of these are losing a net amount of keyword’s each month…

Well that was unexpected.

Doing The Math

I started writing this post a long time ago, like sometime over the summer in 2014. So I’ve had to update the SERP metrics more than I’d like to admit. So I’m going to update them one more time right now. The SERP has changed again since the last time I grabbed the information above – which was almost a month ago.

So as of the writing of this sentence, it’s April 13, 2015 and the link counts for the above URL set are different. As it stands right now the lowest LRD count in this SERP is the lendingtree.com page with 83 linking domains (not considering the domain seems to be losing thousands per month).

If we use Ian’s cost projections (which I think are pretty reasonable) we have a base link requirement of 83 links multiplied by $300 per link = $24,900 to crack SERP 1 for this term (in pure link cost).

Starting to put things into perspective?

Big SEO is expensive. Now let’s see if it’s worth it…

First we need to get a rough approximation of what a mortgage broker is making on these leads. Moneysense.ca estimates a mortgage broker makes ~$2,250 on a $300,000 mortgage, so let’s say that lead is worth 10% of that give or take, so $225*.

*I have have never worked in the mortgage space, if you have a better measure please share in comments and I’ll update here.

I’m going to figure since it’s a pretty qualified search we can assume a lead response rate of 3%. So with the head term having a monthly search volume of 246,000, and the first set of results in the discovery run screenshot coming in conservatively around 2,000,000 searches/month…

We’re looking at conservatively 6% of 2,000,000 searches making it to that page per month, so ~60,000 qualified visits, 3% of whom are going to convert as leads = 1,800 new leads. If only 10% of those leads convert into mortgages you’re making $40,500.00 per month (180 x $225), but let’s dial this way back to more conservative numbers.

If an acceptable break-even period is 12 months, that means you would need to make $2,083/mo – which based on this population size would mean getting just over 1% of people searching for just the head term to your website.

If you can even make it to page 2 you have a good chance of doing that…

1% of searches for “mortgage rates” is 2,460 visitors per month, with a 3% response rate that’s 73 leads, converting 10% into customers and you’re already at $1,575/mo (7 new customers per month), and your payback period shifts to 18 months – still within Ian’s model.

My Rank Potential Model

I look at this just a *bit* differently, in that my model is even a little more quick and dirty.

I like to see the SERP data as a whole (in Excel), identify the row that stands out as the weakest link, put that in context of the other ranking pages, and back into what it would cost to get there.

So moving onto a keyword from the last enterprise set, I’m going to look at the SERP for “used cars.”

Google-SERP_used-cars

Do you see what I see?

#5, http://www.carsales.com.au/new/, has lots of green in the row… including one major flag, no link flow scores.

Upon further inspection this page is returning a 404 error AND when I check this SERP today, is no longer ranking, which I’ll come back to in a bit.

But, what this means is these metrics were enough to get this page to this point…ranking in the top 5 for a head keyword that receives 450,000 searches per month.

Now if I was going to go after this SERP specifically, with the hopes of getting a page to #5 – here’s what I would consider my costs to be:

Domain

The ranking domain is only 5 years old, which is great, it’s not an EMD, and has a domain authority of 68 and domain PageRank of 6. I should be able to pick up a comparable domain in the $3,000 – 5,000 range, however – because this is in the auto vertical, it’s likely I would need to spend a little bit more, so let’s call it $10,000.

Development and Content

So from a quick glance it looks like this site is huge, to the tune of ~40M pages. This is definitely bad news, but the good news is these pages are all database driven, and this content can be scraped from any of the other car listing sites. Still, this is going to represent a major cost for development – let’s say $50,000 for the website and content.

Links

It gets harder still, the domain link profile for this site is serious – with over 30 million links. Good news is the page we’re looking to rank only has 1,602.

Thinking Through This

Without some very compelling reason to get into the used car space, there’s no way I would build this website. But depending on the projected returns – I may try to rank a page for only the head term set of keywords – and sell the leads.

In that case I would chop up my budget into $10,000 for the domain, $5,000 for content, and $5,000 for links. At the $10,000 price-point I would be specifically looking for a domain with a DA60+ and an existing base of links so my link budget could be dedicated to new high-end links with laser relevancy to used car topics.

At that budget I’m fairly confident that I would, at the very least, be able to crack page 1 and if need be sell the website to recover the investment capital (if the average price per lead turned out to be too low).

Key Takeaways

Spot checking enterprise SERP’s has been something I’ve enjoyed more and more each year. What continues to impress me is how often these SERP’s see complete flux.

For example when I was checking the link numbers above, I re-crawled the “used cars” SERP to find that even logged out incognito, I was still being served some local results based on my ip address, and it was a 3-pack from Craigslist. Craigslist is so hyper-local that it doesn’t even show up on the global SERP.

What’s more, every one of the SERP screenshots taken above on 3/15/15 is different as of 4/13/15. This is fantastic news to SEO’s as a SERP in-flux represents an opportunity. Another good caveat is the above SERP’s are not crowded by only a domains, wikipedia, or answer boxes.

So in light that these are all highly commercial keywords with monetization intent, these all represent SERP’s with potential. Before understanding rank potential, you need to see if you really deserve a Google first page rank.

Get your own rank potential analysis right now! Contact us for a quote!

In Closing

Earlier in the post I said I would explain why I was leaving up all 9 SERP screenshots despite the fact that I was only going to analyze 3, the reason is I want to know what you think.

What do you notice based on the signaling UI’s that are rendering? If you decide to pull any fresh link data – please leave a comment and I will update the post to include any new findings.

Thank you.

About Nick
Nick is the Co-Founder of an ecommerce consultancy company and the author of this SEO Blog.

Follow me on Twitter · Visit my website →

  • Ian’s model is an interesting insight. Never thought about it from this angle. Interesting! Your post: Need to read it once again as there is so much to learn.

    • Thank you Joshua – Ian really knows what he’s talking about as one of the few SEO’s I know that really works deep in enterprise SERP’s.

  • Wow, incredibly in-depth and well-written article. More than I can take in on one reading, but it all makes sense. My minor complaint is that the graphs are really hard to differentiate for those of us who are partially (or totally) red/green colorblind. (And yeah, I know that’s a minority complaint, but there are more of us than you’d suspect!) Regardless, excellent premise, and the examples really make it clear. Thanks for the time in producing this.

    • Thanks so much Fred – that was my goal 🙂

      The graphs are actually cur right from Ahrefs and closer to blue and orange, though they definitely change colors when saved down as jpegs for web. I’m going to try to get them in here as higher resolution PNG’s and see if that corrects any of the ease of color identification.

      Cheers!

  • Chris

    Hey Nick

    Great post & nice to see some similarities in my own process when looking for new opportunities & had been working on a similar post.

    I generally scrape the top 20 (or grab them from SEMRush) & use URL Profiler to grab all my metrics (DA, PA, LRDs, Word Count etc etc) I’ll then try & cut out the anomalies though as there’s always a few results which just don’t add up (normally a PBN hidden from link analysis bots) & work with that data.

    The link velocity/pricing is again similar to my analysis though I’m happy to go for a slightly lower TF than 25 🙂

    • Hey Chris –

      Anomalies are absolutely something I didn’t think to address but are of critical importance. I’m going to update based on some examples where link counts don’t add up and/or there are *foundational* brands that simply won’t be unseated. Yes I really liked Ian’s projected LRD cost as IME it’s pretty much dead on as an average.

      What can I say, I’m a TF Elitist.

  • Chris

    Oh & 1 more thing a tool like SERPWoo is invaluable to monitor potential SERPs for a weeks before taking the next steps – as you say in your post a SERP in flux can some times be a great opportunity 😉

  • Jason

    Hey Nick!

    What’s your sense of SERPs that contain a lot of e-comm listings from Amazon, Ebay, Walmart? I have always regarded them as almost ranking by default due to a lack of quality results. You think this model supports/refutes that theory?

    • Hey Jason –

      As someone who lives and dies in SERP’s competing against Amazon, Wal-Mart, Ebay, Home Depot, Lowes, and Grainger – I think more times than not these SERP’s are ripe for optimization. The big box brands are ranking based on domain authority and generally spend no time on on-site optimization for deeper level sub-cat and product detail pages, as well as give very little attention to link acquisition at those lower levels.

      • Elena

        Nick,
        re SERP with a lot of e-comm listings like Amazon, Wal-Mart, Ebay – I always thought this shows G thinks search query has e-comm (buy) intent and won’t show content (learn) sites for this query, even if e-comm sites are weak and don’t optimize for the term.

      • Elena – You’re right, this keyword has been identified as carrying commercial intent, however, in my experience you can do 1 of 2 things if you don’t have a commercial page to rank here;

        1) Create a piece of informational content reviewing the product or service the query is related to and be sure to use related purchase intent keywords in the content and SEO attributes.
        2) Blast this keyword hard with informational content and lots of high-authority links to cause a “re-shuffle” in the signaling properties to Google to get the SERP to begin to show informational results as well as commercial.

        *the second is much harder than the first

      • Jason

        Thanks Nick! Glad to see you and I are seeing those SERPs roughly the same way. One more question I had though – can you explain how you arrived at the 2,000,000 searches/month estimate in the ‘Doing The Math’ section?

      • Absolutely, I’m using a rough aggregate of the head terms in the keyword set for home loans pictured here.

  • Aston

    Great write-up!

    I’m keen to start following the metrics and just wondered which backlink checker you’d recommend – Ahrefs or majestic?

    Thanks

    Aston

  • Lots of lessons to be learned from this post … was taking notes the entire time!

  • Hey Nick,

    Great article! While, I never really looked at the ROI and ROT from your prospective, it does shed some new light that I will need to incorporate into research before a site build out.

    I do build some personally assets, for PPC and PPL, I’m mainly focus on local client SEO.

    I can see how most of this applies to local. Do you do local client work? If so, do you these same metrics when putting together a proposal?

    • Hey Jason – Thanks for sharing your thoughts.

      I don’t do much in terms of Local SEO but I would use the same methodology and just tweak the approach to include analysis of NAP’s and citations versus pure play links since they seem to be better local signals from what I’ve seen.

  • Great article! Explained some things I wasn’t totally clear on, thanks for a great article!

  • I don’t agree with most of the SEO metrics that you have chosen to determine a page’s ranking potential. I had a look at the metrics and gave up on reading the rest of the article. What about the most important metric called “content”?
    Anyway, let’s discuss the rest of the metrics. 1. Number of links – It is the quality of links that matters rather than the number. 2. Number of linking Root domains – Based on #1, internal links can outperform external links. So, the number of linking Root domains is irrelevant. If you know how to write high quality content, a page from your own website can always provide a better link than one from an external website. 3. Trust Flow – You need to bother with “trust” only if you are trying to spam search results. 4. Citation Flow – Like links, few people have any idea about what good citations are let alone measuring their potential. 5. Domain Authority and Page Authority – This has more to do with the Domain content and Page content rather than anything else. So, my final word is that it is content that is the most important metric to consider while determining the ranking potential of a page. All the metrics that you have mentioned have a direct connection with content, which apparently few SEOs feel the need to analyze. I have ranked pages for more than one keyword on new websites with zero external links by focusing purely on content. And, the observations I have made above are based on this fact.

    • Hey Satyabrata –

      First off, thank you for stopping by and sharing your thoughts – I appreciate the opportunity to educate.

      Second, and most importantly – you could not possibly be more dead wrong.

      A few things to consider:
      1. This post is about enterprise SEO – review the keywords in the post, and then please show me an example where you’ve cracked page 1 with no links.
      2. Number of linking root domains and link acquisition velocity have been shown, time and time again, to move the needle. The fact that you dismiss linking root domains as irrelevant means you have done a lot of reading and not much ranking.
      3. Internal links are important, yes – but external links will always trump internal links.
      4. Trust flow is actually a measure of how trustworthy your website is *overall* based on your link profile and low trust is actually a great indicator of spam.
      5. Citation flow doesn’t actually back into local citations, at all – it’s a measure of how much link juice a link will pass from one document to another. Amidst all of your slurping from the Google Kool-Aid I urge you to follow the link above to Majestic’s glossary and catch up on what the term means before denouncing it.
      6. DA and PA have nothing to do with content – they’re authority metrics.

      I believe you’re telling the truth when you say you’ve written a great page and ranked it with zero links, however, this post looks at competitive keywords and not ranking for yellow glazed banana walnut bread with sprinkles in a round pan on a sunday afternoon… which based on your understanding of search signaling is likely similar to the terms you “rank” for.

      • Hi Nick, I appreciate the fact that your opinion differs on many aspects and am willing to learn from your thoughts. I hope that my thoughts stated below will sound reasonable to you. It would be foolish of me to publish here, the results of any SEO experiment isolating ranking signals.

        1. I admit that I haven’t cracked page 1 for competitive keywords like “credit cards” and “used cars”. I haven’t had the opportunity to do so. I will surely give it a try some time in future.
        But, I have ranked in Google’s page 1, keywords like “ink cartridges” and “landscaping”, although I did make use of a handful of external links.
        Also, let me assure you that the keywords ranked on page 1 by using content and internal links only, are surely not of the type “yellow glazed banana walnut bread with sprinkles in a round pan on a sunday afternoon”. In fact, the copywriters of my current organization, Copytous.com, have done a pretty decent job of writing content and ranking in page 1 for some keywords, without any prior knowledge of SEO.

        2. I didn’t state that linking root domains are irrelevant. I stated that the “number” of linking root domains is irrelevant. Why? Because vendors like Ahrefs and Majestic have no way of knowing and calculating how much value Google has assigned to each link detected by them. Hence, the reference to link quality in my earlier comment. For all you know, half the links that you have specified in your metrics may be classified as “spam” by Google. As your conclusions are data-driven, you need to first filter the irrelevant data (links discounted by Google or classified as spam), otherwise it will be skewed.

        3. A link passes different signals to a search engine. An external link passes certain additional signals and hence, is generally considered to be more beneficial than an internal link. The value of a link depends a lot on the content of both the linking page and the linked-to page. As you have absolute control over an internal link (the linking page), on most occasions, you can make it outperform an external link while trying to raise the rank of a particular keyword.

        4. You have based trust, citation and authority on links and deemed them completely independent of content. A link itself is part content (anchor text). The quality/value of a link is determined by content. Moreover, I believe the trust and authority of a page/domain can NEVER be solely determined by links, the page/domain content will always plays a role.
        However, we have to accept that all this can be answered correctly only by the people at Google.
        Just because everybody, or some self-proclaimed authority (Majestic?), is stating something, doesn’t necessarily make it true.
        Even content judged as “high quality” by Google, can be untrue.

  • Hey Nick,

    Thanks for sharing such an informative blog. I agree with your thoughts mostly keyword researching ideas. I used different strategy to research for a long time and i was quite satisfied until now. Next time I will try to merge my strategy with yours and will let you know the outcomes.

  • Thanks so much Joshua for applying a ranking system to keywords that is easily understandable. You really put a new, in-depth spin on some old ideas. One question I have is how often do you think it is necessary to research keywords if you are writing content every day?

  • Thank you Nick for posting this article.

  • Wow, incredibly in-depth and well-written article. More than I can take in on one reading, but it all makes sense. My minor complaint is that the graphs are really hard to differentiate for those of us who are partially (or totally) red/green colorblind. (And yeah, I know that’s a minority complaint, but there are more of us than you’d suspect!) Regardless, excellent premise, and the examples really make it clear. Thanks for the time in producing this.

  • I really love your post,thanks for sharing this and looking forward to see more from you.
    Excellent post, would love to see more

  • I really love your post,thanks for sharing this and looking forward to see more from you.
    Excellent post, would love to see more Thank you.

  • Pingback: Paid & Organic Approaches To Dig Deeper With An SEO Keyword That's Working()

  • Very long post, but it’s all worth it. Lots of valuable information that can really help you in SEO.

  • An SEO getting reinforced is a drastic sign. Billions of searches flocking in over several digital devices are an approach to hiking online strategies to combine all available web movements. Attainable keywords are must to click the right business. Organic searches hitting actual results lead to potential rankings. SERP with given metrics is a indeed magic. Good to follow!

  • Verry great! thanks for sharing this SEO Tut

  • Great article!

    $225 is a good number for mortgage leads unless they are complete junk.

  • This is nice post to know about update regarding leads generation through SEO.

  • Thank you for sharing a valuable information Nick.

  • I love your keyword researching ideas. Thanks for posting.

  • Thanks for aware on this topic.”Rank Potential “. Its really new topic for learning. and learn lots of new think about this topic with you estimate .but i don’t most of the marker work on this strategy or not.http://goo.gl/76xqxc

  • Pingback: SEO Basics: Complete Beginner’s Guide to Search Engine Optimization — Octa Eye()

  • Pingback: How to Write a Marketing Plan()

  • Pingback: How To Increase Qualified B2B SEO Traffic In 2016 & Beyond()

  • moses

    this is splendid post and truly about rank potential. it is well analyse which give me more insight about the whole idea. thank you for going in details and sharing your personal experience in this article. am sure this will be a best guide to help in our business.

    thank you Nick for being a great post blogger. Check it out! mosepack will give you a big boost in search engine ranking w… for $5 on https://www.fiverr.com/s2/e791d9d5d3

Previous post:

Next post:

my script