FTF combines deep technical knowledge with big picture ideas to deliver aggressive digital strategies.
Browse our diverse catalog of client success stories showcasing our wide range of capabilities.
Find out what's currently in Flux in the digital marketing industry and learn from some of the best.
We frequently host in-depth webinars on the latest digital tactics and strategies that are working right now.
Who we are, what we believe in, and the folks at the helm of our organization.
We deliver exceptional digital strategies to clients based on these unwavering beliefs.
Check out our current open positions and come build a more exciting internet.
Fill out the contact form and one of our strategists will reach out to you via email to schedule a short phone call consultation to discuss your business.
709 N 2nd St, 3rd floor
Philadelphia, PA 19123
February 14, 2019
It’s not about ranking #1…
Well, it’s not only about ranking #1.
Like all things that are driven by technology, SEO is constantly changing and has reached a tipping point where to compete you need to own more real estate on the SERPs.
This means moving past the vanity of having your website URL’s ranking for your priority keywords, and instead accepting that Google ranks different types of content for different types of keywords; and the current type of content on your website may not be able to rank for the terms you’re targeting.
This is quite literally what RankBrain is all about.
The practical application is that you may have to accept that to get traffic for the keywords you want – you need to be present on websites that aren’t yours.
We see this most often in software verticals where SERP’s are comprised of a mix of providers, reviews, and directories, like this SERP for “crm software;”
If you go check out this SERP, take note of a few important elements;
The last point is the most important one.
This SERP has 10 organic results that are NOT actual providers of CRM software (and 12 organic results overall), instead they’re review sites like Finances Online, Capterra, Webopedia, and Software Advice — or they’re publishers like PC mag, Business News Daily, Forbes, or HBR.
Then if you take note of the 2 software brands that ARE actually ranking; they’re Salesforce and Zoho – so in short; you have no chance of ranking your brand’s website on this SERP.
However, I’m willing to bet you HubSpot sees a fair bit of traffic (and conversions) from this SERP.
Hubspot is the highest ranked (organic) provider on this SERP (assuming you consider native English speakers scan left to right then top to bottom), likely sending them a steady amount of new leads.
Now check it out on mobile;
It’s not enough to rank your product or feature page somewhere on page one, diluted among a mix of results from directories, YouTube results, and review blogs.
Instead, you need to own the opinions and awareness of every bit of that SERP; and here’s where the SERP Monopoly Strategy comes into play.
Before we get into how to monopolize your target SERP’s, the first step is to identify where your efforts should be directed to ensure you’re not investing in the wrong keywords.
If you haven’t built out a comprehensive keyword matrix to determine this, there’s a shortcut you can take using the commercial value of your priority terms.
The cheat code here is to simply take your priority list of terms and multiply their average monthly search volume by their average cost per click, or if you’re a lazy marketer you can use KeywordKeg which will do this for you.
The goal here is to identify the terms where it would be most expensive for you to have to buy traffic versus rank organically and acquire that sustainable traffic for free, month after month.
So looking at the above data we can see that “crm software for sale” is the second most valuable commercial keyword that we would want to be monopolizing for after “crm software.”
Another big consideration here is that crm software while it’s the 800 pound gorilla, might just be too expensive to go after due to the level of difficulty – and instead it may make more sense to consider going after some of the “cheap” modifier keywords (even if you don’t sell “cheap software” AND you don’t want your product associated with being cheap).
This just requires a bit of creativity on how to create assets to grab rankings in those SERP’s, remembering they don’t necessarily need to be assets that carry your brand.
I don’t want to make this post about keyword research, but if you need a refresher on how to find and prioritize your most important terms (based on commercial value), check out my other post that is actually about keyword research.
I’ve picked a fun local term (that personally I don’t care about, but that’s another discussion – if you really want to know why – shoot me a tweet); “seo services philadelphia” (f0r the purposes of this example I’ve scrolled down past the 4-pack of AdWords ad, and the map pack with 3 results).
Starting to see where this is heading?
Remember the main goal here is to build a list of commercially focused terms so you can off-set your cost of traffic acquisition via SEO.
Once you’ve gotten your prioritize identified and confirmed, now comes time to dial in your strategy to own as much of the boardwalk as possible.
The sheer amount of physical real estate that you’re able to occupy on the SERP has become the equivalent of owning Boardwalk and Park Place on the Parker Bros game board.
Each ranking position on the SERP accounts for an approximate total percentage of total clicks; with the breakdown being a relative average (based on a running study by AWR) on desktop of:
and on mobile:
and featured snippets pull in approximately 8.6% of clicks (according to Ahrefs).
This means that the gains to be had are exponential in nature for every additional ranking you’re able to own on the first page.
If you have 2 rankings on desktop SERP 1, regardless of position, you’re looking at an average net click-through rate of 14.48%.
If you have 3, it’s 21.72%. 4 is 28.96%, and 5 is 36.2% (based an a gross average of ~7.24% for being on “page 1”). This is just a rough baseline as the likelihood of these total click-through percentages actually being much higher depending on which positions those rankings are for.
When you start backing into revenue realization based on these increase traffic share percentages the numbers become staggering.
Imagine if you will a scenario where your site converts 3% of gross traffic to a lead, and then you convert 34% of leads into sales, and your average sale is $100.
This means that every visit to your website is worth $0.98, which doesn’t seem like much but follow the math;
100 visits = 3.4 leads = 1 sales x $100 = $100.00 (Gross conversion rate = 1%)
Now let’s assume that your current SEO traffic is based on ranking #1 (and only #1) for a keyword with 1,000 searches per month.
Your average CTR for your position #1 ranking is ~22%, which means you’re receiving 220 visits/month, creating 2 sales worth $200.
If you were to increase your total share of real estate on that one SERP, based on an average aggregate of ~7.24% CTR per additional ranking, here’s how it would break down into revenue;
In this example every additional ranking equals an additional sale each month from the same SERP.
You need to understand that this is at the individual keyword level, so this potential exponential return is for every keyword you dial in, and then will continue to pay returns month over month, month after month.
Based on the above model, you’re ranking 5 URL’s on SERP 1 for a keyword with an average of 1,000 searches per month, netting you $500 in revenue per month.
If we expound upon this and say you’re able to do this for a reasonable amount of terms, let’s call it 300 keywords (where you’re able to eat up 5 rankings anywhere on page 1) for a total monthly search volume of something more realistic (in aggregate) like 85,000 searches/mo this revenue number is closer to $85,000 (and that’s with just a 1% gross conversion rate… which kind of sucks).
Compare this to if you only had the #1 ranking on those same SERP’s, which instead of ~42,000 visits/month would be bringing in only 18,700 visits, equaling only ~$18,000 in total revenue.
The best way to take advantage of this is to:
Thoughts? Questions? Criticisms? Drop a comment below; I read and respond to every one.
Nick is the Founder and Chief Strategy Officer of From The Future. When he's not elbow deep in data, he's spending time with his wife, his dogs, or his cars.
Your email address will not be published. Required fields are marked *
This site uses Akismet to reduce spam. Learn how your comment data is processed.
Another great one Nick, thanks for sharing!
Great information, Nick. Thanks for sharing your thoughts.
Some nice maths here, good to read something new.
Taking advantage of these search results can be the tricky part, as creating a “best” type piece of content can be tricky on your own domain without mentioning competitors. Ross over at Siege did a good video on that the other day I think people would find super useful https://www.siegemedia.com/seo/rank-best-keywords
Your ‘How to Take Advantage’ section at the end is hurting my brain (in a good way).
I’ll definitely have some questions in the TrafficThinkTank #qa section next week.
Hey Nick, great stuff as usual. Last part is not clear to me:
2) “Pull back metrics” — these metrics inform what exactly? Whether or not we can out-do their ranking somehow? (“Low Hanging” from your kw articles) I don’t understand how this fits into the monopoly process.
3) “Map content type to each via Deepcrawl” — how does Deepcrawl do this for the serp urls? I understand how you’d do it for your own sites pages, but aren’t we talking about the pages that are ranking on page 1?
4) “Sort by content type, and create a new list for each.” Sort what by content type? And create a new list of ??? for each what?
Hey Costa –
2) The metrics are to help you get a gauge on the rank potential for those specific terms.. if page one is dominated by brand URL’s with 200+ LRD’s and 10k words of content, you’ll want to look for publishers on that SERP where you can potentially get a contribution published.
3) DeepCrawl provides all the page source data (via API) for any URL you give it, it doesn’t yet map type — this is something we’ve built on our own in our toolset.
4) Once you’ve mapped the content types for all the URL’s on page 1 of Google, for all your target keywords, you’ll want to then sort by that column to filter you list into groups by page type; i.e. blog posts/articles, categories, sub-categories, product pages, landing pages, etc. You then want to take each of these chopped up smaller lists (where all URL’s are grouped by content type) and create a new sheet so you can analyze for patterns at the content type level vs. looking at the whole population of pages at once.
Great explained Nick!
Amazing article! With Interesting information and tricks. I would love to share this useful article with my https://www.nichepractice.net/ team.
Hi Nick what did you mean by “Sort by content type, and create a new list for each.”
So the process has you mapping content type at the URL-level, i.e. blog post, publisher article, category (or hub page), product page, landing page, video, audio file (think podcast), etc., then filter the list by each content type and move all your collected data to new sheets so you it’s easier to analyze/manage as you start to use this data to inform your content map for your monopoly strategy.
Fantastic article Nick. Smaller metrics suits. I’ll defiantly be using recommended tools.
What do you do when publishing content doesn’t give you new traffic? i.e. the pages get just 5-10 visitors a day. Thanks
I don’t know — this doesn’t happen for us. We put far too much planning in during the entire development process and then promote on proven channels
“Use this data to inform your SEO content map, and get crackin’.” – I almost read it in my head as ‘crack-al-ackin’
I actually like that way better 🙂
Please do a video on this. It will help us understand a lot better.
I’m going to 🙂
One question not sure if I missed the answer in the post itself though – Say you did this all on 1 website, are you not worried about keyword cannibalisation? I’ve had a few instances of this on larger affiliate seo sites where rankings have been prevented due to cannibalisation? Interested to get your thoughts?
P.S. If you said rank different web properties etc in the post apologies.
Haha yes Tom, you’re spot on with your concern which is exactly why the prescription here is to rank as many *different* properties as you can on the same SERP 🙂
I have to say this is confusing for me. While it may get more clicks, if you’ve created a piece of content that is ranking #1 on the page, are we looking at making tangentially related content? I always like to put everything I have in that piece that takes the top spot.
I see you answered it above! Rank of different properties. How did I miss that?
Thanks for sharing your strategy, my question is, what to do in case of SERP’s filled with aggregators? Like in this example with seo services in Philadelhpia…
As one of the SEO practitioners who really likes SEO activities, it makes me even more challenged by SEO in the future.
Especially now that we (Azhima SEO), also open a Local SEO service.
Great stuff. I really like the bit about checking the value of the keywords to help you determine where you should put your focus.
See how partnering with us at From the Future can help build your business.
10 Mins Read
March 5, 2019
March 26, 2019
April 10, 2019