Some nice maths here, good to read something new. Taking advantage of these search results can be the tricky part, as creating a “best” type piece of content can be tricky on your own domain without mentioning competitors. Ross over at Siege did a good video on that the other day I think people would find super useful https://www.siegemedia.com/seo/rank-best-keywords Reply
Nick, wow. Your ‘How to Take Advantage’ section at the end is hurting my brain (in a good way). I’ll definitely have some questions in the TrafficThinkTank #qa section next week. Reply
Hey Nick, great stuff as usual. Last part is not clear to me: 2) “Pull back metrics” — these metrics inform what exactly? Whether or not we can out-do their ranking somehow? (“Low Hanging” from your kw articles) I don’t understand how this fits into the monopoly process. 3) “Map content type to each via Deepcrawl” — how does Deepcrawl do this for the serp urls? I understand how you’d do it for your own sites pages, but aren’t we talking about the pages that are ranking on page 1? 4) “Sort by content type, and create a new list for each.” Sort what by content type? And create a new list of ??? for each what? Thanks! Reply
Hey Costa – 2) The metrics are to help you get a gauge on the rank potential for those specific terms.. if page one is dominated by brand URL’s with 200+ LRD’s and 10k words of content, you’ll want to look for publishers on that SERP where you can potentially get a contribution published. 3) DeepCrawl provides all the page source data (via API) for any URL you give it, it doesn’t yet map type — this is something we’ve built on our own in our toolset. 4) Once you’ve mapped the content types for all the URL’s on page 1 of Google, for all your target keywords, you’ll want to then sort by that column to filter you list into groups by page type; i.e. blog posts/articles, categories, sub-categories, product pages, landing pages, etc. You then want to take each of these chopped up smaller lists (where all URL’s are grouped by content type) and create a new sheet so you can analyze for patterns at the content type level vs. looking at the whole population of pages at once. Reply
Amazing article! With Interesting information and tricks. I would love to share this useful article with my https://www.nichepractice.net/ team. Reply
So the process has you mapping content type at the URL-level, i.e. blog post, publisher article, category (or hub page), product page, landing page, video, audio file (think podcast), etc., then filter the list by each content type and move all your collected data to new sheets so you it’s easier to analyze/manage as you start to use this data to inform your content map for your monopoly strategy. Reply
What do you do when publishing content doesn’t give you new traffic? i.e. the pages get just 5-10 visitors a day. Thanks Reply
I don’t know — this doesn’t happen for us. We put far too much planning in during the entire development process and then promote on proven channels Reply
“Use this data to inform your SEO content map, and get crackin’.” – I almost read it in my head as ‘crack-al-ackin’ Reply
Hey Nick, Awesome post. One question not sure if I missed the answer in the post itself though – Say you did this all on 1 website, are you not worried about keyword cannibalisation? I’ve had a few instances of this on larger affiliate seo sites where rankings have been prevented due to cannibalisation? Interested to get your thoughts? P.S. If you said rank different web properties etc in the post apologies. Cheers, Tom. Reply
Haha yes Tom, you’re spot on with your concern which is exactly why the prescription here is to rank as many *different* properties as you can on the same SERP 🙂 Reply
I have to say this is confusing for me. While it may get more clicks, if you’ve created a piece of content that is ranking #1 on the page, are we looking at making tangentially related content? I always like to put everything I have in that piece that takes the top spot. Reply
Thanks for sharing your strategy, my question is, what to do in case of SERP’s filled with aggregators? Like in this example with seo services in Philadelhpia… Reply
As one of the SEO practitioners who really likes SEO activities, it makes me even more challenged by SEO in the future. Especially now that we (Azhima SEO), also open a Local SEO service. Reply
Great stuff. I really like the bit about checking the value of the keywords to help you determine where you should put your focus. Reply