Technical SEO Increased Traffic by Over 1 Million Visits Per Month

I wrote this post 3 years ago to show the impact technical SEO can have on your organic traffic.

I’m updating it to show you what happens when you turn turn your back on it.

technical seo traffic increases

[white_box]When I wrote this post in 2015, I was a partner at (a user generated music sharing site very similar to Sound Cloud). I was able to identify and fix a number of technical errors that increased organic traffic more than 300% in 18 months, to over 1 million visits. We sold the site shortly after and the new site owners failed to manage it properly. The impact on traffic was detrimental.[/white_box]

The crazy thing is if they would’ve maintained the structure we had in place, they’d still be receiving millions of organic visits.

Whether it’s 2013 or 2019, good technical SEO is good technical SEO – the fundamentals remain the same.

In this post, I’ll cover:

  • How to leverage your site’s internal link equity and your strongest page’s citation flow. The details on using this to your full advantage to send link equity to the pages that need it most.
  • Optimizing your site’s crawl budget to make sure you’re not wasting Googlebot’s time crawling pages with thin or near-duplicate content.
  • The power of properly built and optimized sitemaps, especially for large (enterprise) UGC websites.
  • How to take advantage of “one time events” to produce bursts in organic traffic.

Let’s dive in.

Website background

  • The site wasn’t new and had a solid amount of equity with search engines.
  • It had a decent base of organic traffic (~230k/mo), although engagement from search wasn’t great.
  • We created NO content (yes, ZERO) – this site was run 100% user generated content (UGC).
  • I had no team outside of myself, the owner, and his developer.
  • We had no advertising spend of any kind; all traffic coming to the site was organic.

Many of these principles (if not all) can and should be applied to new sites.. but in this case, these changes were used to unlock growth potential.

Key Drivers To Growth

The following 3 areas were the strategic elements that had the biggest impact on organic growth.

1. Continuous Change

To re-visit a still hugely important piece of SEO; pumpkin hacking. We paid attention to which pages were not only picking up the most steam in terms of traffic, we took some extra steps to make sure these pages got as much of boost as we could give them.

To give these pages the little extra boost we could, we used one of the most powerful tools we had for search; our website.

Sounds super obvious and a bit stupid doesn’t it? But seriously, perhaps the most commonly overlooked factor for driving organic growth brings me to my next point;

2. On-Site SEO

We added links with target anchors to the homepage, created new sections on top-level pages to display links for “featured content” around the site, and baked new views into site-wide navigation.

In addition there were some very specific things wrong with the way the code was built and being rendered on the pages, which I’ll come back to in a bit.

Leveraging a mix of Google’s Search Console, a lot of blood and sweat in development hours to get things right, and amazing tools like URL Profiler; cleaning up the site’s foundation has proven invaluable.

Special thanks to Patrick and Gareth at URL Profiler for all your help with the massive crawl logs for YourListen. Also, be sure to check out SiteBulb…It’s amazing.

3. One-time Events

One interesting tool that only some of the smartest digital marketers on the internet know about, and even more so, know how to use – is traffic leaks.

This strategy involves engineering large amounts of short-term traffic to your site, by acquiring (or leaking) that traffic from another site that already has all the traffic you want. Perhaps the most popular site for running these kinds of campaigns is Reddit.

We didn’t use Reddit, but we did put a lot of time into building relationships with people who held similar keys to kingdoms of huge traffic.

As a music and audio hosting site, and after taking a hard look at analytics, we found that our big traffic events happened when new music was released; and more specifically, when it was released on our site.

So how do we make sure we become the source for music leaks? Build relationships and mindshare with the people who have access to this content.

To do this we leverage the relationships with the lowest barrier of rejection; existing power users on the platform.

You’d be amazed at how many people who use your tools are genuinely willing to help, just because you ask.

A Look At Traffic Growth

When I first joined the YourListen team:


18 months later:


When I first came on the site was averaging between 7 and 8,000 visits per day. Not long after finding and implementing these changes the site shot up to an average of 25,000 visits per day.

We did finally hit that milestone (as you can see above); the site level off to a sustainable run-rate is closer to 800,000 visits per month.

Here’s how that traffic breaks down in terms of acquisition channels:


How to Leverage “One Time Events”

Coming back to this idea, cause it’s critical to hitting these kinds of traffic levels – when you are able to have these big traffic events a couple times per year, it has an amazing effect on your overall residual traffic.

What we’ve noticed is that even after the URL’s that drive these traffic spikes stop receiving huge temporal increases in traffic, the overall site traffic lifts up, and stays there.

Notice in the screenshot above that 3 days after the spike, as traffic is starting to level back off; it begins to grow again, all on its own.

What’s Causing This?

Those large spurts of traffic generate awareness, which leads to an increase in traffic from all channels; direct, social, and of course organic.

They directly lead to increases in branded search, type-in traffic, and social referrals. All sending a lot of positive trust signals throughout the interwebs.

All of these contribute to what I’ll call the trust graph of your website from a search engine perspective, not to mention the links 🙂

One of the leaks on YL gained so much attention it resulted in links from some big websites like Billboard, USA Today, and other major media news outlets.. not to mention, a phone call from Prince.

If you want to take a deep dive into the workings of Traffic Leaks I highly recommend you check out, a free resource site courtesy of evil genius CCarter.

Back To On-Site SEO

There were some really specific things that needed to be fixed, as well as some enhancements that started to really make a difference.

Let’s start with the things that we needed to clean up.

Taking a closer look at the track pages (the individual URL’s that comprise more than 95% of the site’s content), I realized that the track titles that were being displayed lived within the custom Flash player, not in the pages HTML.


What’s worse? The page not only had no H1, but the track title was being rendered in the HTML, but then hidden with a display:none tag in the CSS.

Bad, bad, bad.

Google parses all of this HTML and this looks shady as hell.

The fix was simple; pull the track title out of the flash player, turn on the HTML version, and wrap it in an <h1> tag.

URL Architecture

Looking at the way the URL’s were built, not only were they not using the most ideal naming conventions to create optimized topical relevancy (woohoo keyword research!) but they were all flat.

Flat URL’s work when you have extremely horizontal content sets, like Wikipedia, but they fall short when you have big top-level content groups that all of your other content fits into.

So we organized all the content types into 2 master parent directories;

  1. Audio, and
  2. Music


Optimizing Crawl Budget

This is far from a new concept, however, I still constantly meet and speak to “SEO’s” who give this no consideration.

Which is real stupid.

While Google doesn’t exactly have limited resources, it does start to pay less attention to your site’s pages and the importance it assigns them based on what it finds when it crawls your site.

Using a combination of your site’s robots.txt file and the URL Parameter management console within Google Search Console, you need to help Google help you.

What I’m talking about is filtering out URL parameters that are not priorities.

For YourListen this meant taking the URL’s that hosted the embed-able player, the source .swf files, a handful of code assets and moving these to their own directories so they could be blocked from being crawled, like so:


and then going into the URL Parameter manager in WMT and blocking all the parameters that don’t merit having every variation crawled;



[white_box]Check out our new guide on crawl optimization.[/white_box]

Building Optimized Sitemaps

For all the veteran SEO’s reading this you already know how important these files are, but let me share some learning with you.

I’ve worked on large sites, but I can honestly say that YourListen was probably the largest – with millions of individual content pages.

In case you didn’t know, Google limits it’s individual sitemap crawls to 10MB and 50,000 URL’s.

But, based on my first hand experience, we found that limiting individual XML sitemaps to 35,000 URL’s yielded a far more effective crawl.


Furthermore, we found that not only nesting sitemap files into indexes improved the efficacy of the crawl rate but that actually linking (like with <a> tags) helped us maximize the number of URL’s that were parsed during the crawls.


Finally, I found that when I uploaded the sitemaps it helped to not only test them for errors (confirming to Google that they’re properly formatted and clear of errors) but then also re-submitting them everyday for a few days; essentially brute-forcing the re-crawl.

Unfortunately you can’t set a date range in Search Console so I can’t show how big of a difference the above changes made to the index rate, but I can show the effect this has had for at least the past 10 months:


Key Takeaways

The biggest learning I hope you take away from this post is just how incredibly important it is to be meticulous when it comes to the cleanliness of your site’s mark-up and structure.

In addition, here are your TL;DR points:

  • Leverage your site’s internal link equity and your strongest page’s citation flow. Use this to your full advantage to send link equity to the pages that need it most.
  • Optimize your site’s crawl budget and make sure you’re not wasting Googlebot’s time crawling pages with thin or near-duplicate content.
  • Don’t underestimate the power of properly built and optimized sitemaps, and make sure you’re submitting them regularly through GSC.
  • Put in the work to engineer a few big traffic pops, while they may only be a flash in the pan in terms of growth; the sum of these parts will create a larger whole.
[white_box]Want our team of experts to review your site? Let’s talk.[/white_box]

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.

  1. Hey Nick – Thanks for the marvelous post! I really enjoyed reading it, you are a great author.

    I want to encourage that you continue your great writing on SEO, have a nice morning!

  2. Hey Nick,

    Cool post. It’s good to see that technical SEO, when done right, can seriously increase the amount of traffic to a site.

    I just had one question on thin pages.

    I noticed you mentioned in your key takeaway to stop Google from crawling thin pages (particularly parameters).

    I was wondering whether you ever took this method a step further to try and increase traffic, by intentionally reducing the number of pages on a site to prevent low quality pages receiving link authority so that your best pages receive more authority to rank better?

    In your experience does generally reducing the number of pages on your site benefit your top pages?

    1. Hey Alan,

      Thanks for reading.

      By using the URL Parameters manager and blocking specific parameter patterns that create new unique URL’s, i.e. sorting by popularity, color, size, brand, etc. if your site has a *lot* of pages, then the content on certain parameterized pages is not unique, and thus not adding any value to your crawl.

      So yes, absolutely. You want to optimize your crawl budget to the greatest extent possible by giving google the most unique, content rich pages possible with every URL crawled.

      Does that make sense?

  3. Hi Nick & thanks for a great post. I’ve been helping out companies with SEO part-time and the information you give is so appreciated.
    I’m actually moving away from this aspect of my work to focus on writing, but I’m staying a reader because I really enjoy your writing voice.
    Always glad to hear from you!

  4. Hi Nick,

    Looking at this stats I am amazed to see the difference in traffic, doing some tweaks which are important can give such a good results, I am blogging since one year till now not reached my goal regarding organic traffic.

    Now I have 131 articles on my article which gets only 50 to 100 visitors from organic traffic, I have to make some tweaks to content to increase the organic traffic, thanks for sharing the information, see you soon with another article.

    1. Hey Siddaiah – Thanks for taking the time to leave a comment.

      I’m constantly presented with these discussion points, as I feel SEO (even in late 2015) is still so focused on keywords and links, when site structure and mark-up still plays such a crucial role.

      Best of luck with your organic journey.


  5. Great post Nick.

    I have never gone near anything black hat (or grey hat!) and didn’t know about TrafficLeaks – amazing what can be done 🙂

    I liked that you spent some time on the sitemaps and crawl budget because you are 100% correct – so many completely ignore this and are just happy to let their sites plod along.

    Keep up the great work 🙂


    1. Thanks Andy.

      That’s perhaps the most beautiful part about using traffic leaks, 99% of them are 100% shiny and white hat, the most effective ones I’ve used/seen are when the creator puts a lot of time into researching the audience and making sure the content or pitch has very close alignment with the interests of their target demographic.

      If you want to really dive into some live case studies, i.e. updated daily or weekly as people collect results data, I would strongly encourage you to go check out – tons of smart info in there and it’s all free to access.

      Cheers man,

      1. Free! My favourite word!

        I’ll fire that up on the tablet later and stick it in the waterproof case and take it into the jacuuzi at the gym with me – perfect for a post workout read 😉

        Got a lot to read just on Traffic Leaks alone!

        Matching audience to product is always important and something I carry through my clients sites with good effect. No point in getting just numbers to anything – they are doomed to do nothing!



  6. Nice. Bit surprised that there isn’t any limitation in manual re submissions to WMT for such large site. Have you replicated this frequent submission idea to good effect?

    1. Thanks Roshan. There may be some kind of baked in cache/re-submission timeline that G does use to queue these up (or maybe just flat out ignore them within a certain time-span) but if show it’s not shown to the user.

      Yes. This specific process is recommended for absorbing other site’s content; first using cross-site canonical tags (until over ~80% of the site’s rankings are re-indexed under the destination domain) and then to complete by adding the necessary re-direct rules. I actually wrote about this specifically here; Increase Traffic By Osmosis

  7. Great post Nick what course would you recommend for people who know the basics of SEO but want to get real growth outcomes for sites by improving practices

    1. Thanks Dale, I’m a bit of a keyword research maniac – so my very first recommendation for anyone is to always get their head really firmly wrapped around an effective research process that delivers a roadmap on which keywords to go after and when, i.e. immediate targets, 3/6/12 months down the road. And then backing into a ranking strategy based on those targets.

  8. Hi Nick,

    Starting from thanks for such a detailed and informative post. It is a massive traffic for any website. My concern is I am promoting one website. Problem is I am done with everything above you have stated however, I am not getting good amount of traffic. Could you help me out by checking my on page? Is there any error in on page?

    1. Hey Mishi – The trouble with trying to build significant traffic to a page or website about mobile app development comes down to creativity. Having a clean page and a lot of great content on it about building apps is just not compelling.. it’s not anyone’s fault, it’s just not enough to garner interest, use, repeat visits, etc.

      For a topic like this I think you really need to think outside the box and come up with a something that makes this page *useful* and set you apart from the rest of the competition. What will delight me, inform me, or help me when I come to this page if I am not yet ready to hire you?

  9. Very informative article. What exactly you mean by – leverage the relationships with the lowest barrier of rejection; existing power users on the platform.

    Please help!!!

    1. Hey Yorke –

      So my thinking here was that instead of reaching out cold to content creators within our genre (audio and music producers) that weren’t users and had no connection with the platform, that if we analyzed and chose the power users of the platform who were either 1) creating an order of magnitude more content or 2) whose content was generating significantly more visits or pageviews, and asked them to feature their new content exclusively on our platform in exchange for added visibility around the website – that we could leverage the existing relationships.

      Since these people were already invested in the success of the platform as it meant greater visibility for them and their content.

      Does that make sense?

  10. Solid one, Nick. Thanks for sharing. Great to have more examples out there of how smart technical SEO can make huge differences.

    I wonder how you found out about those XML sitemap tricks – the 35k URLs and the (a) links. Did you do any testing with that on this or other sites?

    1. Thanks a lot Barry.

      Honestly it was just a lot of trial and error. I was seeing patterns where at the 50k URL mark we were hitting sort of a glass ceiling at one point, say sitemap_10, and then the sitemaps further downstream seemed to follow suit to a further degree. So sitemap 10 might only hit an 80% index rate, and then sitemap 11 would hit 70%, sitemap 12 60%, and so on. So I started to experiment with bringing that number down until we saw the overall successful index rate increase to over 95%.

      The (a) links (adjusted syntax to render in wordpress comments 🙂 ) was more so just a hunch and may not really be playing too big of a factor in the crawl/index rate. I just figured since these tags are the natural progression for how Googlebot crawls and indexes a site the same logic would follow with sitemaps.

  11. Thanks for the article Nick.

    We’ve been seeing some indexing issues for a client’s site, but never thought to dig that deep into sitemap optimization like you’ve detailed here. It’s a great suggestion now that I’m reading it.


    1. Thanks for reading Alex.

      It’s surprising how much some manual massaging here can make a difference. If you are able to make some headway I would love it if you stopped back and gave me an update?

  12. All these are essential tips for technical onpage optimization. The illustration with webmaster tools & analytics reports make clear difference here. I would recommend this article for webmasters groups. Thanks again.

  13. Very impressive post. These days, webmasters are fixated on backlinks, and it’s always good to read about technical seo. One question :

    “Optimize your site’s crawl budget and make sure you’re not wasting Googlebot’s time crawling pages with thin or near-duplicate content.”

    Instead of disallowing thin pages, why can’t we rewrite content for those pages.

    1. Hey John – you absolutely could, for YL since it’s all UGC this wasn’t really a viable option, so it made more sense in our specific scenario to simply block these pages from using up crawl credit.

  14. Your post Very impressive is really amazing . I enjoyed reading through your post. Please keep sharing this kind of work with.

  15. Listened to Barry Adam on the “How To Find and Fix Common Technical SEO Issues” webinar and got curious about the XML sub-sitemap limit of 35K pages – interesting 🙂

    Great article, thanks!

    Two questions:

    1) What do you mean with “but that actually linking (like with [a] tags) helped us maximize the number of URL’s that were parsed during the crawls.” – from where did you link to the XML sitemaps?

    2) Weren’t you afraid to lose out on link juice due to setting up the URL parameters in GSC? Optimizing crawl budget is important, but it gets interesting when there’s a potential drawback from a link juice perspective. E.g. when someone links to one of the URLs you don’t want G to index – how will any of the link juice be assed on to other pages on your website?

    1. Hey Steven – Love your domain! 😛

      RE: your questions.

      1) On the actual sitemap.xml page (this is how Yoast links down to sitemaps by default), for example;

      2) No not at all, we were blocking a lot of pages that had no content of value or the engagement was so low it was causing more harm than good, so in this case it meant pages with less than 10 visits and less than 100 plays. The parameters we block are all stuff that shouldn’t be indexed anyway.

      1. Hi Nick, thanks for your quick response 🙂

        RE: your response
        1) Ah, thanks all clear.
        2) I see, in that case there’d be no worries indeed. What is your view on the internal flow of link authority from pages that have been blocked from either robots.txt or indexation using GSC ‘s URL Parameters? Is any link authority passed on in your opinion?

        1. My pleasure Steven.

          IMHO internal links from pages blocked from being crawled don’t have the opportunity to pass link equity downstream since they’re not being crawled – and a link needs to be crawled to be followed and flow to be attributed…

  16. I have a blog , Worry about Google optimization and SEO , thanks now I do well only for you article, I and more you work, thanks a lot, keep it up and grow student like me 😉

  17. Wow. I am recently in process of optimizing & over-hauling a website and while searching for some urls issues, I found this post. Although old yet highly relevant to my issue 🙂
    I can safely say that optimizing url structure can help you increase traffic by almost 20% .
    just wanted to put it here that urlprofiler was turned out to be a great help in my situation.

  18. Indeed a great post about website traffic.

    Nowadays It is very much hard for a blogger to drive targeted traffic to their website and without having targeted traffic, We can never drive customer and sales.

    Getting website traffic is the most important thing for any website.

    To have high website traffic, We must have to write high quality content which is very much important to hold the readers on our website for long period of time.

    We have to write engagging content which can help readers.

    I am glad that You can covered an amazing article on website traffic. Will definitely follow what you said in this article.

    Thanks for sharing it with us. 😀

  19. Really glad i came upon this article got an immense amount of information…. Its all about creating good quality researched content and getting them to the audience.

    In other words, consistency is key!!

    My thanks 🙂

  20. Interesting stuff for sure, but how can you be sure that your sitemap tweaks/submission tactics resulted in this uptick in crawl rate? Did you not do anything else on the site in that time period?

    1. Hey John – Correct, after noticing the 50k sitemaps getting tapped out around 30k URL’s I thought to adjust them downward, first to 45, then 40, and so on until we hit closer to 99% index rates at the file level.

  21. I didn’t think it would be possible to boost traffic to my website. After reading your article, I’m more confident that I can do it. Nice post.

  22. Hi Nice Post! I would like to introduce Search Engine Lands one of the best SEO blogs. Please include them in your list they accepts guest posts


  23. Traffic is most important thing to run successful business online and thank you so much for sharing few important steps.

  24. Nick

    I found your post in “The Best In SEO, 2016”. I was very happy to read the inside article and find out that on your site I will have much more quality content. I will explore the whole site more.


  25. Hey Nick, first of all great post and congrats on hitting the million mark. Was once ranking #3 for a 1.2 million exact kw and boy was it jummy. Reading this post made me go back in time to relive it.

    My question however lies with robots.txt and disallowing “thin” pages. What about about us, policy, cookies etc. pages on smaller websites…should we bother? Is SEO yoast and archive/media etc. enough or should we be very specific when it comes to links from the front page. Also targeting the footer/sitewide links that are usually not that related.

    Thanks for your time and answer, as well as the post of course.

    Greetings from Slovenia,

    Igor Buyseech

  26. Hi Nick

    Great post!
    I am going to apply some of these SEO concepts to my local home inspection business site. And see if I can get a one time velocity peak in traffic 🙂

    Sending out a Tweet as well.


  27. Hey,
    Thanks for sharing such a nice information on how to increase traffic.Really helpful for beginners , great article.Thanks and keep sharing like this.I will be looking forward to reading more from you.

  28. Hi I am so grateful I found your blog, I rewlly found you by accident,
    while I was searching on Google forr something else, Nonetheless I am here
    now and would just likee to say thanks for a tremendous post
    and a all round thrilling blog (I als love the theme/design), I don’t have time to look over it all
    at the miknute but I have saved it and also included yyour RSS feeds, so when I have time I
    will be back to read more, Please do keep up the awesome work.

  29. Great article. One thing that made me scratch my head though was a comment you made about Url Architecture.

    “Flat URL’s work when you have extremely horizontal content sets, like Wikipedia, but they fall short when you have big top-level content groups that all of your other content fits into.”

    I can think of one site with 10,000 URLs with a flat structure. Each main category was siloed with multiple subcategories all flat using internal linking to set up the hierarchy. The one I’m thinking about has traffic in the millions (according to Ahrefs) so it doesn’t seem to have been an issue if one understands site structure.

  30. Thank you so much for sharing a wonderful article. In this post, I have learned various point about search engine optimization. I wanna ask you one question about ( pay per click ) PPC. I want to use on my website and increase my website visibility.

  31. Hi, I really enjoyed reading this remarkable post. I am really impressed by the information provided here. Being a digital marketer, this post will be quite helpful when I plan SEO for organizations. Thank You.

  32. Wow that increase in traffic is epic! Especially with the fact that the bounce rate overall came down. That’s pretty incredible!

    Congrats and thanks for such as great article

  33. I’ve been working as an SEO specialist for a few years. I’ve heard about the points you’ve mentioned but never thought of applying the same for increasing the organic traffic of my website. It is unbelievable that the website visit increased from 8,000 to 25,000. Your idea of traffic leaks is a smart one. I had no idea that things like URL architecture and optimizing Google crawls are important for generating website traffic. The URL architecture part is effective, and I had no idea that XML sitemaps can be limited to 35,000. That’s a piece of useful information. Thanks for the article. I’ll share it with my colleagues. I’m sure of getting some better traffic with these tips.

  34. Great case study, Nick! I feel people will understand a lot about traffic retention after reading your article. I, myself, have written a number of SEO articles and there’s no denying in the improvements needed in the world of technical SEO. I recommend people to use tools for technical SEO as well (as you can see on Much appreciation from my end! Cheers!

  35. I have read the entire blog. It is very helpful and significant information for search engine optimization. Thank you.

  36. That’s an awesome article. SEO is still a grey area for many bloggers and after reading your post couples of my doubts are cleared.

  37. Hi Nick, thanks for a great post. I’ve been helping out companies with SEO part-time and the information you give is so appreciated.
    I’m actually moving away from this aspect of my work to focus on writing, but I’m staying a reader because I really enjoy your writing voice.
    Always glad to hear from you.

  38. Huh, great post again, Nick. Working lately, here on my small Slovenian market, with optimal link equity to internal links. Question: what’s your opinion about impact on improving SERP solely with internal linking improvements? I mean, I would like to, for my small SEO test, measure impact only because of better targeted internal links. Do you think solely improvement of internal links would have any impact on SERPs?

  39. Hey Nick, you have provided great stuff regarding SEO case study and it is essential to know all of these things very well before doing this.

  40. Great post Nick, what course would you recommend for people who know the basics of SEO but want to get real growth outcomes for sites by improving practices.

  41. Hey Nick!!!. Kudos to your article. Thanks for sharing an informative one which helps to know about the Technical SEO which helps to make optimization on websites for keyword rankings. Thanks!! for sharing such an informative article with us. Now I have been working on a website that is related to mobile app development. But my keyword rankings go back and I don’t know why it has happened to me. I have verified all the above-mentioned checklist of technical Seo but I can’t able to find it. Can you help me to find out any error in my on-page?

  42. Hi Nick,

    Thank you for sharing such an informative article.
    This SEO case study is very helpful for new bloggers like me.
    You are doing such great work by providing this free information.

    Thanks and Regards,
    Jay Panchani

  43. Man what an amazing post this was!!!!!!
    SEO plays the most important role when you are to rank a website on Google.
    Even if you have backlinks or do other things right, if you mess up with SEO, you are doomed to fail, in my opinion.
    This SEO analysis really pumped me up!
    Thank you, mate
    Stay safe!!

  44. Hi Nick,

    I thought initially whether i should try this or not as being a developer it is very hard to understand some Marketing tactics, but when i did, i am able to triple the traffic on one of my blog site.
    Thanks to your inputs that helps me through out 🙂

Want These Results?

See how partnering with us at From the Future can help build your business.

Get in Touch