Keyword Research Training Parts 2 and 3
0 votes, 0.00 avg. tacos (0% full)

Update (Jan 15, 2015): After you’re done reading this post, be sure to check out Han and Nate’s new keyword tool that makes it easy to scrape your competitor’s keywords and do bulk keyword difficulty analysis.

Hi again everyone.  So I decided to combine both videos in one post this time.   As usual, if you’re using IE9, please try Chrome or Firefox :)


Below is Part 2 of the video series showing how I do my keyword research.

This video expands on Part 1 and expects you to know everything from that video, so go watch Part 1 if you haven’t yet!


In this video I go over:

  • How I export GAKT data into a spreadsheet and use Macros to turn the data into an earnings calculator.
  • What metrics to track in the calculator, and what my expectations are for each metric.
  • Find a niche with high SERP CTRs and monopolize it!
  • How to use this spreadsheet to determine your content needs, and keyword targeting on that content
  • Download the Spreadsheet with Macros here (right click on link and save as).  It comes with some other Macros I use often as well in case anyone wants them :)
  • Note: Excel can have compatibility issues.  In fact I’m entirely sure the Macros will not work for some people.  I am no Excel expert I just created this using Record Macro, and anyone can recreate this.  Perhaps if there is an excel expert out there, he/she can optimize these Macros.    The reason I’m not turning this into an open office doc that works for everyone is because I am thinking of releasing a free scripted version for you that includes the PA data (see next video).


The next video is less instructional and more of a demonstration of how I built a more automated system to determine keyword competition.   It shows how it works, discusses programming challenges you may run into if you try something like this, and talks about a way I could develop a free version of this for you.



For those that are curious, you can see the exact formula my tool uses to calculate competition in this post (and even copy and paste this to your programmer if you want to recreate our tool).

Personally, this is the video I would have really appreciated about a year ago, but I’m very curious as to how it is received, so please let me know!  Also please let me know in the comments if you would like a script that will allow you to do bulk CMI searches using your own SEOMoz API Key.  If you haven’t already signed up for the free trial of SEOMoz (aff. link).   I get paid whenever anyone signs up for a free trial, you don’t have to purchase after 30 days.  So if you think I deserve a tip from an 80 million dollar organization, use my affiliate link! :)


Update October 17, 2012: Google has made some changes and I have posted an update to my keyword research tactics here:



Hayden “NoHat” Miyamoto

Keyword Research Training Parts 2 and 3
0 votes, 0.00 avg. tacos (0% full)
  1. Great videos, gave me a much better understanding of the changes that are made for the APA.

    Personally, I would appreciate having a copy of the script that uses the SEOMoz API Key.

    There is something that I was wondering about that you may have touched on slightly with Spencer. You had mentioned that there is not a huge difference for keyword ranking of a EMD versus a non-EMD. Lately, I have been registering batches of EMDS.

    Am I right in assuming that I can use a slight difference when determining the amount of backlinking? For instance, with some minor backlinking with a service like socialadr and having a EMD, would it be okay to go for a phrase that had all top ten site APAs at <45 instead of <40?


  2. Thanks for the demo, it has been very helpful to me, I started on a similar project earlier in the year.. difference been I was looking simply for terms with no exact match in bing lots of good info here thanks again…

  3. Awesome stuff Hayden. +1 for sharing.

    Quick question regarding adsense accounts. Do you maintain multiple adsense accounts or do you have all your 3000 + domains on a single account?

    • Rotating dozens of private proxies, if one get’s blocked by G it gets quarantined and taken out of rotation for awhile. Google has changed the way they deal with proxies a couple times in the past 4 months, and we’ve had to adjust as well. I am considering using SEOMoz top 10 results instead (if G changes things again), but has mentioned in the video that has its disadvantages as well.

  4. thanks so much for the videos. i am not interested right now in the scripts but i am very interested in a course of “how to do it right”. i see now how i can easily make my spread sheets do the work for me and i am especially grateful for these ideas.

    thank you Hayden.

  5. Great video tutorials. Thank you. I do have a couple questions:

    That macro is going to be a huge time saver for me so thanks. Doing this manually last night I found four more sites that I think are a go and have already purchased the domain for the first using your technique.

    SEOmoz difficulty tool:

    Did you happen to notice that anything under 43% is fairly easy to rank for or is that some kind of common knowledge heavy users of seomoz difficulty tool are aware of?

    The second video is where the majority of my interest was because I want to be able to find those kws with potential to get to the top of Google without much link building. I don’t have a private network of sites to use so I’m relying on article marketing primarily with some squidoos and what not.

    I am very interested in obtaining a script I could use in SEOmoz with my API key to at least get the page authority. The APA would be even better because that is the hardest part to fully understand where you may draw the line between a no go and moving forward with a kw and site development.

    Thanks again for sharing your processes. I am interested in your advanced classes.

    Thanks again,


  6. Hi Hayden,

    I have trouble backing up the video, but I think you said the next training is on hosting these new sites?

    Also, I have been trying to figure out the best way to go about finding expired domains. Is that also something you will be covering in your next tutorial?

    • I think I might send a survey out to see what users want covered. There’s a lot. Registration and Hosting seemed logical as that’s the next step after identifying your niche. After that would be content, and then link building which would be expired domains.

      But there seems to be the most interest in the expired domains. I could just dive right in there if that’s what people want. I am but your servant…

  7. Hayden,
    Great videos, and thanks for the spreadsheet and the macros. You having given us some great info and I would like to get much more from you.

    Thanks for sharing with us and keep it coming.

  8. Steve Wyman says:

    HI Hayden

    Its the more “advanced” stuff that’s of interest. New ideas and unique methods are what differentiates a site.

    There is a ton of free info now available on niche sites from nichepursuits and the guys at AdsenseFlipper (with their free guide as well).

    So yes more of you take rather than basic sruff is where the value lies.

    • Hey Steve, thanks for the recommendation.

      I will not re-post stuff that is already in other places (that I know of). Most other people are teaching keyword research very differently, and often select niches “intuitively”, they don’t mention the contextual tool, and I think they are doing their readers a disservice. Or they trust a metric that is just plain wrong (allintitle on MS for example).

      I’m still figuring out where I want to go with this, I could go for just super-advanced stuff ala, but then I’m pushing away the majority of the audience, who frankly I think need to be properly informed.

  9. Nice video’s!
    The Dutch adwords version (I’m from Amsterdam) is little different from the English version. So not al is aplicable for me.
    Still, I ‘ve learned a lot from these video’s and also the podcast.
    I definitely will folow the next parts.


  10. In response to your question…I am interested in your free script any course you would create for someone beginning with niche sites. However, I am more interested in using your methods of generating traffic that I would convert to email lists instead of monetizing with adsense.

    Traffic is traffic, and I would love to learn your ranking methods to build niche lists and monetize with affiliate offers (digital products, amazon,etc.). So instead of having possibly 50-100 niches, I would look at substantially fewer separate niches to build lists…perhaps 5-7.

    Am I on a viable track, or has this been tried before without much success? Your input is appreciated.

    • Hi John,

      Lots of people have success with this, it really depends where your interests lie. But if that is the case you want to use the above techniques and focus on LMS more than anything, and adjust the calculator to extract a potential (and always conservative) Visitor Value based on existing affiliate offers.

      This is the way I started out, but it was too much work and far more difficult to scale. Just wasn’t a good fit for me.

  11. +1 for getting a copy of the script. Can you also post your rules for determining the APA?(like I see just above the results table a rule of +5 if domain matches kws). Thank you for good videos so far.

  12. Hayden,
    Thanks very much for sharing this great information. I am really intrigued about your tactics as I have already picked up some great tips.

    I consider myself intermediate but would love to see all three courses as I think I would pick up some more tips from your method.

    That free script sounds great as well!

    Keep up the good work!

    • Hey Marc,

      I have no idea, but probably not. The next script will be hosted on a server so it should work anywhere. Probably a couple weeks away though since my programmer needs to finish his current project first.

      If you’re really impatient you could probably just recreate them easily enough, or pay someone to recreate them (not me, sorry).


  13. Hi Hayden!

    You’ve been mentioning using a huge own network of aged domains for backlinking. Makes perfect sense, especially if the topics of these domains are the same as the topics of money sites.

    However, in adsense you obviously target a huge variety of niches, so it’s quite interesting how you create the site on aged domains? Do you use multi-topic blogs with different categories? Just like any ordinary blog similar to article directory (there are tons of these now), or you manage to split your network into batches of niche blogs, with content related to only specific topics, like probably: diets->healthy eating plans, diet programs, fad diets and related categories…

    Or your approach is simpler than that? Also, what’s your experience in changing the topic of the aged domain completely? I saw good results from this, some domains that are completely unrelated to my topic of choice even rank high for “new” keywrods in my new topic of the old domain.

    Hope to hear your thoughts ;)

    • Hi Antony,

      I actively do all 3 of your examples. I have many sites that have turned into News sites, with sections like any newspaper might have. They always are written as a newspaper would write, and most of the sites will have a specific twist. Like right wing conservative news in the UK (if it used to be a right wing political blog in the UK). If I’m linkbuilding a collection of 20 auto related sites, then the article would be about rise and fall of the UK auto industry for example.

      I also have some sites that were already in the niche, and I just use them to link to only that niche. Sometimes when I find a very good expired domain I will just search for adsense sites around it. Sometimes I don’t use the expired domains as linkers and instead make them rank for a bunch of terms.

      I have one site that was totally random (some manufacturer of some widget), and I recreated it as a completely different topic. It ranked within a week, and it’s ranking now for really high traffic short tail terms in that new niche. Not sure how long this will last, but it does seem to work in the current environment.

  14. Hey Hayden, thanks for this intriguing information and helpful insight into your highly scaled approach. And I appreciate that you communicate more intelligently and concisely than many others. Count me as interested in your offer if the script, as well as anything complimentary you want to offer related to your system. I am also willing to pay.

    Some of the technical points in vid 3 whizzed by me a bit, but I think I get the gist of every step you went over. I am in the process of developing out my first 50-100 domains, rather than getting ready to buy 200 more. However, clearly my kw research methods for choosing the domains I own were not as robust as what you outline.

    So, I am most inclined to use your approach to vet my current portfolio for the best potential, rather than to buy a bunch of new domains.

    I do think it’s good to keep in mind that the audience has people with 1 site, 10 sites and also 100 and over. And of course, zero sites.

    Anyway, thanks again for the great info!

    • Hi Jacob,

      Thanks for the helpful tip. I am trying to get a handle on what the audience wants. I think offering 3 levels (beginner, intermediate, advanceD) is probably helpful for all 3 levels though.

      I will send out a survey to the list. Will bribe you all with some nice goodies for filling it out too :)

  15. Hi Hayden,

    Great videos. Frankly speaking, the more advanced the subject matter, the more I’m interested. Ideas about scaling and automation are what I would like to hear about at the moment, and nobody seems to talk about it. When it comes to keyword research, at the end of the day it comes down to a single goal: pick profitable keywords. The methodology may change but experience dictates how well you perform your research (i.e. just pick a method, get out there and do it). So I think that there is enough of that discussion out there, but I agree that from the posts above there is clearly a demand from you for it.

    Going back to advanced topics, I’m trying to do stuff like figure out what capital I need to build to cover development costs for a custom CMS, management interface, DNS manager, and SEOmoz subscription plus whatever other recurring costs are involved with that. In addition, to see whether it is beneficial to build a disparate array of custom tools or an all-in-one solution that is ideally modular that can have components added/removed at will.

    I don’t just want to throw money at the problem (although I’m lucky that I can if I want to). Sound strategy and advice for a long-term view is what I’m looking for, since tools like what you’ve developed can be used for things other than mere niche site development. You kinda see where I’m going with this?

    People in this {business|hobby} have associated “niche” with “small” and I think it’s the mindset that hinders massive growth or outside-of-the-box thinking.

  16. Hi Hayden,
    I would be interested in the script also.
    For the csv file do you just check off global and local monthly searches.


  17. Hayden,
    I, too, would love the script. Quickly taking GKWT results and evaluating them will reduce the time demands tremendously for me.
    BTW where do you find enough keywords to keep multiple people busy evaluating them?


  18. Thanks for the videos–they’ve taught me a lot of new stuff.

    I’m curious where you get your content. With 3000 sites, I’m guessing you outsource content creation–are there any services you can recommend?

    • Iwriter is hit and miss but cheap. Textbroker is decent and it is my goto if I have more needs than my staff can handle. I try to hire part time writers, pay them once they complete 60 articles. I offer referral bonuses to my staff (once the writer completes 60 articles). It’s cheaper, better quality and you get everything done for you (articles + images + posting). I have a full-time article editor as well who control this process.

  19. This is invaluable stuff – I was just experimenting with downloading the csv for the keyword tool, and your macro is exactly what I was trying to mentally noodle through.

    +1 for being (VERY) interested in the script to allow bulk CMI searches…

  20. Hayden, great series of videos and would be interested in sampling your bulk script too..

    How many google scrapes would you be able to get away with using proxies and how many proxies would you need if you were not out sourcing it and you were doing it yourself?

    I just figured that getting a start on say a hand full of sites initially would be a good start before building up to scaling it, as I think growing it organically would be the way to go for me in the beginning.

    good info, thanks for taking the time out to do it


    • The proxies are only needed for the Google side of this scrape. If your needs aren’t huge and you delay your requests enough you wouldn’t need any proxies at all. Just run the task, go to sleep and have everything done by the time you wake up in the morning. It all depends on volume.

      Also if you are content using SEOMozs top 10, APA could also be returned. You would just have the downfalls I describe in this video, and it would be a bit slower (free account allows 1 request every 5 seconds). You could always rotate free accounts though…

  21. Hey Hayden, When I export from the keyword tool I get a lot more columns then you do in the video. Can I just delete them or leave them there? Not sure if it will affect the macros at all. Thanks!

  22. Tried to download your spreadshhet and it is just a buch of .xml files no .xls files. I have looked at lesson 3 and that excel file down loads fine.

  23. Hello Hayden,
    I follow with great interest your videos.
    I would like to ask you a question and hope it is not out too off topic.
    What type of optimization ON PAGE SEO do?
    Keyword Density? Presence in the title in the H1, H2, H3?

    Before the Penguin, I used a keyvord density of 2.5 and put the key in H1, H2, H3 … Now I would not want my sites were penalized.
    Thanks for the valuable advice, Paul

  24. First of all Hayden, thank you very much for sharing your information and process of selecting keywords.

    One important question, how many Adsense accounts would one need in order to control a… for example 100 domains? I mean, what would you suggest the total amount of domains per adsense account? And do you think creating a LLC with a p.o box would suffice for creating multiple adsense accounts? Or what would be your process. Last question, would you a suggest a vps or dedicated server? And would you check each adsense account with a different laptop or pc or would the vps or dedicated server take care of that?

    I am already coming close to 50 adsense sites, looking to build more, i just want to spread my sites around and not risk/or put all into one basket, if you know what i mean.

    Awaiting your response, thanks again Haydeb.

    • 100 domains is no problem with 1 adsense account.

      Once you have that many you’d want to get a VPS or Dedi, and if you want to scale you absolutely need one.

  25. I’m definitely interested in the intermediate and advanced topics of how to automate and scale. Thanks for the info Hayden!

    Maybe you could throw up a forum too? Or is there some irc channel or forum where you hang out?

  26. Is all of that excel calculation stuff absolutely necessary? Couldn’t we just x’s the amount of LMS against the suggested bid to determine how much we “COULD” make, and then just check the competition and go from there?

    Honestly, I got lost several times as the video progressed.


    • You could do that, but this is no extra work with the Macros, and it allows you to change the numbers based on your business. Perhaps you have higher or lower CTRs, or you expect to rank first so can raise the SERP CTR. Perhaps you have an affiliate program, which means you’d adjust that and the suggested bid to a conservative estimate.

  27. Hi Hayden question about domain names.

    Which is better?

    I’ve read that according to Google would be same as I also read that putting an e in front is next best thing as it denotes and e business of some sort.


    • Out of those 3, is definitely the best. But it still doesn’t carry as much ranking benefit as, or even or

      Between hq or e – I’d say they are about equal.

      • Thanks as this pertains to a niche I want to tackle and the exact match isn’t available in com, net or org. So if I’m reading you right hypen is the next best thing to be able to get a com, net or org extension so that it’s an ‘almost’ exact match?


  28. Hayden,

    I think you missed my earlier post, again if you could please share your thoughts on the best strategies for opening multiple adsense accounts. (Especially if you have a lot of sites).

    Appreciate any help you could give.

    Thanks again,

  29. Hi Hayden,
    thanks for all the great information. Up to May 2011 I had about 40-50 xfactor-style sites bringing in £500+ per month. But after the Panda/Farmer updates that income dropped and last month it was just £20. I tried all I could to return them to their former glory but nothing worked. And I spent a lot of time trying. So I was at a loss how to proceed.

    Your strategy fills in the gaps that none of the other strategies describe. Now I can see that Xfactor was successful more through chance than method. And yet Xfactor was one of the best of the bunch. Your research is far more precise than all of them and it shows me the way through. If I were to apply it to my previous money making sites, it’s likely that only a handful would pass your test.

    So you might call this beginner stuff, but there are plenty of people who (like me) think they are more advanced but are definitely missing some tricks. Thanks!

    Regarding the script, it looks useful for those building bulk and I would like a copy, although it’s probably not essential for me at this moment.

    Since you call your system a ‘set and forget’ system in the interview, I assume you don’t bother much with analytics? Or other analysis? In the past, in my most successful sites, I always find that the popular KWs popping up in my analytics were often NOT the ones my site was primarily optimized for. Bizarre, but true. So although I had gone through the whole process of researching and getting a solid EMD etc, the money was all coming from a secondary KW which I often hadn’t even optimized for.

    There is also the issue with the Google KW tool that no matter how deep you try to dig, there are always ‘hidden’ KWs which don’t show up. I find that reinserting Google suggest KWs around a topic will suddenly produce a bunch of new results that hadn’t come up before – do you have any strategies for digging deeper into Google’s KW cache?

    Finally, you mention above that you don’t want to repeat what others have already covered on other sites. Until today, I had never visited NichePursuits or PassiveOnlineIncome… so what else have I missed? A list of recommended blogs to read would be very useful if you have time.

    Great stuff, and thanks again.

    • Hi Phil,

      Great to hear that these videos have helped you. Keyword Research is the #1 most important part of niche site building in my opinion.

      I have also found that secondary keywords are big earners, and the way I add keywords to my sites is based on grabbing as many phrase match keywords as possible. After running the GAKT3 macro and sorting by LMS and then color, I have a macro that turns them into percentages and generates a CSV for me to upload to my system (it’s actually included in that spreadsheet, I think it’s called Anchors and then CSVExport2 or something). They are then added to my system, which adds them to my rank checker, and also adds them as partial match anchors for link building.

      I don’t use Google Suggest, not even in my scraper, but that’s a great idea (my scraper has used SEMRush data as well though).

      My sites launch with Piwik installed, and you’re question has added an item to my todo list – to check Piwik for search terms and add it to my rank checker and rank optimizer scripts. Rank optimization is a huge deal… will get into that in a later post.

      I don’t do a lot of reading online honestly. I prefer doing. I would however recommend – it’s a more advanced and open community. I have my main SEO in the Philippines check there for new techniques and send me anything that seems interesting.

  30. Hi Hayden,
    there’s a WP plugin I use which grabs the visitor’s search term and adds it to a list below the post to assist in ranking the page for unforeseen kws. To be honest, I’m not sure how effective it really is but it seems to me that a script could be devised to take those kws (or from Piwik) and submit them to the GKW tool right from your host. If you’re that way inclined.

    As for extended KW ‘grabs’, Scrapebox is good for trawls through GSuggest. I have an Excel sheet which adds each letter of the alphabet and common combinations of letters to each KW string to squeeze every last kw combination I can out of the GKW tool. Although most tend to be low search numbers. Another good resource is Wordtracker – get lists from them (which tend to be more diverse), and then run them through GKW tool.

    Yes, I know BHW and a few others. It’s a trawl, though. We probably all need your man in the Philippines.

    Finally, I saw what you do in the video, but the Excel sheet I downloaded doesn’t seem to be the same one – I got ‘CMI-Spreadsheet.xls’ from your link – could you relink to the one you describe?

    Thanks again.

  31. When I check the first 10 results of competition in google, seoquake shows backlinks for both link domains and semrush links. Which one should I pay more attention to? Or should I be checking both? It seems like alot of sites have thousands of link domains but little to none of semrush links or vice versa.


  32. I think that what you have to offer is awesome. I would love it if you could pass all that knowledge to us.

    I’m of course interested in the free methods and most important, I can’t stress this enough, the right and effective methods where all those errors and stupid mistakes that we make, are filtered out.

    Thanks again.


  33. Hi again, I have a few questions. When you start doing a keyword research for a new website, how do you actually start doing? Do you randomly choose a niche from your head and then do keyword research?

    This example that you show in the last two videos, I can’t help it but ask? Are you really sure that those last two keywords “cheap motorcycle helmets” and “women motorcycle helmets” are that easy to rank number one? I know they don’t fit the criteria, because they wouldn’t make 20$ per month, but let say they would.

    Firs ten websites almost all have PR above three and many have links build to it. There is however only one that has the exact phrase match in the tittle and description. Are those two things really that important and would you be able to rank a website without links for those two keywords? Be honest :)

    What would you say is the most accurate and free SEO tool, extension out there, that we can use?

    Thanks again. You made a lot of things clear for me.

    • I have giant lists of keywords with 10 APA <50, and I target them based on that spreadsheet.

      Regarding those examples, I think my tool encountered an error there. That's the problem with going bulk. I always recheck individually those that show 10 for EVERYTHING. I am thinking of moving it back to SEOMoz's top 10 because Google's getting so aggressive vs proxies.

      Accurate and free seo tool? That's an extremely broad question. But if you mean for keyword research, then using this technique with SEOMoz's API on a free trial, or using it manually withe the seomoz toolbar which is free.

  34. A thousands thank you’s, Hayden for sharing all this so completely. It’s rare to find a successful IMer who actually possesses a deep understanding of these KW research metrics, even rarer still to find one who explains it all in such a clear & straightforward manner. This is priceless (or price-full :-) information, with no “guru” price-tag attached. Thanks for your willingness to share your system.

    For those who have been a bit put off by the steep ($99/month) price tag for SEOMoz (after the 30-day trial is up) I just wanted to mention that there are a few KW tools out there who subscribe to SEOMoz for you, and include Page Authority, Domain Authority, Keyword Difficulty and other SEOMoz metrics in their results. SECockpit has included these for over a year, and costs $300 every 6 months, so half the monthly price of SEOMoz. Steve Clayton’s new Keyword Blaze tool also subscribes to SEOMoz and shows all these results, and his tool is just a one-time purchase of under $100. I’m not an affiliate for either of these, but have used them both and found them to provide all the critical SEOMoz numbers plus some other search time-savers.

    • Thanks for the kind words and for sharing. Also you can use the SEOMoz toolbar, and various other toolbars to see PA/DA overlaid on the SERPs. Personally I think you can get what you need in just a week of the tool and cancel your account if you don’t see any other additional value.

  35. I feel like I’m invisible…

    When I check the first 10 results of competition in google, seoquake shows backlinks for both link domains and semrush links. Which one should I pay more attention to? Or should I be checking both? It seems like alot of sites have thousands of link domains but little to none of semrush links or vice versa.


    • Hi Nick NoHatSEO utilizes SEO moz ONLY. There’s a free trial available, sign up using Hayden’s affil link. Currently I’m only using this keyword research method with SEO moz and nothing else. I’ve found keywords that are 40 and under PA.


      • What I’ve been doing is when I find a keyword that’s 40 and under and after APA I have been securing domain names, some EMD. With 40 and over and APA I’ve been putting them on the side until I make a decision securing expired domains. In my short career I want to be able to rank with very, very little off page SEO with strong on page SEO. My theory which might be fact is if you can find keywords that are 40 less KW difficulty you can rank with no off page SEO or very little.

        Which brings me to a question Hayden. After we APA the top 10 who does the APA’s come into play?


      • Hey Marc,

        Basically if you have all 40s you should be able to rank with links from 10 or so standard domains, or 5 or so premium domains. If all 50s you’d need at least double that. Obviously the more you have the better. This can depend on the niche, so the safest thing would be do double the numbers I mentioned above :)

  36. Hayden, I’m getting a run-time error when I try to run the GAKT3 Macro:

    Run-time error ‘1004’:

    Application-defined or object-defined error

    It asks me if I want to “Debug” and I can click on that blue-flashing button, but once it enters debug mode, I don’t know what to do.

    Curiously, the GAKT4 Macro runs OK, but of course, I can’t make use of that info until first using GAKT3 to filter KWs.

    Really want to use this; any help would be appreciated.

  37. Thanks, Marc, was already doing it manually. But the Macros really speed things up…that was the whole point of this post/videos.:)

    Anyone else, who knows Excel well enough to know why I might by getting this error, care to weigh in here?

  38. OK, got the Macros to work. The problem was, I’m on a Mac, running the Mac version of Microsoft Office. Apparently, the Mac version of Excel is not compatible with Hayden’s Macros – I’m posting this just in case other followers of Hayden’s system are on “the better machine.” :-)

    Anyway, I got ahold of the Win-doze version of Office/Excel, fired it up in Parallels (Mac’s virtual PC program) and it works fine.

  39. Hayden said “Basically if you have all 40s you should be able to rank with links from 10 or so standard domains, or 5 or so premium domains. If all 50s you’d need at least double that. Obviously the more you have the better. This can depend on the niche, so the safest thing would be do double the numbers I mentioned above”

    This is where I am not clear. Do all sites on page 1 for a given keyword, need to have APA of 40 in your suggestion? Cause I believe you mentioned that if Keyword Difficulty is under 40% you don’t need to do APA.

    I’ll also ask this in my Guest Post.

    • If the keyword difficulty is under 40% you’re almost guaranteed that all sites have an APA under 40. The exception are EMDs and phrase match domains, as SEOMoz’s keyword tool is keyword agnostic. Again that was just a simple shortcut for people too bothered to go through this properly, and it should work out for them 90% of the time.

      Yes all sites would need to be in the 40s based on my other post on what it takes to rank.

      • Ok this is my experience so far in regards to KW Difficulty under 40%. This doesn’t seem the case in ‘how can I say it?’ more product based niche’s in the types I posted in the Guest Post. But it seems like in a niche keyword phrase like with qualifiers like best, reviews etc this is the case.


  40. Just curious as to when you plan to release, if at all, a free script version of your CMI process? I would love to start using this under my free trial of SEOmoz before it expires. I’d be willing to pay for it if it helps persuade you!

    If you aren’t planning on releasing a script, do you have any advice or tips on getting it developed? Perhaps, a programmer that you could easily put me in contact that is fairly knowledgeable on programming scripts using the SEOmoz API ?

    • Hayden,

      I think you missed my post in this section, again if you could please share your thoughts on whether or not you have decided to provide a free script for bulk CMI searches.



    • Hi Seth,

      My programmer is on vacation at the moment, but I think I will be releasing my batch CMI script to this list, probably on a pay-per-request basis.

      • Just looking to clarify, but by pay-per-request, are you saying we can pay you a one time fee and use the script as many times as we’d like?

        I look forward to this as it would be a huge time saver for finding under 40 PA’s.

        Thanks for all your great info,


    • Hayden I know you are busy but I am still hoping for answer to question :) As I am thinking of getting high quality articles where one would cost me around $12 and with 3-5 articles per website costs can get too high.

      Or maybe get just main website article for $12 and rest for like $5 ?


  41. “The next video is less instructional and more of a demonstration of how I built a more automated system to determine keyword competition. It shows how it works, discusses programming challenges you may run into if you try something like this, and talks about a way I could develop a free version of this for you.”

    Any news on the FREE version you offered to develop for us.

    As always thanks for all your help Hayden, you have by far offered the best tutorials on developing blog networks and not to mention your other very useful tutorials. thx!

  42. Hi Hayden. I’m curious how you get the following keyword statistics programmatically: Local monthly searches, Global monthly searches, and contextual cost per click. I found that I can get all of these from the google adwords API, but the approval process is really lengthy, and there are fairly steep API costs as well.

    Is there a better way to get (at least LMS and GMS) these values programmatically?

  43. This page waѕ νerу helpful. I actuаlly reallу enjоyeԁ readіng іt.
    I want to haѵe to bе gоne ρoгtrау а few moге games than I
    normally might too аѕ not the еssеntial efficient thing I
    have еvег dоne but irrespective thanks!

  44. Jason Pachomski says:

    Hey Hayden,

    I know I’m way late to the party but wanted to see if you could put the Excel sheet with macros back up on the link above. It’s 404 as of right now. Also, thanks a lot for all the great info you’ve put out. It’s really been a game changer for me.

  45. Lloyd Robinson says:

    Hi Hayden,

    First of all thanks for all the great info you have published about niche sites. I heard your interviews with Spencer on Niche Pursuits and am excited to apply your methods as I get started with creating niche sites.

    Like Jason, I can’t download the Excel sheet with macros as there appears to be no file associated with the link. If you could post it again I would be more than grateful.

    Thanks again for making this process a lot more accessible to complete newbs like me.


  46. Hi Hayden,

    Thank you for the wonderful keyword guide here, it’s simply the best I’ve seen in years.
    I was simply shocked to see that I had set up an Excel file similar to yours but mine was much more primitive than yours (no macros, coloring etc.). But i’m really delighted that the logic was similar to yours, so it shows me that I’m on the right track

    As the last 2 commentators pointed out, your excel link is dead :

    I would really appreciate it if you can put back the excel file online

    Thanks again for your highly informative posts about keyword selection techniques


    • Great keyword tutorials hayden thanks, i learned some new things! :)

      I agree with irfan also, if you could repost a working link to kwresearchmacros.xlsm that would be great

  47. Hi Hayden,

    First off, thank you for posting so much rich content on the subject – that is very generous of you!

    Quick question on building a personal network of blogs. I was thinking of creating a blog using college based student groups. That way there would be backlinks to the “news” type of blogs from very high pr .edu sites (granted they would be from miscellaneous parts of the .edu site). Do you think that this is a good idea and would work well for getting a high pr? I’m a bit of a newb at this stuff but I thought I would ask.

    Thanks again!


Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes:

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>