Understanding Keyword Research and Analysis for SEO.

Search engine optimization (SEO) is the process of structuring sites and content in a way that improves the website's visibility and placement on search engines when users search for certain keywords. A proper keyword research and analysis is essential for SEO, content strategy, website taxonomy and link building, as it can significantly boost the qualified and converting traffic of a website. Well, at least for the time being...

Understanding Search Demand 


Keyword research & analysis is the process through which you identify the relevant and profitable/popular queries you need to target through the use of keywords in your website's content, structure and link building.
  
When researching for keywords in various tools such as Adwords Keyword Tool, Wordtracker, Keyword Spy and Ubersuggest, it is important to keep in mind that search patterns (and keyword competition) can be illustrated by the search demand curve, an adaptation of a long tail probability distribution in statistics.

The long tail appeared in the internet world in 2004 in an article of 'Wired' when the entertainment industries were struggling to design new online business models to help them survive the combat against online piracy. According to the article, ''the future of entertainment online lies in the million of niche markets at the shallow end of the bitstream", meaning that is not only about blockbusters and mega hits selling millions, but instead about millions of non-mainstream pieces that collectively can sell more than the few mega hits.

Now back to keyword research, according to the long tail search demand curve, keywords are divided to short tail, mid and long tail depending on their popularity in search queries. 

  • Short tail queries (or fat head keywords according to moz) usually include 1-3 keywords (e.g. Insurance, cheap insurance etc.) that drive millions of monthly searches and are almost impossible to rank for, due to hard competition in SERPs. Typical examples of sites "posing" in the first positions for fat head keywords are websites with high domain age (in some cases over 10 years), have Page Rank over 6, have an enormous amount of backlinks from high authority pages & domains (including .edu and .gov) and have fully optimized their onpage elements. 


Short tail queries competition example
Top results for "insurance" in Google UK


  • Mid tail queries, once comprising of two-three words, currently include an average of four word terms (e.g cheap online car insurance) with still high search volume. This category can be suitable for new websites but requires competition analysis for each keyword in order to identify its competitiveness in SERPs. 
  • Long tail queries are usually 4+ word phrases that account for the 70% of the total search volume. e.g: google plus statistics in 2013, cheap online car insurance in Alburqueque (mr. White!)Ideal for new website that can specialize in a niche market and claim their presence in SERPs.


keyword research and the long tail curve
source: moz.com


Keywords, Rankings & CTRs

When starting your keyword research you need to create an initial keyword list, also called a seed list, with relevant keywords to your website's context. At this point you can start brainstorming words alone or with your friends about who searchers might look for your content/products/services online.  

Next you need to use a keyword tool (some mentioned above) that will help you identify the entire search curve by defining the short, mid and long tail keywords of your niche. By extracting, comparing and sorting keyword search volumes, you then attach values to the selected keywords (i.e high search volume=high value).

As mentioned earlier short tail queries are extremely difficult to compete for, as they drive the majority of the traffic from the SERPs. Back in 2006, after a major leak of click data from AOL, it was revealed that the top 10 results in SERPs for a selected keyword can heavily affect click through rates, and therefore website traffic. The top 3 positions in SERPs were driving 60% of the total traffic, while 75% of the users never scroll past the first SERP! 

In October 2013, the top ranking site for "dog breeds" in Google Greece with 14,8k local monthly queries, received 5k clicks from the top organic place to the website. That is a 33,7% CTR for the top position, significantly lower than the AOL figures below. Considering the fact that CTRs are currently affected by an increasing number of factors such as organic vs paid search competition for the searcher's attention, the increasing number of multimedia results and more sophisticated copywriting for search, it is expected that CTRs for top organic rankings will shrink further in the future.


The top 10 rankings and their hierarchy in CTRs
Source:seobook.com

The probability of a new webpage, that has completed onpage optimization, to rank among the top rankings is limited only to results for long tail queries, which however are far more relevant to the searcher intent than generic short tail queries (e.g "cheap online car insurance in Alburqueque" vs "insurance").

In this case, a realistic approach for your keyword planning would be a granular distribution of long tail keywords across your webpages in order to drive qualified traffic and improve your acquisition metrics. According to a Hubspot survey, the top 10k keywords of the demand curve make up less than 20% of the overall search traffic, while 70% comes from long tail keywords.
  
It is important to keep in mind that depending on the country you are targeting, search queries volume and number of words forming keywords may vary. For instance, in larger and mature online markets such as the US, monthly search volumes are measured in 10Ks, 100Ks or even millions and short tail keywords usually consist of 2~4 word phrases respectively. In smaller less mature online markets, such as in Greece, monthly search volumes are limited to few thousands and short tail keywords are usually limited to only 1~2 word terms.



Analyzing Keyword Competitiveness

To analyze and determine keyword competition you need to take in to account the numerous ranking factors affected by "keyword specific" metrics such as:
  • the search volumes for selected keywords.
  • the number of competitive webpages that have implemented onpage optimization using the selected keywords or include keywords in domain names.
  • the domain age, page rank and page authority of the competitive webpages.
  • the number and the authority of the pages and domains that link back to competitors.
  • the frequency of keywords appearing in internal and external anchor texts.
  • the number of paid search ads showing in search results and competing for user attention.
  • the estimated paid search metrics (such as CPC and competition).  
In practice, you can analyze keyword competition by either using a set of free tools available online or just your custom spreadsheet (at least this is how i started!) or simply subscribe to several premium services such as keyword difficulty tool, market samurai and longtail pro that will save you valuable time. It is recommended though before putting a hole in your pocket for paid tools, to start experimenting with Google's advanced search operators and your custom spreadsheet.

All processes of keyword research, analysis and planning constitute your keyword strategy, accounting for the major part of all SEO tactics according to the essential SEO framework.


The SEO ranking factors 2013
The SEO ranking factors 2013

 


The future of keyword research(ed) 


At present, exploratory search is already transforming into a semantic search scheme  (from the Ancient Greek word sēmantikós; important) and keywords have started to be replaced by contextual entities, which is actually the real aim of the latest Hummingbird search algorithm update. The following video displays Google's official attempt in 2012 to shift from keyword to semantic search though "knowledge graph", a search technology that understands the real meaning behind keywords. In Feb 2013, Sergey Brin at TED2013 stated that "My vision when we started Google 15 years ago was that eventually you wouldn't have to have a search query at all."


 
Through the use of structured data, geotargeting, voice and social signals, the emerging semantic web will help the search engines deliver more accurate and personalized results, closer to the searcher intent.  This shift will affect SEO in the future by making the entire SEO process more difficult and consequently more fair. The image below illustrates how the Google search algorithm has evolved over time in terms of functions.        
November 2013,Vangelis Kovaios    
Suggested reading:        Google Search Algorithm Updates History      Organic Search Acquisition [Infographic]   Understanding Inbound Link Analysis