translation


17 August 2008

Day 4: Search Engine Optimization and Keywords


Search Engine Traffic, SEO and Keyword Choice

Search engines like Google, Yahoo, and MSN have complex algorithms that decide what pages will come up in search results for a given keywords phrase. We can construct content pages with SEO (search engine optimization) in mind in order to rank high in searches, leading to higher levels of SE (Search Engine) traffic.

Of course several variables are involved in SEO, but let’s talk about what’s important to cover.

Keyword Basics

Your website is most likely about a particular topic (e.g. golf swings, poodle manicures, etc.) and so the first thing to do is determine which keywords best describe the content on your page. Try to think of phrases that are specific.

It’s important that Google’s crawler realizes that your website is about the keywords you have chosen. Google does a pretty good job of determining the topic of your site but there a few things you can do assist Google’s crawler to make sure it gets things right.

Let’s say I have a website called “All things Poodles”. My keyword list would be short, only having one keyword: Poodles

My website has 4 pages:
Poodles Home
Poodle Manicures
Poodle Puppies
Poodle Breeders

Once you have a list of keywords you need to do a little research to find out who you are competing with for each keyword.

For the keyword poodle manicure I would do a search on Google and record how many pages are competing. This tells me how many pages have my keyword phrase with the words in any order.

Then I would do a phrase match search by searching on Google for “poodle manicure”. This tells me how many pages have the exact phrase.

Then I would do a title search by searching on Google for allintitle:poodle manicure. This tells me how many pages have my keyword phrase in their title.

Then I would do a url search by searching on Google for allinurl:poodle manicure. This tells me how many pages have my keyword phrase in their url.

Note: To learn more about allintitle, allinurl, and other useful operators:
http://www.google.com/help/operators.html

I would do the same thing for the keyword phrases: poodle puppies and poodle breeders.

My table would look something like this:

Keyword Basic Search Phrase Search Title Search URL Search
poodle manicure 15,000 8 6 1
poodle puppies 145,000 333,000 24,700 3,790
poodle breeders 151,000 49,000 3,840 1,790

Looking at the table above it’s clear that there is little competition for poodle manicure and a lot more competition for poodle puppies.

Optimizing your Page
Once you have your list of keywords and know a little about your competition it’s time to optimize.

Put your main keywords that describe your site in the title tag on every page.

You want your most important keywords to appear in the title with as few other characters as possible.

Do not put your keyword in the title more than 2 or 3 times. Google sees this as keyword spamming and will rank your site lower.

On each page put your main keywords in an h1 tag.

Please note I would not target the keyword dog on my Poodle Manicures page because there are 420 million pages related to the word dog and the vast majority of people searching for dog are not looking for Poodle manicure.

There are 15,000 pages competing for poodle manicures and only 8 pages competing for “poodle manicure”.

H1 tags are reserved for top level headlines on a page. You should only use 1 or 2 top level headlines (h1 tags) on your page.

Further discussion on h1 tags:

http://www.seroundtable.com/archives/002429.html

It’s also a good idea to use h2 and h3 tags where appropriate.

Adding an alt attribute to your image tag can help as well as having your keywords be in the file name. On my Poodle manicure page I would have:

”poodle

Read more about images and SEO

http://www.problogger.net/archives/2005/03/12/formatting-images-for-seo/

I also like to put one of my keywords at the bottom of my page. This way crawlers find my important keywords at the top of my page and at the very end of the page.

It helps to have your keyword phrase in your domain name and in the file name of your page. For my Poodle Manicures page the ideal url would be http://www.Poodles.com/PoodleManicures.html

It’s also imperative to embed your keyword phrase throughout your page at a rate of 1% to 2% of the words on your page (e.g. if you’re writing a 500 word article, add your main keyword 5 to 10 times). This is called keyword density. Note: there’s no exact science on how many times you should use your main keywords. To get an idea of how often they should be used do a Google search for your keyword phrase and look at the keyword density for the top 10 pages.

Something to consider is LSI (latent semantic indexing). Google now ranks pages based on words related to your main keywords and topic. LSI comes naturally, yet you should keep it in mind when thinking through word choice.

Here’s a tip. Take a look at your main keyword. Which words could you use in your content that will be closely related to that topic. Also consider using thesaurus.com to find synonyms for your main keyword.

How do I know which main keywords to use?

You can optimize your pages based on:

What people are searching for year round
What people are searching for right now
What people will be searching for in the coming weeks/months/season.
High Gravity Keywords
High Gravity Keywords are the most searched for terms year round, as well as usually the most competitive. They’re good to go after, but don’t expect to rank in the top 10 for those terms on your content pages. However, it’s more likely you can get your index page in the top 10.

You can find these competitive keywords by using a number of tools:

http://tools.seobook.com/general/keyword/

The SEO Book keyword tool is the omega of free research tools. After typing in a keyword and searching for your high gravity keywords, you’ll see results from WordTracker, Google, Yahoo, MSN, Yahoo Suggest, Google Trends, Keyword Discovery, Google Traffic Estimator, Google Suggest, Quintura, and AdWords Keyword Tool.

The downside to the SEO Book tool is that it doesn’t reveal how many pages on the web are competing for keywords.

You can get the number of pages competing for a given keyword by searching for the keyword phrase on Google.

Case Study:
Joshua Spaulding’s ez-onlinemoney.com ranks in the top 10 on Google for “make money online.” Spaulding gets traffic to his site every day from that competitive and highly searched term. He’s an average net entrepreneur like you and I.

Low Gravity Keywords
Low Gravity Keywords are less searched and less competitive. So why use them? It’s better to be in the top 10 for a low-searched term than buried on page 1000 for a high-searched term. If a term is only searched for 30 times each month and your page is in the top 10 for that term, you’re going to get the bulk of that traffic.

Low Gravity keywords can be found using a free version of WordTracker:

http://freekeywords.wordtracker.com/

For finding the right Low Gravity Keywords it’s necessary to research how many other pages on the web are competing for selected keywords.

Case Study:
Internet Marketer Michael Cheney has stated that a major tactic to his traffic generation success has been creating several pages with low-searched keywords and allowing the traffic to accumulate into a constant stream. We’re talking about a guy who makes six-figures in AdSense earnings alone.

Researching Keywords For What’s Hot Right Now

Finding keywords for what topics and events internet users are searching for now is easy.

You can use Google Trends http://www.google.com/trends/hottrends which will tell you hot keywords as it updates all day long.

Other methods include subscribing to Google News Feeds http://news.google.com and news feeds from top sources in your niche.

Researching Keywords that Will Be Searched For

There’s actually no away of completely predicting what others will be searching for in the coming weeks/months/season.

However, this is one of the best SEO tactics you may have never heard of. The tough part, as said, is that there’s no magic tool that will tell you what terms will be searched for.

One tactic you could use is to capitalize on traffic during Christmas time. There are entire sites dedicated to baking during the holiday season, etc.

Go to Google Trends and search something seasonal like Christmas Recipes.

Case Study:

A couple years ago I made a site for Halloween Costumes and got a ton of traffic in the month of October. I also got a ton of traffic to my Halloween Recipes site in October. I did the same thing for my Christmas Recipes site and saw an avalanche of traffic. How did I know these things would be popular? I used Google Trends:

http://www.google.com/trends?q=christmas+recipes…

Case Study:
ProBlogger.net’s Darren Rowse claims he began hitting the 1,000 visitors a day mark when he began predicting what terms will be searched: http://www.problogger.net/archives/2007/09/13/getting-to-1000…

The method worked so well that Rowse and a friend created an Olympics blog months before the 2004 Olympics including names of athletes, events, and other things. When the Olympics came around, Rowse and his pal witnessed unbelievable amounts of traffic.

Using Robots.txt

One simple thing you can do help to website crawlers crawl your site is have a robots.txt file. This is especially important if you have sections of your website you do not want indexed by search engines.

In addition your robots.txt file can store the location of your site’s sitemap, making it easy for crawlers to find and crawl every page on your site.

“The robots exclusion standard, also known as the Robots Exclusion Protocol or robots.txt protocol is a convention to prevent cooperating web spiders and other web robots from accessing all or part of a website which is, otherwise, publicly viewable.” - WikiPedia

robots.txt files are quite simple and easy to create.

An example robots.txt file:

User-agent: *Disallow: /tmp/
Disallow: /private/
Sitemap: http://www.example.com/sitemap.xml.gzThe first line file “User-agent: *” tells crawlers that any crawler can crawl the site.

The next 2 lines tell crawlers not to crawl anything in the tmp and private folders.

The last line tells the crawler where to find the site’s sitemap.

A robots.txt file is always found in the top level directory on your domain.

Example: http://www.example.com/robot.txt

I used to create my own robots.txt file but I recently found a site that will generate one for me for free:

http://www.mcanerin.com/EN/search-engine/robots-txt.asp

Read more about robots.txt:

http://en.wikipedia.org/wiki/Robots.txt

SEO Checklist

Learn the basics of HTML
Choose your keywords
Find what other sites are competing for those keywords
Put keyword phrase in tags.
Include an image on the page with the file name & the alt attribute containing your keyword phrase
Make sure your keyword phrase is found at the very bottom of your page.
Embed your keyword phrase throughout your page at a rate of 1% to 2%.
Have your keyword phrase in domain and in the page’s file name.
Create your robots.txt file.
Other Suggested SEO Resources:

Learn More about PageRank
http://www.seofaststart.com/blog/why-google-cant-just-dump-pagerank

Google Basic Searching
http://www.google.com/help/basics.html

Google Advanced Search
http://www.google.com/help/operators.html

Interpreting Google Search Results
http://www.google.com/help/interpret.html

Google Webmaster Central
http://www.google.com/webmasters/

Google SEO Basics
http://www.interspire.com/content/articles/13/1/Google-SEO-Basics-for-Beginners

WikiPedia & SEO
http://en.wikipedia.org/wiki/Search_engine_optimization

Google’s keyword search tool
https://adwords.google.com/select/KeywordToolExternal

No comments:

genius khaled