Thursday, August 20, 2009

Basic SEO Tips - Part 2

Competitor Analysis - Suppose you have a product or service that has some unique selling points, you should analyze your online competition so you can optimize your potential customers efficiently. From a SEO point of view you need analyze what other websites are updating regularly relative to your own website. Firstly recognize who they are and create a list by entering all of your primary and secondary key phrases into the major search engines, then build up to a list. You need to update the list ob regular basis to keep assumption on analysis. Evaluate each site cautiously to analyze your competitors on the following basis:-
Keyword / Key Phrase Density Analysis – Competitor’s keywords list is essential (as per above discussion) to authenticate your own analysis.
Pagerank Checkers – For larger number of competitors, use different tools to check their pagerank for your own assessment.
Search Engine Exposure - More willingly than visiting each of search engines independently, you can visit websites like net concepts to afford tools to determine how many internal links a site has been able to get cached in each major search engine.
Whois & Contact Forms - the WHOIS databases will permit you to match website owners with real world businesses that are using multiple sites to boost sales possibilities.
Site Age – web.archiev.org is the place where you can see the details of your competitor’s site when they started business and how much they have developed onwards, and then you can also apply those changes onto your sites.
Quote Checker & Mystery Shopper - if your competitors are online quote system it’s very easy to match up to your product price level against theirs. If they only use a contact form and telephone call back you could act as if to be a prospect requiring a quote.
Design & website: Website design is one of the most important area for good SEO practice:
Understanding of HTML - HTML is computer language that amenities the creation of internet web pages and is fundamentally a text file with a series of short defining codes around text.
Designing a Compelling Homepage - This is the vital page with the website as most uses will ground on it, comparative to other internal pages of your website.
Simple Navigation - The most vital feature of usability is a clean and simple navigational structure. Your structure must be unswerving across all pages so users have a clear perceptive of how to find their way between sections and back again.
Simple Page Structure - Strive and guess the areas of your page up but into a top border, side border, bottom border and main body.
Accessibility - Always check W3C mark up tools for user and search engines’ feasibility.
Implement a Linking Strategy , Meta tags implementation and usability, content optimization and link buildings are already discussed in my previous posts, please check them and read carefully.
Statistics, Logs, Web Analytics - Website Statistics can offer you with a huge range of in sequence was such as the performance of the users, lead and sale conversion ratios Most shared hosting comes with statistical options such as Smartstats, AwStats and Webalizer. On the other hand, there are also online services that provide similar information such as Google Analytics, Onestat and Statcounter.

Saturday, August 15, 2009

Content optimization overview

CONTENT IS THE KING” this is perhaps the single most important of ranking your website highly on the search engines. While all of the factors will help get your website into the top positions, it is your content that will sell your product or service and it is your content that the search engines will be reading when they take the “snapshot” of your site and determine where it should be placed in relation to the billions of pages on the internet.


Important optimization factors:

  1. Keyword Prominence, Density, Proximity and Frequency
  2. Heading Tags
  3. Anchor Text Link / Inline Text Links
  4. Special Text (bold, colored, etc.)


Kew Concepts: Keyword Prominence, Density, Proximity and Frequency:


Prominence: Prominence is a measure of keyword importance that indicated how close a keyword is to the beginning of the analyzed areas (e.g. page title). If some word is used as the beginning of the title, headings, or close to the beginning of the visible text of the page, it is considered more important than other words. Prominence is calculated separately for each important area. HTML markup lets you emphasize certain document areas against the others. The most important items are placed on the top, and their importance is gradually reduced towards the button. Keyword prominence works much the same way. Usually, the closer a keyword to the top of a page and to the beginning of a sentence, the higher its prominence is. However, search engines also check if the keyword is present in the middle and the bottom of the page, and you should be aware of that.


Frequency – No. of times the keyword is used in the analyzed area of the pages. For example, if the keywords is “website design” and the analyzed area content is like “Provides website design and development services including flash website design, logo and corporate ID design” then keyword frequency (website design) = 2.


Keyword Weight / Density – Keyword density, or weight, is a measure of how often a keyword is found in a specific area of the web page like a title, heading, anchor name, visible text etc. against all other words, Formulae – (number of words in the keyword phrase * frequency) / total words in analyzed areas


Proximity – Keyword proximity refers to how close keywords that make up your key phrase are close to each other. For example if the keyword is “electric iron” – “International Iron Inc is selling electronic iron for 7 years. International Iron is a company that specialized in electronic devices, irons and other stuff…” You see that we try to use words “electronic” and “iron” close to each other in the text so that the spider will understand we are targeting people who see these kinds of irons.


Heading:


When using your heading tags try to follow these rules:

  1. Never use the same tag twice on a single page
  2. Try to concise with your wording
  3. Use heading tags only when appropriate. If bold text will do then go that route
  4. Don’t use CSS to mask heading tags


Try to be concise with your wording. If you have a 2 keyword phrase that you are trying to target and you make a heading that is 10 words long then your keyword phrase only makes up about 20% of the total verbiage. If you have a 4-word heading on the other hand you would then have a 50% density and increased priority given to the keyword phrase you are targeting.


Special Text

To emphasis keywords use special formatting such as bold, underline, italic or different color as needed. Search engine do recognize those keywords with special formatting.


Don’ts

Do not apply special formatting unnecessarily here & there as it looks jaggy and doesn’t make any sense. Moreover search engine may penalize for over optimization.

Wednesday, August 12, 2009

Basic SEO Tips For Beginners:

Part 1

Introduction - Uncomplicatedly, SEO is the process of improving the number of visitors to a website via search engines. By optimizing your website with targeted specific key phrases used by your target customers, its possible for search engines rank your website more highly than similar non-optimized competitive. According to search engine strategy and principles, SEO should be viewed as a component of part of your overall professional internet marketing strategy and ethically used to improve the quality of your visitor experience. The first step is to recognize how search engines work.
Search Engine Basics - A search engine is the interactive platform that allows anybody to enter a search query for web site information from billions of web pages, files, video, images, music files. Most people are well acquainted with of Google, Yahoo, MSN but they're also literally hundreds of other less well known specialist Search Engines also providing similar services. When you visit search engine, search results are traditionally displayed as blue links with a short description about the website. The results related directly to the users search query.
How Do Search Engines Work? - Search engines use programmed mathematical algorithms to rank and compare web pages of an analogous content. The algorithms are very complexed to understand and use search bots automatically look for specific information when visiting a web site such as the robots.txt file, sitemap.xml file, WHOIS data. They do this to find new content in microseconds and make sure their own listings accessible to users are highly up to date and relevant. The data is stored by the search engine company in huge server data centers. Each and every company has security and hidden portfolio about their works, so have the search engine companies about the original mathematical formulae.
How Do Search Engines Present Relevant Results? – Generally, search engines check the number of links a website has from other websites and rank web sites on that basis, these links are called inbound links. To combat huge number of links present on a webpage called “link farm” the algorithms became more refined. Now-a-days links are less significant and the textual relevancy of the words, paragraphs, pages and entire theme of website is decisive to achieve high search engine results. Search engines utilize advanced anti spam factors to make certain that users are presented with the most relevant and quality results possible to replicate their search.
Key Phrase Analysis & Selection - The next and most important step is to make out the keywords associated to your product or service that your target customers are typing into search engines. Only then you will be to effectively strategize and optimize a website according to your market trend and customer’s demand. Key phrase selection is the first and primary step in internet marketing. Search engines use mathematical algorithms to evaluate web pages in order to rank these pages (based on a user search query). If you make speculate incorrectly or without research and target key phrases that don't importance buyers, your valiant effort goes in vain. On the other hand, if you target the right blend of keywords and phrases (before you even design your website), you will capitalize on your chances of higher search rankings and create a prospect to sell By using keyword selection tools, advertisers can identify what search terms are popular and competitive. Keyword tools are invaluable in identifying a range of niche search terms that can be used to help optimize a website to achieve higher search engine rankings/ more website visitors. These tools can also produce derivates and synonyms, common spelling mistakes as well as produce comparative competitiveness indices to see if a particular phrase is hard or easy to achieve top search listings with.
Once you have used your market knowledge and keyword tools to validate the search volumes of phrases, make a list, ranked by search volume, of your top 10 phrases. Perpetually there are derivatives of your top ten target phrases.

This is just the 1st part of the discussion. In my next post i will discuss some more important part of SEO , so for the time being stay tuned guys :)

Sunday, August 9, 2009

301 Redirection

One of the most effective and Search Engine Friendly method for webpage redirection is 301 redirection . It's not that hard to implement and it should preserve your search engine rankings for that particular page. To change file name or to move it to another location, 301 redirection is the safest way. In SEO, "301" indicates ''Parmanent" redirection. Below are the couple of 301 redirection methods described:

Old domain to New domain Redirection (htaccess redirect):

To redirect all the directories and pages of your old domain to your new domain, create a .htaccess file, which needs to be placed in the root directory of your old website (i.e the same directory where your index file is placed) .

Options +FollowSymLinks
RewriteEngine on
RewriteRule (.*) http://www.newdomain.com/$1 [R=301,L]

www.newdomain.com in the above code must be replaced with your actual domain name.

Redirect to www (htaccess redirect)


To redirect all requests coming in to domain.com to www.domain.com create a .htaccess file with the below code.The .htaccess file needs to be placed in the root directory of your old website (i.e the same directory where your index file is placed)

Options +FollowSymlinks
RewriteEngine on
rewritecond %{http_host} ^domain.com [nc]
rewriterule ^(.*)$ http://www.domain.com/$1 [r=301,nc]

domain.com and www.newdomain.com in the above code must be replaced with your actual domain name.

Note* This .htaccess method of redirection works ONLY on Linux servers having the Apache Mod-Rewrite moduled enabled.

====================================================

There are other redirection methods, but this are the most sensetive redirections in the SEO perpose.


Saturday, August 8, 2009

Static Vs Dynamic URLs

SEO recently are the hot topics in the word and for each and every website, URL is the basic constraint. URLs get extra precedence when we think on the SEO point of view because url structures are very very important for the search engine spiders to notice your site with millions of websites are there in the web.

Two types of URLs strick our mind, static and dynamic. On the SEO pint of view, both types of urls have importance, yet static URLs have extra added advantages over the dynamic URls due to the following reasons:

1. Static URLs have higher click-through rates in the SERPs, emails, web pages, etc.
2. Keyword prominence and relevancy is higher also,
3. Easier to copy, paste and handle on or offline,
4. Easy to remember and usable in branding and offline media,
5. Creates an accurate seamblance from users of what they’re about to see on the page,
6. When linked-to directly in URL format, it can be made to contain good anchor text to help the page rank higher,
7. All 4 of the major search engines and many other minor engines generally handle static URLs more easily than dynamic URLs, particularly if they consist multiple parameters.

There are penty of other facts regarding to this topic, which may be debatable, but i am not going to debate with anybody here. I am just stating my viwe over there and sharing my thoughs and experiences.

So, i will suggest every SEO hand to go through the website and just devide the SEO and non-SEO pages. Then check if the SEO pages have dynamic URLs. If it has then there are many URL rewrite tools in the web (webconfs.com), where you can easily rewrite all the dynamic pages into static pages, you just have to create a .htaccess file and you have to hard code it, then just upload the file on the root folder of the server and your job simply done. Now, relax, rest will be done by the search engine spider.

Always remember that search engines always prefer static URLs and treate them in good manner. So, before investing your lots of potential time doing SEO, just fix the URL structure first.

Wednesday, August 5, 2009

Top 5 On-Page SEO Factors

Search Engine Optimization now-a-days becoming a tough ask due to absolute competition. If you look for the theme of a particular website, you can easily collect the relevant keywords, based on that site. Most of the cases you target a keyword that at the end will give the value on your business as well as satisfy your client. But you will see that the keyword you choose, is very competitive in most cases, and it is not a cakewalk to get top position on that phrase. So, you have to toil very hard to get within top 10 ranking for your targeted key phrases and the main steps will be as follows:-

1. Keyword Use in Title Tag - Placing the targeted search term or phrase in the title tag of the web page’s HTML header will be very effective.

2. Keyword Use in Body Text - Using the targeted search term in the visible, HTML text of the page also have some boost up on the ranking factor.

3. Keyword Use in H1 Tag - Creating an H1 tag with the targeted search term/phrase, is very meaningful one.

4. Keyword Use in Domain Name - Including the targeted term/phrase in the registered domain name, i.e. keyword.com helps 50% of your job done.

5. Keyword Use in Page URL - Including target terms in the webpage URL, i.e. sherpawebstudios.com/keyword-phrase.

There are lot more other factors regarding the topic and i will discuss in my later post.


Monday, August 3, 2009

How Search Engine Crawls Site

It is an interesting query for all SEO experts that when a search engine visits a particular website, what it sees and what are the paths of observation? Well, different person’s have different opinions about it. The total process has no end and it is very debatable topic also. I will discuss only some points and i will focus on Google search engine’s view of a particular webpage.


Search engine generates crawler or bots which are nothing but calculated and automated programs having some algorithms with fixed constraints and instructions to examine a webpage. A crawler on its first visit to a webpage firstly wants to seek out HTML pages and just ignore all the MIME types. In order to request only HTML resources, a crawler just hit on the HTTP HEAD request to determine a web resource's MIME factor before requesting the entire resource with a GET request. There might be many HEAD requests and to avoid making numerous HEAD requests, a crawler may alternatively examine the URL and only request the resource if the URL ends with .html, .htm or a slash. If the crawler finds all factors cleared there, then it sees the robots.txt page to check the instructions there. Robots.txt is human made text files where some internal files, folders, URLs and some images or other things are blocked and search engine crawler just ignore the pages.


Crawlers then come to other pages freely and examines according to some rules. It crawls the webpage from top to bottom and also from left to right extensively. It first looks for the URL of the site and then checks all Meta data as Title, Description, Keywords, total size of the file, all the texts on the page, total no of words with distinct words. Then comes to check all links on the pages thoroughly. We have to remember that crawler always like simple text content to move freely. Any complex design, scripts images with no ALT tags are avoided by crawler. So to keep a webpage crawlable, these rules should be followed by the page designers, programmers and obviously SEO experts.