Tuesday, May 11, 2010

Utility of NOOPD & NOYDIR Meta tags:

meta tags are very important we all know for the SEO purpose. It's an old concept and properly optimized Meta tags give a lot of benefit to the website that you are trying to optimize for your SERP ranking.

We are already familiar with the famous meta tags like Title (though it's not properly a Meta but we consider it as one of the meta tag), description and keyword. But my question is that how many of us are familiar with the other meta tags like NOOPD and NOYDIR?????? I think very few SEO professionals among us have used it. Ok, no problem met, i am here to discuss about it right now............

Suppose you are searching for a particular keyword or key phrases in the search box and when you find for a particular search term, you would probably follow some sites are listed in the SERP, it might be your site or not, no problem, whatsoever, but the amazing thing is that, when your site has been shown in the SERP, sometime you observe that instead of showing your proper description of the website you have already optimized, it shows the snippet of DMOZ or open directory details of your website if it has been listed out there. Understand? I am trying to say that your website description is replaced by your website's open directory listing details and this is obviously not the result you desire to watch. You are 100% reluctant to see it. The only way you can omit such kind of undesirable result is by instructing your robots not to visit your open directory listings by incorporating the following Meta tag:

meta name="robots" content="NOODP"

The same thing is for the yahoo directory. If you don't wish to see your websites' description is replaced in the search results with the listing details of your site in the yahoo directory, then you can instruct the robots not to crawl your yahoo directory listings totally by incorporating the following tag in the Meta section of your website:

meta name="robots" content="NOYDIR"

I think the above post would definately help you guys to understant the said Meta tags well. If you need to comment please drop me a comment in this regard, i am always eager to hear you.

Meta will be always inserted through angle brace, but for HTML error it's been omitted.

Sunday, May 9, 2010

Same Keyword For Sub domain hampers SERP Ranking

In the SEO field, it’s a very popular confusion around the SEO guys that same keyword or key phrases for index page or other sub domains of the same domain is good or not so famous… Yes it’s a matter of debate and there is lots of opinion regarding this.

According to some SEO giant in the industry, it is said not to be choose same key phrases for lots of sub domains which dilute the ultimate SEO value of the main site you arwe targeting.

When you choose same key phrases for more pages of your sub domains, Google only shows 2-3 sub domains in the SERP result. Google always try to find out the relevancy of the pages you are targeting for the same set of key phrases. Suppose you are targeting 4 pages of your sub domain for the keyword “website promotions”, but you have only seen2 pages in the SERP results. This is due to the relevancy calculation of Goole of the 4 pages you have taken for granted.

In general Google bot when comes to your site for indexing, it looks for the relevant terms. When you optimize many pages with the same keyword, googlebot first of all choose the most relevant page for the selected term and keep it into it’s database and then give the value to the other terms by the relevancy basis.
But, the case is interesting one if you try to put key phrase for internal pages with the proper URL definition. Your URL has to be semantic and relevant with the key phrase you choose to optimize and your URL should consist the keyword term. In this way there is more probability to get SERP ranking for the chosen keyword for the internal pages of your sub domain.

So, the best solution is to choose separate keyword for different pages of your sub domain and though you want to choose same keyword or key phrase for different pages, your URL extension should be clearly stated.

Wednesday, May 5, 2010

Problems in the Google Local Business Center: Waiting For Next Update

Setting up your account and incorporating business locations is really easy, that is, if it’s only a few.
What is the real happening when you are adding multiple listings using the bulk-upload method? According to me this is the ideal method. It is really really fast, it tells you which fault to correct (if any) and you see your business listings materialize in a tick. Awaiting for approval (takes up 2-6 weeks or even more).
So what’s the problem? Well, we want to flavor it up a bit. Add some pictures & videos. But where is the bulk upload feature for this thing? I want to have a branded icon or my company logo added to all the 200 listings I just uploaded using the bulk-upload feature. Impossible! So at the very next moment you are busy doing hard labour work, manually adding images and videos per business location using an almost antique photo upload system. It might be good to disburse some awareness to the local business center.
You can just add some fields to the upload form with img src destinations to be used.
Instead of adding new features, try to improve the ones that already exist. Speaking of which, when is that category updating coming in Europe?
So hereby, I’m compiling a list of features I would like to see or updated in the Local Business Center.
the list of updates:
• category update (in Europe!)
• Interface redesign / usability upgrade
• Photo upload form
• Bulk-upload photos / videos
• statistics (define impressions / views ) , integrate with Google Analytics!
• insight in (trusted) sources used to gather my information (so I can contact them if something is wrong/missing or incomplete! )
• history on status of business listing (pending, awaiting approval, flagged, awaiting next update, marked as inappropriate, etc )
Feel free to leave a comment if you think something is missing!

Friday, April 30, 2010

How To Add Google Analytics @Google Knol

Most of us who are already in the SEO profession for the long time haven't even think of adding Google analytics on their Google knol!!! There are very few people who have applied this technique to make track the organic visitors as a whole. Those who have already applied it, fine but i think this post will help most of the SEO loving people who want to groom themselves.

First of all you have to create a Google knol on a particular topic. After publishing that topic and making it live, you have to click the "edit knol" button. Then you can see various options in the right panel of the post and there would be another button called "my preference". Click and you will be brought into a setting pages with multiple tabs, where you will find another tab name "Analytics". In the mean time you create a new account in Google analytics or from the existing account you can create a website profile name with the existing knol and after putting your details you can automatically receive a urchine.js code there with a "Tracking value". Copy that value and paste it into the "Analytic" section of that knol and finally click the "Save" button.

You are done and really applied something tedious but modern tactics. Now go to your Google analytics account and see the "pending" status of your newly incorporated tracking code.

After 24 hours you will be able to track all you need for your knol in respect to traffic details.

Hope this post will help you a lot! If anybody find interesting kindly make some comment.

Monday, April 26, 2010

Amazing Things - Google dance & Googlewashing

How many SEO guys among us are aware on google dance? Do you know what this is? Ok, i am here going to discuss the method explicitly below.

We all are aware on the fact that Google always going to make updates on it's alorithm or certain technical arenas. For the guys who are in the SEO field, it's really for them to know about the Google update. Tjis update minorly or majorly affect your website ranking factors. In general, Google makes changes on it's algorithm majorly one time in a month (sometime it has been observed that Google have made major algorithm change after 3 months long time) and this change is not so easy to take place, some time it requires more than 24 hours at a streach to get the change of the algorithm.

Google uses 3 internal servers like: http://www.google.com/, http://www2.google.com/, http://www3.google.com/ ................
Actually, this is very interresting topic and outside the Google company, nobody knows what they are and how they take place and when the two more servers run. Only Googler know about the fact in detail and they never disclose thiis fact to anybody outside the Google. When the major update takes place in the prime server, the other two servers depicted above run simultaneously and the most interesting part is that, at that moment a particular keyword or keyphrase shows 3 different ranking in the above 3 servers, i mean, there obviusly are rank differences for the above 3 servers, and how interesting it is!! and this method is known to us as "Google Dance"; "your particular keyword or keyphrase dances in ranking for 3 different servers".

...............................................................................

Another very crazy thing which can turn you even mad is "[B]Googlewashing[/B]", it is simply copying your original content ditto and pasting it in your duplicate site line-by-line. Suppose, you have written an article after a well made research and just uploaded to your site, immidiately you have seen another guy have just copied it thoroughly without a single line change and put it into his/her own site and it's been cached before your own one, how disgusting is it! and that person got the milage kicking you on your ass.

By Googlewashing, you become a total full in the eye of search engine as your credit goes to another person, who haven't give 1 min effort to gain such a long credentials. Googlewashing can kill your compititor or vice versa.

Wednesday, April 21, 2010

Beware Link Builders – Google Caffeine is Knocking @ Your Door

I am writing particularly this topic from some amount of knowledge I have gathered from different resources. It’s said that Google caffeine the latest update in Google algorithm is just knocking at the door. First I heard that it will release in the month of May 2010, now I am hearing its release date will be on the later half of year 2010. Whatever it may be, it will come to us very soon, as a die heard SEO, I am already thrilled about its launch.

I have heard that after big daddy (one change in Google algorithm in the year 2005 around), it’s the first major change or update of Google’s search engine algorithm structure. Google engineers certainly wouldn’t reveal their hidden policies related to caffeine, I have also heard some videos by Matt Cutt on this most wanted topic. For the SEO industries (small and large scale both) there will be some definite strategies for their thoughts on the way to SEO after the certain Google update takes place and most industries are already started thinking on this issues, what would be their new approach to cope up with the latest search technique.

Whatever the information gathered till to date, the main things are website speed, social bookmarking and quality link building is the three things matter most for any site’s SERP ranking. Today my discussion is just on the 3rd one, which is quality link building. The 3rd one matters most for SERP ranking of any site.

Now-a-days, most of the link building campaigns are going through wrong way, I am really sorry to say it, because their approach remains on building high amount of money, irrespective of quality work. I have watched such thousand campaigns and they fain in the long run for sure. Always keep your focus on very high quality link building, I mean build natural links for your targeted website. Choose 4-5 keywords and build very reasonable amount of links over a certain period of time, so that Google or other search engine can’t suspect your activities. Your anchor text should be relevant and thematic. Get links from high PR pages and the third party link page should be of good quality, I mean they shouldn’t be make for only link building purpose, those pages should have some genuine relevancy and web presence, otherwise your link would have no value and it’s simple waste of your time and client’s valuable money.

Not only rely on 3 way link building, try to get some links from dofollw blogs and very potential comments, which would be treated as 1 way link but with very very high value.

Remember, Google caffeine updating algorithm so that your site would have better index for deeper links and beware 3 way link builders, old approach to your link building would be gone very soon. Traditional method will be caught soon by the Google search engine especially after their algorithmic update and fake links not only would be valueless, it would severely damage your SERP raking. So, time has come for you all to think in a better way to do quality work, not quantity and earning money.

Monday, April 19, 2010

GWT Now Shows CTR & Position in SERPS !!!

I picked a term I know I have had the top 5 positions at various times, and it’s interesting to see the clickthrough rate on particular keyword searches…. and how many clicks the top position in Google gets compared to the number 3 position, no4 and no5.

Position 1 58 46 79%
Position 2 91 46 51%
Position 3 210 73 35%
Position 4 260 46 18%
Position 5 110 12 11%

Obviously, this is just one example – it will take a while to look into the new data and look at an average – but it shows a number 1 getting nearly 30% more of the clicks than a no2 ranking. You might find some useful nuggets of information at Google Webmaster Tools for your own site…..

Of course, click through rate can be skewed by any number of factors – the nature of the query or how compelling your call to actions are in your title and your meta description, to name just a couple.

Friday, April 16, 2010

SEO URLs Structure

URL structure is one of the most important thinks among all other in the field of search engine optimization. We all have heard one thing – SEO friendly URLs but how many of us have properly analyzed that?

Search engine crawler always prefers simple and static URLs rather than URLs with full of query strings or in a simple word, dynamic URLs. In the below example it is cleared.

http://www.mydomainname.com/product.php?id_1=2m ……….. Dynamic URL

http://www.mydomainname.com/directory1/directory2.php .......... Static URL

Search Engine always prefers the second one for simplicity. Dynamic URLs are mostly reluctant by all kind of search engines due to the complex structure also. From the SEO point of view, placement of important and thematic keywords in the URL carries heavy weight and helps the site to rank better in SERP.

Suppose your chosen keyword for a particular page of your site is mls listing services. You are determined to use this keyword for this page: http://www.ismartmls.com/mls-listings.php

Look at the above URL structure with proper attention. The main keyword for a particular targeted page is inserted in the URL name. What more you need?? Your SEO for this particular page is 75% done in the initial stage. Now-a-days, most SEO companies ignore this simple fact and at the end they simply loose the SEO value for their website and ultimately can’t satisfy their valuable clients after long and hard work.

Also look the keyword putting pattern in the URL depicted above. The words after the main domain are used with the expression ‘-’. Here is a debate, some people urge for underscore ‘_’ and some urge for dash ‘-’. As a SEO lover, I would go for both as I have instance of success with both the above expressions, yet I prefer ‘-’ more because, small phrases are lined up together with dash which will help crawler to crawl and index the URL far better than the one used ‘_’ which search engine figures out as separate words usage in the URL portion.

So, beware when determining the URL structure of your website you determined to do SEO, only because proper URL structure and keyword placement in the URL makes your job nearly done.

Also to note that search engine crawler still favors extensions like .htm, .html, .php rather than .asp or .aspx or others, so first of all if your site is create with other complex program rather than the said above, sit with the designers and programmers and tell them to convert the site with SEO friendly extension, if you ultimately need to satisfy your SEO clients.

Wednesday, April 14, 2010

Video XML Sitemap – Way to Grow Feasibility of Your Site

Most of us doing SEO for a long time are well acquainted with the general XML sitemap. We all know that normal XML sitemap is created to enhance the crawling and indexing capacity of a text based website in a far better way to make it more feasible. But, we all have flown with the concept that is there any option to rank for our site through embedded video content of a website? This question is growing day by day on a SEO expert’s mind. And where there is a question, there is also a solution we all know. And, conclusion is that, video XML sitemap can solve this problem.

We all know that search engines are generally reluctant to crawl video content present in a site. You can have a website containing very important video contents provided by the client and clear instruction from the client to rank those videos in SERP. Also there are some clients who demand SERP result for most competitive keyword, like “business, internet” etc. Video XML sitemap can help in this part in a better way.

Video Sitemap Guidelines

A video sitemap contains only URLs that refer to the video content of the site, web pages with embedded video, URLs to players for video

If we have multiple videos on the same HTML based page, we need to use different entries for them with different or to provide information. Try to avoid hosting same videos on different URLs of the same site.

Don’t use more than 10,000 video items on the page. You can use multitple video sitemaps, no nesting required here.

.mpg, .mpeg, .mp4, .mov, .wmv, .asf, .avi, .ra, .ram, .rm, .flv, .swf. these are the video file type that google can crawl. One of the most amazing ways to promote flash site is to create video XML sitemap for all the flash content. It will surely yield result. All files should be HTTP accessibility.

We have to keep in mind that no URL used in the video XML sitemap should be blocked from the robots.txt file.

and are the two very important tags for Google web crawler. Here you can put your most competing keywords for generating SE ranking.


Tuesday, April 13, 2010

The Methods Of Negative SEO

We the SEO lover people, always think about boosting out key phrase ranking in the SERP, but how much of us know about negative SEO?? Probably very few of us….

Negative SEO is the tactic to demoralize your core competitor’s key phrase ranking in the search result to boost your own. It reduces a page’s ranking in the search result.

Why do Negative SEO?

This yet to be favorite SEO tactic is mainly done for the following main two reasons:

1. To downgrade your competitor in the search ranking and SERP. This technique is called SERP bubbling. Generally it is done alongside the normal SEO to improve your own site at the same time to dominate your competitor to boost your own ranking.
2. To conceal the news about yourself and also your company. Suppose you have bad reputation over the globe around the web, but as a company owner or as a celebrity or famous person you obviously don’t want to spread those bad things more which can severely damage your reputation. So, you have to downgrade those who are spreading bad things about you and your organization by reducing those particular profiles’s ranking, so that people can’t be much more aware of that. This process is called Online Reputation Management (ORM/SERM).

The Process of Doing Negative SEO


There are a number of ways you can eliminate a page from the search engine rankings. Some of the methods listed are legitimate and good practice, but others are sneaky, immoral and probably illegal. This information is provided so that you may prevent your site being a victim of negative SEO, rather than to condone the practice.
1. Removing aberrant content
This is the easiest method. If an insulting post has been made on a forum, you can call for the moderators to remove the content. In general, forum moderators and blog owners are quite happy to delete potentially damaging posts and comments.

2. Endorse non felonious content - "Insulation"
This involves creating or promoting non harmful pages, for example, if a page contains negative messages about your client, you can create positive message content, and do SEO to promote those pages higher up, and force the bad press down the rankings where it will have less impact.
Similary, you can do this for business competitors. If you sell product A and your competitor sells products A and B, then you can promote another company that just sells product B, which is not in competition with you, but is in competition with the business that you are doing negative SEO on.
This method is sometimes called Google Insulation.

3. Google Bowling
This is a technique designed to remove a site from the SERPs by making google believe the site is spammy. The are two ways this can be done. One is to add links to the site from lots of bad neighbourhoods, link farms and automatically generated spammy pages. If you get thousands of links back to he site in a few days and get them to show up in googles results, this can trigger a spam alert and affect the rankings of the site. The other way it is done is to find a page on your competitors website which has dynamic URLs but has the same content, for example, if they have a page with the url http://mycompetitors.com/index.php?page=11 , then is the url is changed to page=12 but has the same content, then you are vulnerable to google bowling by url manipulation. What is done then is to create hundreds of links with slightly different urls but the same content, post these links liberally around forums, blogs, directories and link farms, and sooner or later google will tag it as black hat.

4. Infect their site
If a site is infected with a virus, then it is flagged up in the SERPs as being potentially dangerous. This can be also achieved by using cross site scripting (XSS) vulnerabilities to create links to pages which display on page content from posted form elements or querystrings - for example, if you have a site with a page like search.asp?keyword=mysearch and in the page itself it says 'there are no results for mysearch' - then the link can be manipulated to search.asp?keyword=. If you then post this link on a webpage, then anyone clicking on it will get the search results page, and where it says 'there are no results for XX' the javascript is inserted into the page content and runs with the same security level as the main page itself. When google picks this link up, it will flag your site as infected and possibly remove it from the search results.

5. Tattling
This involved informing google that a site contravenes its guidelines. Usually this is to report paid-for links (which you youself could set up without the target site's involvement), or grey hat SEO tactics used on the site. Other forms of tattling is to claim copyright theft of content or images.

6. Guilty by Association
This method involves making your own spammy site - the spammier the better - using a similar url to your competitor and if possible use the same domain registrar and hosting services. Copying the metatags and site content of the home page of your target site is also useful. Then you do everything in your power to get the site banned (it's not hard). Once you have done this, you install a 301 redirect to your competitors site and sit back and watch it slide down the rankings linke a pig on a greasy pole! This is especially effective if your black site has the same pages as the target site and you do individual 301 redirects to the target site.

7. False duplicated content
The way this is done is to create a site with the same content as your competitor, but try to get the new content to the site crawled before your competitor. For example, if your competitor changes their home page, you change your honeypot site's homepage to the same content and metatags, then submit a sitemap with just that page on it to google, bing and yahoo site explorer, so that your content is indexed first and your competitor's content is ignored as duplicate content when the search engines get around to indexing them. This is very hard to defend against, and only reporting the site as phishing content can save you here. Canonical urls can also help.

8. Denial of Service Attacks (DOS)
This method of hacking uses several different computers to simultaniously flood the target website with requests so that the volume of traffic blocks up the website's bandwidth and essentially cuts off access to the rest of the world. Distributed denial of service attacks (DDOS) are even more damaging because they use hundreds or even thousands of virus infected zombie PCs all on different IP Addresses to attack the target site.
If a site is unreachable when google tries to crawl it, this has negative consequences for the rankings of the site.

9. Click Fraud
If your competitor has adwords running for their site, you can click on their adverts to use up their budget and affect the number of genuine visitors. Generally google is pretty good at detecting this, so it has limited impact. However, if you set up a team of people and got them all to do 3 or 4 clicks a day, it soon adds up.
Another click mechanism is to get all your friends to click on your site, or the sites just below your competitor. There is some evidence to suggest that the number of clicks affects your rankings.

10. Adsense Banning
If your target site uses google adsense, then you can click on their adverts on their site many times until the adsense account is suspended. Its much easier to suspend someone's adsense account than it is to get it resumed following allogations of click fraud.

11. Black social bookmarking
This method uses social networking sites like twitter and facebook to create lots of bogus accounts, then use these accounts to create spammy links to the target site with phrases like 'viagra', 'porn', 'teens', 'warez', 'crackz', 'gambling' etc. This is an extension of google bowling taking advantage of the new features of google that include real time search of social networking sites.

Monday, April 12, 2010

Updated Seo News, Seo Tips & Techniques Seo Discussion Blog

There are lots of websites in the web and its not really funny to compete with them and brings your own in the SERP (Search Engine Result Pages). You have to apply website optimization techniques with sense to drive lots of traffic towards your site. Keywords are the mail things to group a website by analyzing the business it provides and then the grouping is more necessary. Now, separate page of the website has separate targeted keywords and SEO techniques should be applied on them to gain the high rankings for those keywords and to bring them on SERP.

Website Analysis & Domain Age: If you can analyze your website well, you are done more than half of your job. The domain name is very important, your main targeted keyword in the URL matter a lot in terms of ranking in the web. Also the domain age is very important as most of the search engine bots prefer old age domain rather than the recently made one.

URL Structure: SEO friendly URLs are required for your website and the static URLs are always better for your website. Always use relevant keyword in the URL to fetch maximum benefit. If your website consists lots of dynamic URLs, work out well to convert them into static one because search engine doesnt prefer URLs having query strings.

Meta Tags: Title is the most important Meta tag for your website as it appears in the SERP. Each page should have three targeted keywords and at least two of them should be posted on the Title and the most relevant one should face the beginning of the sentence. Title should be in Title Case and maximum 65-70 characters (with space). Also include one strategic keyword in the Meta description and dont use more than 160 characters (with space) there. This tag is not so important for search engine, though you can get few weights. Meta keyword is less important tag and dont use more than three keywords there and dont spam.

Header Tags: After Title tag, H1 is the most important one to gain more SEO benefit for your site. Include one priority keyword there and if your site permits also include H2, H3 upto H6 there, but this is not always important.

Content: Lean on the colloquial language content is king and always use fresh, well-researched and keyword reached content to draw more traffic for your website. When you write content, keep your focus on keyword prominence, keyword density and keyword weight to make it more SEO friendly. Dont use tour optimized keywords more than 3-5%, otherwise your content will loose SEO weight.

Link popularity: How much back-links your website have, is very crucial factor to gain SERP ranking. Analyze your core competitor based on their inbound links, and try to gain some links from the sites your competitor fetched. It will surely take your site more steps away. Try to gain links from your relevant sites and if you get plenty of quality links (1 way or 3 ways), then search engine will automatically stress on your site more than other sites having less important spam links.



Thursday, August 20, 2009

Basic SEO Tips - Part 2

Competitor Analysis - Suppose you have a product or service that has some unique selling points, you should analyze your online competition so you can optimize your potential customers efficiently. From a SEO point of view you need analyze what other websites are updating regularly relative to your own website. Firstly recognize who they are and create a list by entering all of your primary and secondary key phrases into the major search engines, then build up to a list. You need to update the list ob regular basis to keep assumption on analysis. Evaluate each site cautiously to analyze your competitors on the following basis:-
Keyword / Key Phrase Density Analysis – Competitor’s keywords list is essential (as per above discussion) to authenticate your own analysis.
Pagerank Checkers – For larger number of competitors, use different tools to check their pagerank for your own assessment.
Search Engine Exposure - More willingly than visiting each of search engines independently, you can visit websites like net concepts to afford tools to determine how many internal links a site has been able to get cached in each major search engine.
Whois & Contact Forms - the WHOIS databases will permit you to match website owners with real world businesses that are using multiple sites to boost sales possibilities.
Site Age – web.archiev.org is the place where you can see the details of your competitor’s site when they started business and how much they have developed onwards, and then you can also apply those changes onto your sites.
Quote Checker & Mystery Shopper - if your competitors are online quote system it’s very easy to match up to your product price level against theirs. If they only use a contact form and telephone call back you could act as if to be a prospect requiring a quote.
Design & website: Website design is one of the most important area for good SEO practice:
Understanding of HTML - HTML is computer language that amenities the creation of internet web pages and is fundamentally a text file with a series of short defining codes around text.
Designing a Compelling Homepage - This is the vital page with the website as most uses will ground on it, comparative to other internal pages of your website.
Simple Navigation - The most vital feature of usability is a clean and simple navigational structure. Your structure must be unswerving across all pages so users have a clear perceptive of how to find their way between sections and back again.
Simple Page Structure - Strive and guess the areas of your page up but into a top border, side border, bottom border and main body.
Accessibility - Always check W3C mark up tools for user and search engines’ feasibility.
Implement a Linking Strategy , Meta tags implementation and usability, content optimization and link buildings are already discussed in my previous posts, please check them and read carefully.
Statistics, Logs, Web Analytics - Website Statistics can offer you with a huge range of in sequence was such as the performance of the users, lead and sale conversion ratios Most shared hosting comes with statistical options such as Smartstats, AwStats and Webalizer. On the other hand, there are also online services that provide similar information such as Google Analytics, Onestat and Statcounter.

Saturday, August 15, 2009

Content optimization overview

CONTENT IS THE KING” this is perhaps the single most important of ranking your website highly on the search engines. While all of the factors will help get your website into the top positions, it is your content that will sell your product or service and it is your content that the search engines will be reading when they take the “snapshot” of your site and determine where it should be placed in relation to the billions of pages on the internet.


Important optimization factors:

  1. Keyword Prominence, Density, Proximity and Frequency
  2. Heading Tags
  3. Anchor Text Link / Inline Text Links
  4. Special Text (bold, colored, etc.)


Kew Concepts: Keyword Prominence, Density, Proximity and Frequency:


Prominence: Prominence is a measure of keyword importance that indicated how close a keyword is to the beginning of the analyzed areas (e.g. page title). If some word is used as the beginning of the title, headings, or close to the beginning of the visible text of the page, it is considered more important than other words. Prominence is calculated separately for each important area. HTML markup lets you emphasize certain document areas against the others. The most important items are placed on the top, and their importance is gradually reduced towards the button. Keyword prominence works much the same way. Usually, the closer a keyword to the top of a page and to the beginning of a sentence, the higher its prominence is. However, search engines also check if the keyword is present in the middle and the bottom of the page, and you should be aware of that.


Frequency – No. of times the keyword is used in the analyzed area of the pages. For example, if the keywords is “website design” and the analyzed area content is like “Provides website design and development services including flash website design, logo and corporate ID design” then keyword frequency (website design) = 2.


Keyword Weight / Density – Keyword density, or weight, is a measure of how often a keyword is found in a specific area of the web page like a title, heading, anchor name, visible text etc. against all other words, Formulae – (number of words in the keyword phrase * frequency) / total words in analyzed areas


Proximity – Keyword proximity refers to how close keywords that make up your key phrase are close to each other. For example if the keyword is “electric iron” – “International Iron Inc is selling electronic iron for 7 years. International Iron is a company that specialized in electronic devices, irons and other stuff…” You see that we try to use words “electronic” and “iron” close to each other in the text so that the spider will understand we are targeting people who see these kinds of irons.


Heading:


When using your heading tags try to follow these rules:

  1. Never use the same tag twice on a single page
  2. Try to concise with your wording
  3. Use heading tags only when appropriate. If bold text will do then go that route
  4. Don’t use CSS to mask heading tags


Try to be concise with your wording. If you have a 2 keyword phrase that you are trying to target and you make a heading that is 10 words long then your keyword phrase only makes up about 20% of the total verbiage. If you have a 4-word heading on the other hand you would then have a 50% density and increased priority given to the keyword phrase you are targeting.


Special Text

To emphasis keywords use special formatting such as bold, underline, italic or different color as needed. Search engine do recognize those keywords with special formatting.


Don’ts

Do not apply special formatting unnecessarily here & there as it looks jaggy and doesn’t make any sense. Moreover search engine may penalize for over optimization.

Wednesday, August 12, 2009

Basic SEO Tips For Beginners:

Part 1

Introduction - Uncomplicatedly, SEO is the process of improving the number of visitors to a website via search engines. By optimizing your website with targeted specific key phrases used by your target customers, its possible for search engines rank your website more highly than similar non-optimized competitive. According to search engine strategy and principles, SEO should be viewed as a component of part of your overall professional internet marketing strategy and ethically used to improve the quality of your visitor experience. The first step is to recognize how search engines work.
Search Engine Basics - A search engine is the interactive platform that allows anybody to enter a search query for web site information from billions of web pages, files, video, images, music files. Most people are well acquainted with of Google, Yahoo, MSN but they're also literally hundreds of other less well known specialist Search Engines also providing similar services. When you visit search engine, search results are traditionally displayed as blue links with a short description about the website. The results related directly to the users search query.
How Do Search Engines Work? - Search engines use programmed mathematical algorithms to rank and compare web pages of an analogous content. The algorithms are very complexed to understand and use search bots automatically look for specific information when visiting a web site such as the robots.txt file, sitemap.xml file, WHOIS data. They do this to find new content in microseconds and make sure their own listings accessible to users are highly up to date and relevant. The data is stored by the search engine company in huge server data centers. Each and every company has security and hidden portfolio about their works, so have the search engine companies about the original mathematical formulae.
How Do Search Engines Present Relevant Results? – Generally, search engines check the number of links a website has from other websites and rank web sites on that basis, these links are called inbound links. To combat huge number of links present on a webpage called “link farm” the algorithms became more refined. Now-a-days links are less significant and the textual relevancy of the words, paragraphs, pages and entire theme of website is decisive to achieve high search engine results. Search engines utilize advanced anti spam factors to make certain that users are presented with the most relevant and quality results possible to replicate their search.
Key Phrase Analysis & Selection - The next and most important step is to make out the keywords associated to your product or service that your target customers are typing into search engines. Only then you will be to effectively strategize and optimize a website according to your market trend and customer’s demand. Key phrase selection is the first and primary step in internet marketing. Search engines use mathematical algorithms to evaluate web pages in order to rank these pages (based on a user search query). If you make speculate incorrectly or without research and target key phrases that don't importance buyers, your valiant effort goes in vain. On the other hand, if you target the right blend of keywords and phrases (before you even design your website), you will capitalize on your chances of higher search rankings and create a prospect to sell By using keyword selection tools, advertisers can identify what search terms are popular and competitive. Keyword tools are invaluable in identifying a range of niche search terms that can be used to help optimize a website to achieve higher search engine rankings/ more website visitors. These tools can also produce derivates and synonyms, common spelling mistakes as well as produce comparative competitiveness indices to see if a particular phrase is hard or easy to achieve top search listings with.
Once you have used your market knowledge and keyword tools to validate the search volumes of phrases, make a list, ranked by search volume, of your top 10 phrases. Perpetually there are derivatives of your top ten target phrases.

This is just the 1st part of the discussion. In my next post i will discuss some more important part of SEO , so for the time being stay tuned guys :)

Sunday, August 9, 2009

301 Redirection

One of the most effective and Search Engine Friendly method for webpage redirection is 301 redirection . It's not that hard to implement and it should preserve your search engine rankings for that particular page. To change file name or to move it to another location, 301 redirection is the safest way. In SEO, "301" indicates ''Parmanent" redirection. Below are the couple of 301 redirection methods described:

Old domain to New domain Redirection (htaccess redirect):

To redirect all the directories and pages of your old domain to your new domain, create a .htaccess file, which needs to be placed in the root directory of your old website (i.e the same directory where your index file is placed) .

Options +FollowSymLinks
RewriteEngine on
RewriteRule (.*) http://www.newdomain.com/$1 [R=301,L]

www.newdomain.com in the above code must be replaced with your actual domain name.

Redirect to www (htaccess redirect)


To redirect all requests coming in to domain.com to www.domain.com create a .htaccess file with the below code.The .htaccess file needs to be placed in the root directory of your old website (i.e the same directory where your index file is placed)

Options +FollowSymlinks
RewriteEngine on
rewritecond %{http_host} ^domain.com [nc]
rewriterule ^(.*)$ http://www.domain.com/$1 [r=301,nc]

domain.com and www.newdomain.com in the above code must be replaced with your actual domain name.

Note* This .htaccess method of redirection works ONLY on Linux servers having the Apache Mod-Rewrite moduled enabled.

====================================================

There are other redirection methods, but this are the most sensetive redirections in the SEO perpose.


Saturday, August 8, 2009

Static Vs Dynamic URLs

SEO recently are the hot topics in the word and for each and every website, URL is the basic constraint. URLs get extra precedence when we think on the SEO point of view because url structures are very very important for the search engine spiders to notice your site with millions of websites are there in the web.

Two types of URLs strick our mind, static and dynamic. On the SEO pint of view, both types of urls have importance, yet static URLs have extra added advantages over the dynamic URls due to the following reasons:

1. Static URLs have higher click-through rates in the SERPs, emails, web pages, etc.
2. Keyword prominence and relevancy is higher also,
3. Easier to copy, paste and handle on or offline,
4. Easy to remember and usable in branding and offline media,
5. Creates an accurate seamblance from users of what they’re about to see on the page,
6. When linked-to directly in URL format, it can be made to contain good anchor text to help the page rank higher,
7. All 4 of the major search engines and many other minor engines generally handle static URLs more easily than dynamic URLs, particularly if they consist multiple parameters.

There are penty of other facts regarding to this topic, which may be debatable, but i am not going to debate with anybody here. I am just stating my viwe over there and sharing my thoughs and experiences.

So, i will suggest every SEO hand to go through the website and just devide the SEO and non-SEO pages. Then check if the SEO pages have dynamic URLs. If it has then there are many URL rewrite tools in the web (webconfs.com), where you can easily rewrite all the dynamic pages into static pages, you just have to create a .htaccess file and you have to hard code it, then just upload the file on the root folder of the server and your job simply done. Now, relax, rest will be done by the search engine spider.

Always remember that search engines always prefer static URLs and treate them in good manner. So, before investing your lots of potential time doing SEO, just fix the URL structure first.

Wednesday, August 5, 2009

Top 5 On-Page SEO Factors

Search Engine Optimization now-a-days becoming a tough ask due to absolute competition. If you look for the theme of a particular website, you can easily collect the relevant keywords, based on that site. Most of the cases you target a keyword that at the end will give the value on your business as well as satisfy your client. But you will see that the keyword you choose, is very competitive in most cases, and it is not a cakewalk to get top position on that phrase. So, you have to toil very hard to get within top 10 ranking for your targeted key phrases and the main steps will be as follows:-

1. Keyword Use in Title Tag - Placing the targeted search term or phrase in the title tag of the web page’s HTML header will be very effective.

2. Keyword Use in Body Text - Using the targeted search term in the visible, HTML text of the page also have some boost up on the ranking factor.

3. Keyword Use in H1 Tag - Creating an H1 tag with the targeted search term/phrase, is very meaningful one.

4. Keyword Use in Domain Name - Including the targeted term/phrase in the registered domain name, i.e. keyword.com helps 50% of your job done.

5. Keyword Use in Page URL - Including target terms in the webpage URL, i.e. sherpawebstudios.com/keyword-phrase.

There are lot more other factors regarding the topic and i will discuss in my later post.


Monday, August 3, 2009

How Search Engine Crawls Site

It is an interesting query for all SEO experts that when a search engine visits a particular website, what it sees and what are the paths of observation? Well, different person’s have different opinions about it. The total process has no end and it is very debatable topic also. I will discuss only some points and i will focus on Google search engine’s view of a particular webpage.


Search engine generates crawler or bots which are nothing but calculated and automated programs having some algorithms with fixed constraints and instructions to examine a webpage. A crawler on its first visit to a webpage firstly wants to seek out HTML pages and just ignore all the MIME types. In order to request only HTML resources, a crawler just hit on the HTTP HEAD request to determine a web resource's MIME factor before requesting the entire resource with a GET request. There might be many HEAD requests and to avoid making numerous HEAD requests, a crawler may alternatively examine the URL and only request the resource if the URL ends with .html, .htm or a slash. If the crawler finds all factors cleared there, then it sees the robots.txt page to check the instructions there. Robots.txt is human made text files where some internal files, folders, URLs and some images or other things are blocked and search engine crawler just ignore the pages.


Crawlers then come to other pages freely and examines according to some rules. It crawls the webpage from top to bottom and also from left to right extensively. It first looks for the URL of the site and then checks all Meta data as Title, Description, Keywords, total size of the file, all the texts on the page, total no of words with distinct words. Then comes to check all links on the pages thoroughly. We have to remember that crawler always like simple text content to move freely. Any complex design, scripts images with no ALT tags are avoided by crawler. So to keep a webpage crawlable, these rules should be followed by the page designers, programmers and obviously SEO experts.

Friday, July 31, 2009

What Is Pagerank Sculpting

Just a few days ago in the world of SEO, attribute rel="nofollow" was a palatable way to stop search engine crawler from moving freely on all the pages of a website through the internal links. Simply, because someone wanted to restore the page rank on the targeted page and not to move that to the other pages. It not only helped sites to rank better in SERPs, but on the other way it attracted the client's view also.

On the old methodology, Google itselp accepted the process but recently Google has changed their algorithm and now-a-days, Google just gives importance to a website by judging plenties of other factors, not mere the link's nofollow attribute.

Google, just gives special attention on making great content that will attract links in the first place and choosing a site architecture that makes your site usable/crawlable for humans and search engines. Visitors and search engines shold be allowed to move freely through all internal pages of a site, to count value of each and every page. Obviously, somes secured pages should make nofollow as you choose search engines, not to go out those pages, as client log in, testimonial etc. You can make those pages, nofollow, no problem at all with it.



So, finally, avoid the malpractice of adding nofollow attribute much on the internal pages of your site, search engine crawler will not bother and your site will not get much more value.

Thursday, July 30, 2009

Why Blogging Important For Business?

Blog being an essential social marketing tool forms to be a great way to communicate with colleagues, friends and more particularly with potential customers and to turn them as regular buyers as well. In recent times, blogging forms to be the main platform for making money online. Most people use it as a basic source of their income.

Blogging is considered important for most businesses as a blog with good and authentic information can help in driving traffic to your main website through your blog. By providing a link of your main site in the blog; it will drive the traffic to offer you double exposure to the world and your customers or buyers as well.

Remember without traffic, your website will never get good exposure in the web and your blog comes out to be the main base of your site. Thus good blogs require to be updated each week and at least after three days or so to make search engines come back to your blog. If you provide valuable information relating to your own business it will build up credibility in the minds of buyers.




Once you win the readers trust, you can reach out your businesses and recommend anything to buy. Today, blogging is the most profitable business that helps users to earn or make money. Without thinking it as a mere hobby, treat blogging as a business and a good medium to make money.