Monday, August 17, 2009

Stop Worrying And Start Living...

We are spending most of our precious time on thinking about future and grain about the vast past.What is the use of thinking about those two eternities. Just keep one thing in your mind that "Today is the first day of the rest of my life..".

I want you to think of your life as an hourglass.There are thousands of grains of sand in the top of hourglass.They all pass slowly and evenly through the narrow neck in the middle.Nothing you or I could do would make more than one grain pass through the narrow neck without impairing the hourglass. we all are like this hourglass.

When we start in the morning, there are hundreds of tasks which we feel that we must accomplish that day,But If we do not take them one at a time and let them pass through the day slowly and evenly, as do grains of sand passing through the narrow neck of hourglass, then we are bound to wreck our own physical and mental structure.

You and I are standing this very second at the meeting-place of two eternities: the vast past that has endured for ever, and the future that is plunging on to the last syllable of recorded time. We can't possibly live in either of those eternities-no, not even for one split second. But, by trying to do so, we can wreck both our bodies and our minds. So let's be content to live the only time we can possibly
live. Stop Worrying And Start Living... :)

Wednesday, August 05, 2009

Keywords play major role on SEO

Keywords in "title" tag This is one of the most important places to have a keyword because what is written inside the "title" tag shows in search results as your page title. The title tag must be short (6 or 7 words at most) and the keyword must be near the beginning.

Keywords in URL Keywords in URLs help rank well on SE’s - e.g. - http://domainname.com/seo-services.html, where “SEO services” is the keyword phrase you attempt to rank well for. But if you don't have the keywords in other parts of the document, don't rely on having them in the URL.

Keyword density in document text 3-7 % density is best, 1-2 % for minor Keywords. Keyword density of for major keywords over 10% is suspicious and looks more like keyword stuffing, than a naturally written text.

Keywords in anchor text The anchor text or link label is the visible, clickable text in a hyperlink. The words contained in the Anchor text can determine the ranking that page will receive by search engines. Keywords in anchor text help inbound links as well as page ranking.

Keywords in headings ("H1", "H2", etc. tags) Headings Offers special clues to the search Engines -- H1, H2, H3 in HTML-ese. Since headlines often contain important hints to the content of the webpage, search engines take note of any keywords found here.

Keywords in the beginning of a document Also counts, though not as much as anchor text, title tag or headings. However, have in mind that the beginning of a document does not necessarily mean the first paragraph – for instance if you use tables, the first paragraph of text might be in the second half of the table.

Keywords in tags Spiders don't read images but they do read their textual descriptions in the tag, so if you have images on your page, fill in the tag with some keywords about them.

Keywords in metatags Less important, especially for Google. Yahoo! and MSN still rely on them, so if you are optimizing for Yahoo! or MSN, fill these tags properly. In any case, filling these tags properly will not hurt so do it.

Keyword proximity Keyword proximity measures how close in the text the keywords are. It is best if they are immediately one after the other (e.g. “dog food”), with no other words between them. For instance, if you have “dog” in the first paragraph and “food” in the third paragraph, this also counts but not as much as having the phrase “dog food” without any other words in between. Keyword proximity is applicable for keyword phrases that consist of 2 or more words.

Keyword phrases In addition to keywords, you can optimize for keyword phrases that consist of several words – e.g. “SEO services”. It is best when the keyword phrases you optimize for are popular ones, so you can get a lot of exact matches of the search string but sometimes it makes sense to optimize for 2 or 3 separate keywords (“SEO” and “services”) than for one phrase that might occasionally get an exact match.

Secondary keywords Optimizing for secondary keywords can be a golden mine because when everybody else is optimizing for the most popular keywords, there will be less competition (and probably more hits) for pages that are optimized for the minor words. For instance, “real estate new jersey” might have thousand times less hits than “real estate” only but if you are operating in New Jersey, you will get less but considerably better targeted traffic.

Keyword stemming For English this is not so much of a factor because words that stem from the same root (e.g. dog, dogs, doggy, etc.) are considered related and if you have “dog” on your page, you will get hits for “dogs” and “doggy” as well, but for other languages keywords stemming could be an issue because different words that stem from the same root are considered as not related and you might need to optimize for all of them.

Keywords Synonyms Optimizing for synonyms of the target keywords, in addition to the main keywords. This is good for sites in English, for which search engines are smart enough to use synonyms as well, when ranking sites but for many other languages synonyms are not taken into account, when calculating rankings and relevancy.

Keyword Mistypes Spelling errors are very frequent and if you know that your target keywords have popular misspellings or alternative spellings (i.e. Christmas and Xmas), you might be tempted to optimize for them. Yes, this might get you some more traffic but having spelling mistakes on your site does not make a good impression, so you'd better don't do it, or do it only in the metatags.

Keyword dilution When you are optimizing for an excessive amount of keywords, especially unrelated ones, this will affect the performance of all your keywords and even the major ones will be lost (diluted) in the text.

Keyword stuffing Any artificially inflated keyword density (10% and over) is keyword stuffing and you risk getting banned from search engines.

Wednesday, July 29, 2009

Hey do you know about robots.txt?!!!!

What are Robots.txt files ?
Robots.txt file is a text file (yea, the ones on notepad) that resides on your server, and controls a whole lot of features on your website (whatever platform it is built on). It's a simple text file in which there are a few lines of text, but it's very powerful that it can even decide whether your website should be shown on Google or not, what part of your website should be shown to the search engines (like Google, Yahoo and MSN).

What is a Robots.txt file technically?

In order to understand what robots.txt files are, you have to first understand what a Robot (the web one) is.

A robot - is technically a program from search engines like Google, Yahoo and MSN that are set out on the internet to do the job of finding out new websites, indexing them and gathering the right information about the website. They are sometime called "spiders", "crawlers" and even "bots".

Where do the Robots come from?

Robots are commonly set out by search engines like Google, Yahoo, MSN, Altavista, Ask.com and others. Mainly, these are web servers of the search engines, that are on the constant look out of information on the internet. And they gather information (which ultimately goes to the search engines index) by visiting new websites, gathering up new information from them, following links and calculating and analyzing a whole lot of information from them.

What do Robots do?

Robots mainly performs four types of tasks.


* Site Indexing - Which is more like taking a copy of a new website it finds and storing it in some location at the search engines servers. This is accomplished by scanning the documents on a website and mirroring them to temporary servers.

* Validates the site code - Which is more like comparing the website code to W3C standards and grading them according to accuracy.

* Link Checks - Which includes tracing all possible links (incoming and outgoing) from indexed websites, and calculating the sites grading factors such as authority, relevance etc.

What does a Robots.txt file do?

Robots.txt file gives commands to the visiting robots (on the website) to help them index and collect relevant information about the website.

It's more like the helpdesk, which will give all information, guidance and help to the visitors at an event about how to reach the venue, important places, time schedule, map etc.

The commands on the robots.txt file is completely configurable by the webmaster.

Using the right commands, a webmaster can decide everything related to search engines like what search engines are allowed into the website, what is the information available to them, what are the documents that are not available for the search engines and even pass information like how often are pages added to the website and how often should the robots visit them.

Where to spot the Robots.txt file?

The Robots.txt file is located at the root folder of your website. This is most often the _public-html or the http-docs folder. Root folder means the top most directory on the website that is accessible to the public.
It is critical to place the Robots.txt file in the root folder. Placing it elsewhere will not make it functional.

Why is Robots.txt file and Robots important to a webmaster?

Well, for a webmaster Robots.txt should be important because, it helps ensure better indexing of their websites, which means more information passed to search engines and thereby better search engine ranks for them.

It is possible for the webmaster to decide how their websites should be crawled, indexed and ranked by the search engines by the use of well-written Robots.txt files. So, it gives them complete (well almost) control over how a search engine "sees" their websites, which is very crucial.

How does a Robots.txt file look like?
User-agent: *
Allow: /searchhistory/
Disallow: /search
Disallow: /groups
Disallow: /images
Disallow: /catalogs
Disallow: /catalogues
Disallow: /news
Disallow: /nwshp
Allow: /news?btcid=
Disallow: /news?btcid=*&
Allow: /news?btaid=
Disallow: /news?btaid=*&
Disallow: /setnewsprefs?
Disallow: /index.html?
Disallow: /?
Disallow: /addurl/image?
Disallow: /pagead/
Disallow: /relpage/

If you like to see more Robots.txt files, just type in the domain name followed by the /robots.txt filename in the browser of any website you like, and if they are using a Robots.txt file, then it would show up.(Ex: www.google.com/robots.txt, www.yahoo.com/robots.txt)