Thursday, 25 October 2012

Website EvaluationTips

Website EvaluationTips

The article will provide you a brief description of:
1. Website evaluation issues,
2. Recommended changes to make the website search engine as well as user friendly, &
3. Get your website ready for search engine optimization process.

Simple concept - "Small changes make a BIG difference..."


Due to the ever changing nature of the algorithms, programming and indexing techniques used by the major search engines and directories, it is always suggested that the website should make the recommended changes to gain faster & better rankings generating higher page rank value for the website.

As we all know that most of the website traffic comes from search engines, thus the implementation of the proposed changes will make the website search engine friendly and help in generating more qualified and relevant traffic to the website through search engines.

Search Engine Compatibility Study

Search engine spiders/ crawlers / bots like standard static HTML pages. Such pages are easy to crawl through and get indexed in search engine database and hence help gain top rankings. They never like dynamic pages, database driven pages, flash pages, and websites with linking done through JavaScript etc.

This study reveals many serious search engine incompatibility features/ elements in the site which will hamper our SEO efforts:

SEO-Straight Issues


1. Server Side Details & Issues

Checkpoint 1: Hosting Details
Server Type (Apache or IIS) & IP Address

Checkpoint 2: IP Spam Blocklist

If the website is potentially on an IP shared with other spammers. Being on a spam block list can have a negative impact on your search engine rankings.

Solution:
The website must be placed onto a dedicated IP address or moving to a different server. The costs are minimal and can be set up through your ISP.

2. Duplicate Issues:

Checkpoint 1: Canonical URL

Major search engines like Google consider 'www' and 'non-www' version of domains as two independent websites and index them separately, thus creating a duplicate website problem.

Solution:
Scenario 1: IIS Solution - If your website is hosted on an IIS Server. In order to solve the canonical URL issue, creation a Google sitemap is recommended and fix it OR place a 301 Permanent Redirection from http://xyz.com/ to http://www.xyz.com/

Scenario 2: Apache Solutions - If your website is hosted on an Apache Server, we recommend placing a 301 Permanent Redirection using ".htaccess" file from
http://xyz.com/ to http://www.xyz.com/.

Checkpoint 2: "Home" & Logo Link

If the "Home" link and the link on the "website logo" on all website pages of http://www.xyz.com are linked to its respective index.html page.

For Example: http://www.xyz.com - The "Home" link and the link on the "website logo" linked to http://www.xyz.com/index.html.

Google consider both these types of URLs as duplicate pages and might penalize one or both of them for duplicate content.

Solution:
In order to solve this issue, we recommend changing the "Home" link and the link on the "website logo" as follows:
http://www.xyz.com/index.html to http://www.xyz.com

Checkpoint 3: Duplicate Domains

The presence of duplicate/ mirror website for your website, http://www.xyz.com. http://www.xyz.com and http://www.abc.com/ share the same content, style and design and thus is categorized as duplicate websites by search engines. Google has also already identified this issue and has placed all website pages of http://www.abc.com/ as "supplemental result".

Solution: We recommend following options to solve this issue:

Option 1: Drop/ Remove one of the above websites, preferably http://www.abc.com/.
Option 2: Re-design one of them to look different.
Option 3: Place a 301 Permanent Redirection from http://www.abc.com/ to http://www.xyz.com.

Checkpoint 4: Duplicate Pages

Presence of internal duplicate pages/URLs within the website. Search engines do not like duplicate pages and penalize the website when found to do so.

Google has already tagged many such duplicate page URLs as "supplementary results".

For Example:
http://www.xyz.com/index.html & http://www.xyz.com/Index.html
In the above example both pages have similar content and design with different filenames/ URLs thus creating a duplicate URL issue.

Solution: We recommend removing such duplicate pages from the website and placing a 301 Permanent redirection from the duplicate page (http://www.xyz.com/Index.html) to its actual page (http://www.xyz.com/index.html).

3. Search Engine Un-friendly URL's

Checkpoint 1(Scenario 1): Dynamic URLs (IIS)
Search engines do not like dynamic URLs mainly because of the presence of variables such as "?, +,=" which creates a continuous loop. Search engines avoid crawling through dynamic URLs for the fear of getting trapped in this loop.

Solution: URL Re-writing solution to re-write dynamic URLs into static URLs
installing 3rd party softwares, LinkFreeze OR ISAPI_Rewrite re-writing tools

Checkpoint 1(Scenario 2): Dynamic URLs (Apache)

Solution: Apache Mod-Rewrite re-writing module

Checkpoint 2: PHP Session ID

Solution: disabling the PHP Session IDs

Checkpoint 3: OsCommerce Session ID

Solution: OsCommerce Admin tools

4. Redirections:

Checkpoint 1: Domain Level Redirection
Domain level redirection is bad from search engine ranking perspective. This is the reason why your website is not even indexed in Google.
Solution: In order to solve this issue, we recommend removing the redirection ASAP.

Checkpoint 2: Cloaking

Cloaking, also known as stealth technique used to deliver one page to a search engine for indexing while serving an entirely different page to everyone else.
Solution: stop cloaking

5. Page Level Issues:

Checkpoint 1: Dead Pages
"supplemental results"
Solution: "301 Permanent redirection"

Checkpoint 2: Orphan Pages

If you have pages that aren't linked to from anywhere else on the website (called "orphan" pages) they won't get spidered by search engines or by anyone else for that matter. Such pages are tagged as "supplemental pages" by Google.
Solution: We recommend that either you completely delete these pages from the server or place a link to these pages from the sitemap page.

Checkpoint 3: Retrieval of Content

Retrieved from another website.
Solution: placing the content/ pages under the same domain/ server.

6. On Body Factors:

Checkpoint 1: Excessive use of JavaScript
makes the pages heavy and difficult to load
Solution: placing all JavaScript code on a file and calling it externally

Checkpoint 2: Heavy Use of CSS

makes the pages heavy and difficult to load
Solution: placing the CSS (Style) code on a file and calling it externally

Checkpoint 3: JavaScript Drop Down Menu

Search engines like plain HTML href tag
prevent spiders from crawling through the website pages.
Solution: including text links at the bottom of the entire page and also placing the pages on the sitemap page.

Checkpoint 4: JavaScript Pop-up

JavaScript popup function
Search engines like plain HTML href tag
disallow search engine robots from crawling and indexing website pages
Solution: directing a plain a href link for all such pages from sitemap page.

Checkpoint 5: JavaScript Mouse over Effects

prevents search engine spiders from crawling and indexing the website pages
Solution: placing the pages on the sitemap page

Checkpoint 6: Frameset

contains no visible content
search engines cannot index framed sites
Solution: placing simple text link at bottom navigation to all site pages.

Checkpoint 7:

Use of Class in Hrefhyperlinks, class="menu" is placed before the "A href" tag
difficult to link the pages in website
making it difficult for search engine spiders to crawl and index.
Solution: placing the 'a href' code on first and then the class="menu"


Checkpoint 8: W3C Coding Standards

W3C (World Wide Consortium/Internal Standards Organization) coding standards.
Solution: validate your HTML Code

7. Site Maps:

Checkpoint 1: Sitemap
Solution: creating Sitemap page and link it from bottom navigation on all website pages

Checkpoint 2:Google Sitemap

Google sitemap is a tool to allow Googlebot to spider and index all your website pages. It also allows learning about specific problems being faced in accessing the website and analyzing the website's performance.
Solution: creating a Google (XML) sitemap and submit the same to major search engines

8. Links Strategies:

Checkpoint 1: Excessive Outbound Links

Checkpoint 2:

Bad Linking StrategyAvoid artificial links, identical links, link farming. The more sites that link to your websites, the higher your ranking in the search engine results will be because the more links indicate a higher level of popularity among users of the Internet.

Solution: We recommend filing a reinclusion request to Google.

Checkpoint 3:

Link Popularityback links to a website, most important criteria for top rankings. Consider the current link popularity status on search engines: Google, Yahoo, MSN, AltaVista, AlltheWeb.




Importance of Inbound Links

Importance of Inbound Links

The perfect and quality inbound links are important to achieve high rankings on search engines. Inbound links are the most important factors for top rankings on major search engines. Its quite difficult for a website with less inbound links, to achieve top rankings for a highly competitive search term. Getting the right kind of quality inbound links is very important for a particular website. Its easy to get better rankings with 25-30 good inbound links than with 100 bad incoming links.

Some important factors that can make any link a good incoming link -

1. The inbound link should use any relevant keyword in the anchor text -

if you wish to get high rankings for a particular search term then the links to your website should use the same search term. The anchor text used to link to your site influences the keywords or keyphrases for which the website will get high rankings.

2. The inbound link should come from a relevant website or a page

- inbound links from relevant webpages are more useful than any other links from the unrelevant web pages. Although links from unrelated pages will not hurt the rankings of your website but search engines gives priority to the links coming from the most relevant websites.

3. The inbound link should direct to a relevant page on your site

- to get links to your home page is better than to get links to the page that is most relevant to the chosen anchor text. If you are using a particular link text then the link should direct to a page relevant to the text. If the link text matches the content of the linked pages then it's more likely that your web page is really relevant to that term and you'll get high rankings for that search term.

4. Link from an authority site is also useful -

Links from pages with high authority will increase the TrustRank of your website. Links from websites with high PageRank have a positive effect on the rankings of your own site.

5. There should not be a nofollow attribute in the inbound link -

The nofollow attribute is used to tell the search engines not to follow a link. Links with the nofollow attribute can't help in achieving high search engine rankings. Check the HTML code of their link exchange partners to find out if they are using nofollow attribute to link to the website.

If the link building for the website is done on the basis of some simple factors, the website will definitely get the top rankings on major search engines.


Important Things To Follow To Become A Successful Webblogger

Important Things To Follow To Become A Successful Webblogger

A web log is a website where a user (a blog owner) post his/her entries in chronological order (commonly displayed in reverse chronological order). Blogs can be created and maintained ranging from different subjects, topics, commentary, news to personal interests like online diaries, photo albums, etc. A blog mainly combines text, images, and links to other blogs, web pages, and other media related to its topic with the important ability for the readers to leave comments on the postings. Mainly the blogs are created as textual but some blooger focus also create blogs on art (artlog), photographs (photoblog), sketchblog, videos (vlog), music (MP3 blog), audio (podcasting).

Anybody can start and own a blog of his/her interest and post some relevant articles but the fact is very few people can successfully pull it off. Anybody who want to become a successful blogger, he/she need to know some important things:

1. Passion for the Topic

If you choose a topic you love, your passion will show through. If you choose a topic you don't know much about only because of the high paying keywords, that will be apparent as well.

2. Creative and Innovative

You're going to need to update your blog, at the very least, several times a week. A successful blogger is creative enough to come up with fresh, engaging content every time.

3. Good Idea

Not the same idea as everyone else. Not a copycat of someone else's blog. But your own unique, good idea. If you want to make it as a blogger, you have to have great content and a fresh idea.

4. Good Work Ethic

If you think blogging is just writing a quick five minute post and forgetting about it until next time, you couldn't be more wrong. You're going to spend hours on your blog designing it, writing it, promoting it and more. Most bloggers spend two to five hours on each blog every day.

5. Able to Work Alone

Most bloggers work alone. You have to be able to shut out all of the distractions of home or the coffee shop and concentrate on creating a good post.

6. Notebook

You're going to come up with ideas at the oddest times. What if you're not close to your computer? It's helpful to have a notebook or PDA handy so you can jot down ideas.

7. Thick Skin

You're going to be flamed, you're going to be trolled and people are going to pick your blog posts apart and call you names. You can let it get to you or you can develop a thick skin.

8. Thriving Social Network

It helps to have people to talk to, network with, and shoot ideas off of. This will not only help to eliminate the loneliness some probloggers experiences, but by interacting with others you're sure to come up with fresh new ideas.

9. Motivation

What keeps you motivated? Passion for your topic? Advertising revenue? The adoration of your public? Whatever it is, channel it and use it to keep you driven.


10. Longevity

Blogging isn't a fly by night operation. If you want to succeed, if you want your readers to trust your name, you'll want your blogs to last through the ages. The most successful blogs have been around for a few years. It's the people who cut bait and run after two months of blogging that fail.

11. Vision for Long Term

Do you have a plan? Where do you see your blog two years from now? Five years? Ten? A successful blogger has the ability to see into the future. I'm not saying you know exactly what you're going to blog about, but you'll need to think about how your blog is going to succeed for years to come.

12. The Ability to Work Odd Hours

Blogging isn't exactly a 9 to 5 gig. Ideas hit at any time. Many probloggers even work well into the night or the wake at the crack of dawn to fit blogging into a busy day.

13. Willingness to Learn New Things

As a blogger you want to continue to feed your readers new information. This means you'll have to set a portion of your day aside to read other blogs, books, magazines and websites in your niche.

14. Open Minded

Sometimes bloggers need to completely change their line of thinking. What might have worked a year ago, doesn't work now. Tools and techniques become obsolete. Keep an open mind, don't be afraid to admit defeat and always be on the lookout for new trends and ideas.

15. Teaching Ability

Blogging isn't about talking about yourself. It's about sharing what you know with someone else. Many bloggers are teachers, not story tellers. Though the ability to tell a story is also something a successful blogger must possess.

16. Patience

Patience always pays off and similarly blogging isn't instant gratification. It also takes time and patience to build up a successful blog. And if you wish to have instant pay out, stick with copywriting.





Guidelines to get Top Rankings for Affiliate Websites

Guidelines to get Top Rankings for Affiliate Websites

Some Guidelines to get top 10 rankings for affiliate websites:

1. Always write good quality & unique content for the home page that help your visitors and readers to solve their problems.

2. Perform through keyword research and analysis. Find out the top rankers on Google for your main keywords. Analyze their content and services. How they are doing to be on top on search engines? Are their website web 2.0 site or a web blog? How many बेक links & pages indexed do they have?

3. Again collect keywords from the competitors websites. Analyze their title, meta tags, placement of keywords in H1 header tags, alt texts, etc.

4. You can also visit about.com, spyfu.com or Wikipedia to check for topic of your website and collect some more keywords.

5. Create pages on the basis of keywords your indentified.

6. Try to get some quality one way links to your website that can be from directories, blogs, forums or article directories.

7. Write some useful and relevant articles by using the keywords you have colleted.

8. Create a blog (on blogspot or wordpress) to your site on a new directory, and write articles every day for a month or two as Google like updated web page with fresh content. Also Wordpress has an auto posting feature to post your blog.

9. Submit your blog post to Blog directories and major search engines including netscape.com.

10. Submit articles with links to all the pages of your site, not only the home page.

11. Create a human as well as XML sitemap with all the pages of your site.

12. Add a membership or forum to your site because Google like to see that people are coming back again and again to your website, and find it as a popular website.

13. Also try to get one way links from authority sites like ebay.com, facebook.com. 43things.com, work.com and other popular sites that Google like and admire.

14. Submit a Press Release with links to your site to renowned and popular PR sites, if possible.

Thanks for reading, and if you have any question, post them in this thread, I will try to help you getting better rankings.

You are welcome to ask me any question.

Thanks for reading.

Over-Optimization in SEO

Over-Optimization in SEO

Every website need to be in top rankings on all the major search engines and to achieve high rankings SEO experts use various SEO techniques, mostly ethical or sometimes unethical. Using unethical SEO practices sometimes results in temporarily or permanently exclusion of websites from the index of major search engines like Google for using so called 'black hat' SEO techniques. The reaction of search engines is easy to understand with many unethical tricks and cheats that SEO experts use in their website optmization strategy, the relevancy of results is compromised to the point where search engines initiate to give completely irrelevant and manipulated search results. Or if search engines do not catch you red handed right away, your competitors might report you for your misdoings.

Keyword Density/Keyword Stuffing -

SEO experts try to push their websites to top positions through many ways that might results in questionable practices, like keyword stuffing which is considered an unethical practice because what you actually do is use the keyword in question throughout the text suspiciously often. Having in mind that the recommended keyword density is from 3 to 7%, anything above this, say 10% density starts to look very much like keyword stuffing and it is likely that will not get unnoticed by search engines. A text with 10% keyword density can hardly make sense, if read by a human. Some time ago Google implemented the so called “Florida Update” and essentially imposed a penalty for pages that are keyword-stuffed and over-optimized in general.

Generally, keyword density in the title, the headings, and the first paragraphs matters more. Needless to say that you should be especially careful not to stuff these areas. Try the Keyword Density Cloud tool to check if your keyword density is in the acceptable limits, especially in the above-mentioned places. If you have a high density percentage for a frequently used keyword, then consider replacing some of the occurrences of the keyword with synonyms. Also, generally words that are in bold and/or italic are considered important by search engines but if any occurrence of the target keywords is in bold and italic, this also looks unnatural and in the best case it will not push your page up.

Duplicate Content -

SEO has a basic rule that content is always the king. But it should be unique & relevant. In terms of Google, duplicate content means text that is the same as the text on a different page on the SAME site or on a sister-site, or on a site that is heavily linked to the site in question and it can be presumed that the two sites are related; i.e. when you copy and paste the same paragraphs from one page on your site to another, then you might expect to see your site's rank drop. Most SEO experts believe that syndicated content is not treated as duplicate content and there are many examples of this. If syndicated content were duplicate content, that the sites of news agencies would have been the first to drop out of search results. Still, it does not hurt to check from time if your site has duplicate content with another, at least because somebody might be illegally copying your content and you do not know. The Similar Page Checker tool will help you see if you have grounds to worry about duplicate content.

Doorway Pages and Hidden Text -

Another common keyword scam is doorway pages. Before Google introduced the PageRank algorithm, doorways were a common practice and there were times when they were not considered an illegal optimization. A doorway page is a page that is made especially for the search engines and that has no meaning for humans but is used to get high positions in search engines and to trick users to come to the site. Although keywords are still very important, today keywords alone have less effect in determining the position of a site in search results, so doorway pages do not get so much traffic to a site but if you use them, don't ask why Google punished you.
Like doorway pages is a scam called hidden text. This is text, which is invisible to humans e.g. the text color is the same as the page background but is included in the HTML source of the page, trying to fool search engines that the particular page is keyword-rich. Needless to say, both doorway pages and hidden text can hardly be qualified as optimization techniques, there are more manipulation than everything else.

Links Spam -

Links are another major SEO tool and like the other SEO tools it can be used or misused. While backlinks are certainly important, for Yahoo backlinks are important as quantity, while for Google it is more important what sites backlinks come from, getting tons of backlinks from a link farm or a blacklisted site is begging to be penalized. Also, if outbound links (links from your site to other sites) considerably outnumber your inbound links (links from other sites to your site), then you have put too much effort in creating useless links because this will not improve your ranking. You can use the Domain Stats Tool to see the number of backlinks (inbound links) to your site and the Site Link Analyzer to see how many outbound links you have.

Using keywords in links (the anchor text), domain names, folder and file names does boost your search engine rankings but again, the precise measure is the boundary between topping the search results and being kicked out of them. For instance, if you are optimizing for the keyword “cat”, which is a frequently chosen keyword and as with all popular keywords and phrases, competition is fierce, you might not see other alternative for reaching the top but getting a domain name like http://www.cat-cats-kittens-kitty.com, which no doubt is packed with keywords to the maximum but is first – difficult to remember, and second – if the contents does not correspond to the plenitude of cats in the domain name, you will never top the search
results.

Although file and folder names are less important than domain names, now and then you can include “cat” (and synonyms) in them and in the anchor text of the links। This counts well, provided that anchors are not artificially stuffed (for instance if you use “cat_cats_kitten” as anchor for internal site links this anchor certainly is stuffed). While you have no control over third sides that link to you and use anchors that you don't like, it is up to you to perform periodic checks what anchors do other sites use to link to you. A handy tool for this task is the Backlink Anchor Text Analysis, where you enter the URL and get a listing of the sites that link to you and the anchor text they use.

Finally to conclude, Google and the other search engines makes no difference if a site is intentionally over-optimized to cheat them or over-optimization is the result of good intentions, so no matter what your motives are, always try to keep to reasonable SEO practices and follow the search engine guidelines.

Top SEO Mistakes

Top SEO Mistakes

1. No Targetted Keywords or Targetting the wrong keywords

Many SEOs do this bluder by selecting not so useful, non-relevant or even wrong keywords and key phrases for optimizing the website pages. They might choose keywords according to their mind that are descriptive of the website but the normal users may not search those terms. Choosing the right keywords can bring success or failure to your SEO campaign. Irrespective of your resources, you can't think on your own of all the useful keywords but a good keyword
suggestion tool will definitely help you find keywords that are useful for your website. Also region specific keywords play important role in bringing quality visitors to the website.

2. Ignoring the Title tag

Leaving the title tag empty or not using it properly is also a big mistake. Title tag is one of the most important place to have useful and relevant keywords of your website, it help you in optimization and the text in your title tag shows in the SERPs (search engine result pages) as your page title.

3. Concentrating only on Meta Tags

Many professionals think SEO is only about putting keywords in your meta description & keywords tags. They must know that meta tags does not have enough weightage on search engines. They can create meta description and keywords tag with useful keywords but should not expect much to rank well only because of these meta tags.

4. A Flash Website without a HTML Alternative

Flash might be attractive but not to search engines and users. If you really insist that your site is Flash-based and you want search engines to love it, provide an html version. Here are some more tips for optimizing Flash sites. Search engines don't like Flash sites for a reason – a spider can't read Flash content and therefore can't index it.

5. JavaScript Menus

Using JavaScript for navigation within the website is not bad as long as you understand that search engine spiders can't read JavaScript and build your web pages accordingly. Hence if you have JavaScript menus you can't do without, you should create simple HTML and XML sitemap putting all the page links, so that all your pages will be crawlable to the search engine spiders.

6. Using only Images for Headings

Many programmers use images in place of text due to the thinking that an image looks better than simple text for headings and menus. Although images can make your website look more attractive & distinctive but in SEO terms images for headings and menus are a big mistake because H1, H2 header tags and menu links are important SEO items. If you don't want to have your text within H1, H2 header tags due to their visibility, you can modify them through a stylesheet.

7. Ignoring Keyword Rich URLs

Many SEOs underestimate the importance of a good URL. Dynamic page names are still very frequent in use and no keywords in the URL is more a rule than an exception. Although it is possible to rank high even without keywords in URL but all being equal, if you have keywords in the URL of the domain and file names (inner pages, images, etc), result in an additional advantage over your competitors. Keywords in URLs are more important for MSN & Yahoo! but Google also give weightage to them, hence it is very useful to use keywords in URLs.

8. Backlink Spamming

It is seen as a very common practice to acquire large number of backlinks for the website due to a delusion that more backlinks are always better and for which many SEOs try to get links with any hook n crook, they may resort to link farms, forums or newgroups spam etc., which can lead to getting their website banned. You need to have quality backlinks for your website only through ethical techniques.

9. Lack of Consistency and SEO Maintenance

SEO never been a one time process. People who believe that once they optimize a site, it is done forever are living in a wrong belief. If you want to permanently optimize your website, you have to keep on analysing your competition and changes in the ranking algorithms of search engines and update your website pages.

10. Lack of Keywords in the Page Content

If the content on the page does not have sufficient keywords then you must modify your content according to the requirement and place the useful and relevant keywords at the sensible place within the text. Also it is better to make the most useful keywords bold or highlight them


.

Working method of Search Engines- General

Working method of Search Engines- General


The term "search engine" is often used generically to describe both crawler-based search engines and human-powered directories. These two types of search engines gather their listings in radically different ways.

Crawler-Based Search Engines

Crawler-based search engines, such as Google, create their listings automatically. They "crawl" or "spider" the web, then people search through what they have found. If you change your web pages, crawler-based search engines eventually find these changes, and that can affect how you are listed. Page titles, body copy and other elements all play a role.

Human-Powered Directories

A human-powered directory, such as the Open Directory, depends on humans for its listings. You submit a short description to the directory for your entire site, or editors write one for sites they review. A search looks for matches only in the descriptions submitted.
Changing your web pages has no effect on your listing. Things that are useful for improving a listing with a search engine have nothing to do with improving a listing in a directory. The only exception is that a good site, with good content, might be more likely to get reviewed for free than a poor site.


The Parts Of A Crawler-Based Search Engine

Crawler-based search engines have three major elements. First is the spider, also called the crawler. The spider visits a web page, reads it, and then follows links to other pages within the site. This is what it means when someone refers to a site being "spidered" or "crawled." The spider returns to the site on a regular basis, such as every month or two, to look for changes.

Everything the spider finds goes into the second part of the search engine, the index. The index, sometimes called the catalog, is like a giant book containing a copy of every web page that the spider finds. If a web page changes, then this book is updated with new information.Sometimes it can take a while for new pages or changes that the spider finds to be added to the index. Thus, a web page may have been "spidered" but not yet "indexed." Until it is indexed -- added to the index -- it is not available to those searching with the search engine. It may therefore happen that you could not find your site on search engine for the relevant keywords even though you have submitted it to search engine.

Search engine software is the third part of a search engine. This is the program that sifts through the millions of pages recorded in the index to find matches to a search and rank them in order of what it believes is most relevant.

Major Search Engines: The Same, But Different

All crawler-based search engines have the basic parts described above, but there are differences in how these parts are tuned. That is why the same search on different search engines often produces different results. Information on this page has been drawn from the help pages of each search engine, along with knowledge gained from articles, reviews, books, independent research, tips from others and additional information received directly from the various search engines.