Thursday 25 October 2012

Website EvaluationTips

Website EvaluationTips

The article will provide you a brief description of:
1. Website evaluation issues,
2. Recommended changes to make the website search engine as well as user friendly, &
3. Get your website ready for search engine optimization process.

Simple concept - "Small changes make a BIG difference..."


Due to the ever changing nature of the algorithms, programming and indexing techniques used by the major search engines and directories, it is always suggested that the website should make the recommended changes to gain faster & better rankings generating higher page rank value for the website.

As we all know that most of the website traffic comes from search engines, thus the implementation of the proposed changes will make the website search engine friendly and help in generating more qualified and relevant traffic to the website through search engines.

Search Engine Compatibility Study

Search engine spiders/ crawlers / bots like standard static HTML pages. Such pages are easy to crawl through and get indexed in search engine database and hence help gain top rankings. They never like dynamic pages, database driven pages, flash pages, and websites with linking done through JavaScript etc.

This study reveals many serious search engine incompatibility features/ elements in the site which will hamper our SEO efforts:

SEO-Straight Issues


1. Server Side Details & Issues

Checkpoint 1: Hosting Details
Server Type (Apache or IIS) & IP Address

Checkpoint 2: IP Spam Blocklist

If the website is potentially on an IP shared with other spammers. Being on a spam block list can have a negative impact on your search engine rankings.

Solution:
The website must be placed onto a dedicated IP address or moving to a different server. The costs are minimal and can be set up through your ISP.

2. Duplicate Issues:

Checkpoint 1: Canonical URL

Major search engines like Google consider 'www' and 'non-www' version of domains as two independent websites and index them separately, thus creating a duplicate website problem.

Solution:
Scenario 1: IIS Solution - If your website is hosted on an IIS Server. In order to solve the canonical URL issue, creation a Google sitemap is recommended and fix it OR place a 301 Permanent Redirection from http://xyz.com/ to http://www.xyz.com/

Scenario 2: Apache Solutions - If your website is hosted on an Apache Server, we recommend placing a 301 Permanent Redirection using ".htaccess" file from
http://xyz.com/ to http://www.xyz.com/.

Checkpoint 2: "Home" & Logo Link

If the "Home" link and the link on the "website logo" on all website pages of http://www.xyz.com are linked to its respective index.html page.

For Example: http://www.xyz.com - The "Home" link and the link on the "website logo" linked to http://www.xyz.com/index.html.

Google consider both these types of URLs as duplicate pages and might penalize one or both of them for duplicate content.

Solution:
In order to solve this issue, we recommend changing the "Home" link and the link on the "website logo" as follows:
http://www.xyz.com/index.html to http://www.xyz.com

Checkpoint 3: Duplicate Domains

The presence of duplicate/ mirror website for your website, http://www.xyz.com. http://www.xyz.com and http://www.abc.com/ share the same content, style and design and thus is categorized as duplicate websites by search engines. Google has also already identified this issue and has placed all website pages of http://www.abc.com/ as "supplemental result".

Solution: We recommend following options to solve this issue:

Option 1: Drop/ Remove one of the above websites, preferably http://www.abc.com/.
Option 2: Re-design one of them to look different.
Option 3: Place a 301 Permanent Redirection from http://www.abc.com/ to http://www.xyz.com.

Checkpoint 4: Duplicate Pages

Presence of internal duplicate pages/URLs within the website. Search engines do not like duplicate pages and penalize the website when found to do so.

Google has already tagged many such duplicate page URLs as "supplementary results".

For Example:
http://www.xyz.com/index.html & http://www.xyz.com/Index.html
In the above example both pages have similar content and design with different filenames/ URLs thus creating a duplicate URL issue.

Solution: We recommend removing such duplicate pages from the website and placing a 301 Permanent redirection from the duplicate page (http://www.xyz.com/Index.html) to its actual page (http://www.xyz.com/index.html).

3. Search Engine Un-friendly URL's

Checkpoint 1(Scenario 1): Dynamic URLs (IIS)
Search engines do not like dynamic URLs mainly because of the presence of variables such as "?, +,=" which creates a continuous loop. Search engines avoid crawling through dynamic URLs for the fear of getting trapped in this loop.

Solution: URL Re-writing solution to re-write dynamic URLs into static URLs
installing 3rd party softwares, LinkFreeze OR ISAPI_Rewrite re-writing tools

Checkpoint 1(Scenario 2): Dynamic URLs (Apache)

Solution: Apache Mod-Rewrite re-writing module

Checkpoint 2: PHP Session ID

Solution: disabling the PHP Session IDs

Checkpoint 3: OsCommerce Session ID

Solution: OsCommerce Admin tools

4. Redirections:

Checkpoint 1: Domain Level Redirection
Domain level redirection is bad from search engine ranking perspective. This is the reason why your website is not even indexed in Google.
Solution: In order to solve this issue, we recommend removing the redirection ASAP.

Checkpoint 2: Cloaking

Cloaking, also known as stealth technique used to deliver one page to a search engine for indexing while serving an entirely different page to everyone else.
Solution: stop cloaking

5. Page Level Issues:

Checkpoint 1: Dead Pages
"supplemental results"
Solution: "301 Permanent redirection"

Checkpoint 2: Orphan Pages

If you have pages that aren't linked to from anywhere else on the website (called "orphan" pages) they won't get spidered by search engines or by anyone else for that matter. Such pages are tagged as "supplemental pages" by Google.
Solution: We recommend that either you completely delete these pages from the server or place a link to these pages from the sitemap page.

Checkpoint 3: Retrieval of Content

Retrieved from another website.
Solution: placing the content/ pages under the same domain/ server.

6. On Body Factors:

Checkpoint 1: Excessive use of JavaScript
makes the pages heavy and difficult to load
Solution: placing all JavaScript code on a file and calling it externally

Checkpoint 2: Heavy Use of CSS

makes the pages heavy and difficult to load
Solution: placing the CSS (Style) code on a file and calling it externally

Checkpoint 3: JavaScript Drop Down Menu

Search engines like plain HTML href tag
prevent spiders from crawling through the website pages.
Solution: including text links at the bottom of the entire page and also placing the pages on the sitemap page.

Checkpoint 4: JavaScript Pop-up

JavaScript popup function
Search engines like plain HTML href tag
disallow search engine robots from crawling and indexing website pages
Solution: directing a plain a href link for all such pages from sitemap page.

Checkpoint 5: JavaScript Mouse over Effects

prevents search engine spiders from crawling and indexing the website pages
Solution: placing the pages on the sitemap page

Checkpoint 6: Frameset

contains no visible content
search engines cannot index framed sites
Solution: placing simple text link at bottom navigation to all site pages.

Checkpoint 7:

Use of Class in Hrefhyperlinks, class="menu" is placed before the "A href" tag
difficult to link the pages in website
making it difficult for search engine spiders to crawl and index.
Solution: placing the 'a href' code on first and then the class="menu"


Checkpoint 8: W3C Coding Standards

W3C (World Wide Consortium/Internal Standards Organization) coding standards.
Solution: validate your HTML Code

7. Site Maps:

Checkpoint 1: Sitemap
Solution: creating Sitemap page and link it from bottom navigation on all website pages

Checkpoint 2:Google Sitemap

Google sitemap is a tool to allow Googlebot to spider and index all your website pages. It also allows learning about specific problems being faced in accessing the website and analyzing the website's performance.
Solution: creating a Google (XML) sitemap and submit the same to major search engines

8. Links Strategies:

Checkpoint 1: Excessive Outbound Links

Checkpoint 2:

Bad Linking StrategyAvoid artificial links, identical links, link farming. The more sites that link to your websites, the higher your ranking in the search engine results will be because the more links indicate a higher level of popularity among users of the Internet.

Solution: We recommend filing a reinclusion request to Google.

Checkpoint 3:

Link Popularityback links to a website, most important criteria for top rankings. Consider the current link popularity status on search engines: Google, Yahoo, MSN, AltaVista, AlltheWeb.




No comments:

Post a Comment