Sunday, May 9, 2010

Off Page SEO

Off Page SEO: ff Page SEO is all about Link popularity.The optimization of any website is not complete till the Off Page Optimization is not done. Off Page Optimization also known as Off Site Optimization, which is one of the most important factors to get higher ranking in all search engine like Google,Yahoo,MSN and many more search engines.Off Page SEO is all about the things you could to get a high page rank for your website.Off Page seo we doing for our site Link popularity.Link popularity is the number of relevant quality External links or inbound links pointing to your website.External links mean our site link pointing on other site page. Most major search engines use link popularity as part of their algorithm which helps to determine the relevance of your website. If you don't have External links, you won't rank well for competitive keywords.Off page SEO means link popularity so, in this phase, you should pay special attention to the following:
* Submission to search engines
* Directory Submission
* Blog Submission
* Blog commenting
* Article Submission
* Forums Posting
* Press Releases Submission
* Social Bookmarking
* Video Optimization
* Link Building
* Viral Marketing
* Affiliate Marketing
* Email Marketing
* SMS Marketing
* Feed submissions

Sunday, April 4, 2010

Internal Linking Strategies

Internal Linking Strategies : Internal linking is the way you site link to other pages within your site. internal linking is one of the most important Factor or Strategies that can help you to achieve a good ranking in search engine.Importance of various internal links to SEO. So here’s some following list:

Navigation TEXT Links
No image links, Avoid Javascript Links,Only keyword text links to menu items. Use CSS, Use graphic Background images and text links using keywords. 

Sitemap Links

Sitemap Links is the internal link we all agree on and rarely question. It’s accepted and sitemap necessary, again, Constraint on two point,the first is usability and the second one is topic relevance of the page. The site index list of links to every page .

Subject & Topic Group Links
Keep in mind on Subject & Topic Group Links is related to the site page.This is where we lose our focus and fail to map internal structure for either search engines or visitors. Good sites that rank very well will always use “Related Stories” according to the pages.Constraint on two point,the first is usability and the second one is topic relevance of the page.Many sites lose focus and use “Most Popular” links to pages unrelated to those they are on. If you want to you to achieve a good ranking in search engine Focus on Subject & Topic Group Links.

Breadcrumbs links
Breadcrumb Links :Breadcrumbs link is a navigation aid used in user interfaces.Breadcrumbs are navigational links that appear in every site below the page header.Breadcrumb Links is shown in hierarchy(category – subcategory).Breadcrumbs links structures define what you think is important and point visitors (and search engine robots) to an overall structure. Your opinion of what matters to your site informs the search engines.In other word Breadcrumbs show the path to the current web page and allows the visitor to link to any of the website pages along that path.Below show in the image the 

Types of Breadcrumbs
There are 3 types of web breadcrumbs:
1.Path: Path type of breadcrumbs are dynamic and show the path that the user has taken to arrive at a website page.Below is figure show the
Path type of breadcrumbs.

2. Location: location type of breadcrumbs are static and show where the page is located in the website hierarchy.
Below is figure show the Location type of breadcrumbs.

3. Attribute: The Attribute type of breadcrumbs give information that categorizes the current website page.
Below is figure show the attribute type of breadcrumbs.

Thursday, March 25, 2010

URL Structure

URL Structure : Uniform Resource Locator, is a technical term for what is more commonly known as a website address.URL structure is one of the factors in determining how “search engine friendly” a website is. Search engines like clean, static URL’s. They don’t like messy dynamic url’s. URL structures should be static, and reveal what the page is about. A simple and clear URL structure is much easier for both search engine spiders and human beings. Static URL’s that contain keywords will often rank better than dynamic URL’s that don’t contain any keywords.Another problem with dynamic pages is load time. A dynamically generated URL comes from web pages that are generated from scripts, which put together page content from a database or multiple files on the fly by a server, when a user asks to see a website.
One is of a static url & the other is a dynamic url:
Static URL:
Dynamic URL:
These are the some general rules you should follow when create your site URL:
1. Length of the URL: Try to keep your URLs short and descriptive no more than 3-5 words in your URL.According to Matt Cutts if there are more than 5 words…Google's algorithms typically will just weight those words less and just not give you as much credit.”  
2. URL is case sensitive : Url is must be case sensitive of your site must be good for your site.
3. Keywords in URL: Try to use keywords in URL, if the particular page is about “What is SEO” then try to create a URL that looks like “ ” as opposed to one that looks like “ ”  See the difference?
4. Dashes are better than underscores : Google has no individual preferences (meaning you won’t be penalizes for either of the versions), dashes are more preferable as Google "SEES" each hyphened word as an individual one:So if you have a url like word1_word2, Google will only return that page if the user searches for word1_word2 (which almost never happens). If you have a url like word1-word2, that page can be returned for the searches word1, word2, and even “word1 word2?.
5.If you hesitate if your URLs may be perceived as spammy, check out SEOMOZ URL Spam Detection Tool that will estimate:
Spam words
Subdomain depth
Domain length
6. Mind your file extensions as they might prevent your pages from crawling.i.e. Don't end your URLs with .exe.

Tuesday, March 23, 2010

What are Heading Tags?

Heading Tags : There are 6 heading tags available in HTML coding. H1 is the largest and at the top of the heading structure hierarchy. H6 is the smallest and at the bottom of the heading structure hierarchy. Headings tags indicate to a search engine what a page is about. Heading tags label your headline so that search engines will recognize it as "important" on your web page.The HTML tags <h1> through <h6> are used as headings when creating text content. H1 should be the largest of the tags, and is generally used to surround the title of the page or the title of you site. H2 could be used as a secondary header, highlighting specific sections of the page.
Many search engines use sophisticated algorithms to determine the relevancy of a given page in relation to any given search query. As the methods used to spider a page and index relevant Tags and content change over time, so too do the tricks that can be used in order to attract more search engine spiders to your site.
A common practice is to assign a higher relevancy value to pages where a keyword appears in BOLD.

  • In-between the <b>...</b> Tags or as part of a heading.

  • In-between the <H1>..</H1>, <H2>..</H2> etc. heading Tags.

  • A keyword that appears in such emphasized text is often assumed to be a heading or a sub-title, and is thus often assigned a higher relevancy value.
    The basic structure of heading tags
    <h1>Seo Review</h1>
    (a paragraph or so introduction)
    <h2>What about Seo?</h2>
    <h3>Seo Tips Review</h3>
    <h2>Importance of search engine optimization</h2>
    <h2>Future of Seo Service</h2>

    Monday, March 22, 2010

    Good Fresh Content

    Good Fresh Content: Content is King.Good Fresh content or Good Video attracts visitors.If you have articles that are fresh every day,visitors will come back again and again.Don’t confuse this phrase with the thinking that fresh content will give you higher indexing.That may not be the case. However, the more frequently you update your website with articles, downloads, and new web pages, the more frequently a search engine will stop by to visit your website. When search engines look at your site more frequently, you have the opportunity to achieve higher based on the content you provide. 
    Search engines use web crawlers ,which are simply high-tech programs that scan the internet for websites.The web crawler “indexes” a site based upon a number of algorithmic factors determined by the search engine company.Google loves sites that are constantly changing.Therefore fresh content really is one of the keys to a successful Search engine optimization and thus a successful web site.
    Good Fresh Content is valuable,Benefits of using Good Fresh Content.
    #. Instantly add value to your site.
    #. When you have new headlines every day, other sites will want to link to you. And when you increase the number of inbound links, you boost your ranking on Google and other search engines.
    #. Visitors come back more often, knowing there's always something new.
    #. Other sites will want to link to you and your valuable content.

    Sunday, March 21, 2010

    What is "XML sitemap"?

    Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling,They allow you to specify the importance of each page, the frequency which they should download them and the last time you modified them. or The Sitemaps protocol allows a webmaster to inform search engines about URLs on a website that are available for crawling. A Sitemap is an XML file that lists the URLs for a site. It allows webmasters to include additional information about each URL: when it was last updated, how often it changes, and how important it is in relation to other URLs in the site. This allows search engines to crawl the site more intelligently. Sitemaps are a URL inclusion protocol and complement robots.txt, a URL exclusion protocol.
    Sitemaps are particularly beneficial on websites where:
    The webmaster can generate a Sitemap containing all accessible URLs on the site and submit it to search engines. Since Google, MSN, Yahoo, and Ask use the same protocol now, having a Sitemap would let the biggest search engines have the updated pages information.

    Sitemaps supplement and do not replace the existing crawl-based mechanisms that search engines already use to discover URLs. Using this protocol does not guarantee that web pages will be included in search indexes, nor does it influence the way that pages are ranked in search results.Google first introduced Sitemaps 0.84 in June 2005 so web developers could publish lists of links from across their sites. Google, MSN and Yahoo announced joint support for the Sitemaps protocol in November 2006. The schema version was changed to "Sitemap 0.90", but no other changes were made.
    In April 2007, and IBM announced support for Sitemaps. Also, Google, Yahoo, MS announced auto-discovery for sitemaps through robots.txt. In May 2007, the state governments of Arizona, California, Utah and Virginia announced they would use Sitemaps on their web sites.
    The Sitemaps protocol is based on ideas from "Crawler-friendly Web Servers".
    The Sitemap Protocol format consists of XML tags.Sitemaps can also be just a plain text list of URLs. They can also be compressed in .gz format.
    The File format of XML Site Map is shown below in image:

    Sitemap Index
    The Sitemap XML protocol is also extended to provide a way of listing multiple Sitemaps in a 'Sitemap index' file. The maximum Sitemap size of 10 MB or 50,000 URLs means this is necessary for large sites. As the Sitemap needs to be in the same directory as the URLs listed, Sitemap indexes are also useful for websites with multiple subdomains, allowing the Sitemaps of each subdomain to be indexed using the Sitemap index file and robots.txt.
    Sitemap limits
    Sitemap files have a limit of 50,000 URLs and 10 megabytes per sitemap. Sitemaps can be compressed using gzip, reducing bandwidth consumption. Multiple sitemap files are supported, with a Sitemap index file serving as an entry point for a total of 1000 Sitemaps.
    As with all XML files, any data values (including URLs) must use entity escape codes for the characters : ampersand(&), single quote ('), double quote ("), less than (<) and greater than (>).

    Thursday, March 18, 2010

    All Validation

    Validation simply means ensuring through a test program that your web site is written in valid code, or error free.Validation is important, and  it basically refers to using a program or an online service to check that the web page that you created is free of errors.Validating your web pages is a good habit to get into to help ensure your web pages function properly in all web browsers and search engines. There are various types of validators - some check only for errors, others also make suggestions about your code, telling you when a certain way of writing things might lead to unexpected results ,some  validators are HTML validators, CSS validators, Accessibility validators, and broken link validators.
    HTML Validation : HTML Validation simply means ensuring that your web site is written in valid code, or error free.The process of using such software is known as "validation" and the primary goal of validation is to ensure that the submitted code contains no error or facial mistakes.The W3 Consortium has its own online validator which you can use for free.  You can go to to validate both HTML and XHTML pages.
    CSS validator : A CSS validator checks your Cascading Style Sheets in the same manner; basically, most will check them to make sure that they comply with the CSS standards set by the W3 Consortium. There are a few which will also tell you which CSS features are supported by which browsers.You can go to get free validation for your style sheets.
    Accessibility Validators : Accessibility Validators allow you to verify that your site is viewable by people who are differently abled. You can go  to check your Accessibility Validation. 
    Broken Link Validators : Broken Link Validation simply means ensuring that your web site how many links are broken .Broken links are not good for search engines because crawling robots will stop doing their thing and that makes your site no poste on Google, Yahoo and Bing.You can to check free your broken link.