Sunday, May 9, 2010

Off Page SEO

Off Page SEO: ff Page SEO is all about Link popularity.The optimization of any website is not complete till the Off Page Optimization is not done. Off Page Optimization also known as Off Site Optimization, which is one of the most important factors to get higher ranking in all search engine like Google,Yahoo,MSN and many more search engines.Off Page SEO is all about the things you could to get a high page rank for your website.Off Page seo we doing for our site Link popularity.Link popularity is the number of relevant quality External links or inbound links pointing to your website.External links mean our site link pointing on other site page. Most major search engines use link popularity as part of their algorithm which helps to determine the relevance of your website. If you don't have External links, you won't rank well for competitive keywords.Off page SEO means link popularity so, in this phase, you should pay special attention to the following:
* Submission to search engines
* Directory Submission
* Blog Submission
* Blog commenting
* Article Submission
* Forums Posting
* Press Releases Submission
* Social Bookmarking
* Video Optimization
* Link Building
* Viral Marketing
* Affiliate Marketing
* Email Marketing
* SMS Marketing
* Feed submissions

Sunday, April 4, 2010

Internal Linking Strategies

Internal Linking Strategies : Internal linking is the way you site link to other pages within your site. internal linking is one of the most important Factor or Strategies that can help you to achieve a good ranking in search engine.Importance of various internal links to SEO. So here’s some following list:

Navigation TEXT Links
No image links, Avoid Javascript Links,Only keyword text links to menu items. Use CSS, Use graphic Background images and text links using keywords. 


Sitemap Links

Sitemap Links is the internal link we all agree on and rarely question. It’s accepted and sitemap necessary, again, Constraint on two point,the first is usability and the second one is topic relevance of the page. The site index list of links to every page .

Subject & Topic Group Links
Keep in mind on Subject & Topic Group Links is related to the site page.This is where we lose our focus and fail to map internal structure for either search engines or visitors. Good sites that rank very well will always use “Related Stories” according to the pages.Constraint on two point,the first is usability and the second one is topic relevance of the page.Many sites lose focus and use “Most Popular” links to pages unrelated to those they are on. If you want to you to achieve a good ranking in search engine Focus on Subject & Topic Group Links.

Breadcrumbs links
Breadcrumb Links :Breadcrumbs link is a navigation aid used in user interfaces.Breadcrumbs are navigational links that appear in every site below the page header.Breadcrumb Links is shown in hierarchy(category – subcategory).Breadcrumbs links structures define what you think is important and point visitors (and search engine robots) to an overall structure. Your opinion of what matters to your site informs the search engines.In other word Breadcrumbs show the path to the current web page and allows the visitor to link to any of the website pages along that path.Below show in the image the 



Types of Breadcrumbs
There are 3 types of web breadcrumbs:
1.Path: Path type of breadcrumbs are dynamic and show the path that the user has taken to arrive at a website page.Below is figure show the
Path type of breadcrumbs.

2. Location: location type of breadcrumbs are static and show where the page is located in the website hierarchy.
Below is figure show the Location type of breadcrumbs.


3. Attribute: The Attribute type of breadcrumbs give information that categorizes the current website page.
Below is figure show the attribute type of breadcrumbs.




Thursday, March 25, 2010

URL Structure

URL Structure : Uniform Resource Locator, is a technical term for what is more commonly known as a website address.URL structure is one of the factors in determining how “search engine friendly” a website is. Search engines like clean, static URL’s. They don’t like messy dynamic url’s. URL structures should be static, and reveal what the page is about. A simple and clear URL structure is much easier for both search engine spiders and human beings. Static URL’s that contain keywords will often rank better than dynamic URL’s that don’t contain any keywords.Another problem with dynamic pages is load time. A dynamically generated URL comes from web pages that are generated from scripts, which put together page content from a database or multiple files on the fly by a server, when a user asks to see a website.
One is of a static url & the other is a dynamic url:
Static URL: http://seo-service-4-u.blogspot.com/2010/03/on-page-seo.html
Dynamic URL: http://seo-service-4-u.blogspot.com/2010/03/on-page-seo
These are the some general rules you should follow when create your site URL:
1. Length of the URL: Try to keep your URLs short and descriptive no more than 3-5 words in your URL.According to Matt Cutts if there are more than 5 words…Google's algorithms typically will just weight those words less and just not give you as much credit.”  
2. URL is case sensitive : Url is must be case sensitive of your site must be good for your site.
3. Keywords in URL: Try to use keywords in URL, if the particular page is about “What is SEO” then try to create a URL that looks like “ http://seo-service-4-u.blogspot.com/2010/03/what-is-seo.html ” as opposed to one that looks like “ http://seo-service-4-u.blogspot.com/2010/03/body-text.html ”  See the difference?
4. Dashes are better than underscores : Google has no individual preferences (meaning you won’t be penalizes for either of the versions), dashes are more preferable as Google "SEES" each hyphened word as an individual one:So if you have a url like word1_word2, Google will only return that page if the user searches for word1_word2 (which almost never happens). If you have a url like word1-word2, that page can be returned for the searches word1, word2, and even “word1 word2?.
5.If you hesitate if your URLs may be perceived as spammy, check out SEOMOZ URL Spam Detection Tool that will estimate:
Spam words
Hyphens
Subdomain depth
Domain length
6. Mind your file extensions as they might prevent your pages from crawling.i.e. Don't end your URLs with .exe.



Tuesday, March 23, 2010

What are Heading Tags?

Heading Tags : There are 6 heading tags available in HTML coding. H1 is the largest and at the top of the heading structure hierarchy. H6 is the smallest and at the bottom of the heading structure hierarchy. Headings tags indicate to a search engine what a page is about. Heading tags label your headline so that search engines will recognize it as "important" on your web page.The HTML tags <h1> through <h6> are used as headings when creating text content. H1 should be the largest of the tags, and is generally used to surround the title of the page or the title of you site. H2 could be used as a secondary header, highlighting specific sections of the page.
Many search engines use sophisticated algorithms to determine the relevancy of a given page in relation to any given search query. As the methods used to spider a page and index relevant Tags and content change over time, so too do the tricks that can be used in order to attract more search engine spiders to your site.
A common practice is to assign a higher relevancy value to pages where a keyword appears in BOLD.
Example:

  • In-between the <b>...</b> Tags or as part of a heading.

  • In-between the <H1>..</H1>, <H2>..</H2> etc. heading Tags.

  • A keyword that appears in such emphasized text is often assumed to be a heading or a sub-title, and is thus often assigned a higher relevancy value.
    The basic structure of heading tags
    <h1>Seo Review</h1>
    (a paragraph or so introduction)
    <h2>What about Seo?</h2>
    (content)
    <h3>Seo Tips Review</h3>
    (content)
    <h2>Importance of search engine optimization</h2>
    (content)
    <h2>Future of Seo Service</h2>
    (content)






    Monday, March 22, 2010

    Good Fresh Content

    Good Fresh Content: Content is King.Good Fresh content or Good Video attracts visitors.If you have articles that are fresh every day,visitors will come back again and again.Don’t confuse this phrase with the thinking that fresh content will give you higher indexing.That may not be the case. However, the more frequently you update your website with articles, downloads, and new web pages, the more frequently a search engine will stop by to visit your website. When search engines look at your site more frequently, you have the opportunity to achieve higher based on the content you provide. 
     
    Search engines use web crawlers ,which are simply high-tech programs that scan the internet for websites.The web crawler “indexes” a site based upon a number of algorithmic factors determined by the search engine company.Google loves sites that are constantly changing.Therefore fresh content really is one of the keys to a successful Search engine optimization and thus a successful web site.
    Good Fresh Content is valuable,Benefits of using Good Fresh Content.
    #. Instantly add value to your site.
    #. When you have new headlines every day, other sites will want to link to you. And when you increase the number of inbound links, you boost your ranking on Google and other search engines.
    #. Visitors come back more often, knowing there's always something new.
    #. Other sites will want to link to you and your valuable content.

    Sunday, March 21, 2010

    What is "XML sitemap"?

    Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling,They allow you to specify the importance of each page, the frequency which they should download them and the last time you modified them. or The Sitemaps protocol allows a webmaster to inform search engines about URLs on a website that are available for crawling. A Sitemap is an XML file that lists the URLs for a site. It allows webmasters to include additional information about each URL: when it was last updated, how often it changes, and how important it is in relation to other URLs in the site. This allows search engines to crawl the site more intelligently. Sitemaps are a URL inclusion protocol and complement robots.txt, a URL exclusion protocol.
    Sitemaps are particularly beneficial on websites where:
    The webmaster can generate a Sitemap containing all accessible URLs on the site and submit it to search engines. Since Google, MSN, Yahoo, and Ask use the same protocol now, having a Sitemap would let the biggest search engines have the updated pages information.

    Sitemaps supplement and do not replace the existing crawl-based mechanisms that search engines already use to discover URLs. Using this protocol does not guarantee that web pages will be included in search indexes, nor does it influence the way that pages are ranked in search results.Google first introduced Sitemaps 0.84 in June 2005 so web developers could publish lists of links from across their sites. Google, MSN and Yahoo announced joint support for the Sitemaps protocol in November 2006. The schema version was changed to "Sitemap 0.90", but no other changes were made.
    In April 2007, Ask.com and IBM announced support for Sitemaps. Also, Google, Yahoo, MS announced auto-discovery for sitemaps through robots.txt. In May 2007, the state governments of Arizona, California, Utah and Virginia announced they would use Sitemaps on their web sites.
    The Sitemaps protocol is based on ideas from "Crawler-friendly Web Servers".
    The Sitemap Protocol format consists of XML tags.Sitemaps can also be just a plain text list of URLs. They can also be compressed in .gz format.
    The File format of XML Site Map is shown below in image:


    Sitemap Index
    The Sitemap XML protocol is also extended to provide a way of listing multiple Sitemaps in a 'Sitemap index' file. The maximum Sitemap size of 10 MB or 50,000 URLs means this is necessary for large sites. As the Sitemap needs to be in the same directory as the URLs listed, Sitemap indexes are also useful for websites with multiple subdomains, allowing the Sitemaps of each subdomain to be indexed using the Sitemap index file and robots.txt.
    Sitemap limits
    Sitemap files have a limit of 50,000 URLs and 10 megabytes per sitemap. Sitemaps can be compressed using gzip, reducing bandwidth consumption. Multiple sitemap files are supported, with a Sitemap index file serving as an entry point for a total of 1000 Sitemaps.
    As with all XML files, any data values (including URLs) must use entity escape codes for the characters : ampersand(&), single quote ('), double quote ("), less than (<) and greater than (>).

    Thursday, March 18, 2010

    All Validation

    Validation simply means ensuring through a test program that your web site is written in valid code, or error free.Validation is important, and  it basically refers to using a program or an online service to check that the web page that you created is free of errors.Validating your web pages is a good habit to get into to help ensure your web pages function properly in all web browsers and search engines. There are various types of validators - some check only for errors, others also make suggestions about your code, telling you when a certain way of writing things might lead to unexpected results ,some  validators are HTML validators, CSS validators, Accessibility validators, and broken link validators.
    HTML Validation : HTML Validation simply means ensuring that your web site is written in valid code, or error free.The process of using such software is known as "validation" and the primary goal of validation is to ensure that the submitted code contains no error or facial mistakes.The W3 Consortium has its own online validator which you can use for free.  You can go to http://validator.w3.org/ to validate both HTML and XHTML pages.
    CSS validator : A CSS validator checks your Cascading Style Sheets in the same manner; basically, most will check them to make sure that they comply with the CSS standards set by the W3 Consortium. There are a few which will also tell you which CSS features are supported by which browsers.You can go to http://jigsaw.w3.org/css-validator/ get free validation for your style sheets.
    Accessibility Validators : Accessibility Validators allow you to verify that your site is viewable by people who are differently abled. You can go http://www.section508.info/  to check your Accessibility Validation. 
    Broken Link Validators : Broken Link Validation simply means ensuring that your web site how many links are broken .Broken links are not good for search engines because crawling robots will stop doing their thing and that makes your site no poste on Google, Yahoo and Bing.You can http://validator.w3.org/checklink to check free your broken link.
     

    What is Search Engine Optimization ?

    Search Engine Optimization: SEO is the process of improving the volume or quality of traffic to a site or blog from Search Engines via "Organic" or "un-paid" search results as opposed to Search Engine Marketing (SEM) which deals with paid inclusion.
    Search Engine Optimization: SEO is the process of improving the web page in the search engine in specific Keyword or Keyword Phrase.
    Search Engine Optimization : SEO is the process of getting your website to the top of the search engines is known as search engine optimization (SEO).

    Monday, March 15, 2010

    Frames , Java scripts, Flash

    About Frames: DON’T USE FRAMEA; they are not spider able from the search engines Frames is a method by which a browser window can be divided into multiple sections, in which each contains its own Web page.Frames are bad because Search engines don't like them or You normally cannot bookmark pages inside frames.JavaScript is a programming language that runs on a client's browser which makes a page more attractive .Regarding Java scripts and Flash do not use them excessively. The reason for this is that spiders cannot read JavaScript (or Flash) and thus any link or text created by such JavaScript will not be counted in your indexing and it will produce a lost opportunity.


    Navigation

    The navigation bar of site is important because it is through your navigation that the search engine spiders are able to access all of your web site's content.The navigation bar are two types internal navigation and external navigation.Internal navigation involves the links that move users from one page to another on your site.External navigation refers to links that take users away from your page. The Navigation bar to be SEO-friendly, we have to use both types of navigation carefully.

    Sunday, March 14, 2010

    Image Tag Optimization

    Image Tag Optimization : Another Important Factor for on page optimization  Image Tag or Alt Tag.Alt tag generally means the alternative tag. It is the HTML tag that tells search engine what the images are all about. Since search engines can not read text embedded in images, so alt tag tells search engines what the images are all about.ALT tags, which are basically images' descriptions. Always add ALT tags to your images to make sure search engines recognize all the content on your site. ALT tags filled with keywords can also be used to boost your keyword frequency and help you achieve better rankings.Thats why Image Optimization another aspect of our On site SEO improve your search engine rankings can and drive traffic to your website.
    Image Tag Optimization important in this phase, you should pay special attention to the following:
    #. Use keywords that are present in title tag, Meta tag and body text.

    #. Alt tag is the best option to add more beauty to your website so that

    #. Use 2-3 main keywords rather than the repetition of keywords

    #. Alt tag should not be long (4-7 words)

    #. Use both plural keywords and also singular keyword. using both will help in getting more visitors.

    Creating Sitemap

    Creating Sitemap : A page containing links to all the pages in a site is called sitemap or A sitemap is a webpage that shows all the different pages in the site .Sitemap can be XML, TXT or HTML usually XML and TXT is served for robots .Site maps can improve search engine optimization of a site by making sure that all the pages can be found.

    Header and Footer

    Header : Header is appears at the top of the page,The header of your site is typically the first thing people see when coming to your site .Header is placed on  the top of your page because people make sweeping judgments about what they are about to see and read.The header is set in an h1 HTML tag.
    Footer : Footer is appears at the bottom of a page.Footers will contain the fine point of a website.Basically, footers will  provides user all information they are looking for like contact details and a brief information about website. A footer is sometimes called a running foot.

    Creating Robots File

    Robots file is most important ,A robots.txt is a file placed on your server to tell or instruct the various search engine spiders not to crawl or index certain sections or pages of your website or blog.The robots.txt file itself is a simple text file, which can be created in Notepad or The robots.txt is a very simple text file that is placed on your root directory. An example would be Seo-service-4-u.blogspot.com/robots.txt. This robots.txt file tells various search engine and other robots which areas of your website or blog they are allowed to visit and index.You can only have one robots.txt on your website or blog.

    GOOD : seo-service-4-u.blogspot.com/robots.txt

    BAD : Won't work: seo-service-4-u.blogspot.com/directory/robots.txt

    If you are using Wordpress a sample robots.txt file would be:

    User-agent:

    Disallow: /wp-

    User-agent: User-agent means that all the search bots like Google, Yahoo, Bing ,Ask,MSN ,  Alexa,GigaBlast , DMOZ Checkerand ,Baidu and so on should use those instructions to crawl your website.

    Disallow: /wp- Disallow means this will make sure that the search engines will not crawl the Wordpress files.

    Web Robots are sometimes referred to as Web Crawlers, or Spiders. Therefore the process of a robot visiting your website is called "Spidering" or "Crawling". When we says that the search engines  spidered my website or blog, it means the search engine robots or Web Crawlers have visited their website.This robot Web Crawlers is known by a name and has an independent IP address.IP address is not importance to us, but knowing robot names will help in create a robots.txt file.This is why the file is called "robots.txt." Following are the list of the robots very popular Specific Search Robots names with there bot name:

     Specific Search Engines Robots

    Engine                                      Bots

    Google.com ************* Googlebot

    Alexa.com   ************* Ia_Archiver

    MSN.com   *************      Msnbot

    Altavista.com************* Scooter

    Excite.com ************* ArchitextSpider

    Euroseek.net ************* Arachnoidea

    Gendoor.com ************* GenCrawler

    Infoseek.com ************* UltraSeek

    Hotbot.com ************* Slurp

    Nave.com *************Naverbot, yeti

    Looksmart.com************* MantraAgent

    Lycos.com ************* Lycos_Spider_(T-Rex)

    Baidu.com ************* Baiduspider

    Cuil.com ************* Twiceler

    GigaBlast.com ************* Gigabot

    Yuntis.com ************* Gulper

    LookSmart.com ************* MantraAgent

    Teoma.com  ************* Teoma_agent1

    SearchHippo.com ************* Fluffy the spider

    AlltheWeb.com ************* FAST-WebCrawler

    Euroseek.com   ************* Arachnoidea

     Specific Special Bots

    Google Image ************* Googlebot-Image

    Google Mobile ************* Googlebot-Mobile

    Yahoo MM ************* Yahoo-Mmcrawler

    MSN PicSearch *************Psbot

    Wednesday, March 10, 2010

    Body Text

    This is what your surfers will actually see when coming to your site. There are many issues to consider when placing keywords in the text of your pages. Most search engines index the full text of each page, so it's vital to place keywords throughout your text. However, each search engine uses different ranking algorithms.The body element contains all the contents of an HTML document, such as text, hyperlinks, images, tables, lists, etc.These are general rules that everyone should follow: 

    #. Make sure your main page have your main keywords or keyphares.It has a higher chance of being indexed than your other pages, and it will be the only page indexed by some engines. 

    #. The H1, H2...........H6 tags are given special relevancy weight, and you should plan to integrate your keywords into your heading. You don't have to go extreme, just use one H1 for your most important keyword and two H2's - one for each of your secondary keyword phrases.

    #. Bolding and italicizing your keywords at least once doesn’t hurt and actually gives you a very small boost.

    #. When creating your content pages, keep the following four concepts in mind: Keyword prominence, proximity, density and frequency all are very important .

    Keyword

    Keyword :Keyword are the words which are use by the user in search engine queries or A word or phrase which is used when we search for a website in the search engines or directories.The meta keywords tag allows you to provide additional text for crawler or spider based search engines to index site along with your body copy. How does this help you? Well, for most major crawlers, it doesn't. That's because most crawlers now ignore the tag. The meta keywords tag is sometimes useful as a way to reinforce the terms you think a page is important for on the few crawlers that support it..The act of using keywords to get a web page higher up in Search Engine rankings is called keyword marketing .

    Description

    The META description tag describes your site's content and gives search engine spiders an accurate summary of your site filled with multiple keywords or The way something is explained or the process, act or technique of describing something is called its description.
    These are the general rules you should follow when optimizing your description tag:

    #. Place the keywords phrase at the beginning of your description to achieve the best possible ranking.

    #. If possible , don’t use stop words like “is, a, and, or”

    #. Many search engine use description to describe your site so make sure you not only repeat each of your keyword phrases at least once but make this a true representation of the page that the visitor will be viewing when come to your site and try to keep your description under 255-270 chars.

    Title

    The title tag is one of the most important factors in achieving high rankings. A title tag is an HTML code that creates the words that appear in the top bar of your Web browser. Usually, the Title Tag is the first element in the area of your site, followed by the Meta Description and the Meta Keywords Tags. These are the general rules you should follow when optimizing your title tag:
    #. Use in your title maximum 3 keyword phrases and 100 characters
    #. Don’t use stop words like “a, and, or”
    #. Don't repeat the same keyword in your title more than twice.

    Meta Tags

    Meta Tags or Elements are HTML or XHTML tags are used to provide structured metadata about a Web page. Such elements or tags must be placed as tags in the head area of an HTML or XHTML document. Meta elements can be used to specify page description, keywords .
    Meta tags are HTML codes that are inserted into the header on a web page, after the title tag.

    Tuesday, March 9, 2010

    Optimizing Web Pages

    To get high search rankings, you must optimize your web site to have good quality content that is useful to your visitors. Let's look at the specific things you can do to optimize your web pages for high search ranking.These are the general rules you should follow when optimizing your website or blog.
    #. Web pages that have good content for your visitors will likely have high rankings from the search engines. This is critical for good rankings.
    #. Choose a navigation bar that is logical and natural, once that your visitors and the search engines can easily follow.
    #. Have each web page contain right or unique information. That is, each page makes its own contribution to the site.
    #. Divide your website or blog into categories that are natural and appropriate for the theme of your website or blog . This will help both your visitors and the search engines to find the information they are seeking.
    #. Use in your title maximum 3 keyword phrases and 100 characters that are appropriate for each web page. Because each page is unique, these keywords will be unique with their pages.
    #. Keep the most important pages within the first two sub-levels of your website.
    #. Add a description meta tag to each page that briefly (200 characters or less) describes the page.The META description tag describes your site's content, giving search engines' spiders an accurate summary filled with multiple keywords.
    #. In summary, optimize your website or blog by having excellent content in the web pages. Focus each page on particular keywords or keyphares . keywords that are unique to each page and are meaningful to your visitors.

    Site Search Functionality

    Site Search Functionality is also important ,once you have a website or blog that has a lot of information about search , your readers will want to be able to search the website or blog for information without having to navigate through numerous menus or wade through long lists of options. A search engine or search utility gives your website or blog more interactivity with your readers and encourages them to stay longer. Your users can just type in search few keywords for what interests them, and within seconds they are shown to the exact pages on your website that are relevant.

    What is URL Naming

    URL is an abbreviation that stands for "Universal Resource Locator." It's another name for a web address, the address that you type into your internet browser when you want to go to a website. URL also known as Universal Resource Locator, Web Address, Internet Address etc.
    Format of a URL:
    Protocol://site address/path/filename
    For example, the URL of my company site is:
    http://seo-service-4-u.blogspot.com/
    and a typical page on this site would be:
    http://seo-service-4-u.blogspot.com/2010/03/what-is-url-naming.html

    The above URL consist of:
        * Protocol: http
        * Host computer name: www
        * Domain name: seo-service-4-u
        * Domain type: blogspot.com
        * Path: /2010/03
        * File name: what-is-url-naming.html
    Every domain name has a suffix that indicates which top level domain it belongs to.
    Such Domains. For example:
    # gov - Government agencies
    # edu - Educational institutions
    # org - Organizations (nonprofit)
    # mil - Military
    # com - commercial business
    # net - Network organizations
    # ca - Canada
    # th - Thailand

    Sunday, March 7, 2010

    On Page SEO

    On Page Seo : On page SEO means the code page optimization.
    Search engines are constantly improving their algorithm so they can provide more relevant results. A relevant site is one that provides quality content for its readers so after you have chosen your keyword phrases build the content of your pages around those keywords.
    On page SEO means the code page optimization so, in this phase, you should pay special attention to the following:
    * URL Naming
    * Site Search Functionality
    * Optimizing Web Pages
    * Meta Tags
    * Title
    * Description
    * Keywords
    * Body text
    * Creating Robots File
    * Header and Footer
    * Creating Sitemaps
    * Image Tag Optimization

    * Navigation
    * Frames , Java scripts, Flash

    * Validate Your Code and Feeds

    * XML Sitemaps
    * Good Fresh Content
    *
    Use Headings 
    * URL Structure 
    * Internal linking Strategy

    SEO vs SEM – Which Is Better ?

    Many people confuse in these two terms and use them in the place of each other but there is a difference between SEO and SEM. SEO is a component of search engine marketing . Seo is the act of optimizing a website for organic or natural search engine listings. Seo is one of the most cost- effective functions of a search engine marketing campaign , but it is only one element of SEM.SEO is the process of improving web pages so they rank high in search engines for a specific keyword phrase. SEO is divided in two categories on-page optimization and off-page optimization. On-page optimization it refers to every method you use that change the page code . Off-page optimization equals link popularity
    SEM is the act of marketing a website via search engines, whether by improving rank in organic or natural listings, paid or sponsored listings or a combination of both or sem is the overall process of combining the two processes of search engine optimization and pay-per-click advertising.

    What is Search Engine Marketing ?

    Search Engine Marketing : SEM is a form of Internet Marketing that seeks to promote site or blogs by increasing their visibility in search engine result pages (SERPs) through the use of Search Engine Optimization (SEO) , Paid Placement, Contextual Advertising and Paid Inclusion.
    Search Engine Marketing : SEM is an umbrella term that describes the different methods that you can use to make your Web site more visible on search engines so that you can drive more traffic to your website or blog.
    Search Engine Marketing : SEM is the overall process of combining the two processes of search engine optimization and pay-per-click advertising.