NEED FOR LINK POPULARITY

Link popularity is very important when it comes to today's search engines. Most of the top search engines consider link popularity as an important ranking factor. Especially Google gives a lot of importance to links, so when it comes to search engine optimization link popularity is very important.

In search engine optimization we cannot get natural links from everyone. There are two remedies for this - buying links or adding reciprocal links. But buying links is a costly process. So we prefer going for reciprocal links.

When it comes to link building we should be careful not to link to bad neighborhoods. Linking to a bad neighborhood like link farms, FFAs, Redirect spam sites, etc. might hurt sites. Linking to such sites hurts both the positioning as well as the traffic of these sites. So it is better to be on the safer side and not to do any linking to bad neighborhood. Also check the status of the link often.

Have you seen the Google directory? Some sites on the bottom of the directory are without any Page Rank Most of these sites are in the penalized list by Google. So be careful when you go for link building tactics.

There are many different ways to present reciprocal links. One way is to clearly designate them as reciprocal links and then delegate them to specific areas or pages of your site. An alternate method is to incorporate links to valuable resources inside your own content. You also have the choice of separating reciprocal links and other links, or joining them all together.

STRATEGIES OF LINK BUILDING

Acquiring text links is one of the most important steps in search engine optimization. This does not however mean that buying irrelevant links or links with low Page Ranks will help your site. It is also important that your links on the individual pages are relevant to the keywords.

Hence, when it comes to link building there are some important strategies. By following these strategies you can be sure that your site will rank above those of your competitors.

Page rank - Page Rank is the primary consideration when it comes to link building. It is only links having high page ranks that are of any value. This is because it is only such links that can boost the rankings of your site with the search engines like Google, Yahoo, AltaVista, MSN, etc.

Page Rank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page's value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page receives; it also analyzes the page that casts the vote. Votes cast by pages that are themselves "important" weigh more heavily and help to make other pages "important."

Linking to different sites - While links are considered significant in the process of ranking, Google also sees to it that its results are not manipulated. As part of this control system, it checks that all the links are not from the same site.

Personalize your link exchange letter - Webmasters always prefer to exchange links with sites that sent personalized letters rather than bulk mails. A personalized mail sets you apart from the spam mail. It lets them know that you have visited their site. You are more likely to receive reciprocal links when you send such mails.

Relevancy of links - Ensure that the links you receive are relevant to your site. By linking to related sites you can make sure that you don’t loose out on business either. A few relevant sites linking to your site will give you more benefits than a thousand irrelevant links.

Different IP address - Check your links and make sure that they come from different IP addresses. If all the links come from the same IP address, search engines will be able to cross reference the links and your site will be penalized.

History Of search engines

Where would we be without 'em?
Our experience of the Internet is often facilitated through the use of search engines and search directories. Before they were invented, people’s Net experiences were confined to plowing through sites they already knew of in the hopes of finding a useful link, or finding what they wanted through word of mouth.

As author Paul Gilster puts it in Digital Literacy "How could the world beat a path to your door when the path was uncharted, uncatalogued, and could be discovered only serendipitously?".

This may have been adequate in the early days of the Internet, but as the Net continued to grow exponentially, it became necessary to develop a means of locating desired content.

At first search services were quite rudimentary, but in the course of a few years they have grown quite sophisticated.

Not to mention popular. Search services are now among the most frequented sites on the Web with millions of hits every day.

Even though there is a difference between search engines and search directories (although less so every day), I will adopt the common usage and call all of them search engines.

Archie and Veronica

The history of search engines seems to be the story of university student projects evolving into commercial enterprises and revolutionizing the field as they went. Certainly, that is the story of Archie, one of the first attempts at organizing information on the Net. Created in 1990 by Alan Emtage, a McGill University student, Archie archived what at the time was the most popular repository of Internet files, Anonymous FTP sites.

Archie is short for "Archives" but the programmer had to conform to UNIX standards of short names.

What Archie did for FTP sites Veronica did for Gopherspace. Veronica was created in 1993 at the University of Nevada. Jughead was a similar Gopherspace index.

Robots
Archie and Veronica were for the most part indexed manually. The first real search engine in the sense of a completely automated indexing system is MIT student Matthew Gray’s World Wide Web Wanderer.

The Wanderer robot was intended to track the growth of the Web counting only web servers initially. Soon after its launch it captured URLs as well. This list formed the first database of websites, called Wandex.

Robots at this time were quite controversial. For one, they occupied a lot of network bandwidth and they would index sites so rapidly it was not uncommon for the robots to crash servers.

In the Glossary for Information Retrieval Scott Weiss describes a robot as:

[a] program that scans the web looking for URLs. It is started at a particular web page, and then accesses all the links from it. In this manner, it traverses the graph formed by the WWW. It can record information about those servers for the creation of an index or search facility.

Most search engines are created using robots. The problem with them is, if not written properly, they can make a large number of hits on a server in a short space of time, causing the system’s performance to decay.

The First Web Directory

In response to the problems with automated indexing of the Web, Martjin Koster in Oct. 1993 created Aliweb, which stands for Archie Like Indexing of the Web. This was the first attempt to create a directory for just the Web.

Instead of a robot, webmasters submit a file with their URL and their own description of it. This allowed for a more accurate, detailed listing.

Unfortunately, the application file was difficult to fill out so many websites were never listed with Aliweb,

Spiders

By December 1993, three more robots, now known as spiders, were on the scene: JumpStation, World Wide Web Worm (developed by Oliver McBryan in 1994, bought out by Goto.com in 1998) and the Repository-Based Software Engineering (RBSE) spider.

RBSE made the important step of listing the results based on relevancy to the keyword. This was crucial. Prior to that, the results were in no particular order and finding the right location could require plowing through hundreds of listings.

Excite was launched in February 1993 by Stanford students and was then called Architext. It introduced concept based searching. This was a complicated procedure that utilized statistical word relationships, such as synonyms. This turned up results that might have been missed by other engines if the exact keyword was not entered.

WebCrawler, which was launched in April 20, 1994, was developed by Brian Pinkerton of the University of Washington.

It added a further degree of accuracy by indexing the entire text of webpages. Other search engines only indexed the URL and titles, which meant that some pertinent keywords might not be indexed. This also greatly improved the relevancy rankings of their results.

As an interesting aside, WebCrawler offers an insightful service, WebCrawler Search Voyeur, that allows you to view what people are searching as they enter their queries. You can even stop it and see the results.

Search Directories

There was still the problem that searchers had to know what they were looking for, which as I can attest, is often not the case. The first browsable Web directory was EINet Galaxy, now known as Tradewave Galaxy, which went online January 1994. It made good use of categories and subcategories and so on.

Users could narrow their search until presumably they found something that caught their eye.

It still exists today and offers users the opportunity to help coordinate directories, becoming an active participant in cataloging the Internet in their field perfected the search directory, however.

Yahoo! grew out of two Stanford University students, David Filo’s and Jerry Yang’s, webpages with their favourite links (such pages were quite popular back then).

Started in April 1994 as a way to keep track of their personal interests, Yahoo soon became too popular for the university server.

Yahoo’s user-friendly interface and easy to understand directories have made it the most used search directory. But because everything is reviewed and indexed by people, their database is relatively small, accounting for approximately 1% of webpages.

The Big-Guns

When a search fails on Yahoo it automatically defaults to AltaVista’s search.

AltaVista was late onto the scene in December 1995, but made up for it in scope.

AltaVista was not only big, but also fast. It was the first to adopt natural language queries as well as Boolean search techniques. And to aid in this, it was the first to offer "Tips" for good searching prominently on the site. These advances made for unparalleled accuracy and accessibility.

But AltaVista had competition: HotBot, introduced May 20, 1996 by Paul Gauthier and Eric Brewer at Berkeley. Powered by the Inktomi search engine, it was initially licensed to Wired Magazine website. It has occasionally boasted it can index the entire Web.

Indexing 10 million pages per day, it is the most powerful search engine.

Meta-Engines

The next important step in search engines is the rise of meta-engines. Essentially they don’t offer anything new. They just simultaneously compile search results from various different search engines. Then list the results according to the collective relevancy.

The first meta-engine was MetaCrawler released in 1995. Now called Go2net.com it was developed in 1995 by Eric Selburg, a Masters student at the University of Washington .

Skewing Relevancy

Prior to Direct Hit, launched in the summer of 1998, there were two types of search engines: author controlled services, such as AltaVista and Excite, in which the results are ranked by keyword relevancy and editor-controlled, such as directories like Yahoo and LookSmart, in which people manually decide on placement.

Direct Hit, as inventor Gary Culliss relates: "represents a third kind of search, one that's user-controlled, because search rankings are dependent on the choices made by other users." As users choose to go to a listed link, they keep track of that data and use the collected hit-ratio to calculate the relevancy. So the more people go to the site from Direct Hit the higher it will appear on their results.

which runs as a research project at Stanford University since late 1997, also attempts to improve relevancy rankings. Google uses PageRank, which basically monitors how many sites link to a given page. The more sites and the more important the sites that link to a given site the higher the ranking in the result list.

It does give a slight advantage to .gov and .edu domains. Basically, it is trying to do what Yahoo does but without the need for costly human indexing.

Is This Fair?
Another way of fixing relevancy rankings is by selling prominent placement as Goto.com does. Founded by idealab and Bill Gross, this practice caused quite a controversy. Apparently, there was some doubt as to the actual relevancy of its paid prominent listings. Goto insists that their clients must adhere to a "strict policy" of relevance to the corresponding keywords.

Their corporate site defends its approach:

"In other search engines, there is no cost to spamming or word stuffing or other tricks that advertisers use to increase their placement within search results. When you get conscious decisions involved, and you associate a cost to them, you get better results... GoTo uses a revolutionary new principle for ranking search results by allowing advertisers to bid for consumer attention, and lets the market place determine the rankings and relevance."

For the right amount of money you can ensure your site is placed #1.
Check out the words that are still "unbidden".

Look similar?

That's for the courts to decide now. Goto.com has filed suit February 1999, against the Disney owned Go Network.

Finding a niche

As search engines try to index the entire Web, some search engines have found their niche by narrowing their field to a specific subject or geographical region. Argos was the first to offer a Limited Area Search Engine. Launching October 3, 1996, they index only sites dealing with medieval and ancient topics. A panel decides on whether a site is suitable for inclusion.

Their mandate was to combat such problems as this example (from their site):

"At the time of this writing, a search for "Plato" on the Internet search engine, Infoseek, returned 1,506 responses. Of the first ten of these, only five had anything to do with the Plato that lived in ancient Greece, and one of these was a popular piece on the lost city of Atlantis. The other five entries dealt with such things as a home automation system called, PLATO(tm) for Windows, and another PLATO(r), an interactive software package for the classroom. Elsewhere near the top of the Infoseek list was an ale that went by the name of Plato, a guide to business opportunities in Ireland, and even a novel called the "Lizard of Oz."

Such specializing has also proven effective for MathSearch, Canada.com, and hundreds of others.

Ask Jeeves' niche is making search engines more searchable for the average user. (Who really knows Boolean anyway?) Founded in 1996, but not really well-used until recently, Ask Jeeves has a more human approach. Refining natural language queries so that users can ask normal questions. For example, "Whatever happened to Upper Volta?".

When a question is answered it matches similar queries it has already received and offers these as its results. This is supposed to help guide users to the desired location when they might not know themselves how else to find it.

The Next Generation

There is no denying that these sites are among the most popular websites. They mark the daily entry point into the Web experience.

Search engines are trying to offer more and to be more. Whether it is Northern Light’s private fee-based online library or Yahoo offering free email and content (news, horoscopes, etc.). Search engines are continuing to evolve.

We are seeing the sophistication of the spiders in finding and indexing sites, the increase in user-friendly searching techniques and interface, the expanding of databases and the improved relevancy of results from the database.

(Now if they could just make some money doing it, as most of the companies mentioned continue to operate at a loss.)

As I learned while researching this topic, search engines may open up the door to the World Wide Web, but not without some difficulty. Searching is far from easy or perfect.

As the Web continues to grow rapidly, the need for better search engines only increases.

Link farm and its expression

A link farm is any collection of web sites that all hyperlink to every other site in the group. Although some link farms can be formed by hand, most are created through automated programs and services. A link farm is a structure of spamming the index of a search engine . Other link exchange systems are designed to allow individual websites to selectively exchange links with other related websites and are not considered a form of spamdexing.

Where link weighting is still believed by some Webmasters to influence search engine results with Google, Yahoo!, MSN , and Ask , link farms remain famous tool for increasing PageRank or perceived equivalent values. PageRank-like measurements apply only to the individual pages being linked to, so these pages must in turn link out to other pages in arrange for the link weighting to help.

The expression "link farm" has always carried with it a derogative reputation. Many reciprocal link management service operators advertise the value of their resource management and direct networking relationship building. The reciprocal link management services support their industry as an alternative to search engines for finding and attracting visitors to Web sites.

Valuable tactics for link building

Mention without a Link - This one is amazingly effective. Your goal is to identify sites/pages that already mention your brand/product/service/website but have failed to offer the direct HTML link. Just shoot them a pleasant, personal email and request the link - success rates can be very high. To find these willing linkers, you can use Yahoo's advanced search parameters.

Profile Sites - All those "Web 2.0" sites that permit for the creation of profiles with links, from Frappr and Newsvine to MySpace and Yahoo! 360 to Digg, Del.icio.us and StumbleUpon are goldmines for links. Even those that don't have direct links enabled often allow you to submit sites or explain what you've "tagged" or visited.

If there's a mainly strong site in your sector that you desperately want a link from, this tactic can be of occassional use. The idea is to write a news article with some authority and request a quote from the company/individual you want a link from.

Sites from City search & Yellow Pages to Google provide global, local links if you sign up with them, but in nearly every mid-large metropolitan area, there are literally hundreds of directories and lists of local companies in every possible category. Oftentimes they're free and even when they arent; the prices to be listed are fairly inexpensive.

Contextual Link Building

A powerful technique to increase link popularity is called Contextual Link Building. It involves having your keyword topics written regarding by blogs and websites. In addition to the advantage of having your topics discussed by these authors, they will link back to your website inside a pertinent post, on topic, and with anchor text. We contact bloggers asking them to write about huge content on your site. If necessary, we will even make original articles for them. According to Google and the other main search engines, contextual links are some of the best inbound links you can obtain for your site. These make permanent text links to your website at a very cost effectual rate.

Microsoft Adcenter - A beginner's guide to Microsoft Pay per click

As most of us are aware, Pay per Click (PPC) search engine marketing is an advertising strategy where you, as an advertiser, pay a search engine every time a potential customer clicks on your ad. These ads appear either on search engine results pages or on sites within a search engine's network of partners. The growth of the search industry worldwide has resulted in a big market for paid search advertising and hence most search engines have developed some type of PPC related, of course, to performance.

To appear in the PPC results, advertisers sign up for the PPC program and create short text ads, image ads or videos describing their product or services shown on their site in a way that will tempt searchers to visit it. While setting up the program, an advertiser will determine which trigger keywords/phrases they wish to bid on and how much they are willing to pay when a visitor clicks on their ad.



If you study at a search engine results page (SERP), you will be able to distinguish between search results that are regular organic search listings and PPC search results which are actually paid advertisements. These paid ones are generally listed under "sponsored results" or "featured listings" and consist of specially designed text, image or video ads that are triggered to display when your target keywords are used in a search query.

Microsoft AdCenter (formerly MSN adCenter) has brought out the latest PPC. Microsoft adCenter is the division of the MSN entrusted with the task of providing Pay Per Click advertisements. Microsoft has joined the Big Three league the other two being Google and Yahoo - develop its own system for delivering PPC ads.

Strictly speaking, Microsoft is not a stranger tp PPC as till recently, all of the Microsoft ads displayed on the MSN search engine were supplied by Overture and later Yahoo!. MSN claimed a percentage of the ad revenue in return for displaying Yahoo!'s ads on its search engine. As search marketing expanded and presented great opportunities, Microsoft has now developed its own system, Microsoft adCenter, for selling PPC advertisements directly to advertisers.

Microsoft adCenter lets you to offer bids for keywords or phrases you associate with your ads (See this : http://advertising.microsoft.com/wwdocs/user/en-us/adexcellence/flash/18_Bidding/index.html). This bid is the maximum amount you are willing to pay if any user searches for one of your keywords and clicks your ad. You can also increase your bid if you wish to reach specific audience target that fits your buyer profile. Generally, the higher the bid, the more likely their ad will show above their competitor's

It is good to know that Microsoft sets store on the ability to build brand awareness with their PPC program, permitting the continued exposure of your brand to a wider audience, regardless how many clicks your ad attracts. This is an important feature of most PPC programs though Google or Yahoo is reluctant to use this strategy.

Borrowing ideas from Google adWords, Microsoft adCenter avails the maximum amount an advertiser is willing to pay per click as well as the advertisement's click through rate (CTR) to determine how frequently an advertisement is shown. This dual operation is said to encourage advertisers to write effective ads and makes them advertise only on searches which are relevant to their advertisement.

Microsoft adCenter lets advertisers to target their ads by restricting their ads to a given set of locations. Similarly, adCenter allows advertisers to run their ads on select days of the week or at specific hours of day.

Tips to get more traffic with long tail keywords that convert in to sales

In pay per click management keyword targeting is always the hot topic and use of long tail keyword is one of the best keyword discussed. The use of long tail keyword will bring in additional traffic to your sites all you need to do is spend time in optimizing those specific long tail keywords for better ranking in search engine.

Once you plan to optimize your keyword, you can build on more traffic to your sites in two ways that are as follows:
1. Write more and more content using those keywords. It is one of the most popular ways to optimize your long tail keywords and your site too.
2. If you think that your particular long tail keyword is bringing more and more traffic to your site then put more offers, special discount or other deals that convert this traffic in to sales, which will in turn help you to achieve your return on investments and your advertisements goals.
These are the two ways that will help your pay per click campaign to get more sales with traffic you are already receiving.

How to make your ad relevant with the help of keyword matching options

According to Adwords, there are four keyword matching options which help you determine your ad to appear in Google search. Each and every single keyword can be set as targeted keywords by using any of these four options.

The four types of keywords matching options are-
Broad match: keyword This option allows your ad to be shown for similar phrase. For example, if your ad contains the keyword "wrist band" then your ad will be eligible to appear even when the user searches for either or both words. Like your ad may show on search for the following keywords - wrist, band, buy wrist band, wrist watch etc.

Phrase match: "keyword"
This option allows your ad to be shown for the exact phrase. For example, if your ad contains the keyword "wrist band" then your ad will be eligible to appear only when the user searches for exact phrase "wrist band". The ad can also appear for search that contain other terms as long as it contain the exact phrase.

Exact match: [keyword]
This option allows your ad to be shown for the exact phrase exclusively. For example, if your ad contains the keyword "wrist band" then your ad will be eligible to appear only when the user searches for specific phrase "wrist band". The ad will not be eligible to appear if the phrase contain other terms in it.

Negative match: -keyword
This option ensures that your ad is not shown when any search contains the specific term.

Keyword matching option just requires you to add appropriate punctuation to the keyword while setting the target. With the use of appropriate matching you will get more ad impressions, clicks and conversion which will in turn helps us to meet the ROI goals.