How To Get More Traffic From Yahoo and MSN/Live.com

If you're like me, then you likely look at the web stats and see how much traffic you're getting from Google and wish you could even get half as much from the other big search engines, Yahoo! and MSN/Live.

The bad news is you aren't likely to get a significant amount of traffic from the other search engines. It's a pretty simple reason. The other search engines just don't get that much searchers compared to Google, as you can see from this chart created from data by hitwise. Other services that track search engine searches show similar info.

Many webmasters I've talked to, responses I've seen on forums and blogs, as well as my own personal experience seems to indicate that even if you rank better with the other search engines, you're likely to still get more traffic from Google. I've seen this even when one of my sites is the first result for one of the other search engines.

There is only so much you can do to get more traffic from Yahoo or Microsoft, but if it looks like you might be able to squeeze a little more out of it you might want to spend the time. Many people find that while you might not get a significant increase in traffic, the traffic you do get converts quite well. For instance, on one site I find that visitors from MSN search have a higher AdSense Click Through Rate (CTR) than from other sources. On another site, I've noticed that Yahoo! visitors tended to buy more products.

This isn't always the case, but it's still worth putting in a little effort to see what happens. I'll cover some of the things you can do to try to increase your search engine traffic from Yahoo and Microsoft.

So far it's been working with Yahoo! but Microsoft is showing promise but still hardly indexed the site I'm using as an example.

Getting more traffic from Yahoo/Microsoft?

Are you already getting more traffic from other search engines? As I mentioned before, this is pretty rare. If this is the case, you really want to take a close look at what you might be doing wrong. Look over your pages, your back links, etc.

The major search engines work essentially the same way, even though their rankings may differ slightly. It's very unlikely that you found a particular niche that gets a lot more searches in one search engine than the other.

Search Engine Stats from one of my Sites

To the right you'll see two charts showing the search engine traffic I'm getting on one of my sites. The top chart shows a period before I started working on improving the traffic from Yahoo or Microsoft search engines. You'll notice that most of my search engine traffic is coming from Google. Over 90% and that's not including the traffic I get from Google's image search, which is the second highest search engine referrer for that site. That site doesn't even contain a lot of images. The next 4 search engines in order of traffic they send are AOL, MSN/Live and Yahoo!. AOL is powered by Google search so there's not much you can do to improve on there.

As you can see, after I performed some of the steps I'll describe below, I started getting a bigger share of my traffic from Yahoo and a bit more from Microsoft. All search engines wound up giving me more traffic in the second period that the bottom chart illustrates. I didn't sacrifice less Google traffic for more Yahoo traffic.

This next chart shows the activity from the search engine crawlers, Googlebot, Yahoo Slurp and MSNBot. These are the programs to go around and visit websites to include them in their respective search engine index.

All the crawlers wound up visiting this site more after the changes, but MSNBot had the largest increase. While I still haven't seen a significant increase from MSN or Live.com, I'm hoping this increased crawl rate is a good sign.

Now lets look at some of the steps I took to get here.

Cleaned up and verified HTML

Here's one thing that sometimes trips up spiders. Your HTML pages should be well formed with no errors. I'm not one of these people who puts badges on my sites that show they contain 100% valid HTML, but I do think it has some benefits. I use the HTML Validator plug-in for Firefox. First thing I do is correct all the errors that are listed. Then I check the warnings and see if they need corrections. My experience shows that you don't need pages to be completely free of errors and warnings. Browsers and crawlers understand this. A video Matt Cutts from Google published gave a stat that a large portion of pages out there aren't valid. Search engines understand this and compensate. The reason I check validation is that it helps me find errors that may trip up crawlers.

Things like unterminated strings in tag attributes, tags that weren't closed, typos in tags, etc. can cause you problems.

Cleaned up JavaScript

This site uses quite a bit of JavaScript and I went through it to try to clean up the code a bit. As much as I could, I pulled out little bits of code that could go into one of the external .js files. The less stuff in the HTML that crawlers don't care about the better.

Since I was working with the JavaScript I also cleared out unused functions, and checked for any errors that ran fine but could cause problems.

Added some more content

The site gets updated on a regular basis, but I expanded the amount of content that was actually on the page to provide more detail for the users and more keywords for crawlers.

While I was at it I reorganized the pages to make the most important content show up as early in the HTML as I could.

Extended the Domain Name Registration

I registered this domain for only 1 year. I should have done it longer but I was a bit hesitant. After I saw that it was worth keeping the site up, I extended the registration. Search engines don't like domains that are new and only registered for one year. Many spammers use throw-away domains so they prefer domains where the owner indicates they plan on keeping it for a while.

This was a mistake on my part, for this website, I should have known better. It took some time to get indexed but because I was doing well with Google, some notable sites found this one and linked to it which helped Yahoo pay attention.

Use a Sitemap

A Sitemap is an xml document where you can list the URL's in your site and provide some extra information to search engines to help them index your site better. These aren't the sitemap pages that many sites used to have, but seemed to go out of favor lately. You can learn about Sitemaps from Google.

Sitemaps don't guarantee that your pages will be indexed or necessarily improve your rankings, but they will help search engines know more about your site and the pages on it.

I actually had a Sitemap set up for this site, all the search engines seemed to be able to access it fine but I did notice that it wasn't validating. It was valid XML but it was missing the namespace attribute. You can use this Sitemap validator to check yours.

Submit your Sitemap to Yahoo!

Using Yahoo Site Explorer you can let yahoo know where to find your Sitemap. When you get to Site Explorer, enter the URL of your site map and click on the Add My Site button.

Then just follow the steps provided. One of the steps involves proving to Yahoo! that you won the site in question. This is a very important step. They have clear directions, and as it may change I'll leave it up to you to follow their instructions. Right now you can either validate by placing a meta tag in your home page, or a file on your site.

Add a Sitemap to MSN/Live.com

Microsoft provides tools for webmasters where you can create an account and let them know the location of your sitemap. Just follow the steps they give you. You'll also need to verify your site as you had to with Yahoo! and Google.

Microsoft seemed to be the search engine that was having a problem with the Sitemap being invalid. After making sure the XML Sitemap was valid, things started to improve with MSNBot.

Monitor your Crawl Stats

All three major search engines provide information and statistics to let you know how well your site is being crawled and if there are any issues. After you submit your Sitemaps, log in and check for any problems and to see how may pages are indexed.

Check to make sure that all your important pages are being indexed and if they're not, try to figure out why.

On the site we're talking about, Google indexed the site quickly and continues to crawl and index content very quickly. Yahoo is a bit slower but it eventually crawled the whole site.

Microsoft is lagging behind the other two search engines. While Yahoo and Google have indexed the entire site, Microsoft has only indexed about 10% after I made the changes. The increased MSNBot traffic makes me hopeful that it will continue to grow, but there are many people complaining about Microsoft's poor ability to crawl sites.

This is very odd because there are thousands of backlinks to this site, it's not blocked and Microsoft gives this a domain score of 5 out of 5! There are also no crawl issues identified.

Ping Sitemap Changes

I added code to this application to ping the search engines to let them know when my Sitemap has changed. This site gets updated and new pages are added regularly but I didn't want to constantly ping for each change. I set up a thread to ping the search engines a few times a day if the Sitemap has changed. Yahoo! and Google already visit the site frequently enough to pick up the changes naturally. There might be a penalty for sending too many pings, so I didn't want to overdo it.

If you'd like to automatically ping changes to your sitemap, below are the URLs you'll need to know. Replace www.example.com/sitemap.xml with the appropriate url of your sitemap.

Ping Yahoo when Sitemap is updated

http://search.yahooapis.com/SiteExplorerService/V1/ping?sitemap=http://www.example.com/sitemap.xml

Ping Microsoft when Sitemap is updated

http://webmaster.live.com/webmaster/ping.aspx?siteMap=http://www.example.com/sitemap.xml

Ping Google when Sitemap is updated

www.google.com/webmasters/tools/ping?sitemap=http://www.example.com/sitemap.xml

Block Duplicate Pages

This site has some different views for some of the pages to make it easier for users. It also contains search and a way to group data together in different ways. So I decided to add NOINDEX meta tags on those pages to keep them from confusing the search engines.

After they've been deindexed, I'll add rules to robots.txt to keep them from being crawled again.

In addition to removing duplicate content issues, I'm hoping this helps the crawlers spend more time on the more important pages of my site.

Results so far

As I mentioned, I'm already seeing good results from the changes I've implemented across all search engines but Microsoft is still doing very poorly. I need to find out if there are any more problems but as it is, they have almost 5 times as many pages indexed as they did before but even Ask.com has more pages in their index than Live.com does.

Did you like this article?
If so please take a moment to share it by clicking on a service button bellow.

0 comments:

Post a Comment