|
|
|
Save Time with Generic Pages!
'
Why Create Engine-Specific Pages? '
A
few years ago, it was a common practice to create separate pages for each
search engine for each important keyword phrase, then use a robots.txt
file to keep other engines out of the engine-specific pages. However, most
of us have now changed our philosophy. After all, that "old"
strategy was extremely time consuming!
We
can save an enormous amount of time and energy if we create a
"generic" page to begin with. Why go to the trouble to create
engine-specific pages when a generic page will work just as well?
So, let's talk about how to work with generic pages effectively, and then
how to take those generic pages and make them engine specific, if needed.
Important Note
As any of my Search Engine Workshop students will tell you, I
believe very strongly in focus, focus, focus. So, when working with your
generic pages, you'll want to create one page focused on one keyword
phrase only. Don't bring in other keyword phrases on the same page. FOCUS!
My Search Engine Workshop students will also tell you that I believe each
page of your site should be:
If
a page isn't of value to both, the page is junk and needs to be deleted
from your site.
Creating Generic Pages . . .
To create a generic page, you simply optimize a page in a very general
manner, using META tags, heading tags, link text, good quality content,
and so forth. Some of the engines don't consider META tags, and that's
okay. Using them won't hurt your rankings for those engines.
Target your audience by using Wordtracker
to find the best keywords focused on the information your target audience
is actually looking for when they go online.
Remember: people use the Internet to look for information. Provide
that information, and you're strengthening your Web site and online
presence.
WebPosition Gold users, be sure to run your page through Page Critic ,
choosing the HotBot search engine. HotBot receives primary search results
from Inktomi, which still considers META tags when determining relevancy.
So, it's a good engine to use when creating generic pages.
You've now got a generic page, and you're ready to see how it ranks across
the major engines.
Once you've created your generic page, submit it to the engines by using
pay inclusion or letting the engines find links to the page from other
pages. I don't recommend "free add URL" submissions, and
there's no real reason to use submission software, except for some of the
less important engines.
Once the page has had time to settle, check your rankings. Watch your
rankings for a month or two, because depending on which submission method
you've chosen, it can take a while for your rankings to settle.
Then, look for holes in your strategy. Is your page doing well across the
board? In many cases, by using HotBot and creating a "generic"
page, you'll find that the same page stands an excellent chance at ranking
well across almost all of the major engines.
If your page is doing well in some of the engines but not others, that's
the time to begin creating engine-specific pages.
How to Create Engine-Specific Pages . . .
Take your generic page and run it through the major engines in
WebPosition Gold's Page Critic where you aren't getting top rankings. Make
changes based on Critic's recommendations for each engine, and save each
page in a slightly different manner. Be sure to make a note of which page
was optimized for each of the engines.
Try to stay away from making it so obvious that you have engine-specific
pages. For example, you may not want to name your pages:
name-your-pages-AV.html (for AltaVista), or
name-your-pages-FST.html (for Fast)
What about Duplicate Content?
Let's say your generic page ranks well with Google and the Inktomi-influenced
engines, but it's not ranking well with Teoma or Fast/Lycos. If you create
engine-specific pages for Teoma and Fast, you'll now have three almost
identical pages, which the engines won't like.
Remember that the golden rule when working with content is that the
content must be of value to both the search engines and the users. Having
duplicate content is not of value to the search engines (or the users).
They certainly don't want several versions of the same content cluttering
up their indices.
To keep from getting in trouble with duplicate content, you'll need to
create a robots.txt file and allow certain engines to have access to
certain pages, yet keep them out of other pages. In other words, you'll
direct the engines to whichever pages you want each engine to visit by
using a robots.txt file.
Robots.txt
Files
Create a text file with Window's NotePad, NoteTab Pro, or any
other editor that can save ASCII .txt files. Use the following syntax:
User-agent: (PutSpiderNameHere)
Disallow:/(PutFileNameHere)
The "user-agent" portion lets you specify which engines
you want to keep out, and the "disallow" portion lets you
specify directories or file names.
For example, to tell AltaVista's spider, Scooter, not to index a couple of
pages, create a robots.txt file as follows:
User-agent: Scooter
Disallow:/name-your-pages.html
Disallow:/keyword-phrase.html
By creating a robots.txt file using this information, we're keeping
AltaVista out of our pages created specifically for Fast and Teoma. You'll
want to do the same for each of the other engines. Then, you'll want to
create entries for Fast and Teoma that will keep them out of the original
generic page.
That way, none of the engines will see duplicate content, and they'll only
see the pages created specifically for them.
Save the page as robots.txt, then upload the file to the root directory of
your Web site. The "root directory" is where your
index.html (or htm) page is located.
This is a very simple example of a robots.txt file, but they can get quite
complex. One little mistake can cause an engine to find a page that you
don't want found. Plus, you have to know the names of each engine's
"user agent," or spider. That's why I recommend using a software
program that creates the file and does the work for you.
A Software Solution for Creating Robots.txt Files
. . .
An excellent software program for creating robots.txt files is Robot
Manager Pro. You can even download a free trial version of the
software, which will create robots.txt files as well as analyze the first
100 spider visits from your log files.
One of my favorite features of Robot Manager Pro is its spider analysis
feature. The software will analyze spider visits to your site, and it will
let you know how far down into your site a spider has visited, which pages
it picked up, whether the pay inclusion spiders are re-indexing on their
designated time schedule, and more.
Like Wordtracker and WebPosition Gold, this is a "must have"
software program for me personally.
In Conclusion
We all live in a very busy world, and we don't need to make
more work for ourselves. Therefore, it makes sense to start with a generic
page and see what kind of results we get. Then, if we don't get the
rankings we want with the page, we can then create engine-specific pages
by running the same page back through Web Position Gold's Page Critic for
the other engines.
You'll be surprised at how well your generic pages will do, and with the
extra time you'll save, you can create more high performance pages for
your site and continue to increase your traffic!
|