Registering Dynamic Sites

Many sites with dynamically generated web pages (versus static web pages) are not getting indexed and registered by search engines. And it seems that search engines don’t plan to change their environmental variables to make it easier for dynamic sites to get listed.

As you know, an environmental variable is a piece of information or a criterion that an engine uses in its algorithms to index, register, or rank a web page for placement in its database.

When a robot engine spiders or crawls a web site to index a URL domain name, it generally wants to crawl deeper into the site to determine suitability for indexing. If, as it crawls deeper into the site domain, it encounters pages whose URL includes a question mark (?) – an indication that the URL is dynamic – the robot interprets this question mark as an endless permutation of possibilities. Not knowing what to do next, it stops crawling the site.

Consequently, if the robot only has one or two pages of a site to consider for ranking in its database, such a site will appear to be hollow or lacking content, and the URL is not considered worthy of placement in its database. The result is that any dynamic web site might as well be invisible to robot search engines.

Currently, to my knowledge, none of the robot search engines plan to provide a solution for registering dynamic pages – it just doesn’t seem to be an option. Hence, dynamic site owners have turned to search engine optimization specialists using unique methods for creating content with static pages, relying on professional assistance to obtain good rankings.

What is a static page? It is simply an HTML web page with static content; the page can be any .htm or .html web page hosted by a domain name. The content within the page changes only when a human uploads a change to the site content. There are no meta refresh tags to redirect the page to another page and no cloaking or IP redirect delivery.

What is a meta refresh tag? It’s an HTML code tag that tells the browser to wait a specified period of time, perhaps one second, and then substitutes or replaces the page with another. It is commonly referred to as a redirect. Cloaking or IP delivery is a more sophisticated version of redirect.

Given the limitations of a robot’s registering dynamically generated web pages and the dislike it has for redirects, dynamically generated sites have very few choices when it comes to being found. A good search engine optimization specialist can help you deliver good static content; however, doing so takes a significant amount of labor and requires the proper registration techniques.

Registration techniques involve compliance with robot submission practices. The “add URL” page within all robot search engines has compliance guidelines, rules and preferences on how to submit a page or pages to their robot. When you follow these rules precisely, you will get your best results. Unfortunately, the search engines don’t make it easy. And each has its own submission requirements.

For instance, Google will accept only two pages per day; Excite, 25 per week. In some cases, a robot will not want to see the same document again for a specified period of time; Excite, for instance, doesn’t like to see a document again within 60 days. You must therefore track the submission history and/or check the robot before submitting a document again.

People who believe the answer is to resubmit continuously, all the time, are largely mistaken. This will only draw attention to your site and cause the robot to avoid it in the future.

If you have dynamically generated web pages, you have special circumstances to consider when it comes to submitting to robot search engines. It’s nothing that can’t be overcome, but it’s not a slam-dunk.

Remember, good static content, together with proper submission and registration procedures, will produce page-one links to your potentially invisible yet content-rich web site.

Related reading

hp
penguin-4-0
bbc
click
<