Question

I have an asp.net .aspx page(say fruits.aspx page) which lists all the fruits(apple, banana, mango etc) with a thumbnail, title and link which leads to each fruit's respective detail page. Now all this data is being retrieved from an XML with the help of backhand code with help of an XSLT and user-control.

Now since the data and URLs of each fruit's detail page are not there statically on this page, it will not be crawled and indexed as per my knowledge.

Is there a workaround that I can do to make each fruit's detail page crawled and indexed.

If I had the dynamic URLs only with something like "?var=value", I could solve it with static/dynamic conversion using URL re-write. But here the URL itself is not there but is generated from code behind.

Was it helpful?

Solution

Search engines will not see the aspx file as it sits on your server; Instead, they see the same thing your web browser does: the resulting HTML output.

This means that the parameters you speak of will be seen and indexed properly by search engines.

OTHER TIPS

There is no way to do it then. Each page you want indexed must have a unique URL. When you generate the page, just generate a unique URL. Take your query parameters and paste them on the end of your script name.

For example say that fruits.aspx is called with ?fruit=banana as a query parameter. Your best option is to generate a page with a unique static URL for example make the link to the banana page look like /fruits.aspx/fruit/banana.

Even better would be to rewrite it to remove the .aspx. Then the site looks like all static content, which is even better for indexing. If a URL looks like it is backed by a databasem the search engine is less likely to index everything.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top