Question

I'm trying to implement breadcrumb functionality on a visual studio ASP C# site using the SiteMapPath Control.

The company I work for has inherited the site and we're primarily PHP developers so forgive the ignorance.

Originally when I dropped in the SiteMapPath from the tool box I got an error message saying there was no web.sitemap file found. I then created one using an application that supposedly does the job for ASP sites.

The error message we get now tells us that your not allowed to have the same URL twice in the xml structure. This seems pretty ridiculous as many pages will have the same links.

Some research has told me to append each of the URLs with a unique virtual useless requeststring in the xml. This also seems a bit ridculous, and a total hack - especially with a site containing potentially hundreds of urls repeated.

Can anyone shed a little light on this, or maybe even a totally different approach??

Thanks so much!

Was it helpful?

Solution

"The error message we get now tells us that your not allowed to have the same URL twice in the xml structure. This seems pretty ridiculous as many pages will have the same links."

I think there's a bit of confusion: Pages having the same links is irrelevant - web.sitemap is just an XML map of page locations. The file doesn't record page cross-linking. But, you can nest things as such if they have identical names:

<?xml version="1.0" encoding="utf-8" ?>
<siteMap xmlns="http://schemas.microsoft.com/AspNet/SiteMap-File-1.0" >
    <siteMapNode url="/" title="Home">
        <siteMapNode url="/subdir" title="Subdir">
            <siteMapNode url="/subdir/page.aspx" title="Nested Page" />
        </sitemapNode>
        <siteMapNode url="/page.aspx" title="Root Page" />
    </siteMapNode>
</siteMap>

OTHER TIPS

Basically, the default site map provider (System.Web.XmlSiteMapProvider) requires all URLs to be unique, so it can easily resolve the currently selected node, with the property SiteMap.CurrentNode.

This is a bit frustrating, which results in people tacking on bogus query strings like you've noted. For the simple case with only a few dups, this is usually acceptable.

You can however, implement your own site map provider, see Implementing ASP.NET Site-Map Providers on MSDN. By doing this, you can have your own logic that processes your sitemap file, and get the behavior you want.

A custom sitemap provider would probably be the cleanest approach in this case.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top