Is there a room for optimizing the existing code?
Absolutely. The first place I'd change has nothing to do with how you load the XML - it's string concatenation. I'd change your recursiveOperation
method to:
private static string RecursiveOperation(XmlNodeList xmlNodeList)
{
StringBuilder result = new StringBuilder();
foreach (XmlNode currentNode in xmlNodeList)
{
XmlNode clonedNode = currentNode;
// Remove try/finally block - if an exception is thrown your
// result will be lost anyway
string interimMarkup = RecursiveOperation(currentNode.ChildNodes);
clonedNode.InnerXml = interimMarkup;
result.Append(Environment.NewLine)
.Append(clonedNode.OuterXml)
.Append(Environment.NewLine);
}
return result.ToString();
}
It's possible you could optimize this further using a single StringBuilder
passed into RecursiveOperation
, but I haven't quite got to grips with your code sufficiently to say yet. (It's before the first coffee of the morning.)
In terms of the XML handling itself, you're currently doing a lot of reparsing by setting the OuterXml
node in each child (recursively). I suspect if I had a better grasp of what you were doing, it would be feasible to change the whole approach. Given that this is functionality that XmlReader
really doesn't have (it wouldn't make sense), it's not clear that you should be taking much notice of the other article at the moment.
How would I go about directly instantiating XmlTextReader from Control, StringWriter or XhtmlTextWriter object? Or do I really need to render it as a string first then instantiate XmlTextReader?
It's not clear what it would even mean to create an XmlTextReader
from any of those objects - they're not inherently a source of XML data. I think what you've got already looks reasonable to me.
If you're still concerned about the performance, you should avoid guesswork and use a profiler to measure where the time is being taken. You should set yourself a target first though, otherwise you won't know when you've finished optimizing.