Re: Optimising multiple document() calls
I am pre-processing batches of about 1000 XML files at a time using Saxon. Part of the pre-process involves aggregating linked XML documents into the current document. Naturally, I use the document() function for this: ... How would you optimise this? Would a deep-copy with <xsl:copy-of> be faster? Or am I better off writing my own processor for this aggregating step (easy enough).
Since the XSLT processor is obliged to keep all document() node trees around somehow (it doesn't know when you might need a given tree again in the transform and any generated identifiers for the nodes need to be persistent in the transform), I would recommend a simple SAX process for this kind of aggregation. Not only would it be faster (no building of the node tree), it would have a very small footprint (no persistence of any input documents).
Some XSLT processors will allow you to pass your output SAX events as input to an XSLT transform so in such a case you wouldn't even have the aggregated file sitting around to worry about.
I hope this helps.
. . . . . . . . . Ken
PURCHASE STYLUS STUDIO ONLINE TODAY!
Purchasing Stylus Studio from our online shop is Easy, Secure and Value Priced!
Download The World's Best XML IDE!
Accelerate XML development with our award-winning XML IDE - Download a free trial today!
Subscribe in XML format