[XSL-LIST Mailing List Archive Home] [By Thread] [By Date] [Recent Entries] [Reply To This Message] RE: Improving performance for huge files with XSLT
Yes, it's a problem. Every XSLT processor builds a tree of the source document in memory, this tree will often occupy about 100 bytes per node in the document. One solution is to write a SAX filter to subset the data on the way in to the XSLT processor, if you only need to access part of it. Saxon has an extension <saxon:preview>, which processes the document one subtree at a time. This is rather messy but it can sometimes help. Sebastian Rahtz has published some performance comparisons for various processors on large XML files. Mike Kay > -----Original Message----- > From: Ornella Piva [mailto:Ornella.Piva@xxxxxx] > Sent: 13 September 2000 08:14 > To: XSL-List@xxxxxxxxxxxxxxxx > Subject: Improving performance for huge files with XSLT > > > Hi, > I'm using XSLT to convert xml files into other xml files. I have to > convert huge xml files (containing, e.g. 50000/100000 nodes), but the > perfomance is becoming a real problem: it takes more or less > 20 minutes > to convert a file with 100000 nodes. > Are there some general methods to improve the performance > with huge xml > files? Did somebody encounter the same problem? How did you solve it? > > Thanks, > Ornella Piva > > > XSL-List info and archive: http://www.mulberrytech.com/xsl/xsl-list > XSL-List info and archive: http://www.mulberrytech.com/xsl/xsl-list
|
PURCHASE STYLUS STUDIO ONLINE TODAY!Purchasing Stylus Studio from our online shop is Easy, Secure and Value Priced! Download The World's Best XML IDE!Accelerate XML development with our award-winning XML IDE - Download a free trial today! Subscribe in XML format
|