[XSL-LIST Mailing List Archive Home] [By Thread] [By Date] [Recent Entries] [Reply To This Message] [no subject]
Juggy writes: >>>>>>>>>>>>>>>>>>>> Date: Sun, 08 Sep 2002 12:39:28 +0200 From: juggy@xxxxxxx <mailto:juggy@xxxxxxx> Subject: speed questions Hi there, I have a xml dictionary file with about 95000 entries, 20 Megabytes in size. Due to its nature I need to do searching amongst different criterias (languages, substring matching, ...) and I intend to use XSL for it. Now - judging from my latest experiments - I wonder if xml/xsl is a good choice for implementing such a thing, since - given my present understanding of xml/xsl - each time I invoke the xsl(t)-processor the file is read (flatly) again. And since this file is so big I wonder if this is efficient? I also thought about generating separate, smaller xml files which hold additional statistical data that I could preprocess with another stylesheet in order to save some time, but I am not sure if this would be useful. <snip> >>>>>>>>>>>>>>>>>>>>>>> We use msxml and a VB app as "driver" and in that environment it is NOT necessary to reload the XML input file each time. We just load it once and then reuse the DOM multiple times against various XSLT transforms. My experience is that the single biggest contributor to speed is the use of xsl:key. We've seen speedups on the order of 50x by defining keys. My guess is that, with a 20MB XML file, you will likely see some long processing times if you don't use xsl:key. Regards, Bill XSL-List info and archive: http://www.mulberrytech.com/xsl/xsl-list
|
PURCHASE STYLUS STUDIO ONLINE TODAY!Purchasing Stylus Studio from our online shop is Easy, Secure and Value Priced! Download The World's Best XML IDE!Accelerate XML development with our award-winning XML IDE - Download a free trial today! Subscribe in XML format
|