[XSL-LIST Mailing List Archive Home] [By Thread] [By Date] [Recent Entries] [Reply To This Message]

RE: document() runs out of memory

Subject: RE: document() runs out of memory
From: Scott_Boag@xxxxxxxxx
Date: Tue, 21 Mar 2000 20:57:33 -0500
dom out of memory
OK, thanks for trying.

> Can someone confirm that the XSL segment I included in my initial post
will read in each file, one at a time, process it and then read in the next
without creating a huge union of all of the file contents?

Well, it does not create a union.  But, Xalan stuffs the documents in a
hashtable right now, and keeps it around until the processing is done.  It
really ought to keep them in a limited LRU pool.  If you are using the
Xerces DOM liaison, this will be more expensive because it will create a
full DOM each time.

It's open source, so if you want to work with us to put in a LRU pool, we
would be glad to help.  You could also install your own version of the
Document function.

-scott




                                                                                                                           
                    Brad Sommerfeld                                                                                        
                    <bsommerfeld@xxxxxxx>        To:     "'Scott_Boag@xxxxxxxxx'" <Scott_Boag@xxxxxxxxx>                   
                    Sent by:                     cc:     xsl-list@xxxxxxxxxxxxxxxx                                         
                    owner-xsl-list@mulber        Subject:     RE: document() runs out of memory                            
                    rytech.com                                                                                             
                                                                                                                           
                                                                                                                           
                    03/21/00 05:21 PM                                                                                      
                    Please respond to                                                                                      
                    xsl-list                                                                                               
                                                                                                                           
                                                                                                                           



I just tried the Xalan 1.0.0 version with the same results.  I'm not
convinced the issue is the processor as much as the XSL that I have
constructed.


Can someone confirm that the XSL segment I included in my initial post will
read in each file, one at a time, process it and then read in the next
without creating a huge union of all of the file contents?


Also, I tried the code suggested by Myriam and it wouldn't parse with a
reference to a variable within the variable declaration.


> Xalan keeps an internal cache of the documents, so it shouldn't be
> recreating a source tree or parsing each time.  I suspect you
> are running
> into a variable bug that has since been fixed.  We just released Xalan
> 1.0.0, so I suggest you try that.
>








 XSL-List info and archive:  http://www.mulberrytech.com/xsl/xsl-list


Current Thread

PURCHASE STYLUS STUDIO ONLINE TODAY!

Purchasing Stylus Studio from our online shop is Easy, Secure and Value Priced!

Buy Stylus Studio Now

Download The World's Best XML IDE!

Accelerate XML development with our award-winning XML IDE - Download a free trial today!

Don't miss another message! Subscribe to this list today.
Email
First Name
Last Name
Company
Subscribe in XML format
RSS 2.0
Atom 0.3
Site Map | Privacy Policy | Terms of Use | Trademarks
Free Stylus Studio XML Training:
W3C Member
Stylus Studio® and DataDirect XQuery ™are products from DataDirect Technologies, is a registered trademark of Progress Software Corporation, in the U.S. and other countries. © 2004-2013 All Rights Reserved.