Re: Processing huge XML files
From: "Michael Kay" <michael.h.kay@n...> > But really, when you get above 50Mb or so, you need to start looking at > XML databases. Another approach is to use steaming languages such as Perl and OmniMark, (and, I guess, Python?) especially if you are not updating the data just extracting information. Of course, you may need to take several passes. And you may need to have one pass of the data generate a program to be used for then next pass, a venerable technique that is often overlooked. But multiple passes with streaming languages is the way that many large scale publishing systems work. A lot can depend on whether your document has an order that is amenable to your application: storing metadata and keys before the data in particular. A very typical way of constructing streaming programs on large data sets is to do two passes: 1) Run over the data and extract all information that will be needed for decisions that otherwise require random access or lookahead. 2) Run over the data and perform the extractions/analysis, using the decision points. Cheers Rick Jelliffe
PURCHASE STYLUS STUDIO ONLINE TODAY!
Purchasing Stylus Studio from our online shop is Easy, Secure and Value Priced!
Download The World's Best XML IDE!
Accelerate XML development with our award-winning XML IDE - Download a free trial today!
Subscribe in XML format