[XML-DEV Mailing List Archive Home]
[By Thread]
[By Date]
[Recent Entries]
[Reply To This Message]
Re: Got a huge XML document?
- From: Michael Sokolov <msokolov@safaribooksonline.com>
- To: Damian Morris <damian@moso.com.au>
- Date: Thu, 12 Sep 2013 19:40:19 -0400
We've worked with a lot of reference
works, which tend to be large, and for which we've needed to
implement streaming processes (pre-XSLT 3.0 these have been
awkward cranky hand-baked SAX streams). I can't share any of the
XML sadly, since it's all proprietary, but I'm always happy for an
excuse to talk about them.
An example is the OED, which comes as a set of 26 XML files: the
largest is S, which is about 350MB. In total, the work is 2.6G of
XML, but of course this is basically a list of 280,000-odd
entries. Still some are quite large (I think "set" is the
biggest) and all have a complex internal structure (entries broken
into senses which further have quotations, all of which are
independently searchable entities).
Even when we deal with books with more complex global structure
(like the complete annotated works of an author, or a scholarly
bible with commentary), we tend to atomize them into chapters or
sections or the like. It's the only way for humans to work with
them. In those cases, it is true that the file contains
essentially a sequence of chunks -- however, preserving the
hierarchy is important, and this introduces a lot of complexity
for streaming (especially with those nasty SAX parsers), because
you want to maintain essentially all of the content of your
ancestor nodes *that is not part of some other chunk* since that
represents contextual metadata of interest.
Probably the largest example of a very deep nesting structure that
I've worked on is the Biblioteca Teubneriana Latina, which is
essentially every known classical Latin work of literary interest
(it doesn't include laundry lists). It's about 795MB, delivered
in 36 files, but the first one of these incorporates the rest
using XML entity references! So we process it as a single file --
there is a tiny fragment of useful metadata in that first file.
That is broken down by a table of contents scheme that goes like:
letter, author, work, and then arbitrary scheme that depends on
the work, down to the individual page or line. Even there, I
think the deepest node in the TOC is about 12 levels down. You
can browse the TOC here: http://www.degruyter.com/db/btl, although you
have to pay a lot to get the text of the entries (your library or
institution might have it).
-Mike
On 9/12/2013 6:48 PM, Damian Morris wrote:
3FF5B79B-E83F-4A82-84DC-CADBD869310E@moso.com.au"
type="cite">
I've got some XML in my test suite for Xmplify - individual files
- that are from the WIPO, and weigh in at 360 MB...
Cheers,
Damian
--
MOSO Xmplify XML Editor - Xmplary XML for Mac OS X
w: http://xmplifyapp.com
t: @xmplify
> Date: Friday, 13 September 2013 6:49 AM
>
>> On 12 Sep 2013, at 19:47, David Lee wrote:
>>
>> In my experience, ALL Large XML files are
really collections of smaller files.
>> I have never seen a single XML document of any
large size that isnt simply
>> <root>
>> <row> document 1 .... </row>
>> ..... 10 bizillion times
>> </root>
>
> That's certainly a very common pattern, but I've
seen a few examples that
> don't quite fit it. For example, a database dump
of 50 tables each of which
> fits the above pattern. Or GIS data consisting of
large numbers of objects of
> a wide variety of different kinds. What does seem
to be true is that as files
> get larger, it's rare for the hierarchy to get
deeper.
I agree with that and wanted to share a brief note on
our experience, dealing
primarily with XML that is to be printed in some
format.
While XML for things like parts catalogues can get
quite large, they tend to be
of the pattern of repeating sets of data. Some of the
larger XML documents we
deal with (which are not "database dumps") tend to be
lengthy pieces of
legislation.
While legislation can be broken down into provisions
and so on, there is still
enough cross-referencing and relationships between the
information to make it
tricky to break up into standalone components.
Having said that I don't think I've seen a single piece
of legislation (eg.
Bill or Act) exceed 100MB in XML document size.
-Gareth
|
[Date Prev]
| [Thread Prev]
| [Thread Next]
| [Date Next]
--
[Date Index]
| [Thread Index]
|
PURCHASE STYLUS STUDIO ONLINE TODAY!
Purchasing Stylus Studio from our online shop is Easy, Secure and Value Priced!
Download The World's Best XML IDE!
Accelerate XML development with our award-winning XML IDE - Download a free trial today!
Subscribe in XML format
RSS 2.0 |
|
Atom 0.3 |
|
|
Stylus Studio has published XML-DEV in RSS and ATOM formats,
enabling users to easily subcribe to the list from their preferred news reader application.
|
Stylus Studio Sponsored Links are added links designed to provide related and additional information to the visitors of this website.
they were not included by the author in the initial post. To view the content without the Sponsor Links please
click here.
|
|