[Home] [By Thread] [By Date] [Recent Entries]
At 17:24 10/04/2001 +0100, Henry S. Thompson wrote: >Tim Bray <tbray@t...> writes: > ><snip/> > > > Get some data on how much space and time a binary representation > > will save, then you'll be able to make intelligent quantitative > > decisions on where it's worthwhile deploying it. > > > > Until then, it's just amusing speculation. -Tim > >I strongly endorse Tim's point. I just wasted a weekend getting my >schema validator to dump the internal form of the 'compiled' >schema-for-schemas, on the _assumption_ that reloading that would be >faster than parsing/compiling the schema-document-for-schemas every >time I needed it. Wrong. Takes more than twice as long to reload the >binary image than to parse/compile the XML. I spent weeks sweating over a system that took over 3 minutes on average to render a HTML page through a long series of server side processes that involved lots of XML parsing of configuration files and such. The design was clean, understandable but performance [expletive deleted]. The rush of code to the hand to speed up the parsing bit was hard to resist. We calmed down. Took some deep breaths, ran the system with profiling probes turned on. Dropped the data into Excel and drew some graphs after a good nights sleep. The performance problems had *nothing* to do with the XML parsing and everything to do with unnecessary file IO. We dropped in a cache - basically a 1 page change to a very large system and voila latency dropped from 3 minutes to about 15 seconds! You gotta run your figures on this sort of thing. If I had a days holidays for every week I have wasted on an "obvious" performance improvement over the last 18 years, I would not be seen on XML-DEV till next April fools day. regards, Sean
|

Cart



