|
[XML-DEV Mailing List Archive Home] [By Thread] [By Date] [Recent Entries] [Reply To This Message] Re: Fast text output from SAX?
elliotte (in support) i just want to add here that we have been using ascii storage in our database for everything including numbers for over 20 years and we still run as fast, faster in fact, than databases that use binary formats (integers, doubles, etc). i pointed this out in my paper on the subject. the problem was, and remains, that conversion to and from readable formats is very "time expensive" (slow) and very often the data is not manipulated at all, so you're always copping conversion charges. but not always manipulation charges. the only possible saving can come from transfer speeds and that is domain dependent - how many significant places are you swapping for always having to send 16 bytes (or whatever precision you're using). and the binary is only useful for the numbers. i remain committed to text only formats, particularly for data interchange rick On Sat, 2004-04-17 at 01:13, Elliotte Rusty Harold wrote: > At 9:00 AM -0400 4/16/04, Stephen D. Williams wrote: > > >Binary doesn't imply there isn't any well-formedness checking, obviously. > > For once I agree. Obviously, binary doesn't imply that. However, in > practice, the binary formats I do see rarely do as much > well-formedness checking as a parser does, either in the XML domain > or elsewhere. Speed gains that come from eliminating well-formedness > checking should not be attributed to a binary format. Similar gains > can be had by eliminating well-formedness checking when processing > real XML (Not that I recommend doing that of course). > > >Incremental, or lazy evaluation, well-formedness is useful and > >potentially protects applications just as well as full > >well-formedness with better efficiency. > > Simply not true. It doesn't come close to protecting applications as > well as full checking. > > >Allowing the application the option of avoiding repetitive or > >unncessary well-formedness checking is a valid strategy. Your > >argument that data that a 'program' receives must always be fully > >validated in any situation could just as easily be extended to > >libraries and modules receiving DOM references or similar. What one > >system may do with libraries or software modules, another may do > >with plugins and another may do with n-tier processing steps. Does > >the granularity of the implementation somehow necessarily change the > >fundamental likelihood of corruption? > > All published interfaces should verify their preconditions. Libraries > should (and indeed the ones I write do) verify that input passed to > them from outside the library is correct. For instance, if I publish > an interface that expects a DOM object that can be serialized as > namespace well-formed XML, and you pass me a DOM object that can't > be, then the precondition is violated. The method should detect this > and throw an exception. And indeed this is exactly what I do: > > http://www.cafeconleche.org/XOM/apidocs/nu/xom/converters/DOMConverter.html#convert(org.w3c.dom.Document) > > Cowboy coding like you suggest scares me. I certainly wouldn't want > to have to trust any software written to such standards. Regrettably > I know there's a lot of such software out there, which is more reason > than ever for my software to be very careful about what it receives > from your software.
|
PURCHASE STYLUS STUDIO ONLINE TODAY!Purchasing Stylus Studio from our online shop is Easy, Secure and Value Priced! Download The World's Best XML IDE!Accelerate XML development with our award-winning XML IDE - Download a free trial today! Subscribe in XML format
|
|||||||||

Cart








