|
[XML-DEV Mailing List Archive Home] [By Thread] [By Date] [Recent Entries] [Reply To This Message] Another way to optimize XML
Based on what Stephen said (below), I offer a model I've invented for optimizing XML to render large data sets as charts, tables, etc; it: 1. Converts XML documents to a delimited text file (e.g., CSV) prior to transmission to clients 2. Combines the XML data elements with relational database dumps/queries, legacy system flat file reports, and even OLAP data [Optional] 3. Manipulate/process the data (data mining, statistical analysis) [Optional] 4. Organizes the CSV's contents into logically/semantically configured arrays optimized for rapid rendering 5. Transmits the CSV to clients for local storage, parsing and rendering. This process: (a) Transforms XML documents into the smallest possible file (as much as 25 times smaller) for transmission and client-side storage (b) Enables integration with other data sources (c) Simplifies and speed the parsing process (d) Requires minimal processing and memory overhead (e) Uses semantics/pointers based on logical data structures (column/row locations) (f) Removes the threat of viruses from the file since delimited ASCII text files are virus-proof (g) Offers an additional level of security coming from the ability to "scramble" the data prior to transmission, then unscramble the data client-side. If presentation is done via COM, a macro-driven spreadsheet quickly and easily renders the CSV contents client-side to provide offline-interactive charts, etc. Users could, for example, slice, dice, and drill-down into the data instantaneously using pivot tables. They could also generate hundreds or thousands of completely different charts in seconds, and jump from viewing one to another instantaneously. Furthermore, since all the data are stored locally in the CSV, and since data updates are extremely rapid due to the very small size of the CSV, the information is portable and requires minimal online time, which is ideal for mobile users. For browser presentation, the CSV's contents are used client-side to generate graphic images of charts (e.g., gifs) and create an XML document, which is then be transformed via XSL it into XHTML, etc. We're now looking for a way a browser can render the CSV client-side without having to transform it back to XML. Does this sound like a valuable XML optimization model option for large data sets? Steve Beller -----Original Message----- From: Stephen D. Williams [mailto:sdw@l...] Sent: Monday, April 12, 2004 3:16 PM To: bob@w... Cc: 'Michael Champion'; ''xml-dev' DEV' Subject: Re: XML Binary Characterization WG public list available +1 My design does all of the things you mention, and more. I disagree however about optimizing multiple aspects: You must optimize on as many axii as are appropriate to you if you want a solution that is the best combination of tradeoffs. It is more difficult to do this of course, but anything else doesn't meet the requirements, my requirements at least. I do of course have strong ordering of what is important, in this order: CPU processing overhead, memory processing overhead, new semantics for libraries (fast pointers to support any logical data structure, deltas), storage/transmission space efficiency, support for binary payloads. This has led me to consider solutions that don't seem to have been tried seriously before such as avoiding parsing and serialization altogether for my main mode of usage. sdw
|
PURCHASE STYLUS STUDIO ONLINE TODAY!Purchasing Stylus Studio from our online shop is Easy, Secure and Value Priced! Download The World's Best XML IDE!Accelerate XML development with our award-winning XML IDE - Download a free trial today! Subscribe in XML format
|
|||||||||

Cart








