[XML-DEV Mailing List Archive Home] [By Thread] [By Date] [Recent Entries] [Reply To This Message] Re: XML not ideal for Big Data
> Three limitations to processing XML files are: > > 1. XML File Size as set by the OS. > 2. RAM consumption. > 3. CPU consumption. > > Most XML Parsers can be used on big files (100GB) without exceeding these > limitations. This is because XML Parsers are stream based - reading small > chunks at a time. If you want to process the XML file, however, you will > need to use a streaming technology like SAX. Other XML processing > technologies like many DOM implementations will cause you to exceed RAM. > > Choosing a max like 1MB to 50MB will allow you to more freely use a wide > variety of XML and other technologies (like EMail attachments) making your > XML less constrained. Again, it depends on your use cases for the XML. > Jim in many ways you have hit the nail on the head. XML can be successfully used with large data dumps. The underlying problem is not necessarily with the data in XML format but the tools and frameworks that are used to process it. The knee jerk reaction to the common programmer when they have to deal with XML is to try and data bind against it. There are a wide variety of ways to process the XML, and normally the most common method isn't going to be the correct method with large data stores. XML Data Bases, Streaming, STAX, SAX, etc are much more efficient ways, then just trying to databind and store everything in memory (which is what typically is the first reaction). When the data binding fails, the programmer typically blames XML not their own choice in technology to process it. Dave
[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index] |
PURCHASE STYLUS STUDIO ONLINE TODAY!Purchasing Stylus Studio from our online shop is Easy, Secure and Value Priced! Download The World's Best XML IDE!Accelerate XML development with our award-winning XML IDE - Download a free trial today! Subscribe in XML format
|