RE: Validating Bulk XML Data
> Broadly we can have two approaches either change > the schema a bit so that the xsd grammar sees each of the > files as one giant xml document with many nodes or parse each > of the 1000. The disadvantage with the first approach seems > to be if one of the original gomls is invalid the whole > aggregated goml gets invalidated. The second approach is > potentially slow (we are using this and multi threading it > and tweaking bits of it). I can't see any intrinsic reason why validating 1000 small documents should take longer than validating one document formed by concatenating the content, provided the schema itself is only prepared once. Michael Kay http://www.saxonica.com
[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index]
PURCHASE STYLUS STUDIO ONLINE TODAY!
Purchasing Stylus Studio from our online shop is Easy, Secure and Value Priced!
Download The World's Best XML IDE!
Accelerate XML development with our award-winning XML IDE - Download a free trial today!
Subscribe in XML format