[Home] [By Thread] [By Date] [Recent Entries]

  • To: "'xml-dev@l...'" <xml-dev@l...>
  • Subject: XML Schema for large database
  • From: Bill Riegel <BRiegel@l...>
  • Date: Thu, 15 Aug 2002 09:17:29 -0500
  • Return-receipt-to: Bill Riegel <BRiegel@l...>

Wanting to serial the contents of portions of a large database, i.e. 500 -
1000 tables definitions.
The purpose of the file to allow it to be loaded by someone else, somewhere
else. 

Trying to understand the implications of creating a XML Schema that reflects
the rules
of the entire database. The XML Schema will be 1000's if not 10000's line
long. 

I would auto create the XML Schema from the database's metadata. 

But would there be problems with parsers or transformation engines being
able
to consume it. 

I have been toying with the idea of breaking up the logical model into
several
smaller sets, and only allowing the user selection a set of tables that are
defined in the set.

Or have the minimal about of data in one xml file. and collect the files in
a 
zip file. Each file would then have a small(er) Schema. 

Is it possible/probable that a large schema would not be able to be
processed ??

Bill Riegel
Landmark Graphics
Phone: 713-839-3388

Site Map | Privacy Policy | Terms of Use | Trademarks
Free Stylus Studio XML Training:
W3C Member