Re: Soft Landing
> > I always felt that feeding them > > automagically from services such as full-text > > indexing and analysis was dicey. If you > > use semantic nets to create semantic nets, it > > is a bit like using an a-bomb to detonate > > an h-bomb. If I were king of the world, with unlimited budget and unlimited cooperation, I'd start with a taxonomy and domain experts. Let them define a domain vocabulary (again I keep pointing to MeSH for medical literature). Then, when new literature is published each month, run it through machine analysis to identify new terms that start popping up in the literature (e.g., XML a few years ago). Also identify relationships to existing concepts or terms (similarity searches), and so on. The domain experts identify an alert level (e.g., 5 citations) and when a term or concept exceeds that level, it's included in a monthly update they receive -- new terms and concepts in the literature. They use that information when updating domain vocabularies on a quarterly basis. Using a pre-defined domain vocabulary is probably more efficient than doing it all automagically using inference engines, machine analysis of schemas, RDF, parsing and so on. Look at the portals that migrated to a classification scheme, instead of being simply keyword container searches.
PURCHASE STYLUS STUDIO ONLINE TODAY!
Purchasing Stylus Studio from our online shop is Easy, Secure and Value Priced!
Download The World's Best XML IDE!
Accelerate XML development with our award-winning XML IDE - Download a free trial today!
Subscribe in XML format