[XML-DEV Mailing List Archive Home] [By Thread] [By Date] [Recent Entries] [Reply To This Message]

Statistical vs "semantic web" approaches to making sense of the Net

  • To: "xml-dev@l..." <xml-dev@l...>
  • Subject: Statistical vs "semantic web" approaches to making sense of the Net
  • From: Mike Champion <mc@x...>
  • Date: Wed, 23 Apr 2003 21:09:48 -0400
  • User-agent: Opera7.03/Win32 M2 build 2670

statistical semantic web

There was an interesting conjunction of articles on the ACM "technews" page 
[http://www.acm.org/technews/current/homepage.html] -- one on "AI" 
approaches to spam filtering  
http://www.nwfusion.com/news/tech/2003/0414techupdate.html and the other on 
the Semantic Web 

What struck me is that the "AI" approach (I'll guess it makes heavy use of 
pattern matching and statistical techniques such as Bayesian inference) is 
working with raw text that the authors are deliberately trying to obfuscate 
the meaning of to get past "keyword" spam filters, and the Semantic Web 
approach seems to require explicit, honest markup.  Given the "metacrap" 
argument about semantic metadata 
(http://www.well.com/~doctorow/metacrap.htm) I suspect that in general the 
only way we're going to see a "Semantic Web"  is for statistical/pattern 
matching software to create the semantic markup and metadata.  That is, if 
such tools can make useful inferences today about spam that pretends to be 
something else, they should be very useful in making inferences tomorrow 
about text written by people who try to say what they mean.

This raises a question, for me anyway:  If it will take a "better Google 
than Google" (or perhaps an "Autonomy meets RDF") that uses Baysian or 
similar statistical techniques to create the markup that the Semantic Web 
will exploit, what's the point of the semantic markup?  Why won't people 
just use the "intelligent" software directly?  Wearing my "XML database 
guy" hat, I hope that the answer is that it will be much more efficient and 
programmer-friendly to query databases generated by the 'bots containing 
markup and metadata to find the information one needs.  But I must admit 
that 5-6 years ago I thought the world would need standardized, widely 
deployed XML markup before we could get the quality of searches that Google 
allows today using only raw HTML and PageRank heuristic algorithm.

So, anyone care to pick holes in my assumptions, or reasoning?  If one does 
accept the hypothesis that it will take smart software to produce the 
markup that the Semantic Web will exploit, what *is* the case for believing 
that it will be ontology-based logical inference engines rather than 
statistically-based heuristic search engines that people will be using in 
5-10 years?  Or is this a false dichotomy?  Or is the "metacrap" argument 
wrong, and people really can be persuaded to create honest, accurate, self- 
aware, etc. metadata and semantic markup?

[please note that my employer, and many colleagues at W3C, may have a very 
different take on this and please don't blame anyone but me for this 



Purchasing Stylus Studio from our online shop is Easy, Secure and Value Priced!

Buy Stylus Studio Now

Download The World's Best XML IDE!

Accelerate XML development with our award-winning XML IDE - Download a free trial today!

Don't miss another message! Subscribe to this list today.
First Name
Last Name
Subscribe in XML format
RSS 2.0
Atom 0.3

Stylus Studio has published XML-DEV in RSS and ATOM formats, enabling users to easily subcribe to the list from their preferred news reader application.

Stylus Studio Sponsored Links are added links designed to provide related and additional information to the visitors of this website. they were not included by the author in the initial post. To view the content without the Sponsor Links please click here.

Site Map | Privacy Policy | Terms of Use | Trademarks
Free Stylus Studio XML Training:
W3C Member
Stylus Studio® and DataDirect XQuery ™are products from DataDirect Technologies, is a registered trademark of Progress Software Corporation, in the U.S. and other countries. © 2004-2013 All Rights Reserved.