|
[XML-DEV Mailing List Archive Home] [By Thread] [By Date] [Recent Entries] [Reply To This Message] RE: indexing and querying XML (not XQuery)
From: 'Alan Gutierrez' [mailto:alan-xml-dev@e...] * Bullard, Claude L (Len) <len.bullard@i...> [2005-08-23 15:34]: > One other thought: suppose that rather than providing Google with > original content, the index was locally generated and submitted in > a standard format. Wouldn't that be something like a locally > generated topic map that is published as the sole interface to the > content? Is that your idea, Alan? As long as the code is > inspectable (no compiled components), this is palatable. > Sorry, fading in and out today. Programming, for one. A lot of > new information, to try to absorb for another. > Topic maps? You mean these? > http://www.topicmaps.org/xtm/index.html Yes. They may not be exactly what you want, but the idea is that local search engines return a spec'd index to enable smooth integrated control of results rather than spidering and scraping, thus returning local control. > In essence, most blogs have a search box, so most blogs already > have a search engine of some kind on board. > I'd like to provide a REST interface that generates results in > XML, so that a blog can be searched via scripting. Fine so far. > I'd like to create a REST interface that would allow a script, > or a user interface, to create a comment regarding relevance, > and have the blog store that comment. Blogs have comments now. Microsoft web pages have ranking boxes but these are preset questions or simple pick a number between n and n. Relevance is a tricky concept. Relevant to what? > Gaming is countered by only permitting scripts or persons to > comment, if they also have a blog, and that blog is listed as a > trusted blog, somewhere. So a social network of members who can comment. Blogger has that now. Having scripts comment is tricky. That is where the 'do we trust this algorithm' question comes up. The article Koberg referenced mentions this issue and notes that in Cutting's opinion, most of the Google secret sauce algorithms have been reversed engineered by the gamers. Given a choice, I'd ask for full transparency for any algorithm used to evaluate my blog or any network of blogs in which I elect membership. I would want right of first refusal if my blog were selected for membership, and the ability to blog engines that attempt to pirate it into a social network (no press gangs). > A lot of voting. A lot of consenus. No single authority. Except for the script writer and the creator of the ontological basis for relevance. To make this work openly, you will need some form of folksonymy. If this all seems paranoid, well, it is. The World Wide Web has evolved into a universe full of one click nasties, neglinks, secret police, and the full rot of propaganda. It has all of the good stuff too, but 'trust until violated' rules aren't working, so defensive measures are necessary and they emerged too in the form of AdAware, etc. With blogs, now our publications are being used as well as our surfing behaviors. Folksonomies are one thing; having opaque scripts based on highly local no-opt-in categories create the linkages is suspect. Opt-with or opt-not. len
|
PURCHASE STYLUS STUDIO ONLINE TODAY!Purchasing Stylus Studio from our online shop is Easy, Secure and Value Priced! Download The World's Best XML IDE!Accelerate XML development with our award-winning XML IDE - Download a free trial today! Subscribe in XML format
|
|||||||||

Cart








