[XML-DEV Mailing List Archive Home] [By Thread] [By Date] [Recent Entries] [Reply To This Message]

re: Re: A processing instruction for robots

  • From: rev-bob@g...
  • To: xml-dev@i...
  • Date: 09 Dec 99 05:32:38 -0500

angel fire vocab level g
I suppose this is where I'm supposed to come in....  ;)

> > Adding the robots info to every DTD in the world requires
> > unanimous agreement. Adding a PI requires non-interference
> > with other PIs, a vastly simpler task. Waiting for XML
> > to support mixin vocabularies and for those to be widely
> > used, could take a few years.
> 
> Walter is right on both counts, but I'm having trouble getting comfortable
> with his PI idea.  Not violently against it, but two things make me
> uncomfortable.  First of all, PIs basically [expletive deleted].  Having said that, if you
> gotta use them, this is the kind of thing to use them for.

Agreed.  This is an instruction to a specific class of processor, hence it's a good fit as a 
PI from that angle.

> But my big problem is with the idea that individual resources ought to embed
> robot-steering information. It just feels like the wrong level of
> granularity.  Either this ought to be done externally in something like
> robots.txt but smarter, at the webmaster/administrator level, or, with a
> namespaced vocabulary at the individual element level.

This has been tried.  The problem is that the current robots.txt idea just doesn't work for 
everybody - robots.txt is supposed to reside in the domain's root [1], and not everybody 
has that access.  (Big examples: Geocities, Angelfire, Tripod, AOL....)  Granted, a tweak 
to that specification that would allow local copies of robots.txt to affect their subdirectory 
tree would be *most* helpful in that regard, but that just doesn't exist.

[1] - See http://info.webcrawler.com/mak/projects/robots/norobots.html under "The 
Method" header.  The filename is "/robots.txt" - which forces the file into the root.

Because of this overwhelming gap, there's a hack in HTML that uses META to 
granularize this at a per-document level, and a few bots are good about obeying that 
syntax.

> The PI has the characteristic that it *has* to be in the document and can modify
> *only* the whole document.  Also I question the ability of authors to do the right
> thing with this kind of a macro-level control.

I do it all the time.  In fact, I have a default value I can specify in my templates.  (Yes, I 
could use a robots.txt file - the current method is a holdover from before I had a domain 
name for my site.)

> Also I question the ability of robot authors to do the right thing at the individual
> document level.

That's already a current issue.  Bot authors who are conscientious enough to obey the 
META hack will have no problem modifying their source to obey the XML PI as well; 
it's a trivial transformation.  (Especially if the PI uses syntax that's as close to the META 
version as possible!)

> In any case, there really should be a namespace with a bunch of predeclared
> attributes for this purpose; then for those who want to do fancy things,
> they can do so in a clean way at the individual element level.

Fine - swipe the existing values and go from there.  The fewer changes made, the better - 
from all viewpoints.  Not only will there be fewer deltas for page authors to learn, but 
bot authors will be better able to just reuse existing META code to accomodate the PI.

Note that I'm not saying that a local robots file wouldn't be a wonderful idea - just that 
since you currently have only the choices of "global" and "per document" with HTML, 
you ought to have *at least* those same choices with XML.  A local robots.txt would be 
tasty gravy indeed.

> Anyhow, is there enough XML on the web to make this interesting?  Serious
> question, I don't know the answer. -T.

I have enough X(HT)ML up to be very interested in this matter - and there's only going 
to be more online as the spec progresses.  Why not address the issue *before* there's a 
huge amount of X(HT)ML online, instead of waiting until a few assorted hacks come 
up?



 Rev. Robert L. Hood  | http://rev-bob.gotc.com/
  Get Off The Cross!  | http://www.gotc.com/

Download NeoPlanet at http://www.neoplanet.com


xml-dev: A list for W3C XML Developers. To post, mailto:xml-dev@i...
Archived as: http://www.lists.ic.ac.uk/hypermail/xml-dev/ and on CD-ROM/ISBN 981-02-3594-1
To unsubscribe, mailto:majordomo@i... the following message;
unsubscribe xml-dev
To subscribe to the digests, mailto:majordomo@i... the following message;
subscribe xml-dev-digest
List coordinator, Henry Rzepa (mailto:rzepa@i...)



PURCHASE STYLUS STUDIO ONLINE TODAY!

Purchasing Stylus Studio from our online shop is Easy, Secure and Value Priced!

Buy Stylus Studio Now

Download The World's Best XML IDE!

Accelerate XML development with our award-winning XML IDE - Download a free trial today!

Don't miss another message! Subscribe to this list today.
Email
First Name
Last Name
Company
Subscribe in XML format
RSS 2.0
Atom 0.3
 

Stylus Studio has published XML-DEV in RSS and ATOM formats, enabling users to easily subcribe to the list from their preferred news reader application.


Stylus Studio Sponsored Links are added links designed to provide related and additional information to the visitors of this website. they were not included by the author in the initial post. To view the content without the Sponsor Links please click here.

Site Map | Privacy Policy | Terms of Use | Trademarks
Free Stylus Studio XML Training:
W3C Member
Stylus Studio® and DataDirect XQuery ™are products from DataDirect Technologies, is a registered trademark of Progress Software Corporation, in the U.S. and other countries. © 2004-2013 All Rights Reserved.