[XML-DEV Mailing List Archive Home] [By Thread] [By Date] [Recent Entries] [Reply To This Message]

Re: Too much power? was RE: 2007 Predictions

  • From: "Kurt Cagle" <kurt.cagle@g...>
  • To: "Michael Champion" <mc@x...>
  • Date: Mon, 22 Jan 2007 11:17:18 -0800

ge modular

I'm not sure what I said to inspire this very fine rant ...

Some really very good coffee, taken in entirely too large a quantity.

> For years, there are two messages that have consistently come from
Microsoft -

Only two? :-) I would have thought there would be a lot more inconsistent
messages, considering that nobody below Steve Ballmer is really in charge of
something as pervasive and multifaceted as XML, and that every team has a
somewhat different take on it.

This reality is not widely appreciated outside Redmond -- there's no grand
XML strategy coming from the top down, there's a zillion little project
proposals and bug reports and customer requests and competitive moves being
noticed and processed bottom up.  If things like SVG or XForms support
aren't happening, it's a good bet that nobody in a relevant team can make a
compelling case for spending money on it.  If some customers who need SVG or
XForms support in the browser migrate to Firefox, so be it -- the whole
point of XML is interoperability across platforms and applications.  Of
course if so many people need technologies we don't support badly enough to
migrate away from a revenue-generating product to get it, that's another
story entirely.  That kind of thing does happen, e.g. the surge in demand
for ODF support in Office after Office 2007 was locked down.  Unless the
competitors spend gazillions of lobbying bucks advocating SVG and XForms as
the Only Real Standards,  I don't think that is likely to happen.

There's something of a chicken and egg problem that I see. The gateway effect that I discussed earlier I think is still present, though somewhat weakened now compared to 1999. Had Microsoft supported SVG in 2000, people would very quickly have found a use for it (the fact that VML has been revived from the dead to serve as an alternative I think is a compelling point in that regard). Your customers (and I'm speaking here of you as an advocate of Microsoft, not as Michael Champion) generally can tell you what it is that they are currently using that they would like to see improved, but customers (business customers in particular) are VERY conservative in their ability to forecast a need for a technology that doesn't yet exist (which is one of the reasons why it's usually the small, agile companies that are staking their future on a new technology that are often the most successful players into a new space).

What's more, I really don't think it's a question of SVG and XForms being "the Only Real Standards". They are standards, and becoming fairly heavily used formats (especially the former). There is no one, least of all me, advocating that you need to incorporate the W3C standards as the default underlying implementation - that they exist as options for input or output is sufficient. If you have XAML support in IE7 (with WPF) - it's not a big jump to make static SVG a loadable format that then gets mapped via a transformation into XAML. My suspicion at this stage is that SVG is rapidly converging on the static definition first anyway. Similarly, SVG input and output is supported in Visio, and its fairly robust.

In other words, I think the issue here is not one of customer support, it's one of interoperability. In 1999, Microsoft could effectively dictate the standards for interoperability. Nearly a decade later, that's no longer anywhere near as true. The WS-* services stack has been very well adopted in specific business sectors (finance, being chief among them) for all the reasons that you would expect - it has a strong security layer, has the ability to perform an audit trail, has the pieces necessary for federation and identity management, and so forth. However, the rest of the world is quickly converging on a RESTian model for interchange that take advantage of the advantages that such a system has in a distributed network (syndication being a big part of that). Microsoft has lost the lead in terms of innovation on the web browser front, is facing challenges with ODF based office systems (in essence the long tail is adopting ODF, even if the tall head is still using Microsoft standards), has basically reached stasis on the database front, and is seeing at best slow incremental growth at this stage on the server front. As the rest of the world does move to W3C standards, Microsoft will either have to adapt to that or lose customers as interoperability between other systems makes Microsoft products too obviously oriented towards vendor lock-in.

Now, I'm not going to argue that there aren't customers that are perfectly happy with vendor lock-in. There are some fundamental benefits to that, some of which Len Bullard has articulated well in other posts. Interoperability by its very definition implies a consensus or compromise view that forces a lowest common denominator approach. As long as you are in one universe and are willing to abide by the rules of that universe, you can get the best that the universe has to offer - high performance apps, a centralized source of distribution, consistent APIs. Apple went that route for many years, and OSX still has a lot of those characteristics ... but OSX also made the decision to build on a Linux/Unix base, and as such has managed to take advantage of developments in that sector - and in response are beginning to feed back into that base. They chose to give up being completely monolithic in exchange for that interoperability, and for the most part I think that the gamble paid off. In a sense, they had to, however ... the developer base for Apple had been drying up for years, because it was too closed an environment ... and the only real way they could survive was to reach out to a different development base (Unix in this particular case).

There is a process in programming known as refactoring. Refactoring can be painful - after a product has reached a certain point in development (especially if there was an ad-hoc nature to that development) the realization is made that what exists is an interesting proof of concept, but the foundation needs to be rethought in light of subsequent developments. We talk a lot about going with consensus based development as stepping back in time, and to a certain extent that's true; however, what is important in refactoring is that you take all the things that you learned in order to shape how you develop the product, and in general the refactoring base exists at a considerably higher level than it did when you started the development the last time, usually with an eye towards generalizing the API, making the documentation and nomenclature more consistent, and increasing the applicability to other domains. This is what I see happening on the browser front right now - it's a refactoring of general programming principles, and while you can't (today) do in a browser what you can do in a stand-alone application, I'd be willing to bet that in a couple of years you will be able to - and you'll be able to do it in a much more open and distributed manner than you could today with those same stand-alone apps.

What's the best alternative? It depends upon your needs. If you need THIS functionality RIGHT NOW, then taking the existing applications makes perfect sense. If instead you are looking at where you want to be two or three years down the road, then it makes sense to invest in forward looking technologies. It's a gamble, to be sure, but that's the nature of business - you can't reap the rewards of risk without taking the risk in the first place.


> I think before throwing stones, it would be worth taking a good hard look
> at where the rest of the world (Microsoft's potential customers) are going
to.

That's pretty much how I spend my pathetic life, and I hear extremely few
actual or plausible customers expressing the views on the importance of the
second-generation XML standards that you are espousing.  It appears that
people want solid, performant implementations of the very basic XML specs
and tools to make them usable.  The few exceptions, e.g. people voting with
their feet against IE's stasis or the outpouring of love for XSLT2 (and
indifference to the old story about XQuery in .NET), convince me that it's
not just a matter of MS not listening to the customers, or existing
customers blindly obeying the mind control rays emanating from Redmond.  The
reality I see is that most of the benefit from XML comes from the simple
fact that it is the universally supported way of exchanging data.  The
actual benefits from and demand for other XML-related technologies drops
very steeply once you move outside geekdom.

Geeks are, by the nature of their profession, forward looking. They have to be. I'm a geek (I'm also a technology author, which puts me even more into a forward looking profession), and in order for me to stay employed tomorrow, I need to know what the hot technologies will be, I need to read the trend lines and the potential disruptors. If I'm wrong - I'm unemployed. That's why you listen not just to your customers - they are lagging indicators - but you also listen to your developer base and everyone else's developer base too, because they will often be the ones playing with tomorrow's technologies even when your customers are still integrating yesterday's, they are the ones that will typically be implementing the technologies and will have the first encounters with whether in fact such tech is viable or not, and they are the ones that may very well be making the purchasing decisions two or three or four years down the road. They are your canaries in the coal mine.

Software companies sell the future as a commodity. This makes them different from car companies or shoe companies or cereal companies - you're not just forecasting the demand for how much product you will sell, but you are forecasting whether the problem space in that time will be even remotely close to your vision. When you write a piece of commercial software, you're making a bet - that the services that the software will offer in six months, one year, three years, seven years will in fact be the ones that customers will buy then. It's why the failure rate for software projects is so high, and why things such as ODF can broadside you.

I don't envy you your position - you need to maintain a consistent dialog between the (often conflicting) viewpoints of the geeks who are trying to see the future and your customers who are hoping that YOU have the solution to the problem that they have right now. It's a constant balancing act, and it becomes easy to dismiss the naysayers like me who seem to be articulating solutions to problems that don't exist yet. That may be because my glasses are dirty and smudged, or it could be because I see those problems emerging in a time frame where they WILL become important to your customers in a few years.

> If all I'm looking for is a list of scheduled flights sorted by time, then

> yes, I would choose the first. The point I don't understand here is that
> this has nothing to do with JSON or XML - it is a matter of what is
exposed by the data provider.

My point is that the travel *service* is offering a lot more than simply the
data -- it's combining and processing the data and providing consumable
answers or physical world actions.  Interoperable data is a Good Thing, but
it's not the only thing that's important.

I understood the point - but I'm also saying that data that goes unused is simply noise. One of the things I offer as a consulting service is schema design. Schemas are complicated, not because the language is complex but because you are in essence having to make a data model that can effectively scale to the requirements at hand. Typically good schema design is finding the minimal subset necessary to provide the minimal level of support for a given domain, then to add into that in a modular fashion those things that provide additional targeted functionality to specific subdomains. That minimal subset is the interop layer - I can build transformations or work with manipulators to insure that this minimal set will provide sufficient data to be useful to the broad set of people that may need those minimal services. However, because I've spent some time with that minimal subset, building the overlays can also become modular, can scale up in complexity without impairing the core usage.

The Atom spec is a good example of this. It's a minimally functional syndication format; as people work with it, they have begun to extend it (usually, but not always, in separate namespaces) to be able to handle everything from calendaring functions to job processing. I find this happening with most of the specs that I work with or advocate. You really have two approaches here - one is to build on minimal schemas in a modular fashion, the other is to create superschemas. In my experience, unless those superschemas are actually just ontologies with no relational associations, the latter approach generally doesn't work, because it becomes too specific (and complex) for people to work with.

> Put another way (and to get back to your analogy) - I need a certain
voltage and
> amperage to run my computer. I rely upon the energy grid to insure that
there is
> comparatively little variability in the line, and I rely upon a
transformer
> to get just the precise amount of power that's needed to keep my laptop
happy.

At some point in the future, that will be a good analogy.  At the moment a
more apt analogy seems to be Edison saying "DC current is all you need, just
learn to live with that clunky generator on every block" ... but there are
lots of Teslas out there offering more and different kinds of power.  Real
standards will come along *behind* innovation. As you note most of the W3C
is doing a good job of accepting and adapting to this reality.  It's the
people who want to lead with committee-generated standards or authoritative
pronouncements, as opposed to following up on successful innovations with
standards, that I object to.

My suspicion at this stage is that this era is largely coming to a close, for the reasons I cited earlier. Sometimes you need to jump-start things - which I believe was the case in the era 1998-2003.  XSLT would not have emerged on its own, it required people to basically throw together a standard and then hammer down the rough edges. SVG is somewhat debatable. I will agree that I think it went too far in its approach, but graphical standards have always been contentious (I remember the introduction of Postscript, and the battles that insued around THAT). On the other hand, is there that much difference between Tim Berners-Lee making a pronouncement about a standard and Steve Ballmer making one?

Concerning the Tesla argument, however, putting Microsoft into the role of Tesla is something I find rather amusing. W3C standards have generally been adopted on their merits, not because of their marketing budget (which, from personal experience, is non-existent). Adopting an open standard has generally had to be done in opposition to the interests of senior management, because programmers are generally much closer to the code and are less likely to be swayed by a sales presentation giving arbitrarily derived ROIs. It often has necessitated going with inferior products in the short term in order to "grow" with those products in the long term, and typically that investment has paid off decent dividends. "

In general what I see from my perspective is that Microsoft is now trying to push technology that it has its patents in (and which offers some advantages - so did Edison's DC push, which would have given short term immediate power to consumers, at the expense of a General Electric generator on every block (shades of a Microsoft desktop on every desk)). Alternating current came largely from GE's competitors, primarily because alternate current systems consolidated the power distribution grid at a time when consolidation was necessary, although it was ironically a GE employee, Charles Steinmetz, that was responsible for most of the contemporary innovations that made alternating current the standard.

Perhaps this is the role that Microsoft should play in the future - not the role of Edison, but the role of Steinmetz - in being the industry leader that recognized that this core standardization is necessary and worked to improve it, rather than trying to stand in its way. GE was certainly not hurt by this acceptance of what was almost but not quite an industry standard; if you were to take even a portion of those Vista developers and shift them into IE8 in order to adopt those standards at a level higher than anyone else can, then is there any real harm in doing so?



[Date Prev] | [Thread Prev] | [Thread Next] | [Date Next] -- [Date Index] | [Thread Index]


PURCHASE STYLUS STUDIO ONLINE TODAY!

Purchasing Stylus Studio from our online shop is Easy, Secure and Value Priced!

Buy Stylus Studio Now

Download The World's Best XML IDE!

Accelerate XML development with our award-winning XML IDE - Download a free trial today!

Don't miss another message! Subscribe to this list today.
Email
First Name
Last Name
Company
Subscribe in XML format
RSS 2.0
Atom 0.3
 

Stylus Studio has published XML-DEV in RSS and ATOM formats, enabling users to easily subcribe to the list from their preferred news reader application.


Stylus Studio Sponsored Links are added links designed to provide related and additional information to the visitors of this website. they were not included by the author in the initial post. To view the content without the Sponsor Links please click here.

Site Map | Privacy Policy | Terms of Use | Trademarks
Free Stylus Studio XML Training:
W3C Member
Stylus Studio® and DataDirect XQuery ™are products from DataDirect Technologies, is a registered trademark of Progress Software Corporation, in the U.S. and other countries. © 2004-2013 All Rights Reserved.