[XSL-LIST Mailing List Archive Home]
[Reply To This Message]
Re: Re: Simple problem - complicated solution - perfor
Subject: Re: Re: Simple problem - complicated solution - performance|
From: Antonio Fiol Bonnín <fiol@xxxxxxxxxx>
Date: Fri, 17 May 2002 09:01:40 +0200
I had already seen FXSL before posting to this list.
When coming to performance, AFAICT, O(n) may seem worth using, except
for the fact that I am thinking of using this on a browser-side
That meaning: I have the XML file and the XSLT file on a server. The XML
is dynamically generated, and includes a "call" to the XSLT file. When
thinking of FXSL, the size of the "functions" itself makes it not worth
I have not deeply studied what was proposed for O(n log n), but that may
be interesting if it is simple.
Otherwise, the --very simple-- O(n2) approach will do, as my data sets
will invariantly have 10 elements each, which should not be a big deal.
IIRC, O(n) should be equal to O(n log n) for a 10 sample data model, and
O(n2) is worse, but only 10 to 100 (1 order of magnitude). The
processing time used on it may be worth the network transfer time over a
56K analog modem (yes, still more used in Europe than DSL lines).
I will study that, and maybe do some performance tests. It seems to me
that I will get some interesting results.
Thank you very much, especially for such a quick answer.
Antonio Fiol Bonnín
Dimitre Novatchev wrote:
The performance will remain linear if a DVC algorithm is used.
Read about DVC algorithms and their optimisation at:
Many functions in FXSL have their DVC implementation.
"Stuart Celarier" <stuart at ferncrk dot com> wrote:
Dimitre raises an interesting point about using recursion for computing
the minimum and maximum values of a set of data. Let me throw this
question back out to the list, especially to people with XSLT
It seems like there must be some practical limits to recursion since
that would involve a call stack in memory. Is it reasonable to think
about recursion that stacks up a couple of thousand or tens of
of calls deep? Taking a page fault on a call stack seems like it could
get very expensive very quickly.
Clearly computing a the minimum and maximum should require linear time,
O(n). But if the computation itself doesn't scale well, then a
O(n) algorithm could perform much worse in practice. Comments?
Do You Yahoo!?
LAUNCH - Your Yahoo! Music Experience
XSL-List info and archive: http://www.mulberrytech.com/xsl/xsl-list
Description: S/MIME Cryptographic Signature
PURCHASE STYLUS STUDIO ONLINE TODAY!
Purchasing Stylus Studio from our online shop is Easy, Secure and Value Priced!
Download The World's Best XML IDE!
Accelerate XML development with our award-winning XML IDE - Download a free trial today!
Subscribe in XML format