Topic: Strengthening and weakening requirements on standard algorithms in TR2
Author: David Abrahams <dave@boost-consulting.com>
Date: Mon, 6 Jun 2005 21:56:58 CST Raw View
caj@cs.york.ac.uk (chris jefferson) writes:
> Various of the searching algorithms (find, find end, search_n, search,
> etc) specify their complexity in terms of applications of the
> corresponding predicate. There are two major problems I have with the
> current specifications:
>
> 1) Their requirements are not in terms of the result.
>
> For example, I would perfer find's complexity to be:
> At most "i - first + 1" applications of the corresponding predicate
> where such an iterator i exists, or last - first applications otherwise.
>
> 2) They don't place a limit on the number of "other things" the
> algorithm can do (this would be more imporant if (1) was fixed).
>
> Some implementations tend to write something like:
>
> Dist = std::distance(first1, last1);
>
> At the beginning of algorithms which take forward or bidirectional
> iterators, meaning even if there was a match near the start then the
> algorithm still takes O(n).
Even with a perfect implementation the algorithm is still O(N).
:)
> Was there a reason not to place stronger requirements, or was it simply
> thought the existing requirements would be strong enough for most people?
I don't know, but at some point, unless you specify the code that
implements the algorithm (and we don't want to do that for many good
reasons) it will come down to quality-of-implementation. You have to
draw the line somewhere.
> Out of curiousity, is people considered this reasonable, would it be
> feasable to add these tightened requirements to TR2? (I'm not asking
> someone else to write it for me, I'm asking if any experts think there
> is a reasonable choice such an application would be accepted).
Yeah, there's a reasonable chance.
> Another similar thing I'd like to see in TR2 is a weakening of the
> requirements of various algorithms (in particular, seeing sort and
> stable_sort only require bidirectional iterators being the most
> important,
What time and space complexity guarantees would you offer when
bidirectional iterators are passed?
> but other weakenings have been discussed in this group). Does
> this sound reasonable?
It does to me.
> or would it be necessary to add a new algorithm,
> tr2::bidirectional_stable_sort, or something similar (which seems
> messy).
Definitely not.
--
Dave Abrahams
Boost Consulting
www.boost-consulting.com
---
[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std-c++@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]
[ FAQ: http://www.jamesd.demon.co.uk/csc/faq.html ]