Topic: should std::vector<> exponential growth rate be followed strictly in times of low availabe memory.
Author: johnchx2@yahoo.com (johnchx)
Date: Fri, 5 Nov 2004 15:52:42 GMT Raw View
dhruvbird@gmx.net ("Dhruv Matani") wrote
> 2. Every insertion involves touching a lot of data, so places where data
> is coming in at a very fast rate, this is unacceptable.
If you need *consistently* fast push_back() times -- to keep up with a
fast input source, for instance -- then vector's amortized-constant
time probably won't be good enough. Amortized-constant time is fine
if you don't mind the application grinding to a halt from time to time
while reallocation is performed, but it sounds like you may not be
able to afford it.
> Here, I know in advance the maximum size of the vector, but in other cases
> I might not.
Your best bet is probably to use vector only if you know the maximum
size in advance. If you can reserve() sufficient space up-front, you
don't have to worry about re-allocation. If you can't, and you can't
afford long copy times while receiving input, you'll probably be
better off with a deque.
> Now, given that fact that I have say 2GB of Phy. mem. I
> should not be consrtained by the exponential growth policy.
Physical memory is probably irrelevant. The real constraint is more
likely to be the amount of unfragmented space available in the portion
of the virtual address space of the process which is earmarked for
heap allocation. If you don't absolutely need the entire data
structure to be in a single continguous chunk of address space, you'll
get better results (fewer bad_alloc exceptions, no copies, no
reallocation delays) from a deque.
Of course, it's not hard to implement your "reallocation fallback"
policy with a wrapper function like this:
template < class T, class A >
void modified_push_back ( std::vector<T,A>& v, const T& t )
{
try {
v.push_back(t);
}
catch (std::bad_alloc&) {
std::vector<T,A> v2;
v2.reserve(v.capacity() + 1 ); // or whatever
std::copy( v.begin(), v.end(), std::back_inserter(v2) );
v.swap(v2);
}
}
---
[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std-c++@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]
[ FAQ: http://www.jamesd.demon.co.uk/csc/faq.html ]
Author: johnchx2@yahoo.com (johnchx)
Date: Mon, 8 Nov 2004 02:03:07 GMT Raw View
dhruvbird@gmx.net ("Dhruv Matani") wrote
> On Fri, 05 Nov 2004 15:52:42 +0000, johnchx wrote:
> > Of course, it's not hard to implement your "reallocation fallback"
> > policy with a wrapper function like this:
> >
> > template < class T, class A >
> > void modified_push_back ( std::vector<T,A>& v, const T& t )
> > {
> > try {
> > v.push_back(t);
> > }
> > catch (std::bad_alloc&) {
> > std::vector<T,A> v2;
> > v2.reserve(v.capacity() + 1 ); // or whatever
> > std::copy( v.begin(), v.end(), std::back_inserter(v2) );
> > v.swap(v2);
> > }
>
> I don't think this would work for the simple reason that a vector
> implementation is not required to allocate ONLY n bytes(memory for n
> objects) after a reserve(n)
> command. The guarantee is that at least n objects can be inserted(assuming
> that it is empty initially) without reallocation. The vector can do this
> internally:
>
> vector<T>::reserve(int n)
> {
> int new_size = std::max(this->capacity()*2, n);
> // Normal stuff.
> }
Maybe you didn't notice that we're calling reserve() on a brand new
vector, whose capacity is zero (or near zero, if the implementation
happens to do a little pre-allocation). Under such conditions,
reasonable implementations -- such as the one you illustrate -- will
allocate memory for n "slots," or perhaps a bit more. It would be a
poor (though formally compliant) implementation that reserved much
more than n slots when its previous capacity() was zero or near-zero.
---
[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std-c++@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]
[ FAQ: http://www.jamesd.demon.co.uk/csc/faq.html ]