Topic: should std::vector<> exponential growth rate be followedstrictlyin times of
Author: dhruvbird@gmx.net ("Dhruv Matani")
Date: Fri, 29 Oct 2004 17:55:39 GMT Raw View
On Mon, 25 Oct 2004 22:36:44 +0000, Dave Harris wrote:
> dhruvbird@gmx.net ("Dhruv Matani") wrote (abridged):
>> Now, in this case, the vector had an old capacity of init_sz, and
>> 2*init_sz would be such that init_sz+2*init_sz is > 3GB. Linux(32-bit)
>> supports a max of 3GB segments for the application's data, so clearly
>> the allocator would not be able to provide the required memory. What
>> I'm now thinking about is should the standard be permissive and
>> allow the vector to start growing slowly now, say increase the size
>> by 10 elements to prevent bad_alloc from being thrown?
>
> But than adding N elements would mean copying the 1.2Gb already in the
> vector N/10 times. That would be terribly slow.
>
> std::vector has a max_size() member, which in your case should reflect the
> 3Gb limit. It would be reasonable for a vector to grow to min( max_size(),
> capacity()*2 ). And it would be conforming, too - no conforming program
> could tell the difference.
>
> Incidently, many implementations use a factor of 1.5 rather than 2, so
> they don't grow quite as quickly. 1.2Gb + 1.5*1.2Gb = 3Gb, so you are in
> luck, as long as you don't have anything else consuming memory :-)
Yes, but practically it is not the case, so when the user just wanted to
add 1 more entry and the vector fails is much worse IMHO to copy 1.2Gb in
large amount of time and have the operation succeed.
Regards,
-Dhruv.
---
[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std-c++@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]
[ FAQ: http://www.jamesd.demon.co.uk/csc/faq.html ]