Topic: should std::vector<> exponential growth rate be followed strictlyin times of


Author: brangdon@cix.co.uk (Dave Harris)
Date: Mon, 25 Oct 2004 22:36:44 GMT
Raw View
dhruvbird@gmx.net ("Dhruv Matani") wrote (abridged):
> Now, in this case, the vector had an old capacity of init_sz, and
> 2*init_sz would be such that init_sz+2*init_sz is > 3GB. Linux(32-bit)
> supports a max of 3GB segments for the application's data, so clearly
> the allocator would not be able to provide the required memory. What
> I'm now thinking about is should the standard be permissive and
> allow the vector to start growing slowly now, say increase the size
> by 10 elements to prevent bad_alloc from being thrown?

But than adding N elements would mean copying the 1.2Gb already in the
vector N/10 times. That would be terribly slow.

std::vector has a max_size() member, which in your case should reflect the
3Gb limit. It would be reasonable for a vector to grow to min( max_size(),
capacity()*2 ). And it would be conforming, too - no conforming program
could tell the difference.

Incidently, many implementations use a factor of 1.5 rather than 2, so
they don't grow quite as quickly. 1.2Gb + 1.5*1.2Gb = 3Gb, so you are in
luck, as long as you don't have anything else consuming memory :-)

-- Dave Harris, Nottingham, UK

---
[ comp.std.c++ is moderated.  To submit articles, try just posting with ]
[ your news-reader.  If that fails, use mailto:std-c++@ncar.ucar.edu    ]
[              --- Please see the FAQ before posting. ---               ]
[ FAQ: http://www.jamesd.demon.co.uk/csc/faq.html                       ]