memory management - Space efficient C++ vector allocator for large vectors? -
i'm working c++ code implements graph algorithm uses lot of small chunks of memory (a relative of gspan, doesn't matter). code implemented in c++ , uses std::vectors store many small elements (on order of 64 bytes each). however, i'm using on larger data sets original authors, , i'm running out of memory.
it appears, however, i'm running out of memory prematurely. fragmentation? suspect because std::vectors trying increase in size every time need more memory, , vectors insist on contiguous memory. have 8gb of ram , 18gb of swap, yet when std::bad_alloc thrown, i'm using 6.5gb resident , ~8gb virtual. i've caught bad_alloc calls , printed out vector sizes , here's see:
size: 536870912 capacity: 536870912 maxsize: 1152921504606846975 terminate called after throwing instance of 'std::bad_alloc' what(): std::bad_alloc
so, clearly, we've hit maximum size of vector , library trying allocate more, , failing.
so questions are:
- am correct in assuming problem is?
- what solution (besides "buy more ram"). i'm willing trade cpu time fitting in memory.
- should convert entire code use std::list (and somehow implement operator[] places code uses it?).. more ram efficient? @ least allow list elements non-contiguous...right?
- is there better allocator out there can use override standard on vectors use case?
- what other solutions missing?
since don't know how memory used, i'm aware if make changes there still might not enough memory calculations, suspect can @ least lot further i'm getting now, seems giving quickly.
i try using std::deque
direct drop-in vector
. there's possibility since (often) uses collection of chunks, extending deque
cheaper extending vector
(in terms of memory needed).
Comments
Post a Comment