-
Notifications
You must be signed in to change notification settings - Fork 87
Description
Hello.
I am running a multithreaded application performing some regression on a many observations. The regression is an expectation maximization so is a little complex. I break that data into blocks and pass each block to a different thread. Each thread gets its own instance of the regression object that is doing the math.
If I run this in release mode, without fulldebug, the entire process completes and the max memory usage according to windows is 41 Mb.
If I run with full debug mode, the memory increases until I run out of memory. If I decrease the size of my dataset such that the regression completes (can only use a few thousand observations, of the million or so I need to do) then I exit the program, it takes a long time to exit, but no memory leaks are reported. I did a test to create a purposeful leak and the report did come up with the leak, so I know its working.
The thing is that this is being run in a dunitx test case. So when the test ends, windows is still saying 8GB of memory is used, but the testing GUI is setting there. All my objects are freed by the time the test completed, so closing down the GUI itself should not somehow "clean up" my memory in any way.
I am unclear how to debug this. My guess is somehow the logs associated with the full debug mode are increasing and increasing. But I cannot figure out why they would be doing this if after the regression completes there are no leaks reported.