c++ - How does a memory leak improve performance -


I am creating a large RTree (spatial index) filled with nodes, being able to handle many questions and updates in it. needed. The items are constantly being created and destroyed. The basic test tree I am running is to see the performance of the tree as the number of growing objects of the tree. I incorporate randomly-located objects from 100-20000 to the same size, 100 in the pay scale. Irrelevant to the issue of search and updating, which is currently facing.

Now, when the no memory leak is "Insert in the tree" display is everywhere it runs anywhere from 10.5 seconds to 15000 objects up to ~ 18000. There is no pattern.

When I deliberately add to a leak, it is as simple as adding "new int"; I do not give anything to it, there is a line in it, the display instantly falls on a nice soft curve slope of 1.5 to 20 for the full 20k for 100 (seconds) seconds (roughly).

Very, at this point you have lost a lot if I want the source code but I can include it but it is huuugggee and is really only a difference that makes the difference "new int";

Thanks in advance! I'm sure how you came with this new int -

Test, but this is not a great way to fix things :) Run your code using a profile and find out where the real delay is, then focus on fixing hot spots.

G ++ has created it - just -pg


Comments

Popular posts from this blog

Eclipse CDT variable colors in editor -

AJAX doesn't send POST query -

wpf - Custom Message Box Advice -