+ 1
Application RAM usage
Hi I have an application which is having memory usage of 12 GB RAM on task manager process. When user clicks on the button, a call back function is called. When this call back function is completed, memory usage is back to 12 to 12.1 GB. BUT this call back function takes 20 to 25 minutes to get completed and memory usage is reached to 30 GB RAM usage. Does this mean any issue from my application in call back function development or it is windows operating system issue? What to check for this extra memory usage ? Is it due to memory leak or some space complexity ? I am storing around 38 millions data in vector of a class objects. How to start focusing on this issue ?
11 Respuestas
+ 3
Giannis i do understand that it is difficult to say without actual code , but unfortunately will not be able to share code due to legacy issues
Also 12 GB TO 12.1 is not my concern to be honest.
App reaching upto 30 GB is issue as of now as it might crash application if 18 gb RAM is not available when application started
I am aware about valgrind and memory leak tools but will it help me ? My 18 GB used in is not maintained as soon as app call back is completed. How to analyze that 18 GB (12 TO 30) is worth for application or its OS issue or its algorithm issue ?
+ 3
Ketan,
With your recent reply I began to see it's somewhat more complex since this is a per user data.
However, with such a huge amount of data, I personally would still prefer use of DBMS for data centralization, ease of maintenance and data security purposes.
Then again your company may have their own strategies and plans, accordingly to each case to be solved, including this one.
I'm standing by to hear and learn : )
+ 3
Thanks Giannis nd Ipang
One basic question:
Lets talk simple things apart from this data I shared.
Simple function of basic c++ console app. It takes 10 mins plus time... Just a sleep of 1 min in a for loop for 10 times. After each sleep, I am allocating 1 gb (4 bytes for int and 1024/4 ints) data to vector<int>.
Does this mean my app will have memory usage of 9 gb after 9 sleep command and at the end of loop, it will occupy 10 gb RAM ?once function is over, stack rewinds and vector<int> clears and all 10 gb ram is released.. is it correct or space complexity is not a RAM usage ?
+ 2
Hello Ketan,
I'm not going to say a thing about the application you use or its memory usage, I don't know what it is or how it works, details is unavailable.
Rather I would like to suggest you to use one of the DBMS available, because you mentioned 38 million data. DBMS softwares are designed specifically for such tasks, handling huge amount of data.
We should consider a lot of things when developing application that deals with huge amount of data. Perhaps a partial data load help, so it doesn't eat up too much memory. Only load a portion of data that is to be worked on, instead of loading up the whole thing to memory.
Just a suggestion though ; )
+ 2
Thanks Giannis
+ 1
12 GB is insanely huge for just 1 application, so goes from 12 to 12.1 is really big difference.
If you use std::vector and not specify the capacity then it will allocates automatic a bit more space than it need everytime so no need to allocating again and again for every object you add as it takes too much time. In this case there is no memory leak because when you destroy these vectors, everything will be freed.
Its hard to tell without see the code if you got memory leak or not, however there a few ways to check for memory leak like sanitizers
https://developers.redhat.com/blog/2021/05/05/memory-error-checking-in-c-and-c-comparing-sanitizers-and-valgrind
+ 1
Ipang , thanks for sharing this but i am not sure whether it is helpful to me or not
Those data is just a calculation based on few values in a file... Those values are not in my control as it is property of other vendor
Apart from this, those all are file specific and user may have 1000s of such files... Also one file is local to user and those calculated details are of no use to other users. Also for a single user, every thing is file specific and change in a file changes everything
With such complexity and use of local data for each file , does it make sense to go with DBMS ? how to store all details locally and not on server?
Thanks again for sharing views and great to learn different perspectives
+ 1
Ketan Lalcheta it will be way easier for you if you can run this code without load so much data just for testing (keep a backup first if need). Then you can try run multiple times that function (if is possible do it in loop thousand times or more) and see if everytime increase more and more the ram after execution. If you use multiple threads make sure that all threads exit always and not stuck in loop, also if you use Windows OS make sure you close all your handles, you can see the total handles opened in your system in task manager under the CPU graph
+ 1
Ketan Lalcheta
1. except the data you got in vector, a console program needs a bit extra ram (the executable, thread data, process data and much more) this is i think less than 10MB in windows 10. Also if you not init a vector with a specific capacity at the beginning, then it will allocate maybe a bit more space in ram.
For exable if call the size() of vector it return 268435456 (how many ints saved) but if call capacity() it may return more.
So after 9 loops it will not be exactly, but almost 9GB.
2. The data you insert in vector will not saved in stack but in heap. After the vector destroyed it will automatic free the memory