c# - CLR / High memory consumption after switching from 32-bit process to 64-bit process -
I have the backend app (Windows Service) built on the top of the .NET Framework 4.5 (C #). This application runs on Windows Server 2008 R2 server with 64GB of storage.
Due to dependence on me, I let it use 32-bit process (compiled as x86) / LARGEADDRESSAWARE flag more than 2GB of storage in the user space. Using this configuration, the average memory consumption (according to the "Memory (Personal Work Set)" column in the Task Manager) was approximately 300-400MB
For this reason I needed a bigger flag, and the reason I changed it to 64-bit, that is, though 300-400 MB is average , once it is applying the stuff in which there is a lot of data loading in memory (and when you have a lot Memory-Not So Interested Not the development and management is very easy).
Recently (after removing those x86 native dependencies), I changed the application compilation to "a CPU", so now, on the production server, it runs as a 64-bit process When I made this change, the average memory consumption (according to the task manager) was found at new levels: 3-4 GB , when there is no other change which can explain this change in practice is. / P>
There are some additional facts about the current situation:
-
According to the "pile in all stops" count, the total amount of memory is approximately 600MB
-
When debugging the process with WinDbg + SOS! Dumpheps -State showed that approximately 250-300 MB are free, but all other objects were much less than the total amount used Process Storage
-
According to the GC display counters, Gen0 collections are on a regular basis, in fact, the "Time in GC" counter indicates that the average time spent on GC is 10-20 % (Which is understood by the nature of the application - a lot of information and data structures that are in use for it
-
I am using the server GC in this app
-
There is no memory problem on the server. It is available for available memory (64 GB). The book uses 50-60%.
My question:
-
Why allocated memory for a great process ( According to the Task Manager) and the difference between the actual size of the CLR heap (there is no unmanaged code in this process which can explain this)?
-
32-bit process Why does the 64-bit process take more memory compared to the same process as running? Even considering that pointers It takes time size twice, there is a big difference.
-
Can I do something to reduce memory consumption or get a better understanding of this issue?
Thank you!
There are some things to consider:
1) You have mentioned that You are using server GC mode. In the server GC mode, CLR creates a pile for every CPU core on the machine, which is more efficient and multi-threaded processing in server processes, e.g. Asp.Net processes have two segments of each stack: For a small object, each segment starts with 4 GB of reserved memory each, basically to use more memory on the system GC mode system, the overall system performance Tries to do business.
2) Pointer is 64-bit, of course big on.
3) GC Server becomes very expensive in GC mode due to foreground Gen2 stack, too big. The CLR tries super hard to reduce the number of the foreground gene 2GC, sometimes using the background gene 2GC.
4) Depending on the usage, fragmentation can be a real issue, I have seen a stack with 98% fragmentation (98% stack free block).
In order to really solve your problem, you need to get an ETW trace + a memory dump, and then use tools like PerfView to expand detailed analysis.
Comments
Post a Comment