Cross Column

Showing posts with label Heap Dump. Show all posts
Showing posts with label Heap Dump. Show all posts

Sunday, May 31, 2020

How to Setup a Standalone Memory Analyzer for Windows 10

The Eclipse Memory Analyzer is a fast and feature-rich Java heap analyzer that helps you find memory leaks and analyze high memory consumption issues.

Standalone vs Plug-ins


You can install the Memory Analyzer into an Eclipse IDE (see [1]).  However, a standalone Memory Analyzer is useful if you do not want to install a full-fledged IDE on the system you are running the heap analysis.  To download a standalone Memory Analyzer, click here.

Prerequisite


For this illustration, we have downloaded
Eclipse Memory Analyzer Version 1.10.0 ―Windows (x86_64)
which requires a minimum Java version of 1.8.0.

Setups


After unzipping the file, a new folder named mat was created:
C:\Users\<username>\Downloads\mat

You can create a Windows Command Script (.cmd) with the same name in the same folder using the following content:

set PATH=C:\Program Files\Java\jdk-10\bin;
start MemoryAnalyzer.exe

To avoid running MAT with the wrong JRE, we have set its PATH environment variable pointing to a JDK installation with Java version of 18.3.0:

$ cd "/cygdrive/c/Program Files/Java/jdk-10/bin"
$ ./java.exe --version
java 10 2018-03-20
Java(TM) SE Runtime Environment 18.3 (build 10+46)
Java HotSpot(TM) 64-Bit Server VM 18.3 (build 10+46, mixed mode)

To enable MAT handling large heap dumps (i.e., .hprof), you may increase the heap size of MAT runtime by changing its MemoryAnalyzer.ini in the same folder:[7]

-startup
plugins/org.eclipse.equinox.launcher_1.5.0.v20180512-1130.jar
--launcher.library
plugins/org.eclipse.equinox.launcher.win32.win32.x86_64_1.1.700.v20180518-1200
-vmargs
-Xmx10g

For example, we have set its max heap size to be 10 GB.

Getting a Heap Dump


The Memory Analyzer can work with HPROF binary formatted heap dumps. Those heap dumps are written by HotSpot and any VM derived from HotSpot. Depending on your scenario, your OS platform and your JDK version, you may have different options to acquire a heap dump.[2]

As a developer, you want to trigger a heap dump on demand. On Windows, use JDK and JConsole. On Linux and Mac OS X, you can also use jmap or jcmd that comes with JDK.


Via MAT:

Via Java VM parameters:
  • -XX:+HeapDumpOnOutOfMemoryError  –XX:HeapDumpPath=[file path]
    • writes heap dump on the first Out Of Memory Error (recommended)
  • -XX:+HeapDumpOnCtrlBreak 
    • writes heap dump together with thread dump on CTRL+BREAK
    • -XX:+HeapDumpOnCtrlBreak in HotSpot JVM (by Sun/Oracle) is present in 1.4.2_12 or higher and 1.5.0_14 or higher. For JVMs 1.6, 1.7, 1.8 this option is no more present, but you can use the "jmap" or "jcmd" tools.

Via Tools:
  • jmap
    • jmap -dump:format=b,file=snapshot.jmap <process id>
    • It is recommended to use the latest utility jcmd instead of jmap utility for enhanced diagnostics and reduced performance overhead. 
  • jcmd
    • jcmd <process id/main class> GC.heap_dump Myheapdump
  • JConsole
    • Launch jconsole.exe and invoke operation dumpHeap() on HotSpotDiagnostic MBean
  • SAP JVMMon
    • Launch jvmmon.exe and call menu for dumping the heap
Example 1 Create a Heap Dump using jmap

bash-4.2$ ${JAVA_HOME}/bin/jmap -dump:format=b,file=MyJmapHeapdump.hprof 82734
Dumping heap to /tmp/NM_OOM/MyJmapHeapdump.hprof ...
Heap dump file created

Example 2 Create a Heap Dump using jcmd

$ /bi/app/jdk/bin/jcmd 82734 GC.heap_dump MyJcmdHeapdump.hprof
82734:
Heap dump file created

Notes

  • By default, the heap dump will be generated in the "current directory" of the Java application.[6] 
    • jmap generates heap dump in the directory where you run jmap, which is different from jcmd's behavior.
  • The dump file can be huge, up to Gigabytes, so ensure that the target file system has enough space.
  • Need to login as the same user as your process' to attach to it

References

Saturday, August 10, 2013

Diagnosing Heap Stress in HotSpot

Heap stress is characterized as OutOfMemory conditions or frequent Full GCs accounting for a certain percentage of CPU time[6].  To diagnose heap stress, either heap dumps or heap histograms can help.

In this article, we will discuss the following topics:
  1. Heap histogram vs. heap dump[1]
  2. How to generate heap histogram or heap dump in HotSpot


Heap Histogram vs. Heap Dump 


Without much ado, read this companion article for the comparison.  For heap analysis, you can use either jmap or jcmd to do the job[5].  Here we focus only on using jmap.

$ jdk-hs/bin/jmap -help
Usage:
    jmap [option] 
        (to connect to running process)
    jmap [option] 
        (to connect to a core file)
    jmap [option] [server_id@]
        (to connect to remote debug server)

where <option> is one of:
    <none>               to print same info as Solaris pmap
    -heap                to print java heap summary
    -histo[:live]        to print histogram of java object heap; if the "live"
                         suboption is specified, only count live objects
    -permstat            to print permanent generation statistics
    -finalizerinfo       to print information on objects awaiting finalization
    -dump:<dump-options> to dump java heap in hprof binary format
                         dump-options:
                           live         dump only live objects; if not specified,
                                        all objects in the heap are dumped.
                           format=b     binary format
                           file=  dump heap to 
                         Example: jmap -dump:live,format=b,file=heap.bin <pid>
    -F                   force. Use with -dump:<dump-options> <pid> or -histo
                         to force a heap dump or histogram when <pid> does not
                         respond. The "live" suboption is not supported
                         in this mode.
    -h | -help           to print this help message
    -J<flag>             to pass <flag> directly to the runtime system


Generating Heap Histogram


Heap histograms can be obtained by using jmap (note that you need to use jmap from the same JDK installation which is used to run your applications):

$~/JVMs/jdk-hs/bin/jmap -histo:live 7891 >hs_jmap_7891.txt


 num     #instances         #bytes  class name
----------------------------------------------
   1:       2099805      195645632  [C
   2:        347553       49534472  <constMethodKlass>
   3:       2055692       49336608  java.lang.String
   4:        347553       44501600  <methodKlass>
   5:         30089       36612792  <constantPoolKlass>
   6:       1044560       33425920  java.util.HashMap$Entry
   7:         90868       24909264  [B
   8:         30089       23289072  <instanceKlassKlass>
   9:         22323       18194144  <constantPoolCacheKlass>
  10:        177458       15661816  [Ljava.util.HashMap$Entry;
  11:        642260       15414240  javax.management.ObjectName$Property
  12:        159785       15405144  [Ljava.lang.Object;

In the output, it shows the total size and instance count for each class type in the heap.  For example, there are 2099805 instances of character arrays (i..e, [C), which has a total size of 195645632 bytes.  Because the suboption live was specified, only live objects were counted (i.e., a full GC was forced before histogram was collected).

Generating Heap Dump


Heap dump is a file containing all the memory contents of a Java application. It can be generated via:

$ ~/JVMs/jdk-hs/bin/jmap -dump:live,file=/tmp/hs_jmap_dump.hprof 7891
Dumping heap to /tmp/hs_jmap_dump.hprof ...
Heap dump file created

Including the live option in jmap will force a full GC to occur before the heap is dumped so that it contains only live objects.  We recommend taking multiple heap dumps.  For example, 30 minutes and 1 hour into the run.  Then use Eclipse MAT[2,3] to examine the heap dumps.

Finally, there are other ways to generate a java heap dump:
  • Use jconsole option to obtain a heap dump via HotSpotDiagnosticMXBean at runtime
  • Heap dump will be generated when OutOfMemoryError is thrown by specifying
    • -XX:+HeapDumpOnOutOfMemoryError VM option
  • Use hprof[7]


References

Wednesday, August 7, 2013

Diagnosing OutOfMemoryError or Memory Leaks in JRockit

When you run into OutOfMemoryError or other memory-leak issues, generating a heap histogram or heap dump can help you diagnose the memory-bloating issues.

In [1], it lists the following Java Heap related problems:
  • Exceeding max heap 
    • The heap is full and cannot fit a new object
  • Large allocation
    • The new object is too large for the contiguous free space
  • Native exhaustion
    • There is not enough native heap for the requested object[10]
  • GC Starvation
    • The heap is almost full and causing frequent garbage collection[9]
  • Optimization
    • Heap utilization is higher than expected for the current number of users
In this article, we will discuss the following topics:
  1. Heap histogram vs. heap dump (see also [8])
  2. How to generate heap histogram or heap dump in JRockit
  3. JVM options that are useful for heap analysis

Heap Histogram vs. Heap Dump


A heap dump is a snapshot of all the objects in the Java Virtual Machine (JVM) heap at a certain point in time. The JVM software allocates memory for objects from the heap for all class instances and arrays. The garbage collector reclaims the heap memory when an object is no longer needed and there are no references to the object. By examining the heap you can locate where objects are created and find the references to those objects in the source.  However, dumping of Java heap is time-consuming and lengthy in size.

On the other hand, heap histogram gives a very good summary of heap objects used in the application without doing a full heap dump. It can help you quickly narrow down a memory leak. This information can be obtained in several way:
  • Attach a running process using the command jrcmd.
  • Generate from a core file or heap dump
Note that we refer to heap histogram, heap summary, or heap diagnostics interchangeably in this article.

Generating Heap Histogram


A heap histogram can be obtained from a running process using the command:
  • jrcmd 20488 heap_diagnostics


--------- Detailed Heap Statistics: ---------
30.0% 65027k   672176     -1k [C
10.2% 22119k   943754     -2k java/lang/String
10.0% 21592k   183036     -3k [Ljava/lang/Object;
 4.7% 10185k   434587     +0k java/util/HashMap$Entry
 4.7% 10114k    27029  -1254k [B
 4.5% 9783k   111539     +0k [Ljava/util/HashMap$Entry;
 1.9% 4075k    34777     +0k java/lang/Class
 1.9% 4058k    86590     +0k java/util/HashMap
 1.6% 3448k   147156     +0k javax/management/ObjectName$Property
 1.2% 2593k    82994     +0k java/util/LinkedHashMap$Entry
 1.1% 2398k    76765     +0k java/util/concurrent/ConcurrentHashMap$Segment
 1.0% 2215k     9311     +0k [I
 0.9% 1975k    18469     +0k [J

In the output, there is a "Detailed Heap Statistics" section, which shows the total size and instance count for each class type in the heap:
  • The first column corresponds to the Class object type contribution to the Java Heap footprint in %
  • The second column correponds to the Class object type memory footprint in K
  • The third column correponds to the # of Class instances of a particular type
  • The fourth column correponds to the delta - / + memory footprint of a particular type
As you can see from the above snapshot, the biggest data type is [C (i.e., character array) and java.lang.String. In order to see which data types are leaking, you will probably need to generate several snapshots, which you might be able to observe a trend that can lead to further analysis.

Generating Heap Dump


Heap dump is a file containing all the memory contents of a Java application. It can be generated via 
  • jrcmd 20488 hprofdump
    • Wrote dump to /.../appmgr/APPTOP/instance/debug/jrockit_20488.hprof
Then you can use various tools to load that file and look at various things in the heap: how much each kind of object is using, what things are holding onto the most amount of memory, and so on.  The size of heap dump file is proportional to the size of Java Heap and can be large.

Three of the most common tools are:

  • jhat
    • This is the original heap analyzer tool, which reads the heap dump and runs a small HTTP server that lets you look at the dump through a series of web page links.
  • VisualVM [3]
  • MAT [4,5]

Heap-Related JVM Options


When your JVM runs into OutOfMemoryError, you can set:
  • -XX:+HeapDumpOnOutOfMemoryError
to get a heap dump after the heap is big and bloated just before the JVM dies.  Also, you can provide the following flags:
  • -XX:HeapDumpPath=<path to the destination>
  • -XX:+ExitOnOutOfMemoryError
Similarly, you can get a heap histogram instead of a full heap dump using[7]:
  • -XX:+HeapDiagnosticsOnOutOfMemoryError 
  • -XX:HeapDiagnosticsPath=<path to the destination>

Wednesday, March 27, 2013

Analyzing the Performance Issue Caused by WebLogic Session Size Too Big

In this article, we will show you lots of interesting stuffs:
  1. How to investigate an issue with too many live objects kept in old generation, which cannot be reclaimed in Full GC by HotSpot VM
    • For background information regarding Garbage Collectors, read [1-7]
  2. How important sesssion-timeout element in web.xml is
  3. How to use java tool to generate heap dump
  4. How to use Eclipse Memory Analyzer to analyze heap dump (i.e. fod.hprof file)
    • Download link: here

The Issues


We have seen slow performance in one of our benchmarks (i.e., FOD).  Here are the symptoms:
  • GC took about 50% of the application's CPU time
  • Full GC's were not triggered by full Perm Gen, but by full Old Gen
    • The old generation is completely full and it isn't clearing up anything but a few soft references during Full GC.
So, the symptoms all point to too many live objects kept in Old Gen and they have caused many Full GC's.  There could be different causes for the above symptoms.  To investigate, you need to create heap dumps and check out what objects HotSpot VM is holding onto.  Below we will show you how to investigate this.

Generating Heap Dump


First, we have used a Java tool jmap to generate a summary of the heap contents:

$./jmap -histo 16345 >/scratch/aime1/tmp/fod.summary

 num     #instances         #bytes  class name
----------------------------------------------
   1:      15777231      822030680  [Ljava.lang.Object;
   2:       3074270      463615192  [C
   3:       4076972      331840256  [Ljava.util.HashMap$Entry;
   4:       9287101      297187232  java.util.HashMap$Entry
   5:       2105437      151591464  org.apache.myfaces.trinidad.bean.util.PropertyHashMap
   6:       4075444       97810656  java.util.HashMap$FrontCache
   7:       1893014       90864672  java.util.HashMap
   8:       2018043       80721720  org.apache.myfaces.trinidad.bean.util.FlaggedPropertyMap
   9:       3017161       72411864  java.lang.String


Unfortunately, it didn't tell us much regarding the biggest java object that have taken ~800M bytes.  But, we did notice that two other property maps also took up a lot of memory space.  So, the next step is to generate a full dump as follows:

$./jmap -dump:live,file=/scratch/aime1/tmp/fod.hprof 16345

To analyze the full heap dump, you need to use Eclipse Memory Analyzer which we have chosen the standalone version.

Analyzing Heap Dump With Memory Analyzer (MAT)


I started the Memory Analyzer with the following command:

$ cd /scratch/sguan/mat/
$ ./MemoryAnalyzer &

When I opened the heap dump file (i.e., fod.hprof), MAT failed with a message related to Java Heap Space.  So, I need to modified the following line in the MemoryAnalyzer.ini file:

-vmargs
-Xmx1024m

by changing -Xmx1024m to -Xmx6240m.  Note that how big you should set for -Xmx option depends on:
  • Size of heap dump
    • Our original file size is 6.4g and was reduced to 3.6g after loading.  Note that MAT will remove unreachable objects in the loading process.
  • Size of physical RAM in your system

The Culprit— WebLogic Session Objects


From the MAT, we have found out that the biggest live object kept in the heap was:
  • weblogic.servlet.internal.session.MemorySessionContext @ 0x705e0ff08 
Also, for two previously-mentioned property map objects, they are also related to the session object.  If you trace those objects back to the GC root, you will see that they are pointed to by the session state. They are the reason the session state is so big.

session-timeout Element in web.xml


Session object is used by WebLogic container to track a user's session that uses the StoreFrontModule web application in our FOD benchmark.  Every time a new user comes in, it starts a session. This session object keeps user's data or states in it. If the user leaves, the session is considered idle and the server will retain it in memory for a period of time before reclaiming it.

Java web applications can be configured with a session timeout value which specifies the number of minutes a session can be idle before it is abandoned.  This session timeout element is defined in the Web Application Deployment Descriptor web.xml[9]. For example, this was the session timeout value we had:

<session-config>
<session-timeout>35</session-timeout>
</session-config>

The problem with our benchmark's slowness is due to this session timeout value being too high.  This has caused the sessions being piled up and taking up lots of memory. Setting it to be a lower value (see [10] on how to patch web.xml in an EAR), our benchmark then performs normally.

© Travel for Life Guide. All Rights Reserved.

Analytical Insights on Health, Culture, and Security.