KBEC-00423 - How to configure HPROF files and generation, and details on how to Analyze Hprof files

Article ID:360033184011
2 minute readKnowledge base

An Hprof file can be created as a heap dump of the memory of a Java™ process. This is typically created when there is an Out of Memory error occurs on the system. The following sections contain details on how to configure the HPROF files and how to analyze them:

How to configure where HPROF files are created?

Due to the nature of a Heap dump, the HPROF files can be extremely large. To preserve these files in a separate location with more space, you can configure the location where HPROF files are generated by changing the value of 'XX:HeapDumpPath'. In the Java Documentation for this configuration option:

For example -XX:HeapDumpPath=/disk2/dumps will cause the heap dump to be generated in the /disk2/dumps directory.

please see the Oracle documentation for more details.

You will need to add this line to the wrapper.conf of the machine similar to the following:

wrapper.java.additional.121=-XX:HeapDumpPath=

Please ensure that the Number chosen for the "additional" Id is unique within the wrapper.conf

How to disable generation of HPROF file?

The HPROF files are usually an indication of an OutOfMemory issue and are used to analyze these errors. It is not recommended to stop generation of these files. You can, however, move them to different places. Also, to disable the generation of these files you would need comment out the "wrapper.java.additional.120=-XX:+HeapDumpOnOutOfMemoryError" line out of the wrapper.conf on the server. The wrapper.conf is within the /conf folder.

If you are going to be disabling it, please note it should be done temporarily. And Even temporarily, disabling the hprof generation is not recommended as the information generated in the files is useful in investigating the errors.

How to analyze HPROF file?

It is possible to use the Eclipse Memory Analyzer to analyze the Hprof file. However, it would be beneficial to pass the HPROF file to EC as well so that we can analyze as well.

Is a partial HPROF of any use?

Any dump from the server will contain data/error messages that can help debug the situation.

Also, can you please clarify what you memory settings within your warpper.conf. This file can be found in /conf. I would recommend increasing the memory setting and see if you continue to hit out of memory errors.

If your system generates an HPROF file, please send your wrapper.conf files, and any details to support@cloudbees.com for analysis, then split the heap dump as per How to generate a heap dump? then upload the heap dump to https://uploads.cloudbees.com.