Java JIT Compilation, Inlining and JITWatch

Dr. Srinath recently shared an InfoQ article with us and its title is "Is Your Java Application Hostile to JIT Compilation?". I'm writing what I learnt from that article in this blog post.

Overview of Just-In-Time (JIT) compiler

Java code is usually compiled into platform independent bytecode (class files) using "javac" command. This "javac" command is the Java programming language compiler.

The JVM is able to load the class files and execute the Java bytecode via the Java interpreter. Even though this bytecode is usually interpreted, it might also be compiled in to native machine code using the JVM's Just-In-Time (JIT) compiler. 

Unlike the normal compiler, the JIT compiler compiles the code (bytecode) only when required. With JIT compiler, the JVM monitors the methods executed by the interpreter and identifies the “hot methods” for compilation. After identifying the Java method calls, the JVM compiles the bytecode into a more efficient native code.

In this way, the JVM can avoid interpreting a method each time during the execution and thereby improves the run time performance of the application.

The -client and -server systems

It's important to note that there are different JIT compilers for -client and -server systems. A server application needs to be run for a longer time and therefore it needs more optimizations. However a client application may not need lot of optimizations compared to a server application.


There are many optimization techniques for JIT compilation. Such optimization techniques are “inlining”, "dead code elimination" etc. 

Let's look at "inlining" optimization technique in this blog post.

The inlining process optimizes the code by substituting the body of a method into the places where that method is called.

Following are some advantages:

  • Eliminating the need for virtual method lookup
  • Save the cost of calling another method
  • Not needed to create a new stack frame
  • No performance penalty for good coding practices

Inlining depends on the method size. The value is configured by “-XX:MaxInlineSize” and the default value is 35 bytes.

For “hot” methods, which are called in high frequency, the threshold value is increased to 325 bytes. This threshold value is configured by “-XX:FreqInlineSize”.

JarScan tool in JITWatch

The JITWatch is an open source tool developed to get much better insight into how the JIT compiler affects the code.

JarScan is a tool included in JITWatch to analyze jar files and count the bytes of each method’s bytecode.

With this tool, we can identify the methods, which are too large to JIT.

PrintCompilation JVM Flag

The “-XX:+PrintCompilation” flag shows basic information about the HotSpot method compilation.

It generates logs like:
37    1      java.lang.String::hashCode (67 bytes)
124   2  s!  java.lang.ClassLoader::loadClass  (58 bytes)

In the example, the first columns show the time in milliseconds since the process started.

Second column is the compile ID, which track an individual method as it is compiled, optimized, and possibly deoptimized again.

The next column show additional information in the form of flags. (s - “synchronized”, ! - “has exception handlers”).

Last two columns show the the method name and the bytes of bytecode.

This flag doesn’t have much impact on JIT compiler performance and therefore we can use this flag in production.

We can use the PrintCompilation output and the JarScan output to determine which methods are compiled.

There are two minor problems with  PrintCompilation output.

  1. The method signature is not printed, which makes it difficult to identify overloaded methods.
  2. No way to configure log output to a different file.

Identifying JIT-friendly methods

Following is a simple process to determine whether methods are JIT-friendly.
  1. Identifying methods, which are in critical path for the transactions.
  2. JarScan output should not indicate such methods
  3. PrintCompilation output should show such methods being compiled.

Comparison of Java 7 and Java 8 methods

The InfoQ article compares the “$JAVA_HOME/jre/lib/rt.jar” of JDK 7 & 8 to identify the changes in inlining behaviour.

The Java 7 has 3684 inline-unfriendly methods and Java 8 has 3576 such methods. It’s important to know that methods like “split”, “toLowerCase”, &  “toUpperCase” in String are not inline-friendly in both Java versions. This is due to handling UTF-8 data rather than ASCII.


The JITWatch tool can analyze the compilation logs generated with the “-XX:+LogCompilation” flag.

The logs generated by LogCompilation are XML-based and has lot of information related to JIT compilation. Hence these files are very large.


This blog post is about the Just-In-Time (JIT) compiler and its "Inlining" optimization technique. The JIT compiler mainly helps to optimize run-time performance in HotSpot JVM

With JITWatch tools and PrintCompilation, we can understand the JIT behaviour in our applications. With a quick analysis we can figure out performance impacts.

The important point is that if a method is too large, the inlining optimization will not be used. Therefore it's important to write JIT-friendly methods when we consider the performance of a system.

It’s also important to measure the performance of original system and compare after applying fixes. We should never apply any performance driven changes blindly.


Popular posts from this blog

Java CPU Flame Graphs

Specifying a custom Event Settings file for Java Flight Recorder

Java Flight Recorder Continuous Recordings