An excessive rate of objects allocation, however fast object creation is in Java, can dramatically impact the overall performance of an application. Too many objects created in too little time will increase pressure on the garbage collector, resulting in more frequent stop the world pauses, which in turn translate to jitter and/or degraded response time for the end user.
Low latency applications follow two broad strategies to work around this issue:
1- gc tuning: increasing the total heap available to the JVM will reduce the frequency of stop the world pauses (but not their duration). Allocating more threads to parallel gcs, re-sizing the eden/survivor/tenured space may also help. However these settings will become out of date sooner or later when the volume or distribution of the data processed by the application change, and will then need to be re-evaluated.
2- use non allocating patterns: basically strive to reduce the number of objects created to reduce the workload on the garbage collector. For example:
Profile, profile, profile
Identify allocation hotspots in the code using a profiler such as the Eclipse memory analyzer, YourKit, JProfiler… Make fixes to remove the hotspot and repeat as long as necessary.
Use primitives instead of primitive wrappers and objects
Prefer int to Integer, double to Double, char to Character…etc. Primitives are allocated on the stack and therefore are not garbage collected.
The same reasoning extends to data structures: a SparseArray (a map which uses primitives for its keys) will be more memory efficient than a HashMap which uses objects for both keys and values.
Also worth mentioning Trove, a library dedicated to primitives-only collections in Java.
Reconsider your logging strategy
Logging is a source of allocation. Try reduce the logging level and the amount of information being logged. If you must log then consider your logger implementation carefully to pick the lowest allocation logging implementation.
direct buffers are allocated outside of the heap and hence are not subject to the vagaries of the garbage collector. Better suited for long-lived objects such as the app static data.
Use object pools
It is well established that the concept of immutability leads to better quality code. This is a fundamental tenet of the functional programming paradigm and most functional languages enforce immutability, at least by default.
In certain circumstances this can lead to the creation of an excessive number of objects: e.g listening to market data updates from multiple external feeds where each feed publishes thousands (if not millions) messages per second. If each incoming message creates an object in memory then that’s a lot of objects for the JVM to keep up with.An alternative is to create a pool of objects which are kept in memory and reused for each incoming feed update.
Now pooling has a bad rap in javaland – and this is deserved to some extent. This technique does lead to more complicated code, is more-error prone, and can actually hurt performance in multi-threaded environments where each resource managed by the pool has to be thread-safe. However there are ways to achieve pooling without elaborate locking schemes, and the end-result is well-worth the additional development time.