The UseCompressedOops feature (oop means ordinary object pointer, i.e., one used to hold Java object references), introduced in Java6 is designed to save memory and CPU cycles by allowing applications to use large heaps of up to 32GB, while still using 32-bit pointers to hold an object reference. How can 32-bit pointers address more than 4GB (2^32) memory? The trick is to ignore the lower 3 bits of the pointer, which are normally 0 (as all objects addresses are aligned at 8-bytes boundary) and use them to address a larger available space (8x larger, hence 4GB * 8 = 32GB).
This is a great feature for many applications with moderate-high memory requirements. Oracle mentions that the same application, when running without the UseCompressedOops flag might consume as much as 50% more memory compared to the case when this flag is used. And the good news is that in all recent Java releases the UseCompressedOops flag is on for heap sizes below 32GB. That means that in most cases we don't need to worry about it at all.
We need to be aware of the UseCompressedOops setting, however, in two scenarios:
A. When trying to find the right heap size for a memory hungry application. A process for determining that "right size" could be starting an experiment with a larger heap, e.g., 60-70GB and lowering its size until both the application and the garbage collector are happy. But what happens if the sweet stop that you find is around 35-40GB? That surely means that UseCompressedOops is off and you're stuck with using expensive 64-bit object pointers. Trying heap size below 32GB may reveal that your application (and the GC) can be happy with as little as 25-30GB of memory, which means both cheaper hardware and faster execution.
B. Under some unique combination of command line parameters, specifying -XX:+UseCompressedOops (or simply ignoring it being on by default) may effectively limit the heap size to 32GB. This won't happen if the -Xmx parameter is used and a warning will be printed if there is a conflict between -Xmx and -XX:+UseCompressedOops.
But it can happen when the -XX:MaxRAMFraction and the -XX:MaxRAM parameters are used without specifying the max heap size with the -Xmx parameter. It's a rare scenario and should probably be addressed by better parameter combinations check in the java executable, but it can still happen. As always a helpful parameter to detect such anomalies is -XX:+PrintCommandLineFlags.