Solr Out of Memory (OOM): Causes and Solutions

solr-OOM

The SearchStax Managed Search service makes it easy to set up, manage and maintain Apache Solr, an open-source enterprise-search platform from Apache Lucene Project. One issue with Java applications is that they sometimes encounter out of memory errors and this is a common issue with Solr deployments too.

When Solr runs out of memory, we intuitively expect that the index is too large or the application is overwhelmed by a very high indexing rate. Although these issues are common, they might not be the real or the only reasons.

Below we review the keys reasons why your Solr deployment might throw an Out of Memory Exception and what you can do to resolve the errors.

Requesting a large number of rows

Queries requesting a large number of rows can run the system out of memory.

When investigating performance issues in client deployments, we often see that the queries are asking for more than a million rows! Although Solr might not return that many documents, it internally allocates memory for the number of results that the query requested. 

Out of Memory Solution:

You should configure your application to request only the number of rows that you are showing in the search results. Even if you are using faceting, requesting for only 10 or 20 rows will still compute the facets over the entire resultset. 

Queries starting at a large page number 

Queries starting at a large page number use unexpected amounts of memory. A similar performance issue happens when the queries do deep pagination by having a large start parameter. Solr needs to fetch all results up to the value of the start parameter, resulting in heavy memory utilization.

Out of Memory Solution:

If your application cannot be restructured to avoid deep pagination, to fetch larger results, you can use “Cursors”. You can learn more about them at the Solr Pagination Documentation.

Faceting, Sorting and Grouping Queries

Faceting, sorting and grouping queries use a lot of memory, especially if done on fields that are not docValues. In general, faceting, sorting and grouping queries are expensive, having high memory utilization. Setting docValues=true in the schema field definition reduces the java heap requirements by memory-mapping field data. 

Out of Memory Solution:

If you are having out-of-memory issues, you should investigate the fields that are being used for faceting, grouping, and sorting, and make sure that their schema sets docValues=true. (If you change a docValues setting in the schema, you’ll have to reindex your content.)

Large Caches – QueryResultCache, DocumentCache, FilterCache, FieldCache. 

Caching makes Solr fast and reliable by trading speed for memory. Large caches could be one of the reasons behind your out-of-memory problems. 

There are different kinds of caches that are configured in solrconfig.xml:

  1. filterCache: This is the cache storing unordered lists of document ids that have been returned by the “fq” (filterQuery) parameter of your queries.
  2. queryResultCache: This cache stores document ids returned by searches
  3. documentCache: This caches fieldValues that have been defined as “stored” in the schema, so that Solr does not have to go back to the index to fetch and return them for display.
  4. fieldCache: This cache is used to store all of the values for a field in memory rather than on to disk. For a large index, the fieldCache can occupy a lot of memory, especially if caching many fields.

The settings for each cache defines its initial size, its max size, and its autowarmcount – which is the number of items that are copied over from an old searcher to the new one. 

Out of Memory Solution:

Looking at Plugins/Stats in the Solr dashboard, one can check the hit ratio of the caches to see if they are being utilized. If the hit ratio is too low, the caches are not really being utilized. You can make the caches smaller to reduce the memory footprint. 

Also, if the number of evictions is too large, chances are the cached entries are being tossed without being used. You might just benefit by reducing the cache sizes to relieve your out-of-memory problems.

You should also note that these caches are per core/collection. The memory requirements will be multiplied by the number of collections. If your application uses a large set of collections, the memory impact of caching will be magnified. 

How To Add More Memory

If you are using SearchStax Managed Search to host your Solr deployments and need more memory, learn how you can upgrade your SearchStax deployment.

In addition, if you’d like to understand whether your Solr process died because of an out-of-memory exception, review SearchStax Product FAQs entry on Solr Out of Memory errors.

Switch To An Easier Solution

If you’re feeling stuck with your current Solr deployment or if it’s no longer fitting the needs of your company, contact one of our experts to see how easy it is to switch to SearchStax Managed Search.

About the Author

By Dipsy Kapoor

VP, Engineering

February 18, 2023

Recommended for you

What is Solr?

What is Solr?

Learn more about Solr, a powerful and flexible search platform that can be adapted and customized for relevant engaging search

Get the Latest Content First