Although maintaining unneeded indexes can have a detrimental effect on performance, it can also be detrimental if a needed index is missing.

The best way to identify expensive searches processed in the server is to examine the access logs for search operations with a high etime (elapsed processing time) value. Once you have identified these search operations, you can weed out any of these operations that do not need to be fast. For example, you may have applications that generate reports by performing inefficient searches (for example, searches to retrieve all entries), and itis probably acceptable for those searches to be slow.

For any remaining searches that should be fast but are not, the best way to understand why the search is expensive is to issue a search with the same base DN, scope, and filter, but requesting only the debugsearchindex attribute. This is a special attribute that causes the server to return debug information about the index processing that is performed in the course of evaluating the search and how long it took to complete each step of the evaluation. From this output, you may be able to see which indexes were used and which could not be used because there was either no applicable index or because the index entry limit had been exceeded for the target key. You may also see expensive accesses to exploded indexes. WYou mayalso be able to identify indexes that need to be added, indexes that may benefit from being converted to composite indexes, or indexes for which there may be a legitimate need to increase the index entry limit. Alternatively, you may also be able to determine that there could be a different way to perform the search so that it does not depend on components that are unindexed or that match a very large number of entries.