If you're using a full-text search with MongoDB and filtering out soft-deleted or duplicate records using fields like removed: { $exists: false }
and dupSlug: { $exists: false }
, you might wonder: should I index those fields for performance?
MongoDB applies $text
search first, using your defined text index (e.g., SearchSwimmerByNameTeamLsc
). This is handled efficiently with IXSCAN
and TEXT_MATCH
.
Filters like removed
and dupSlug
are applied later, during the FETCH
stage. At that point:
MongoDB has already narrowed down the candidates using the text index.
It only needs to examine a small number of documents — for example, around ~2,865 in our case — which is very fast.
Indexing removed
or dupSlug
won’t help, because:
$exists: false
conditions cannot use sparse indexes (they only work with $exists: true
).
MongoDB doesn't allow combining $text
with other fields in compound indexes.
You could create a partial index, but that only helps if you're querying those fields independently of text search.
Your query is already efficient and does not require additional indexes on removed
or dupSlug
.
Indexing them would add storage and write overhead with no gain in performance.
Only consider indexing if:
You start querying these fields without full-text search, or
You refactor your logic to use something like isActive: true
(which is easier to index).
Still seeing performance issues? You might benefit from replacing $exists: false
with a flag field (isDeleted
, isDuplicate
) that defaults to true
and can be indexed cleanly.
Let me know if you want to explore that option or review your actual query plans.