Skip to content

Bulk indexing - Saving a model with a long indexable name triggers as many queries as the number of ngrams #43

@Startouf

Description

@Startouf

The update code triggers one insert per ngram

This leads to as many insert requests as there are ngrams to be indexed. This becomes terribly slow when doing batch updates, (especially if one forgets to implement the update_if option)

I suggest the following

  • Use an implementation of batch update to add all the ngrams in a single request
  • (might be a an idea for a new feature) use a default implementation of default_if based on dirty tracking (if the indexed fields are Mongoid fields, then it's possible to use dirty tracking with _changed? methods to know if an update is needed

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions