Skip to content
Open
Show file tree
Hide file tree
Changes from 38 commits
Commits
Show all changes
42 commits
Select commit Hold shift + click to select a range
c181d5c
pre-design review PoC of my bbq lucine integration, added an encoder …
adityamachiroutu Jul 15, 2025
f8b81e0
renamed encoder from bbq to binary to match faiss implementation, spo…
adityamachiroutu Jul 22, 2025
b4c1d89
added unit tests
adityamachiroutu Jul 25, 2025
d925212
unit tests
adityamachiroutu Jul 28, 2025
5d15c2f
starting BWC tests
adityamachiroutu Jul 28, 2025
3307128
integration/bwc tests
adityamachiroutu Jul 28, 2025
f68cd43
spotless
adityamachiroutu Jul 28, 2025
011fe7d
removing comments, spotless, removing unneeded tests
adityamachiroutu Jul 28, 2025
ad6c48e
removing comments, spotless, removing unneeded tests
adityamachiroutu Jul 28, 2025
f5d8a4a
updated bwc test
adityamachiroutu Jul 28, 2025
b4b44f3
changes bwc tests
Jul 29, 2025
862bcf3
fixed bwc
adityamachiroutu Jul 29, 2025
b68d0f1
test exclusion for bbq, matches sq bwc test
adityamachiroutu Jul 29, 2025
7541f90
added methodcomponentcontext within bbqvectorsformat
adityamachiroutu Jul 30, 2025
9274a76
spotless
adityamachiroutu Jul 30, 2025
a863a3c
added version check as a knnresttestcase, and check versioning in bwc…
adityamachiroutu Jul 30, 2025
ab86018
spotless
adityamachiroutu Jul 30, 2025
302a67d
Revert "Add KNN Profiling Info to Core Profiler - KNNQuery, Native, R…
adityamachiroutu Jul 31, 2025
4a477f3
changed compression level to 32x for rescoring
adityamachiroutu Jul 31, 2025
b92dfbc
changelog
adityamachiroutu Aug 4, 2025
d4c859c
Reapply "Add KNN Profiling Info to Core Profiler - KNNQuery, Native, …
adityamachiroutu Aug 4, 2025
95eff40
tweaking version in test case
adityamachiroutu Aug 8, 2025
1206a98
rebased, changed version for integ test, addressed comments
adityamachiroutu Aug 11, 2025
6785f10
added an awaitsfix to avoid changes to build.gradle
adityamachiroutu Aug 11, 2025
459eb6a
more integ tests
adityamachiroutu Aug 11, 2025
509403c
Merge branch 'main' into bbq-lucene-integration
adityamachiroutu Aug 11, 2025
49b5db9
spotless
adityamachiroutu Aug 12, 2025
a9052cf
automatically use bbq encoder when compression level is 32x and the l…
adityamachiroutu Aug 14, 2025
78a5509
set the default oversample for lucene bbq to 5 for vectors with dimen…
adityamachiroutu Aug 20, 2025
70f2e0c
spotless
adityamachiroutu Aug 20, 2025
83a2aa2
Merge branch 'main' into bbq-lucene-integration
adityamachiroutu Aug 21, 2025
02b0cd7
Merge branch 'test-oversample' into bbq-lucene-integration
adityamachiroutu Aug 22, 2025
6812655
upgrade to 3.3
adityamachiroutu Aug 25, 2025
79cdd79
fixed defaulting rescoring functionality
adityamachiroutu Aug 25, 2025
0abc8dc
spotless
adityamachiroutu Aug 25, 2025
2a85559
added unit tests for compression level change
adityamachiroutu Aug 26, 2025
379520b
spotless
adityamachiroutu Aug 26, 2025
a8a31c7
Merge branch 'main' into bbq-lucene-integration
adityamachiroutu Aug 26, 2025
aa29c46
addressed comments
adityamachiroutu Aug 27, 2025
6e783da
addressed further comments
adityamachiroutu Aug 27, 2025
e3bc2dc
resolved comments
adityamachiroutu Aug 29, 2025
ca8cb09
fixed versioning for compression level
adityamachiroutu Aug 29, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,10 @@ All notable changes to this project are documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). See the [CONTRIBUTING guide](./CONTRIBUTING.md#Changelog) for instructions on how to add changelog entries.

## [Unreleased 3.3](https://github.com/opensearch-project/k-NN/compare/main...HEAD)

### Features
* Integrated Lucene's better binary quantization [#2838](https://github.com/opensearch-project/k-NN/pull/2838)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: present tense


### Refactoring
* Refactored the KNN Stat files for better readability.

Expand Down
1 change: 1 addition & 0 deletions qa/restart-upgrade/build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -252,6 +252,7 @@ testClusters {
knn_bwc_version.startsWith("2.15.")) {
filter {
excludeTestsMatching "org.opensearch.knn.bwc.IndexingIT.testKNNIndexLuceneQuantization"
excludeTestsMatching "org.opensearch.knn.bwc.IndexingIT.testKNNIndexLuceneBBQ"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why are you trying to exclude this test only until version 2.15 ? it won't work until version 3.2 ?

excludeTestsMatching "org.opensearch.knn.bwc.IndexingIT.testKNNIndexBinaryForceMerge"
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,6 @@

import java.io.IOException;
import java.util.Collections;
import java.io.IOException;
import java.util.List;
import java.util.Map;

Expand All @@ -51,6 +50,7 @@
import static org.opensearch.knn.common.KNNConstants.MODE_PARAMETER;
import static org.opensearch.knn.common.KNNConstants.NAME;
import static org.opensearch.knn.common.KNNConstants.PARAMETERS;
import static org.opensearch.knn.common.KNNConstants.ENCODER_BBQ;

public class IndexingIT extends AbstractRestartUpgradeTestCase {
private static final String TEST_FIELD = "test-field";
Expand All @@ -64,6 +64,8 @@ public class IndexingIT extends AbstractRestartUpgradeTestCase {
private static final int NUM_DOCS = 10;
private static int QUERY_COUNT = 0;

private static final String ALGO = "hnsw";
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove this constant


// Default Legacy Field Mapping
// space_type : "l2", engine : "nmslib", m : 16, ef_construction : 512
public void testKNNIndexDefaultLegacyFieldMapping() throws Exception {
Expand Down Expand Up @@ -659,4 +661,75 @@ public void testRandomRotationBWC() throws Exception {
deleteKNNIndex(newIndex);
}
}

@AwaitsFix(bugUrl = "https://github.com/opensearch-project/k-NN/issues/2805")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this meant to be here?

public void testKNNIndexLuceneBBQ() throws Exception {
waitForClusterHealthGreen(NODES_BWC_CLUSTER);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We don't need to add the same condition to validate BWC version twice in if and else blocks, probably add it here after the cluster is green

if (!isBBQEncoderSupported(getBWCVersion())) {
                logger.info("Skipping testKNNIndexLuceneBBQ as BBQ encoder is not supported in version: {}", getBWCVersion());
                return;
            }

int k = 4;
int dimension = 2;

if (isRunningAgainstOldCluster()) {
// Skip test if BBQ encoder is not supported in the old cluster version
if (!isBBQEncoderSupported(getBWCVersion())) {
logger.info("Skipping testKNNIndexLuceneBBQ as BBQ encoder is not supported in version: {}", getBWCVersion());
return;
}

String mapping = XContentFactory.jsonBuilder()
.startObject()
.startObject("properties")
.startObject(TEST_FIELD)
.field(VECTOR_TYPE, KNN_VECTOR)
.field(DIMENSION, dimension)
.startObject(KNN_METHOD)
.field(NAME, METHOD_HNSW)
.field(METHOD_PARAMETER_SPACE_TYPE, SpaceType.INNER_PRODUCT.getValue())
.field(KNN_ENGINE, LUCENE_NAME)
.startObject(PARAMETERS)
.startObject(METHOD_ENCODER_PARAMETER)
.field(NAME, ENCODER_BBQ)
.endObject()
.field(METHOD_PARAMETER_EF_CONSTRUCTION, 256)
.field(METHOD_PARAMETER_M, 16)
.endObject()
.endObject()
.endObject()
.endObject()
.endObject()
.toString();
createKnnIndex(testIndex, getKNNDefaultIndexSettings(), mapping);

Float[] vector1 = { -10.6f, 25.48f };
Float[] vector2 = { -10.8f, 25.48f };
Float[] vector3 = { -11.0f, 25.48f };
Float[] vector4 = { -11.2f, 25.48f };
addKnnDoc(testIndex, "1", TEST_FIELD, vector1);
addKnnDoc(testIndex, "2", TEST_FIELD, vector2);
addKnnDoc(testIndex, "3", TEST_FIELD, vector3);
addKnnDoc(testIndex, "4", TEST_FIELD, vector4);

float[] queryVector = { -10.5f, 25.48f };
Response searchResponse = searchKNNIndex(testIndex, new KNNQueryBuilder(TEST_FIELD, queryVector, k), k);
List<KNNResult> results = parseSearchResponse(EntityUtils.toString(searchResponse.getEntity()), TEST_FIELD);
assertEquals(k, results.size());
for (int i = 0; i < k; i++) {
assertEquals(k - i, Integer.parseInt(results.get(i).getDocId()));
}
} else {
// Skip test if BBQ encoder is not supported in the old cluster version
if (!isBBQEncoderSupported(getBWCVersion())) {
logger.info("Skipping testKNNIndexLuceneBBQ validation as BBQ encoder is not supported in version: {}", getBWCVersion());
return;
}
float[] queryVector = { -10.5f, 25.48f };
Response searchResponse = searchKNNIndex(testIndex, new KNNQueryBuilder(TEST_FIELD, queryVector, k), k);
List<KNNResult> results = parseSearchResponse(EntityUtils.toString(searchResponse.getEntity()), TEST_FIELD);
assertEquals(k, results.size());
for (int i = 0; i < k; i++) {
assertEquals(k - i, Integer.parseInt(results.get(i).getDocId()));
}
deleteKNNIndex(testIndex);
}
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -102,6 +102,7 @@ public class KNNConstants {
public static final double MAXIMUM_CONFIDENCE_INTERVAL = 1.0;
public static final String LUCENE_SQ_BITS = "bits";
public static final int LUCENE_SQ_DEFAULT_BITS = 7;
public static final String ENCODER_BBQ = "binary";

// nmslib specific constants
@Deprecated(since = "2.19.0", forRemoval = true)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@
import org.opensearch.knn.index.KNNSettings;
import org.opensearch.knn.index.codec.KNN990Codec.NativeEngines990KnnVectorsFormat;
import org.opensearch.knn.index.codec.nativeindex.NativeIndexBuildStrategyFactory;
import org.opensearch.knn.index.codec.params.KNNBBQVectorsFormatParams;
import org.opensearch.knn.index.codec.params.KNNScalarQuantizedVectorsFormatParams;
import org.opensearch.knn.index.codec.params.KNNVectorsFormatParams;
import org.opensearch.knn.index.engine.KNNEngine;
Expand Down Expand Up @@ -44,7 +45,8 @@ public abstract class BasePerFieldKnnVectorsFormat extends PerFieldKnnVectorsFor
private final int defaultBeamWidth;
private final Supplier<KnnVectorsFormat> defaultFormatSupplier;
private final Function<KNNVectorsFormatParams, KnnVectorsFormat> vectorsFormatSupplier;
private Function<KNNScalarQuantizedVectorsFormatParams, KnnVectorsFormat> scalarQuantizedVectorsFormatSupplier;
private final Function<KNNScalarQuantizedVectorsFormatParams, KnnVectorsFormat> scalarQuantizedVectorsFormatSupplier;
private final Function<KNNBBQVectorsFormatParams, KnnVectorsFormat> bbqVectorsFormatSupplier;
private final NativeIndexBuildStrategyFactory nativeIndexBuildStrategyFactory;
private static final String MAX_CONNECTIONS = "max_connections";
private static final String BEAM_WIDTH = "beam_width";
Expand All @@ -56,7 +58,7 @@ public BasePerFieldKnnVectorsFormat(
Supplier<KnnVectorsFormat> defaultFormatSupplier,
Function<KNNVectorsFormatParams, KnnVectorsFormat> vectorsFormatSupplier
) {
this(mapperService, defaultMaxConnections, defaultBeamWidth, defaultFormatSupplier, vectorsFormatSupplier, null);
this(mapperService, defaultMaxConnections, defaultBeamWidth, defaultFormatSupplier, vectorsFormatSupplier, null, null);
}

public BasePerFieldKnnVectorsFormat(
Expand All @@ -65,7 +67,8 @@ public BasePerFieldKnnVectorsFormat(
int defaultBeamWidth,
Supplier<KnnVectorsFormat> defaultFormatSupplier,
Function<KNNVectorsFormatParams, KnnVectorsFormat> vectorsFormatSupplier,
Function<KNNScalarQuantizedVectorsFormatParams, KnnVectorsFormat> scalarQuantizedVectorsFormatSupplier
Function<KNNScalarQuantizedVectorsFormatParams, KnnVectorsFormat> scalarQuantizedVectorsFormatSupplier,
Function<KNNBBQVectorsFormatParams, KnnVectorsFormat> bbqVectorsFormatSupplier
) {
this(
mapperService,
Expand All @@ -74,6 +77,7 @@ public BasePerFieldKnnVectorsFormat(
defaultFormatSupplier,
vectorsFormatSupplier,
scalarQuantizedVectorsFormatSupplier,
bbqVectorsFormatSupplier,
new NativeIndexBuildStrategyFactory()
);
}
Expand Down Expand Up @@ -110,6 +114,11 @@ public KnnVectorsFormat getKnnVectorsFormatForField(final String field) {

if (engine == KNNEngine.LUCENE) {
if (params != null && params.containsKey(METHOD_ENCODER_PARAMETER)) {
KNNBBQVectorsFormatParams bbqParams = new KNNBBQVectorsFormatParams(params, defaultMaxConnections, defaultBeamWidth);
if (bbqParams.validate(params)) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we please add debug log like the below?

return bbqVectorsFormatSupplier.apply(bbqParams);
}

KNNScalarQuantizedVectorsFormatParams knnScalarQuantizedVectorsFormatParams = new KNNScalarQuantizedVectorsFormatParams(
params,
defaultMaxConnections,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@

package org.opensearch.knn.index.codec.KNN9120Codec;

import org.apache.lucene.codecs.lucene102.Lucene102HnswBinaryQuantizedVectorsFormat;
import org.apache.lucene.codecs.lucene99.Lucene99HnswScalarQuantizedVectorsFormat;
import org.apache.lucene.codecs.lucene99.Lucene99HnswVectorsFormat;
import org.opensearch.common.collect.Tuple;
Expand Down Expand Up @@ -76,6 +77,15 @@ public KNN9120PerFieldKnnVectorsFormat(
mergeThreadCountAndExecutorService.v2()
);
},
knnBBQVectorsFormatParams -> {
final Tuple<Integer, ExecutorService> mergeThreadCountAndExecutorService = getMergeThreadCountAndExecutorService();
return new Lucene102HnswBinaryQuantizedVectorsFormat(
knnBBQVectorsFormatParams.getMaxConnections(),
knnBBQVectorsFormatParams.getBeamWidth(),
mergeThreadCountAndExecutorService.v1(),
mergeThreadCountAndExecutorService.v2()
);
},
nativeIndexBuildStrategyFactory
);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@

package org.opensearch.knn.index.codec.backward_codecs.KNN990Codec;

import org.apache.lucene.codecs.lucene102.Lucene102HnswBinaryQuantizedVectorsFormat;
import org.apache.lucene.codecs.lucene99.Lucene99HnswScalarQuantizedVectorsFormat;
import org.apache.lucene.codecs.lucene99.Lucene99HnswVectorsFormat;
import org.opensearch.index.mapper.MapperService;
Expand Down Expand Up @@ -37,6 +38,12 @@ public KNN990PerFieldKnnVectorsFormat(final Optional<MapperService> mapperServic
knnScalarQuantizedVectorsFormatParams.isCompressFlag(),
knnScalarQuantizedVectorsFormatParams.getConfidenceInterval(),
null
),
knnBBQVectorsFormatParams -> new Lucene102HnswBinaryQuantizedVectorsFormat(
knnBBQVectorsFormatParams.getMaxConnections(),
knnBBQVectorsFormatParams.getBeamWidth(),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there no constructor parameter for bits like above?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, Lucene does not have that constructor.

NUM_MERGE_WORKERS,
null
)
);
}
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
/*
* Copyright OpenSearch Contributors
* SPDX-License-Identifier: Apache-2.0
*/

package org.opensearch.knn.index.codec.params;

import org.opensearch.knn.index.engine.MethodComponentContext;
import java.util.Map;
import static org.opensearch.knn.common.KNNConstants.ENCODER_BBQ;
import static org.opensearch.knn.common.KNNConstants.METHOD_ENCODER_PARAMETER;

/**
* Class provides params for Lucene102HnswBinaryQuantizedVectorsFormat
*/
public class KNNBBQVectorsFormatParams extends KNNVectorsFormatParams {

public KNNBBQVectorsFormatParams(Map<String, Object> params, int defaultMaxConnections, int defaultBeamWidth) {
super(params, defaultMaxConnections, defaultBeamWidth);
MethodComponentContext encoderMethodComponentContext = (MethodComponentContext) params.get(METHOD_ENCODER_PARAMETER);
Map<String, Object> bbqEncoderParams = encoderMethodComponentContext.getParameters();
}

@Override
public boolean validate(Map<String, Object> params) {
if (params.get(METHOD_ENCODER_PARAMETER) == null) {
return false;
}

if (!(params.get(METHOD_ENCODER_PARAMETER) instanceof MethodComponentContext)) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: use == false check. It's standard across OpenSearch

if ((params.get(METHOD_ENCODER_PARAMETER) instanceof MethodComponentContext) == false)

return false;
}

MethodComponentContext encoderMethodComponentContext = (MethodComponentContext) params.get(METHOD_ENCODER_PARAMETER);
return ENCODER_BBQ.equals(encoderMethodComponentContext.getName());
}

/**
* Check if BBQ is enabled
* @return true if BBQ is enabled, false otherwise
*/
public boolean isBBQEnabled() {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What's the purpose of this method?

// BBQ is enabled if this class is being used, which means the encoder parameter was validated
return true;
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
/*
* Copyright OpenSearch Contributors
* SPDX-License-Identifier: Apache-2.0
*/

package org.opensearch.knn.index.engine.lucene;

import com.google.common.collect.ImmutableSet;
import org.opensearch.knn.index.VectorDataType;
import org.opensearch.knn.index.engine.Encoder;
import org.opensearch.knn.index.engine.KNNMethodConfigContext;
import org.opensearch.knn.index.engine.MethodComponent;
import org.opensearch.knn.index.engine.MethodComponentContext;
import org.opensearch.knn.index.mapper.CompressionLevel;

import java.util.Set;

import static org.opensearch.knn.common.KNNConstants.ENCODER_BBQ;

/**
* Lucene BBQ (Better Binary Quantization) encoder
*/
public class LuceneBBQEncoder implements Encoder {
private static final Set<VectorDataType> SUPPORTED_DATA_TYPES = ImmutableSet.of(VectorDataType.FLOAT);

private final static MethodComponent METHOD_COMPONENT = MethodComponent.Builder.builder(ENCODER_BBQ)
.addSupportedDataTypes(SUPPORTED_DATA_TYPES)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As discussed offline in the past, can you add support for bits parameter and set default to 1 bit (32x compression) such that in the future if Lucene supports 2 and 4 bits we can use this parameter.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For the naming convention, pls keep it consistent with Faiss BQ

.build();

@Override
public MethodComponent getMethodComponent() {
return METHOD_COMPONENT;
}

@Override
public CompressionLevel calculateCompressionLevel(
MethodComponentContext methodComponentContext,
KNNMethodConfigContext knnMethodConfigContext
) {
return CompressionLevel.x32;
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,8 @@ public class LuceneHNSWMethod extends AbstractKNNMethod {
);

final static Encoder SQ_ENCODER = new LuceneSQEncoder();
final static Map<String, Encoder> SUPPORTED_ENCODERS = Map.of(SQ_ENCODER.getName(), SQ_ENCODER);
final static Encoder BBQ_ENCODER = new LuceneBBQEncoder();
final static Map<String, Encoder> SUPPORTED_ENCODERS = Map.of(SQ_ENCODER.getName(), SQ_ENCODER, BBQ_ENCODER.getName(), BBQ_ENCODER);

final static MethodComponent HNSW_METHOD_COMPONENT = initMethodComponent();

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,11 +24,14 @@
import static org.opensearch.knn.common.KNNConstants.METHOD_ENCODER_PARAMETER;
import static org.opensearch.knn.common.KNNConstants.METHOD_HNSW;
import static org.opensearch.knn.index.engine.lucene.LuceneHNSWMethod.HNSW_METHOD_COMPONENT;
import static org.opensearch.knn.index.engine.lucene.LuceneHNSWMethod.SQ_ENCODER;

public class LuceneMethodResolver extends AbstractMethodResolver {

private static final Set<CompressionLevel> SUPPORTED_COMPRESSION_LEVELS = Set.of(CompressionLevel.x1, CompressionLevel.x4);
private static final Set<CompressionLevel> SUPPORTED_COMPRESSION_LEVELS = Set.of(
CompressionLevel.x1,
CompressionLevel.x4,
CompressionLevel.x32
);

@Override
public ResolvedMethodContext resolveMethod(
Expand Down Expand Up @@ -69,10 +72,18 @@ protected void resolveEncoder(KNNMethodContext resolvedKNNMethodContext, KNNMeth
}

MethodComponentContext methodComponentContext = resolvedKNNMethodContext.getMethodComponentContext();
MethodComponentContext encoderComponentContext = new MethodComponentContext(SQ_ENCODER.getName(), new HashMap<>());

String encoderName = (resolvedCompressionLevel == CompressionLevel.x32)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nit: since there are multiple clauses let's make this one if check of x32 compression

? LuceneHNSWMethod.BBQ_ENCODER.getName()
: LuceneHNSWMethod.SQ_ENCODER.getName();
MethodComponent encoderComponent = (resolvedCompressionLevel == CompressionLevel.x32)
? LuceneHNSWMethod.BBQ_ENCODER.getMethodComponent()
: LuceneHNSWMethod.SQ_ENCODER.getMethodComponent();

MethodComponentContext encoderComponentContext = new MethodComponentContext(encoderName, new HashMap<>());
Map<String, Object> resolvedParams = MethodComponent.getParameterMapWithDefaultsAdded(
encoderComponentContext,
SQ_ENCODER.getMethodComponent(),
encoderComponent,
knnMethodConfigContext
);
encoderComponentContext.getParameters().putAll(resolvedParams);
Expand Down
Loading
Loading