3780: Be able to sort facet values by alpha or count r=dureuill a=Kerollmops

This PR introduces a new `sortFacetValuesBy` settings parameter to expose the facet distribution in either count or lexicographic/alpha order.

## Mini Spec of the `sortFacetValuesBy` Settings Parameter

This parameter can be set in the settings to change how the engine returns the facet values. There are two possible values to this parameter.

Please note that the current behavior changed a bit, and keys are returned in lexicographic order instead of undefined order. The previous order wasn't defined as we were using a `HashMap`, which returns entries in hash order (undefined), and we are now using an `IndexMap`, which returns them in insertion order (the order we actually want).

Also, note that there are performance issues when the dataset is enormous. Here are the timings of the engine running on my Macbook Pro M1 (16Go of RAM). [The dataset is 40 million songs file](https://www.notion.so/meilisearch/Songs-from-MusicBrainz-686e31b2bd3845898c7746f502a6e117), and the database size is about 50GiB. Even if you think 800ms is not that high, don't forget that the API is public, and anybody can ask for multiple facets in a single query.

| Search Kind | Get Facets | Max Values per Facet | Time for Alpha | Time for Count | Count but with #3788 |
|------------:|------------|----------------------|:--------------:|----------------|----------------------|
| Placeholder | genres     | default (100)        | 7ms            | 187ms          | 122ms                |
| Placeholder | genres     | 20                   | 6ms            | 124ms          | 75ms                 |
| Placeholder | album      | default (100)        | 9ms            | 808ms          | 677ms                |
| Placeholder | album      | 20                   | 8ms            | 579ms          | 446ms                |
| Placeholder | artist     | default (100)        | 9ms            | 462ms          | 344ms                |
| Placeholder | artist     | 20                   | 9ms            | 341ms          | 246ms                |

### Order Values in Alphanumeric Order

This is the default one. Values will be returned by lexicographic order, ascending from A to Z.

```bash
# First, update the settings
curl 'localhost:7700/indexes/movies/settings/facetting' \
  -H "Content-Type: application/json"  \
  -d '{ "sortFacetValuesBy": { "*": "alpha" } }'

# Then, ask for the facet distribution
curl 'localhost:7700/indexes/movies/search?facets=genres'
```

```json5
{
    "hits": [
        /* list of results */
    ],
    "query": "",
    "processingTimeMs": 0,
    "limit": 20,
    "offset": 0,
    "estimatedTotalHits": 1000,
    "facetDistribution": {
        "genres": {
            "Action": 3215,
            "Adventure": 1972,
            "Animation": 1577,
            "Comedy": 5883,
            "Crime": 1808,
            // ...
        }
    },
    "facetStats": {}
}
```

### Order Values in Count Order

Facet values are sorted by decreasing count. The count is the number of records containing this facet value in the query results.

```bash
# First, update the settings
curl 'localhost:7700/indexes/movies/settings/facetting' \
  -H "Content-Type: application/json"  \
  -d '{ "sortFacetValuesBy": { "*": "count" } }'

# Then, ask for the facet distribution
curl 'localhost:7700/indexes/movies/search?facets=genres'
```

```json5
{
    "hits": [
        /* list of results */
    ],
    "query": "",
    "processingTimeMs": 0,
    "limit": 20,
    "offset": 0,
    "estimatedTotalHits": 1000,
    "facetDistribution": {
        "genres": {
            "Drama": 7337,
            "Comedy": 5883,
            "Action": 3215,
            "Thriller": 3189,
            "Romance": 2507,
            // ...
        }
    },
    "facetStats": {}
}
```

## Todo List
 - [x] Add tests
 - [x] Send analytics when a user change the `sortFacetValuesBy`
 - [x] Create a prototype and announce it in https://github.com/meilisearch/product/discussions/519.

Co-authored-by: Kerollmops <clement@meilisearch.com>
Co-authored-by: Clément Renault <clement@meilisearch.com>
This commit is contained in:
meili-bors[bot] 2023-06-29 12:43:25 +00:00 committed by GitHub
commit c9b3f80947
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
20 changed files with 516 additions and 179 deletions

1
Cargo.lock generated
View File

@ -2731,6 +2731,7 @@ dependencies = [
"grenad", "grenad",
"heed", "heed",
"hnsw", "hnsw",
"indexmap",
"insta", "insta",
"itertools", "itertools",
"json-depth-checker", "json-depth-checker",

View File

@ -208,12 +208,13 @@ pub(crate) mod test {
use std::str::FromStr; use std::str::FromStr;
use big_s::S; use big_s::S;
use maplit::btreeset; use maplit::{btreemap, btreeset};
use meilisearch_types::facet_values_sort::FacetValuesSort;
use meilisearch_types::index_uid_pattern::IndexUidPattern; use meilisearch_types::index_uid_pattern::IndexUidPattern;
use meilisearch_types::keys::{Action, Key}; use meilisearch_types::keys::{Action, Key};
use meilisearch_types::milli;
use meilisearch_types::milli::update::Setting; use meilisearch_types::milli::update::Setting;
use meilisearch_types::milli::{self}; use meilisearch_types::settings::{Checked, FacetingSettings, Settings};
use meilisearch_types::settings::{Checked, Settings};
use meilisearch_types::tasks::{Details, Status}; use meilisearch_types::tasks::{Details, Status};
use serde_json::{json, Map, Value}; use serde_json::{json, Map, Value};
use time::macros::datetime; use time::macros::datetime;
@ -263,7 +264,12 @@ pub(crate) mod test {
synonyms: Setting::NotSet, synonyms: Setting::NotSet,
distinct_attribute: Setting::NotSet, distinct_attribute: Setting::NotSet,
typo_tolerance: Setting::NotSet, typo_tolerance: Setting::NotSet,
faceting: Setting::NotSet, faceting: Setting::Set(FacetingSettings {
max_values_per_facet: Setting::Set(111),
sort_facet_values_by: Setting::Set(
btreemap! { S("age") => FacetValuesSort::Count },
),
}),
pagination: Setting::NotSet, pagination: Setting::NotSet,
_kind: std::marker::PhantomData, _kind: std::marker::PhantomData,
}; };

View File

@ -362,6 +362,7 @@ impl<T> From<v5::Settings<T>> for v6::Settings<v6::Unchecked> {
faceting: match settings.faceting { faceting: match settings.faceting {
v5::Setting::Set(faceting) => v6::Setting::Set(v6::FacetingSettings { v5::Setting::Set(faceting) => v6::Setting::Set(v6::FacetingSettings {
max_values_per_facet: faceting.max_values_per_facet.into(), max_values_per_facet: faceting.max_values_per_facet.into(),
sort_facet_values_by: v6::Setting::NotSet,
}), }),
v5::Setting::Reset => v6::Setting::Reset, v5::Setting::Reset => v6::Setting::Reset,
v5::Setting::NotSet => v6::Setting::NotSet, v5::Setting::NotSet => v6::Setting::NotSet,

View File

@ -0,0 +1,33 @@
use deserr::Deserr;
use milli::OrderBy;
use serde::{Deserialize, Serialize};
#[derive(Debug, Default, Copy, Clone, PartialEq, Eq, Serialize, Deserialize, Deserr)]
#[serde(rename_all = "camelCase")]
#[deserr(rename_all = camelCase)]
pub enum FacetValuesSort {
/// Facet values are sorted in alphabetical order, ascending from A to Z.
#[default]
Alpha,
/// Facet values are sorted by decreasing count.
/// The count is the number of records containing this facet value in the results of the query.
Count,
}
impl From<FacetValuesSort> for OrderBy {
fn from(val: FacetValuesSort) -> Self {
match val {
FacetValuesSort::Alpha => OrderBy::Lexicographic,
FacetValuesSort::Count => OrderBy::Count,
}
}
}
impl From<OrderBy> for FacetValuesSort {
fn from(val: OrderBy) -> Self {
match val {
OrderBy::Lexicographic => FacetValuesSort::Alpha,
OrderBy::Count => FacetValuesSort::Count,
}
}
}

View File

@ -2,6 +2,7 @@ pub mod compression;
pub mod deserr; pub mod deserr;
pub mod document_formats; pub mod document_formats;
pub mod error; pub mod error;
pub mod facet_values_sort;
pub mod features; pub mod features;
pub mod index_uid; pub mod index_uid;
pub mod index_uid_pattern; pub mod index_uid_pattern;

View File

@ -14,8 +14,9 @@ use serde::{Deserialize, Serialize, Serializer};
use crate::deserr::DeserrJsonError; use crate::deserr::DeserrJsonError;
use crate::error::deserr_codes::*; use crate::error::deserr_codes::*;
use crate::facet_values_sort::FacetValuesSort;
/// The maximimum number of results that the engine /// The maximum number of results that the engine
/// will be able to return in one search call. /// will be able to return in one search call.
pub const DEFAULT_PAGINATION_MAX_TOTAL_HITS: usize = 1000; pub const DEFAULT_PAGINATION_MAX_TOTAL_HITS: usize = 1000;
@ -102,6 +103,9 @@ pub struct FacetingSettings {
#[serde(default, skip_serializing_if = "Setting::is_not_set")] #[serde(default, skip_serializing_if = "Setting::is_not_set")]
#[deserr(default)] #[deserr(default)]
pub max_values_per_facet: Setting<usize>, pub max_values_per_facet: Setting<usize>,
#[serde(default, skip_serializing_if = "Setting::is_not_set")]
#[deserr(default)]
pub sort_facet_values_by: Setting<BTreeMap<String, FacetValuesSort>>,
} }
#[derive(Debug, Clone, Default, Serialize, Deserialize, PartialEq, Eq, Deserr)] #[derive(Debug, Clone, Default, Serialize, Deserialize, PartialEq, Eq, Deserr)]
@ -398,13 +402,25 @@ pub fn apply_settings_to_builder(
Setting::NotSet => (), Setting::NotSet => (),
} }
match settings.faceting { match &settings.faceting {
Setting::Set(ref value) => match value.max_values_per_facet { Setting::Set(FacetingSettings { max_values_per_facet, sort_facet_values_by }) => {
Setting::Set(val) => builder.set_max_values_per_facet(val), match max_values_per_facet {
Setting::Set(val) => builder.set_max_values_per_facet(*val),
Setting::Reset => builder.reset_max_values_per_facet(), Setting::Reset => builder.reset_max_values_per_facet(),
Setting::NotSet => (), Setting::NotSet => (),
}, }
Setting::Reset => builder.reset_max_values_per_facet(), match sort_facet_values_by {
Setting::Set(val) => builder.set_sort_facet_values_by(
val.iter().map(|(name, order)| (name.clone(), (*order).into())).collect(),
),
Setting::Reset => builder.reset_sort_facet_values_by(),
Setting::NotSet => (),
}
}
Setting::Reset => {
builder.reset_max_values_per_facet();
builder.reset_sort_facet_values_by();
}
Setting::NotSet => (), Setting::NotSet => (),
} }
@ -476,6 +492,13 @@ pub fn settings(
max_values_per_facet: Setting::Set( max_values_per_facet: Setting::Set(
index.max_values_per_facet(rtxn)?.unwrap_or(DEFAULT_VALUES_PER_FACET), index.max_values_per_facet(rtxn)?.unwrap_or(DEFAULT_VALUES_PER_FACET),
), ),
sort_facet_values_by: Setting::Set(
index
.sort_facet_values_by(rtxn)?
.into_iter()
.map(|(name, sort)| (name, sort.into()))
.collect(),
),
}; };
let pagination = PaginationSettings { let pagination = PaginationSettings {

View File

@ -14,14 +14,27 @@ default-run = "meilisearch"
[dependencies] [dependencies]
actix-cors = "0.6.4" actix-cors = "0.6.4"
actix-http = { version = "3.3.1", default-features = false, features = ["compress-brotli", "compress-gzip", "rustls"] } actix-http = { version = "3.3.1", default-features = false, features = [
actix-web = { version = "4.3.1", default-features = false, features = ["macros", "compress-brotli", "compress-gzip", "cookies", "rustls"] } "compress-brotli",
"compress-gzip",
"rustls",
] }
actix-web = { version = "4.3.1", default-features = false, features = [
"macros",
"compress-brotli",
"compress-gzip",
"cookies",
"rustls",
] }
actix-web-static-files = { git = "https://github.com/kilork/actix-web-static-files.git", rev = "2d3b6160", optional = true } actix-web-static-files = { git = "https://github.com/kilork/actix-web-static-files.git", rev = "2d3b6160", optional = true }
anyhow = { version = "1.0.70", features = ["backtrace"] } anyhow = { version = "1.0.70", features = ["backtrace"] }
async-stream = "0.3.5" async-stream = "0.3.5"
async-trait = "0.1.68" async-trait = "0.1.68"
bstr = "1.4.0" bstr = "1.4.0"
byte-unit = { version = "4.0.19", default-features = false, features = ["std", "serde"] } byte-unit = { version = "4.0.19", default-features = false, features = [
"std",
"serde",
] }
bytes = "1.4.0" bytes = "1.4.0"
clap = { version = "4.2.1", features = ["derive", "env"] } clap = { version = "4.2.1", features = ["derive", "env"] }
crossbeam-channel = "0.5.8" crossbeam-channel = "0.5.8"
@ -57,7 +70,10 @@ prometheus = { version = "0.13.3", features = ["process"] }
rand = "0.8.5" rand = "0.8.5"
rayon = "1.7.0" rayon = "1.7.0"
regex = "1.7.3" regex = "1.7.3"
reqwest = { version = "0.11.16", features = ["rustls-tls", "json"], default-features = false } reqwest = { version = "0.11.16", features = [
"rustls-tls",
"json",
], default-features = false }
rustls = "0.20.8" rustls = "0.20.8"
rustls-pemfile = "1.0.2" rustls-pemfile = "1.0.2"
segment = { version = "0.2.2", optional = true } segment = { version = "0.2.2", optional = true }
@ -71,7 +87,12 @@ sysinfo = "0.28.4"
tar = "0.4.38" tar = "0.4.38"
tempfile = "3.5.0" tempfile = "3.5.0"
thiserror = "1.0.40" thiserror = "1.0.40"
time = { version = "0.3.20", features = ["serde-well-known", "formatting", "parsing", "macros"] } time = { version = "0.3.20", features = [
"serde-well-known",
"formatting",
"parsing",
"macros",
] }
tokio = { version = "1.27.0", features = ["full"] } tokio = { version = "1.27.0", features = ["full"] }
tokio-stream = "0.1.12" tokio-stream = "0.1.12"
toml = "0.7.3" toml = "0.7.3"
@ -99,7 +120,10 @@ yaup = "0.2.1"
anyhow = { version = "1.0.70", optional = true } anyhow = { version = "1.0.70", optional = true }
cargo_toml = { version = "0.15.2", optional = true } cargo_toml = { version = "0.15.2", optional = true }
hex = { version = "0.4.3", optional = true } hex = { version = "0.4.3", optional = true }
reqwest = { version = "0.11.16", features = ["blocking", "rustls-tls"], default-features = false, optional = true } reqwest = { version = "0.11.16", features = [
"blocking",
"rustls-tls",
], default-features = false, optional = true }
sha-1 = { version = "0.10.1", optional = true } sha-1 = { version = "0.10.1", optional = true }
static-files = { version = "0.2.3", optional = true } static-files = { version = "0.2.3", optional = true }
tempfile = { version = "3.5.0", optional = true } tempfile = { version = "3.5.0", optional = true }
@ -109,7 +133,17 @@ zip = { version = "0.6.4", optional = true }
[features] [features]
default = ["analytics", "meilisearch-types/all-tokenizations", "mini-dashboard"] default = ["analytics", "meilisearch-types/all-tokenizations", "mini-dashboard"]
analytics = ["segment"] analytics = ["segment"]
mini-dashboard = ["actix-web-static-files", "static-files", "anyhow", "cargo_toml", "hex", "reqwest", "sha-1", "tempfile", "zip"] mini-dashboard = [
"actix-web-static-files",
"static-files",
"anyhow",
"cargo_toml",
"hex",
"reqwest",
"sha-1",
"tempfile",
"zip",
]
chinese = ["meilisearch-types/chinese"] chinese = ["meilisearch-types/chinese"]
hebrew = ["meilisearch-types/hebrew"] hebrew = ["meilisearch-types/hebrew"]
japanese = ["meilisearch-types/japanese"] japanese = ["meilisearch-types/japanese"]

View File

@ -401,12 +401,17 @@ make_setting_route!(
analytics, analytics,
|setting: &Option<meilisearch_types::settings::FacetingSettings>, req: &HttpRequest| { |setting: &Option<meilisearch_types::settings::FacetingSettings>, req: &HttpRequest| {
use serde_json::json; use serde_json::json;
use meilisearch_types::facet_values_sort::FacetValuesSort;
analytics.publish( analytics.publish(
"Faceting Updated".to_string(), "Faceting Updated".to_string(),
json!({ json!({
"faceting": { "faceting": {
"max_values_per_facet": setting.as_ref().and_then(|s| s.max_values_per_facet.set()), "max_values_per_facet": setting.as_ref().and_then(|s| s.max_values_per_facet.set()),
"sort_facet_values_by_star_count": setting.as_ref().and_then(|s| {
s.sort_facet_values_by.as_ref().set().map(|s| s.iter().any(|(k, v)| k == "*" && v == &FacetValuesSort::Count))
}),
"sort_facet_values_by_total": setting.as_ref().and_then(|s| s.sort_facet_values_by.as_ref().set().map(|s| s.len())),
}, },
}), }),
Some(req), Some(req),
@ -545,6 +550,10 @@ pub async fn update_all(
.as_ref() .as_ref()
.set() .set()
.and_then(|s| s.max_values_per_facet.as_ref().set()), .and_then(|s| s.max_values_per_facet.as_ref().set()),
"sort_facet_values_by": new_settings.faceting
.as_ref()
.set()
.and_then(|s| s.sort_facet_values_by.as_ref().set()),
}, },
"pagination": { "pagination": {
"max_total_hits": new_settings.pagination "max_total_hits": new_settings.pagination

View File

@ -6,6 +6,7 @@ use std::time::Instant;
use deserr::Deserr; use deserr::Deserr;
use either::Either; use either::Either;
use index_scheduler::RoFeatures; use index_scheduler::RoFeatures;
use indexmap::IndexMap;
use log::warn; use log::warn;
use meilisearch_auth::IndexSearchRules; use meilisearch_auth::IndexSearchRules;
use meilisearch_types::deserr::DeserrJsonError; use meilisearch_types::deserr::DeserrJsonError;
@ -14,7 +15,7 @@ use meilisearch_types::heed::RoTxn;
use meilisearch_types::index_uid::IndexUid; use meilisearch_types::index_uid::IndexUid;
use meilisearch_types::milli::score_details::{ScoreDetails, ScoringStrategy}; use meilisearch_types::milli::score_details::{ScoreDetails, ScoringStrategy};
use meilisearch_types::milli::{ use meilisearch_types::milli::{
dot_product_similarity, FacetValueHit, InternalError, SearchForFacetValues, dot_product_similarity, FacetValueHit, InternalError, OrderBy, SearchForFacetValues,
}; };
use meilisearch_types::settings::DEFAULT_PAGINATION_MAX_TOTAL_HITS; use meilisearch_types::settings::DEFAULT_PAGINATION_MAX_TOTAL_HITS;
use meilisearch_types::{milli, Document}; use meilisearch_types::{milli, Document};
@ -226,6 +227,26 @@ impl From<MatchingStrategy> for TermsMatchingStrategy {
} }
} }
#[derive(Debug, Default, Clone, PartialEq, Eq, Deserr)]
#[deserr(rename_all = camelCase)]
pub enum FacetValuesSort {
/// Facet values are sorted in alphabetical order, ascending from A to Z.
#[default]
Alpha,
/// Facet values are sorted by decreasing count.
/// The count is the number of records containing this facet value in the results of the query.
Count,
}
impl From<FacetValuesSort> for OrderBy {
fn from(val: FacetValuesSort) -> Self {
match val {
FacetValuesSort::Alpha => OrderBy::Lexicographic,
FacetValuesSort::Count => OrderBy::Count,
}
}
}
#[derive(Debug, Clone, Serialize, PartialEq)] #[derive(Debug, Clone, Serialize, PartialEq)]
pub struct SearchHit { pub struct SearchHit {
#[serde(flatten)] #[serde(flatten)]
@ -253,7 +274,7 @@ pub struct SearchResult {
#[serde(flatten)] #[serde(flatten)]
pub hits_info: HitsInfo, pub hits_info: HitsInfo,
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub facet_distribution: Option<BTreeMap<String, BTreeMap<String, u64>>>, pub facet_distribution: Option<BTreeMap<String, IndexMap<String, u64>>>,
#[serde(skip_serializing_if = "Option::is_none")] #[serde(skip_serializing_if = "Option::is_none")]
pub facet_stats: Option<BTreeMap<String, FacetStats>>, pub facet_stats: Option<BTreeMap<String, FacetStats>>,
} }
@ -554,10 +575,30 @@ pub fn perform_search(
.unwrap_or(DEFAULT_VALUES_PER_FACET); .unwrap_or(DEFAULT_VALUES_PER_FACET);
facet_distribution.max_values_per_facet(max_values_by_facet); facet_distribution.max_values_per_facet(max_values_by_facet);
let sort_facet_values_by =
index.sort_facet_values_by(&rtxn).map_err(milli::Error::from)?;
let default_sort_facet_values_by =
sort_facet_values_by.get("*").copied().unwrap_or_default();
if fields.iter().all(|f| f != "*") { if fields.iter().all(|f| f != "*") {
let fields: Vec<_> = fields
.iter()
.map(|n| {
(
n,
sort_facet_values_by
.get(n)
.copied()
.unwrap_or(default_sort_facet_values_by),
)
})
.collect();
facet_distribution.facets(fields); facet_distribution.facets(fields);
} }
let distribution = facet_distribution.candidates(candidates).execute()?; let distribution = facet_distribution
.candidates(candidates)
.default_order_by(default_sort_facet_values_by)
.execute()?;
let stats = facet_distribution.compute_stats()?; let stats = facet_distribution.compute_stats()?;
(Some(distribution), Some(stats)) (Some(distribution), Some(stats))
} }

View File

@ -36,7 +36,7 @@ async fn import_dump_v1_movie_raw() {
assert_eq!(code, 200); assert_eq!(code, 200);
assert_eq!( assert_eq!(
settings, settings,
json!({"displayedAttributes": ["*"], "searchableAttributes": ["*"], "filterableAttributes": [], "sortableAttributes": [], "rankingRules": ["typo", "words", "proximity", "attribute", "exactness"], "stopWords": [], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": {"oneTypo": 5, "twoTypos": 9}, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100 }, "pagination": { "maxTotalHits": 1000 } }) json!({"displayedAttributes": ["*"], "searchableAttributes": ["*"], "filterableAttributes": [], "sortableAttributes": [], "rankingRules": ["typo", "words", "proximity", "attribute", "exactness"], "stopWords": [], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": {"oneTypo": 5, "twoTypos": 9}, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100, "sortFacetValuesBy": { "*": "alpha" } }, "pagination": { "maxTotalHits": 1000 } })
); );
let (tasks, code) = index.list_tasks().await; let (tasks, code) = index.list_tasks().await;
@ -128,7 +128,7 @@ async fn import_dump_v1_movie_with_settings() {
assert_eq!(code, 200); assert_eq!(code, 200);
assert_eq!( assert_eq!(
settings, settings,
json!({ "displayedAttributes": ["genres", "id", "overview", "poster", "release_date", "title"], "searchableAttributes": ["title", "overview"], "filterableAttributes": ["genres"], "sortableAttributes": ["genres"], "rankingRules": ["typo", "words", "proximity", "attribute", "exactness"], "stopWords": ["of", "the"], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": { "oneTypo": 5, "twoTypos": 9 }, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100 }, "pagination": { "maxTotalHits": 1000 } }) json!({ "displayedAttributes": ["genres", "id", "overview", "poster", "release_date", "title"], "searchableAttributes": ["title", "overview"], "filterableAttributes": ["genres"], "sortableAttributes": ["genres"], "rankingRules": ["typo", "words", "proximity", "attribute", "exactness"], "stopWords": ["of", "the"], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": { "oneTypo": 5, "twoTypos": 9 }, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100, "sortFacetValuesBy": { "*": "alpha" } }, "pagination": { "maxTotalHits": 1000 } })
); );
let (tasks, code) = index.list_tasks().await; let (tasks, code) = index.list_tasks().await;
@ -220,7 +220,7 @@ async fn import_dump_v1_rubygems_with_settings() {
assert_eq!(code, 200); assert_eq!(code, 200);
assert_eq!( assert_eq!(
settings, settings,
json!({"displayedAttributes": ["description", "id", "name", "summary", "total_downloads", "version"], "searchableAttributes": ["name", "summary"], "filterableAttributes": ["version"], "sortableAttributes": ["version"], "rankingRules": ["typo", "words", "fame:desc", "proximity", "attribute", "exactness", "total_downloads:desc"], "stopWords": [], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": {"oneTypo": 5, "twoTypos": 9}, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100 }, "pagination": { "maxTotalHits": 1000 }}) json!({"displayedAttributes": ["description", "id", "name", "summary", "total_downloads", "version"], "searchableAttributes": ["name", "summary"], "filterableAttributes": ["version"], "sortableAttributes": ["version"], "rankingRules": ["typo", "words", "fame:desc", "proximity", "attribute", "exactness", "total_downloads:desc"], "stopWords": [], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": {"oneTypo": 5, "twoTypos": 9}, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100, "sortFacetValuesBy": { "*": "alpha" } }, "pagination": { "maxTotalHits": 1000 }})
); );
let (tasks, code) = index.list_tasks().await; let (tasks, code) = index.list_tasks().await;
@ -310,7 +310,7 @@ async fn import_dump_v2_movie_raw() {
assert_eq!(code, 200); assert_eq!(code, 200);
assert_eq!( assert_eq!(
settings, settings,
json!({"displayedAttributes": ["*"], "searchableAttributes": ["*"], "filterableAttributes": [], "sortableAttributes": [], "rankingRules": ["words", "typo", "proximity", "attribute", "exactness"], "stopWords": [], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": {"oneTypo": 5, "twoTypos": 9}, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100 }, "pagination": { "maxTotalHits": 1000 } }) json!({"displayedAttributes": ["*"], "searchableAttributes": ["*"], "filterableAttributes": [], "sortableAttributes": [], "rankingRules": ["words", "typo", "proximity", "attribute", "exactness"], "stopWords": [], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": {"oneTypo": 5, "twoTypos": 9}, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100, "sortFacetValuesBy": { "*": "alpha" } }, "pagination": { "maxTotalHits": 1000 } })
); );
let (tasks, code) = index.list_tasks().await; let (tasks, code) = index.list_tasks().await;
@ -402,7 +402,7 @@ async fn import_dump_v2_movie_with_settings() {
assert_eq!(code, 200); assert_eq!(code, 200);
assert_eq!( assert_eq!(
settings, settings,
json!({ "displayedAttributes": ["title", "genres", "overview", "poster", "release_date"], "searchableAttributes": ["title", "overview"], "filterableAttributes": ["genres"], "sortableAttributes": [], "rankingRules": ["words", "typo", "proximity", "attribute", "exactness"], "stopWords": ["of", "the"], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": { "oneTypo": 5, "twoTypos": 9 }, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100 }, "pagination": { "maxTotalHits": 1000 } }) json!({ "displayedAttributes": ["title", "genres", "overview", "poster", "release_date"], "searchableAttributes": ["title", "overview"], "filterableAttributes": ["genres"], "sortableAttributes": [], "rankingRules": ["words", "typo", "proximity", "attribute", "exactness"], "stopWords": ["of", "the"], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": { "oneTypo": 5, "twoTypos": 9 }, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100, "sortFacetValuesBy": { "*": "alpha" } }, "pagination": { "maxTotalHits": 1000 } })
); );
let (tasks, code) = index.list_tasks().await; let (tasks, code) = index.list_tasks().await;
@ -494,7 +494,7 @@ async fn import_dump_v2_rubygems_with_settings() {
assert_eq!(code, 200); assert_eq!(code, 200);
assert_eq!( assert_eq!(
settings, settings,
json!({"displayedAttributes": ["name", "summary", "description", "version", "total_downloads"], "searchableAttributes": ["name", "summary"], "filterableAttributes": ["version"], "sortableAttributes": [], "rankingRules": ["typo", "words", "fame:desc", "proximity", "attribute", "exactness", "total_downloads:desc"], "stopWords": [], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": {"oneTypo": 5, "twoTypos": 9}, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100 }, "pagination": { "maxTotalHits": 1000 }}) json!({"displayedAttributes": ["name", "summary", "description", "version", "total_downloads"], "searchableAttributes": ["name", "summary"], "filterableAttributes": ["version"], "sortableAttributes": [], "rankingRules": ["typo", "words", "fame:desc", "proximity", "attribute", "exactness", "total_downloads:desc"], "stopWords": [], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": {"oneTypo": 5, "twoTypos": 9}, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100, "sortFacetValuesBy": { "*": "alpha" } }, "pagination": { "maxTotalHits": 1000 }})
); );
let (tasks, code) = index.list_tasks().await; let (tasks, code) = index.list_tasks().await;
@ -584,7 +584,7 @@ async fn import_dump_v3_movie_raw() {
assert_eq!(code, 200); assert_eq!(code, 200);
assert_eq!( assert_eq!(
settings, settings,
json!({"displayedAttributes": ["*"], "searchableAttributes": ["*"], "filterableAttributes": [], "sortableAttributes": [], "rankingRules": ["words", "typo", "proximity", "attribute", "exactness"], "stopWords": [], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": {"oneTypo": 5, "twoTypos": 9}, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100 }, "pagination": { "maxTotalHits": 1000 } }) json!({"displayedAttributes": ["*"], "searchableAttributes": ["*"], "filterableAttributes": [], "sortableAttributes": [], "rankingRules": ["words", "typo", "proximity", "attribute", "exactness"], "stopWords": [], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": {"oneTypo": 5, "twoTypos": 9}, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100, "sortFacetValuesBy": { "*": "alpha" } }, "pagination": { "maxTotalHits": 1000 } })
); );
let (tasks, code) = index.list_tasks().await; let (tasks, code) = index.list_tasks().await;
@ -676,7 +676,7 @@ async fn import_dump_v3_movie_with_settings() {
assert_eq!(code, 200); assert_eq!(code, 200);
assert_eq!( assert_eq!(
settings, settings,
json!({ "displayedAttributes": ["title", "genres", "overview", "poster", "release_date"], "searchableAttributes": ["title", "overview"], "filterableAttributes": ["genres"], "sortableAttributes": [], "rankingRules": ["words", "typo", "proximity", "attribute", "exactness"], "stopWords": ["of", "the"], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": { "oneTypo": 5, "twoTypos": 9 }, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100 }, "pagination": { "maxTotalHits": 1000 } }) json!({ "displayedAttributes": ["title", "genres", "overview", "poster", "release_date"], "searchableAttributes": ["title", "overview"], "filterableAttributes": ["genres"], "sortableAttributes": [], "rankingRules": ["words", "typo", "proximity", "attribute", "exactness"], "stopWords": ["of", "the"], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": { "oneTypo": 5, "twoTypos": 9 }, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100, "sortFacetValuesBy": { "*": "alpha" } }, "pagination": { "maxTotalHits": 1000 } })
); );
let (tasks, code) = index.list_tasks().await; let (tasks, code) = index.list_tasks().await;
@ -768,7 +768,7 @@ async fn import_dump_v3_rubygems_with_settings() {
assert_eq!(code, 200); assert_eq!(code, 200);
assert_eq!( assert_eq!(
settings, settings,
json!({"displayedAttributes": ["name", "summary", "description", "version", "total_downloads"], "searchableAttributes": ["name", "summary"], "filterableAttributes": ["version"], "sortableAttributes": [], "rankingRules": ["typo", "words", "fame:desc", "proximity", "attribute", "exactness", "total_downloads:desc"], "stopWords": [], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": {"oneTypo": 5, "twoTypos": 9}, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100 }, "pagination": { "maxTotalHits": 1000 } }) json!({"displayedAttributes": ["name", "summary", "description", "version", "total_downloads"], "searchableAttributes": ["name", "summary"], "filterableAttributes": ["version"], "sortableAttributes": [], "rankingRules": ["typo", "words", "fame:desc", "proximity", "attribute", "exactness", "total_downloads:desc"], "stopWords": [], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": {"oneTypo": 5, "twoTypos": 9}, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100, "sortFacetValuesBy": { "*": "alpha" } }, "pagination": { "maxTotalHits": 1000 } })
); );
let (tasks, code) = index.list_tasks().await; let (tasks, code) = index.list_tasks().await;
@ -858,7 +858,7 @@ async fn import_dump_v4_movie_raw() {
assert_eq!(code, 200); assert_eq!(code, 200);
assert_eq!( assert_eq!(
settings, settings,
json!({ "displayedAttributes": ["*"], "searchableAttributes": ["*"], "filterableAttributes": [], "sortableAttributes": [], "rankingRules": ["words", "typo", "proximity", "attribute", "exactness"], "stopWords": [], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": {"oneTypo": 5, "twoTypos": 9}, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100 }, "pagination": { "maxTotalHits": 1000 } }) json!({ "displayedAttributes": ["*"], "searchableAttributes": ["*"], "filterableAttributes": [], "sortableAttributes": [], "rankingRules": ["words", "typo", "proximity", "attribute", "exactness"], "stopWords": [], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": {"oneTypo": 5, "twoTypos": 9}, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100, "sortFacetValuesBy": { "*": "alpha" } }, "pagination": { "maxTotalHits": 1000 } })
); );
let (tasks, code) = index.list_tasks().await; let (tasks, code) = index.list_tasks().await;
@ -950,7 +950,7 @@ async fn import_dump_v4_movie_with_settings() {
assert_eq!(code, 200); assert_eq!(code, 200);
assert_eq!( assert_eq!(
settings, settings,
json!({ "displayedAttributes": ["title", "genres", "overview", "poster", "release_date"], "searchableAttributes": ["title", "overview"], "filterableAttributes": ["genres"], "sortableAttributes": [], "rankingRules": ["words", "typo", "proximity", "attribute", "exactness"], "stopWords": ["of", "the"], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": { "oneTypo": 5, "twoTypos": 9 }, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100 }, "pagination": { "maxTotalHits": 1000 } }) json!({ "displayedAttributes": ["title", "genres", "overview", "poster", "release_date"], "searchableAttributes": ["title", "overview"], "filterableAttributes": ["genres"], "sortableAttributes": [], "rankingRules": ["words", "typo", "proximity", "attribute", "exactness"], "stopWords": ["of", "the"], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": { "oneTypo": 5, "twoTypos": 9 }, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100, "sortFacetValuesBy": { "*": "alpha" } }, "pagination": { "maxTotalHits": 1000 } })
); );
let (tasks, code) = index.list_tasks().await; let (tasks, code) = index.list_tasks().await;
@ -1042,7 +1042,7 @@ async fn import_dump_v4_rubygems_with_settings() {
assert_eq!(code, 200); assert_eq!(code, 200);
assert_eq!( assert_eq!(
settings, settings,
json!({ "displayedAttributes": ["name", "summary", "description", "version", "total_downloads"], "searchableAttributes": ["name", "summary"], "filterableAttributes": ["version"], "sortableAttributes": [], "rankingRules": ["typo", "words", "fame:desc", "proximity", "attribute", "exactness", "total_downloads:desc"], "stopWords": [], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": {"oneTypo": 5, "twoTypos": 9}, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100 }, "pagination": { "maxTotalHits": 1000 } }) json!({ "displayedAttributes": ["name", "summary", "description", "version", "total_downloads"], "searchableAttributes": ["name", "summary"], "filterableAttributes": ["version"], "sortableAttributes": [], "rankingRules": ["typo", "words", "fame:desc", "proximity", "attribute", "exactness", "total_downloads:desc"], "stopWords": [], "synonyms": {}, "distinctAttribute": null, "typoTolerance": {"enabled": true, "minWordSizeForTypos": {"oneTypo": 5, "twoTypos": 9}, "disableOnWords": [], "disableOnAttributes": [] }, "faceting": { "maxValuesPerFacet": 100, "sortFacetValuesBy": { "*": "alpha" } }, "pagination": { "maxTotalHits": 1000 } })
); );
let (tasks, code) = index.list_tasks().await; let (tasks, code) = index.list_tasks().await;

View File

@ -21,6 +21,9 @@ static DEFAULT_SETTINGS_VALUES: Lazy<HashMap<&'static str, Value>> = Lazy::new(|
"faceting", "faceting",
json!({ json!({
"maxValuesPerFacet": json!(100), "maxValuesPerFacet": json!(100),
"sortFacetValuesBy": {
"*": "alpha"
}
}), }),
); );
map.insert( map.insert(
@ -63,6 +66,9 @@ async fn get_settings() {
settings["faceting"], settings["faceting"],
json!({ json!({
"maxValuesPerFacet": 100, "maxValuesPerFacet": 100,
"sortFacetValuesBy": {
"*": "alpha"
}
}) })
); );
assert_eq!( assert_eq!(

View File

@ -34,6 +34,7 @@ heed = { git = "https://github.com/meilisearch/heed", tag = "v0.12.6", default-f
"sync-read-txn", "sync-read-txn",
] } ] }
hnsw = { version = "0.11.0", features = ["serde1"] } hnsw = { version = "0.11.0", features = ["serde1"] }
indexmap = { version = "1.9.3", features = ["serde"] }
json-depth-checker = { path = "../json-depth-checker" } json-depth-checker = { path = "../json-depth-checker" }
levenshtein_automata = { version = "0.2.1", features = ["fst_automaton"] } levenshtein_automata = { version = "0.2.1", features = ["fst_automaton"] }
memmap2 = "0.5.10" memmap2 = "0.5.10"

View File

@ -26,7 +26,8 @@ use crate::readable_slices::ReadableSlices;
use crate::{ use crate::{
default_criteria, CboRoaringBitmapCodec, Criterion, DocumentId, ExternalDocumentsIds, default_criteria, CboRoaringBitmapCodec, Criterion, DocumentId, ExternalDocumentsIds,
FacetDistribution, FieldDistribution, FieldId, FieldIdWordCountCodec, GeoPoint, ObkvCodec, FacetDistribution, FieldDistribution, FieldId, FieldIdWordCountCodec, GeoPoint, ObkvCodec,
Result, RoaringBitmapCodec, RoaringBitmapLenCodec, Search, U8StrStrCodec, BEU16, BEU32, OrderBy, Result, RoaringBitmapCodec, RoaringBitmapLenCodec, Search, U8StrStrCodec, BEU16,
BEU32,
}; };
/// The HNSW data-structure that we serialize, fill and search in. /// The HNSW data-structure that we serialize, fill and search in.
@ -71,6 +72,7 @@ pub mod main_key {
pub const EXACT_WORDS: &str = "exact-words"; pub const EXACT_WORDS: &str = "exact-words";
pub const EXACT_ATTRIBUTES: &str = "exact-attributes"; pub const EXACT_ATTRIBUTES: &str = "exact-attributes";
pub const MAX_VALUES_PER_FACET: &str = "max-values-per-facet"; pub const MAX_VALUES_PER_FACET: &str = "max-values-per-facet";
pub const SORT_FACET_VALUES_BY: &str = "sort-facet-values-by";
pub const PAGINATION_MAX_TOTAL_HITS: &str = "pagination-max-total-hits"; pub const PAGINATION_MAX_TOTAL_HITS: &str = "pagination-max-total-hits";
} }
@ -1298,6 +1300,31 @@ impl Index {
self.main.delete::<_, Str>(txn, main_key::MAX_VALUES_PER_FACET) self.main.delete::<_, Str>(txn, main_key::MAX_VALUES_PER_FACET)
} }
pub fn sort_facet_values_by(&self, txn: &RoTxn) -> heed::Result<HashMap<String, OrderBy>> {
let mut orders = self
.main
.get::<_, Str, SerdeJson<HashMap<String, OrderBy>>>(
txn,
main_key::SORT_FACET_VALUES_BY,
)?
.unwrap_or_default();
// Insert the default ordering if it is not already overwritten by the user.
orders.entry("*".to_string()).or_insert(OrderBy::Lexicographic);
Ok(orders)
}
pub(crate) fn put_sort_facet_values_by(
&self,
txn: &mut RwTxn,
val: &HashMap<String, OrderBy>,
) -> heed::Result<()> {
self.main.put::<_, Str, SerdeJson<_>>(txn, main_key::SORT_FACET_VALUES_BY, &val)
}
pub(crate) fn delete_sort_facet_values_by(&self, txn: &mut RwTxn) -> heed::Result<bool> {
self.main.delete::<_, Str>(txn, main_key::SORT_FACET_VALUES_BY)
}
pub fn pagination_max_total_hits(&self, txn: &RoTxn) -> heed::Result<Option<usize>> { pub fn pagination_max_total_hits(&self, txn: &RoTxn) -> heed::Result<Option<usize>> {
self.main.get::<_, Str, OwnedType<usize>>(txn, main_key::PAGINATION_MAX_TOTAL_HITS) self.main.get::<_, Str, OwnedType<usize>>(txn, main_key::PAGINATION_MAX_TOTAL_HITS)
} }

View File

@ -58,7 +58,7 @@ pub use self::heed_codec::{
pub use self::index::Index; pub use self::index::Index;
pub use self::search::{ pub use self::search::{
FacetDistribution, FacetValueHit, Filter, FormatOptions, MatchBounds, MatcherBuilder, FacetDistribution, FacetValueHit, Filter, FormatOptions, MatchBounds, MatcherBuilder,
MatchingWords, Search, SearchForFacetValues, SearchResult, TermsMatchingStrategy, MatchingWords, OrderBy, Search, SearchForFacetValues, SearchResult, TermsMatchingStrategy,
DEFAULT_VALUES_PER_FACET, DEFAULT_VALUES_PER_FACET,
}; };

View File

@ -1,19 +1,22 @@
use std::collections::{BTreeMap, HashSet}; use std::collections::{BTreeMap, HashMap, HashSet};
use std::ops::ControlFlow; use std::ops::ControlFlow;
use std::{fmt, mem}; use std::{fmt, mem};
use heed::types::ByteSlice; use heed::types::ByteSlice;
use heed::BytesDecode; use heed::BytesDecode;
use indexmap::IndexMap;
use roaring::RoaringBitmap; use roaring::RoaringBitmap;
use serde::{Deserialize, Serialize};
use crate::error::UserError; use crate::error::UserError;
use crate::facet::FacetType; use crate::facet::FacetType;
use crate::heed_codec::facet::{ use crate::heed_codec::facet::{
FacetGroupKeyCodec, FacetGroupValueCodec, FieldDocIdFacetF64Codec, FieldDocIdFacetStringCodec, FacetGroupKeyCodec, FieldDocIdFacetF64Codec, FieldDocIdFacetStringCodec, OrderedF64Codec,
OrderedF64Codec,
}; };
use crate::heed_codec::{ByteSliceRefCodec, StrRefCodec}; use crate::heed_codec::{ByteSliceRefCodec, StrRefCodec};
use crate::search::facet::facet_distribution_iter; use crate::search::facet::facet_distribution_iter::{
count_iterate_over_facet_distribution, lexicographically_iterate_over_facet_distribution,
};
use crate::{FieldId, Index, Result}; use crate::{FieldId, Index, Result};
/// The default number of values by facets that will /// The default number of values by facets that will
@ -24,10 +27,21 @@ pub const DEFAULT_VALUES_PER_FACET: usize = 100;
/// the system to choose between one algorithm or another. /// the system to choose between one algorithm or another.
const CANDIDATES_THRESHOLD: u64 = 3000; const CANDIDATES_THRESHOLD: u64 = 3000;
/// How should we fetch the facets?
#[derive(Debug, Default, Clone, Copy, PartialEq, Eq, Serialize, Deserialize)]
pub enum OrderBy {
/// By lexicographic order...
#[default]
Lexicographic,
/// Or by number of docids in common?
Count,
}
pub struct FacetDistribution<'a> { pub struct FacetDistribution<'a> {
facets: Option<HashSet<String>>, facets: Option<HashMap<String, OrderBy>>,
candidates: Option<RoaringBitmap>, candidates: Option<RoaringBitmap>,
max_values_per_facet: usize, max_values_per_facet: usize,
default_order_by: OrderBy,
rtxn: &'a heed::RoTxn<'a>, rtxn: &'a heed::RoTxn<'a>,
index: &'a Index, index: &'a Index,
} }
@ -38,13 +52,22 @@ impl<'a> FacetDistribution<'a> {
facets: None, facets: None,
candidates: None, candidates: None,
max_values_per_facet: DEFAULT_VALUES_PER_FACET, max_values_per_facet: DEFAULT_VALUES_PER_FACET,
default_order_by: OrderBy::default(),
rtxn, rtxn,
index, index,
} }
} }
pub fn facets<I: IntoIterator<Item = A>, A: AsRef<str>>(&mut self, names: I) -> &mut Self { pub fn facets<I: IntoIterator<Item = (A, OrderBy)>, A: AsRef<str>>(
self.facets = Some(names.into_iter().map(|s| s.as_ref().to_string()).collect()); &mut self,
names_ordered_by: I,
) -> &mut Self {
self.facets = Some(
names_ordered_by
.into_iter()
.map(|(name, order_by)| (name.as_ref().to_string(), order_by))
.collect(),
);
self self
} }
@ -53,6 +76,11 @@ impl<'a> FacetDistribution<'a> {
self self
} }
pub fn default_order_by(&mut self, order_by: OrderBy) -> &mut Self {
self.default_order_by = order_by;
self
}
pub fn candidates(&mut self, candidates: RoaringBitmap) -> &mut Self { pub fn candidates(&mut self, candidates: RoaringBitmap) -> &mut Self {
self.candidates = Some(candidates); self.candidates = Some(candidates);
self self
@ -65,7 +93,7 @@ impl<'a> FacetDistribution<'a> {
field_id: FieldId, field_id: FieldId,
facet_type: FacetType, facet_type: FacetType,
candidates: &RoaringBitmap, candidates: &RoaringBitmap,
distribution: &mut BTreeMap<String, u64>, distribution: &mut IndexMap<String, u64>,
) -> heed::Result<()> { ) -> heed::Result<()> {
match facet_type { match facet_type {
FacetType::Number => { FacetType::Number => {
@ -134,9 +162,15 @@ impl<'a> FacetDistribution<'a> {
&self, &self,
field_id: FieldId, field_id: FieldId,
candidates: &RoaringBitmap, candidates: &RoaringBitmap,
distribution: &mut BTreeMap<String, u64>, order_by: OrderBy,
distribution: &mut IndexMap<String, u64>,
) -> heed::Result<()> { ) -> heed::Result<()> {
facet_distribution_iter::iterate_over_facet_distribution( let search_function = match order_by {
OrderBy::Lexicographic => lexicographically_iterate_over_facet_distribution,
OrderBy::Count => count_iterate_over_facet_distribution,
};
search_function(
self.rtxn, self.rtxn,
self.index self.index
.facet_id_f64_docids .facet_id_f64_docids
@ -159,9 +193,15 @@ impl<'a> FacetDistribution<'a> {
&self, &self,
field_id: FieldId, field_id: FieldId,
candidates: &RoaringBitmap, candidates: &RoaringBitmap,
distribution: &mut BTreeMap<String, u64>, order_by: OrderBy,
distribution: &mut IndexMap<String, u64>,
) -> heed::Result<()> { ) -> heed::Result<()> {
facet_distribution_iter::iterate_over_facet_distribution( let search_function = match order_by {
OrderBy::Lexicographic => lexicographically_iterate_over_facet_distribution,
OrderBy::Count => count_iterate_over_facet_distribution,
};
search_function(
self.rtxn, self.rtxn,
self.index self.index
.facet_id_string_docids .facet_id_string_docids
@ -189,94 +229,48 @@ impl<'a> FacetDistribution<'a> {
) )
} }
/// Placeholder search, a.k.a. no candidates were specified. We iterate throught the fn facet_values(
/// facet values one by one and iterate on the facet level 0 for numbers.
fn facet_values_from_raw_facet_database(
&self, &self,
field_id: FieldId, field_id: FieldId,
) -> heed::Result<BTreeMap<String, u64>> { order_by: OrderBy,
let mut distribution = BTreeMap::new(); ) -> heed::Result<IndexMap<String, u64>> {
let db = self.index.facet_id_f64_docids;
let mut prefix = vec![];
prefix.extend_from_slice(&field_id.to_be_bytes());
prefix.push(0); // read values from level 0 only
let iter = db
.as_polymorph()
.prefix_iter::<_, ByteSlice, ByteSlice>(self.rtxn, prefix.as_slice())?
.remap_types::<FacetGroupKeyCodec<OrderedF64Codec>, FacetGroupValueCodec>();
for result in iter {
let (key, value) = result?;
distribution.insert(key.left_bound.to_string(), value.bitmap.len());
if distribution.len() == self.max_values_per_facet {
break;
}
}
let iter = self
.index
.facet_id_string_docids
.as_polymorph()
.prefix_iter::<_, ByteSlice, ByteSlice>(self.rtxn, prefix.as_slice())?
.remap_types::<FacetGroupKeyCodec<StrRefCodec>, FacetGroupValueCodec>();
for result in iter {
let (key, value) = result?;
let docid = value.bitmap.iter().next().unwrap();
let key: (FieldId, _, &'a str) = (field_id, docid, key.left_bound);
let original_string =
self.index.field_id_docid_facet_strings.get(self.rtxn, &key)?.unwrap().to_owned();
distribution.insert(original_string, value.bitmap.len());
if distribution.len() == self.max_values_per_facet {
break;
}
}
Ok(distribution)
}
fn facet_values(&self, field_id: FieldId) -> heed::Result<BTreeMap<String, u64>> {
use FacetType::{Number, String}; use FacetType::{Number, String};
match self.candidates { let mut distribution = IndexMap::new();
Some(ref candidates) => { match (order_by, &self.candidates) {
(OrderBy::Lexicographic, Some(cnd)) if cnd.len() <= CANDIDATES_THRESHOLD => {
// Classic search, candidates were specified, we must return facet values only related // Classic search, candidates were specified, we must return facet values only related
// to those candidates. We also enter here for facet strings for performance reasons. // to those candidates. We also enter here for facet strings for performance reasons.
let mut distribution = BTreeMap::new(); self.facet_distribution_from_documents(field_id, Number, cnd, &mut distribution)?;
if candidates.len() <= CANDIDATES_THRESHOLD { self.facet_distribution_from_documents(field_id, String, cnd, &mut distribution)?;
self.facet_distribution_from_documents( }
field_id, _ => {
Number, let universe;
candidates, let candidates = match &self.candidates {
&mut distribution, Some(cnd) => cnd,
)?; None => {
self.facet_distribution_from_documents( universe = self.index.documents_ids(self.rtxn)?;
field_id, &universe
String, }
candidates, };
&mut distribution,
)?;
} else {
self.facet_numbers_distribution_from_facet_levels( self.facet_numbers_distribution_from_facet_levels(
field_id, field_id,
candidates, candidates,
order_by,
&mut distribution, &mut distribution,
)?; )?;
self.facet_strings_distribution_from_facet_levels( self.facet_strings_distribution_from_facet_levels(
field_id, field_id,
candidates, candidates,
order_by,
&mut distribution, &mut distribution,
)?; )?;
} }
};
Ok(distribution) Ok(distribution)
} }
None => self.facet_values_from_raw_facet_database(field_id),
}
}
pub fn compute_stats(&self) -> Result<BTreeMap<String, (f64, f64)>> { pub fn compute_stats(&self) -> Result<BTreeMap<String, (f64, f64)>> {
let fields_ids_map = self.index.fields_ids_map(self.rtxn)?; let fields_ids_map = self.index.fields_ids_map(self.rtxn)?;
@ -291,6 +285,7 @@ impl<'a> FacetDistribution<'a> {
Some(facets) => { Some(facets) => {
let invalid_fields: HashSet<_> = facets let invalid_fields: HashSet<_> = facets
.iter() .iter()
.map(|(name, _)| name)
.filter(|facet| !crate::is_faceted(facet, &filterable_fields)) .filter(|facet| !crate::is_faceted(facet, &filterable_fields))
.collect(); .collect();
if !invalid_fields.is_empty() { if !invalid_fields.is_empty() {
@ -300,7 +295,7 @@ impl<'a> FacetDistribution<'a> {
} }
.into()); .into());
} else { } else {
facets.clone() facets.iter().map(|(name, _)| name).cloned().collect()
} }
} }
None => filterable_fields, None => filterable_fields,
@ -337,7 +332,7 @@ impl<'a> FacetDistribution<'a> {
Ok(distribution) Ok(distribution)
} }
pub fn execute(&self) -> Result<BTreeMap<String, BTreeMap<String, u64>>> { pub fn execute(&self) -> Result<BTreeMap<String, IndexMap<String, u64>>> {
let fields_ids_map = self.index.fields_ids_map(self.rtxn)?; let fields_ids_map = self.index.fields_ids_map(self.rtxn)?;
let filterable_fields = self.index.filterable_fields(self.rtxn)?; let filterable_fields = self.index.filterable_fields(self.rtxn)?;
@ -345,6 +340,7 @@ impl<'a> FacetDistribution<'a> {
Some(ref facets) => { Some(ref facets) => {
let invalid_fields: HashSet<_> = facets let invalid_fields: HashSet<_> = facets
.iter() .iter()
.map(|(name, _)| name)
.filter(|facet| !crate::is_faceted(facet, &filterable_fields)) .filter(|facet| !crate::is_faceted(facet, &filterable_fields))
.collect(); .collect();
if !invalid_fields.is_empty() { if !invalid_fields.is_empty() {
@ -354,7 +350,7 @@ impl<'a> FacetDistribution<'a> {
} }
.into()); .into());
} else { } else {
facets.clone() facets.iter().map(|(name, _)| name).cloned().collect()
} }
} }
None => filterable_fields, None => filterable_fields,
@ -363,7 +359,12 @@ impl<'a> FacetDistribution<'a> {
let mut distribution = BTreeMap::new(); let mut distribution = BTreeMap::new();
for (fid, name) in fields_ids_map.iter() { for (fid, name) in fields_ids_map.iter() {
if crate::is_faceted(name, &fields) { if crate::is_faceted(name, &fields) {
let values = self.facet_values(fid)?; let order_by = self
.facets
.as_ref()
.and_then(|facets| facets.get(name).copied())
.unwrap_or(self.default_order_by);
let values = self.facet_values(fid, order_by)?;
distribution.insert(name.to_string(), values); distribution.insert(name.to_string(), values);
} }
} }
@ -374,25 +375,34 @@ impl<'a> FacetDistribution<'a> {
impl fmt::Debug for FacetDistribution<'_> { impl fmt::Debug for FacetDistribution<'_> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result { fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
let FacetDistribution { facets, candidates, max_values_per_facet, rtxn: _, index: _ } = let FacetDistribution {
self; facets,
candidates,
max_values_per_facet,
default_order_by,
rtxn: _,
index: _,
} = self;
f.debug_struct("FacetDistribution") f.debug_struct("FacetDistribution")
.field("facets", facets) .field("facets", facets)
.field("candidates", candidates) .field("candidates", candidates)
.field("max_values_per_facet", max_values_per_facet) .field("max_values_per_facet", max_values_per_facet)
.field("default_order_by", default_order_by)
.finish() .finish()
} }
} }
#[cfg(test)] #[cfg(test)]
mod tests { mod tests {
use std::iter;
use big_s::S; use big_s::S;
use maplit::hashset; use maplit::hashset;
use crate::documents::documents_batch_reader_from_objects; use crate::documents::documents_batch_reader_from_objects;
use crate::index::tests::TempIndex; use crate::index::tests::TempIndex;
use crate::{milli_snap, FacetDistribution}; use crate::{milli_snap, FacetDistribution, OrderBy};
#[test] #[test]
fn few_candidates_few_facet_values() { fn few_candidates_few_facet_values() {
@ -417,14 +427,14 @@ mod tests {
let txn = index.read_txn().unwrap(); let txn = index.read_txn().unwrap();
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.execute() .execute()
.unwrap(); .unwrap();
milli_snap!(format!("{map:?}"), @r###"{"colour": {"Blue": 2, "RED": 1}}"###); milli_snap!(format!("{map:?}"), @r###"{"colour": {"Blue": 2, "RED": 1}}"###);
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.candidates([0, 1, 2].iter().copied().collect()) .candidates([0, 1, 2].iter().copied().collect())
.execute() .execute()
.unwrap(); .unwrap();
@ -432,7 +442,7 @@ mod tests {
milli_snap!(format!("{map:?}"), @r###"{"colour": {"Blue": 2, "RED": 1}}"###); milli_snap!(format!("{map:?}"), @r###"{"colour": {"Blue": 2, "RED": 1}}"###);
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.candidates([1, 2].iter().copied().collect()) .candidates([1, 2].iter().copied().collect())
.execute() .execute()
.unwrap(); .unwrap();
@ -443,7 +453,7 @@ mod tests {
milli_snap!(format!("{map:?}"), @r###"{"colour": {" blue": 1, "RED": 1}}"###); milli_snap!(format!("{map:?}"), @r###"{"colour": {" blue": 1, "RED": 1}}"###);
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.candidates([2].iter().copied().collect()) .candidates([2].iter().copied().collect())
.execute() .execute()
.unwrap(); .unwrap();
@ -451,13 +461,22 @@ mod tests {
milli_snap!(format!("{map:?}"), @r###"{"colour": {"RED": 1}}"###); milli_snap!(format!("{map:?}"), @r###"{"colour": {"RED": 1}}"###);
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.candidates([0, 1, 2].iter().copied().collect()) .candidates([0, 1, 2].iter().copied().collect())
.max_values_per_facet(1) .max_values_per_facet(1)
.execute() .execute()
.unwrap(); .unwrap();
milli_snap!(format!("{map:?}"), @r###"{"colour": {"Blue": 1}}"###); milli_snap!(format!("{map:?}"), @r###"{"colour": {"Blue": 1}}"###);
let map = FacetDistribution::new(&txn, &index)
.facets(iter::once(("colour", OrderBy::Count)))
.candidates([0, 1, 2].iter().copied().collect())
.max_values_per_facet(1)
.execute()
.unwrap();
milli_snap!(format!("{map:?}"), @r###"{"colour": {"Blue": 2}}"###);
} }
#[test] #[test]
@ -489,14 +508,14 @@ mod tests {
let txn = index.read_txn().unwrap(); let txn = index.read_txn().unwrap();
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.execute() .execute()
.unwrap(); .unwrap();
milli_snap!(format!("{map:?}"), @r###"{"colour": {"Blue": 4000, "Red": 6000}}"###); milli_snap!(format!("{map:?}"), @r###"{"colour": {"Blue": 4000, "Red": 6000}}"###);
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.max_values_per_facet(1) .max_values_per_facet(1)
.execute() .execute()
.unwrap(); .unwrap();
@ -504,7 +523,7 @@ mod tests {
milli_snap!(format!("{map:?}"), @r###"{"colour": {"Blue": 4000}}"###); milli_snap!(format!("{map:?}"), @r###"{"colour": {"Blue": 4000}}"###);
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.candidates((0..10_000).collect()) .candidates((0..10_000).collect())
.execute() .execute()
.unwrap(); .unwrap();
@ -512,7 +531,7 @@ mod tests {
milli_snap!(format!("{map:?}"), @r###"{"colour": {"Blue": 4000, "Red": 6000}}"###); milli_snap!(format!("{map:?}"), @r###"{"colour": {"Blue": 4000, "Red": 6000}}"###);
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.candidates((0..5_000).collect()) .candidates((0..5_000).collect())
.execute() .execute()
.unwrap(); .unwrap();
@ -520,7 +539,7 @@ mod tests {
milli_snap!(format!("{map:?}"), @r###"{"colour": {"Blue": 2000, "Red": 3000}}"###); milli_snap!(format!("{map:?}"), @r###"{"colour": {"Blue": 2000, "Red": 3000}}"###);
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.candidates((0..5_000).collect()) .candidates((0..5_000).collect())
.execute() .execute()
.unwrap(); .unwrap();
@ -528,13 +547,22 @@ mod tests {
milli_snap!(format!("{map:?}"), @r###"{"colour": {"Blue": 2000, "Red": 3000}}"###); milli_snap!(format!("{map:?}"), @r###"{"colour": {"Blue": 2000, "Red": 3000}}"###);
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.candidates((0..5_000).collect()) .candidates((0..5_000).collect())
.max_values_per_facet(1) .max_values_per_facet(1)
.execute() .execute()
.unwrap(); .unwrap();
milli_snap!(format!("{map:?}"), @r###"{"colour": {"Blue": 2000}}"###); milli_snap!(format!("{map:?}"), @r###"{"colour": {"Blue": 2000}}"###);
let map = FacetDistribution::new(&txn, &index)
.facets(iter::once(("colour", OrderBy::Count)))
.candidates((0..5_000).collect())
.max_values_per_facet(1)
.execute()
.unwrap();
milli_snap!(format!("{map:?}"), @r###"{"colour": {"Red": 3000}}"###);
} }
#[test] #[test]
@ -566,14 +594,14 @@ mod tests {
let txn = index.read_txn().unwrap(); let txn = index.read_txn().unwrap();
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.execute() .execute()
.unwrap(); .unwrap();
milli_snap!(format!("{map:?}"), "no_candidates", @"ac9229ed5964d893af96a7076e2f8af5"); milli_snap!(format!("{map:?}"), "no_candidates", @"ac9229ed5964d893af96a7076e2f8af5");
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.max_values_per_facet(2) .max_values_per_facet(2)
.execute() .execute()
.unwrap(); .unwrap();
@ -581,7 +609,7 @@ mod tests {
milli_snap!(format!("{map:?}"), "no_candidates_with_max_2", @r###"{"colour": {"0": 10, "1": 10}}"###); milli_snap!(format!("{map:?}"), "no_candidates_with_max_2", @r###"{"colour": {"0": 10, "1": 10}}"###);
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.candidates((0..10_000).collect()) .candidates((0..10_000).collect())
.execute() .execute()
.unwrap(); .unwrap();
@ -589,7 +617,7 @@ mod tests {
milli_snap!(format!("{map:?}"), "candidates_0_10_000", @"ac9229ed5964d893af96a7076e2f8af5"); milli_snap!(format!("{map:?}"), "candidates_0_10_000", @"ac9229ed5964d893af96a7076e2f8af5");
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.candidates((0..5_000).collect()) .candidates((0..5_000).collect())
.execute() .execute()
.unwrap(); .unwrap();
@ -626,14 +654,14 @@ mod tests {
let txn = index.read_txn().unwrap(); let txn = index.read_txn().unwrap();
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.compute_stats() .compute_stats()
.unwrap(); .unwrap();
milli_snap!(format!("{map:?}"), "no_candidates", @"{}"); milli_snap!(format!("{map:?}"), "no_candidates", @"{}");
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.candidates((0..1000).collect()) .candidates((0..1000).collect())
.compute_stats() .compute_stats()
.unwrap(); .unwrap();
@ -641,7 +669,7 @@ mod tests {
milli_snap!(format!("{map:?}"), "candidates_0_1000", @r###"{"colour": (0.0, 999.0)}"###); milli_snap!(format!("{map:?}"), "candidates_0_1000", @r###"{"colour": (0.0, 999.0)}"###);
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.candidates((217..777).collect()) .candidates((217..777).collect())
.compute_stats() .compute_stats()
.unwrap(); .unwrap();
@ -678,14 +706,14 @@ mod tests {
let txn = index.read_txn().unwrap(); let txn = index.read_txn().unwrap();
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.compute_stats() .compute_stats()
.unwrap(); .unwrap();
milli_snap!(format!("{map:?}"), "no_candidates", @"{}"); milli_snap!(format!("{map:?}"), "no_candidates", @"{}");
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.candidates((0..1000).collect()) .candidates((0..1000).collect())
.compute_stats() .compute_stats()
.unwrap(); .unwrap();
@ -693,7 +721,7 @@ mod tests {
milli_snap!(format!("{map:?}"), "candidates_0_1000", @r###"{"colour": (0.0, 1999.0)}"###); milli_snap!(format!("{map:?}"), "candidates_0_1000", @r###"{"colour": (0.0, 1999.0)}"###);
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.candidates((217..777).collect()) .candidates((217..777).collect())
.compute_stats() .compute_stats()
.unwrap(); .unwrap();
@ -730,14 +758,14 @@ mod tests {
let txn = index.read_txn().unwrap(); let txn = index.read_txn().unwrap();
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.compute_stats() .compute_stats()
.unwrap(); .unwrap();
milli_snap!(format!("{map:?}"), "no_candidates", @"{}"); milli_snap!(format!("{map:?}"), "no_candidates", @"{}");
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.candidates((0..1000).collect()) .candidates((0..1000).collect())
.compute_stats() .compute_stats()
.unwrap(); .unwrap();
@ -745,7 +773,7 @@ mod tests {
milli_snap!(format!("{map:?}"), "candidates_0_1000", @r###"{"colour": (0.0, 999.0)}"###); milli_snap!(format!("{map:?}"), "candidates_0_1000", @r###"{"colour": (0.0, 999.0)}"###);
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.candidates((217..777).collect()) .candidates((217..777).collect())
.compute_stats() .compute_stats()
.unwrap(); .unwrap();
@ -786,14 +814,14 @@ mod tests {
let txn = index.read_txn().unwrap(); let txn = index.read_txn().unwrap();
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.compute_stats() .compute_stats()
.unwrap(); .unwrap();
milli_snap!(format!("{map:?}"), "no_candidates", @"{}"); milli_snap!(format!("{map:?}"), "no_candidates", @"{}");
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.candidates((0..1000).collect()) .candidates((0..1000).collect())
.compute_stats() .compute_stats()
.unwrap(); .unwrap();
@ -801,7 +829,7 @@ mod tests {
milli_snap!(format!("{map:?}"), "candidates_0_1000", @r###"{"colour": (0.0, 1998.0)}"###); milli_snap!(format!("{map:?}"), "candidates_0_1000", @r###"{"colour": (0.0, 1998.0)}"###);
let map = FacetDistribution::new(&txn, &index) let map = FacetDistribution::new(&txn, &index)
.facets(std::iter::once("colour")) .facets(iter::once(("colour", OrderBy::default())))
.candidates((217..777).collect()) .candidates((217..777).collect())
.compute_stats() .compute_stats()
.unwrap(); .unwrap();

View File

@ -1,3 +1,5 @@
use std::cmp::Reverse;
use std::collections::BinaryHeap;
use std::ops::ControlFlow; use std::ops::ControlFlow;
use heed::Result; use heed::Result;
@ -19,7 +21,7 @@ use crate::DocumentId;
/// ///
/// The return value of the closure is a `ControlFlow<()>` which indicates whether we should /// The return value of the closure is a `ControlFlow<()>` which indicates whether we should
/// keep iterating over the different facet values or stop. /// keep iterating over the different facet values or stop.
pub fn iterate_over_facet_distribution<'t, CB>( pub fn lexicographically_iterate_over_facet_distribution<'t, CB>(
rtxn: &'t heed::RoTxn<'t>, rtxn: &'t heed::RoTxn<'t>,
db: heed::Database<FacetGroupKeyCodec<ByteSliceRefCodec>, FacetGroupValueCodec>, db: heed::Database<FacetGroupKeyCodec<ByteSliceRefCodec>, FacetGroupValueCodec>,
field_id: u16, field_id: u16,
@ -29,7 +31,7 @@ pub fn iterate_over_facet_distribution<'t, CB>(
where where
CB: FnMut(&'t [u8], u64, DocumentId) -> Result<ControlFlow<()>>, CB: FnMut(&'t [u8], u64, DocumentId) -> Result<ControlFlow<()>>,
{ {
let mut fd = FacetDistribution { rtxn, db, field_id, callback }; let mut fd = LexicographicFacetDistribution { rtxn, db, field_id, callback };
let highest_level = get_highest_level( let highest_level = get_highest_level(
rtxn, rtxn,
db.remap_key_type::<FacetGroupKeyCodec<ByteSliceRefCodec>>(), db.remap_key_type::<FacetGroupKeyCodec<ByteSliceRefCodec>>(),
@ -44,7 +46,102 @@ where
} }
} }
struct FacetDistribution<'t, CB> pub fn count_iterate_over_facet_distribution<'t, CB>(
rtxn: &'t heed::RoTxn<'t>,
db: heed::Database<FacetGroupKeyCodec<ByteSliceRefCodec>, FacetGroupValueCodec>,
field_id: u16,
candidates: &RoaringBitmap,
mut callback: CB,
) -> Result<()>
where
CB: FnMut(&'t [u8], u64, DocumentId) -> Result<ControlFlow<()>>,
{
/// # Important
/// The order of the fields determines the order in which the facet values will be returned.
/// This struct is inserted in a BinaryHeap and popped later on.
#[derive(Debug, PartialOrd, Ord, PartialEq, Eq)]
struct LevelEntry<'t> {
/// The number of candidates in this entry.
count: u64,
/// The key level of the entry.
level: Reverse<u8>,
/// The left bound key.
left_bound: &'t [u8],
/// The number of keys we must look for after `left_bound`.
group_size: u8,
/// Any docid in the set of matching documents. Used to find the original facet string.
any_docid: u32,
}
// Represents the list of keys that we must explore.
let mut heap = BinaryHeap::new();
let highest_level = get_highest_level(
rtxn,
db.remap_key_type::<FacetGroupKeyCodec<ByteSliceRefCodec>>(),
field_id,
)?;
if let Some(first_bound) = get_first_facet_value::<ByteSliceRefCodec>(rtxn, db, field_id)? {
// We first fill the heap with values from the highest level
let starting_key =
FacetGroupKey { field_id, level: highest_level, left_bound: first_bound };
for el in db.range(rtxn, &(&starting_key..))?.take(usize::MAX) {
let (key, value) = el?;
// The range is unbounded on the right and the group size for the highest level is MAX,
// so we need to check that we are not iterating over the next field id
if key.field_id != field_id {
break;
}
let intersection = value.bitmap & candidates;
let count = intersection.len();
if count != 0 {
heap.push(LevelEntry {
count,
level: Reverse(key.level),
left_bound: key.left_bound,
group_size: value.size,
any_docid: intersection.min().unwrap(),
});
}
}
while let Some(LevelEntry { count, level, left_bound, group_size, any_docid }) = heap.pop()
{
if let Reverse(0) = level {
match (callback)(left_bound, count, any_docid)? {
ControlFlow::Continue(_) => (),
ControlFlow::Break(_) => return Ok(()),
}
} else {
let starting_key = FacetGroupKey { field_id, level: level.0 - 1, left_bound };
for el in db.range(rtxn, &(&starting_key..))?.take(group_size as usize) {
let (key, value) = el?;
// The range is unbounded on the right and the group size for the highest level is MAX,
// so we need to check that we are not iterating over the next field id
if key.field_id != field_id {
break;
}
let intersection = value.bitmap & candidates;
let count = intersection.len();
if count != 0 {
heap.push(LevelEntry {
count,
level: Reverse(key.level),
left_bound: key.left_bound,
group_size: value.size,
any_docid: intersection.min().unwrap(),
});
}
}
}
}
}
Ok(())
}
/// Iterate over the facets values by lexicographic order.
struct LexicographicFacetDistribution<'t, CB>
where where
CB: FnMut(&'t [u8], u64, DocumentId) -> Result<ControlFlow<()>>, CB: FnMut(&'t [u8], u64, DocumentId) -> Result<ControlFlow<()>>,
{ {
@ -54,7 +151,7 @@ where
callback: CB, callback: CB,
} }
impl<'t, CB> FacetDistribution<'t, CB> impl<'t, CB> LexicographicFacetDistribution<'t, CB>
where where
CB: FnMut(&'t [u8], u64, DocumentId) -> Result<ControlFlow<()>>, CB: FnMut(&'t [u8], u64, DocumentId) -> Result<ControlFlow<()>>,
{ {
@ -86,6 +183,7 @@ where
} }
Ok(ControlFlow::Continue(())) Ok(ControlFlow::Continue(()))
} }
fn iterate( fn iterate(
&mut self, &mut self,
candidates: &RoaringBitmap, candidates: &RoaringBitmap,
@ -98,10 +196,10 @@ where
} }
let starting_key = let starting_key =
FacetGroupKey { field_id: self.field_id, level, left_bound: starting_bound }; FacetGroupKey { field_id: self.field_id, level, left_bound: starting_bound };
let iter = self.db.range(self.rtxn, &(&starting_key..)).unwrap().take(group_size); let iter = self.db.range(self.rtxn, &(&starting_key..))?.take(group_size);
for el in iter { for el in iter {
let (key, value) = el.unwrap(); let (key, value) = el?;
// The range is unbounded on the right and the group size for the highest level is MAX, // The range is unbounded on the right and the group size for the highest level is MAX,
// so we need to check that we are not iterating over the next field id // so we need to check that we are not iterating over the next field id
if key.field_id != self.field_id { if key.field_id != self.field_id {
@ -116,7 +214,7 @@ where
value.size as usize, value.size as usize,
)?; )?;
match cf { match cf {
ControlFlow::Continue(_) => {} ControlFlow::Continue(_) => (),
ControlFlow::Break(_) => return Ok(ControlFlow::Break(())), ControlFlow::Break(_) => return Ok(ControlFlow::Break(())),
} }
} }
@ -132,7 +230,7 @@ mod tests {
use heed::BytesDecode; use heed::BytesDecode;
use roaring::RoaringBitmap; use roaring::RoaringBitmap;
use super::iterate_over_facet_distribution; use super::lexicographically_iterate_over_facet_distribution;
use crate::heed_codec::facet::OrderedF64Codec; use crate::heed_codec::facet::OrderedF64Codec;
use crate::milli_snap; use crate::milli_snap;
use crate::search::facet::tests::{get_random_looking_index, get_simple_index}; use crate::search::facet::tests::{get_random_looking_index, get_simple_index};
@ -144,7 +242,7 @@ mod tests {
let txn = index.env.read_txn().unwrap(); let txn = index.env.read_txn().unwrap();
let candidates = (0..=255).collect::<RoaringBitmap>(); let candidates = (0..=255).collect::<RoaringBitmap>();
let mut results = String::new(); let mut results = String::new();
iterate_over_facet_distribution( lexicographically_iterate_over_facet_distribution(
&txn, &txn,
index.content, index.content,
0, 0,
@ -161,6 +259,7 @@ mod tests {
txn.commit().unwrap(); txn.commit().unwrap();
} }
} }
#[test] #[test]
fn filter_distribution_all_stop_early() { fn filter_distribution_all_stop_early() {
let indexes = [get_simple_index(), get_random_looking_index()]; let indexes = [get_simple_index(), get_random_looking_index()];
@ -169,7 +268,7 @@ mod tests {
let candidates = (0..=255).collect::<RoaringBitmap>(); let candidates = (0..=255).collect::<RoaringBitmap>();
let mut results = String::new(); let mut results = String::new();
let mut nbr_facets = 0; let mut nbr_facets = 0;
iterate_over_facet_distribution( lexicographically_iterate_over_facet_distribution(
&txn, &txn,
index.content, index.content,
0, 0,

View File

@ -4,7 +4,7 @@ use heed::types::{ByteSlice, DecodeIgnore};
use heed::{BytesDecode, RoTxn}; use heed::{BytesDecode, RoTxn};
use roaring::RoaringBitmap; use roaring::RoaringBitmap;
pub use self::facet_distribution::{FacetDistribution, DEFAULT_VALUES_PER_FACET}; pub use self::facet_distribution::{FacetDistribution, OrderBy, DEFAULT_VALUES_PER_FACET};
pub use self::filter::{BadGeoError, Filter}; pub use self::filter::{BadGeoError, Filter};
use crate::heed_codec::facet::{FacetGroupKeyCodec, FacetGroupValueCodec, OrderedF64Codec}; use crate::heed_codec::facet::{FacetGroupKeyCodec, FacetGroupValueCodec, OrderedF64Codec};
use crate::heed_codec::ByteSliceRefCodec; use crate::heed_codec::ByteSliceRefCodec;

View File

@ -7,7 +7,7 @@ use log::error;
use once_cell::sync::Lazy; use once_cell::sync::Lazy;
use roaring::bitmap::RoaringBitmap; use roaring::bitmap::RoaringBitmap;
pub use self::facet::{FacetDistribution, Filter, DEFAULT_VALUES_PER_FACET}; pub use self::facet::{FacetDistribution, Filter, OrderBy, DEFAULT_VALUES_PER_FACET};
pub use self::new::matches::{FormatOptions, MatchBounds, Matcher, MatcherBuilder, MatchingWords}; pub use self::new::matches::{FormatOptions, MatchBounds, Matcher, MatcherBuilder, MatchingWords};
use self::new::PartialSearchResult; use self::new::PartialSearchResult;
use crate::error::UserError; use crate::error::UserError;

View File

@ -14,7 +14,7 @@ use crate::error::UserError;
use crate::index::{DEFAULT_MIN_WORD_LEN_ONE_TYPO, DEFAULT_MIN_WORD_LEN_TWO_TYPOS}; use crate::index::{DEFAULT_MIN_WORD_LEN_ONE_TYPO, DEFAULT_MIN_WORD_LEN_TWO_TYPOS};
use crate::update::index_documents::IndexDocumentsMethod; use crate::update::index_documents::IndexDocumentsMethod;
use crate::update::{IndexDocuments, UpdateIndexingStep}; use crate::update::{IndexDocuments, UpdateIndexingStep};
use crate::{FieldsIdsMap, Index, Result}; use crate::{FieldsIdsMap, Index, OrderBy, Result};
#[derive(Debug, Clone, PartialEq, Eq, Copy)] #[derive(Debug, Clone, PartialEq, Eq, Copy)]
pub enum Setting<T> { pub enum Setting<T> {
@ -122,6 +122,7 @@ pub struct Settings<'a, 't, 'u, 'i> {
/// Attributes on which typo tolerance is disabled. /// Attributes on which typo tolerance is disabled.
exact_attributes: Setting<HashSet<String>>, exact_attributes: Setting<HashSet<String>>,
max_values_per_facet: Setting<usize>, max_values_per_facet: Setting<usize>,
sort_facet_values_by: Setting<HashMap<String, OrderBy>>,
pagination_max_total_hits: Setting<usize>, pagination_max_total_hits: Setting<usize>,
} }
@ -149,6 +150,7 @@ impl<'a, 't, 'u, 'i> Settings<'a, 't, 'u, 'i> {
min_word_len_one_typo: Setting::NotSet, min_word_len_one_typo: Setting::NotSet,
exact_attributes: Setting::NotSet, exact_attributes: Setting::NotSet,
max_values_per_facet: Setting::NotSet, max_values_per_facet: Setting::NotSet,
sort_facet_values_by: Setting::NotSet,
pagination_max_total_hits: Setting::NotSet, pagination_max_total_hits: Setting::NotSet,
indexer_config, indexer_config,
} }
@ -275,6 +277,14 @@ impl<'a, 't, 'u, 'i> Settings<'a, 't, 'u, 'i> {
self.max_values_per_facet = Setting::Reset; self.max_values_per_facet = Setting::Reset;
} }
pub fn set_sort_facet_values_by(&mut self, value: HashMap<String, OrderBy>) {
self.sort_facet_values_by = Setting::Set(value);
}
pub fn reset_sort_facet_values_by(&mut self) {
self.sort_facet_values_by = Setting::Reset;
}
pub fn set_pagination_max_total_hits(&mut self, value: usize) { pub fn set_pagination_max_total_hits(&mut self, value: usize) {
self.pagination_max_total_hits = Setting::Set(value); self.pagination_max_total_hits = Setting::Set(value);
} }
@ -680,6 +690,20 @@ impl<'a, 't, 'u, 'i> Settings<'a, 't, 'u, 'i> {
Ok(()) Ok(())
} }
fn update_sort_facet_values_by(&mut self) -> Result<()> {
match self.sort_facet_values_by.as_ref() {
Setting::Set(value) => {
self.index.put_sort_facet_values_by(self.wtxn, value)?;
}
Setting::Reset => {
self.index.delete_sort_facet_values_by(self.wtxn)?;
}
Setting::NotSet => (),
}
Ok(())
}
fn update_pagination_max_total_hits(&mut self) -> Result<()> { fn update_pagination_max_total_hits(&mut self) -> Result<()> {
match self.pagination_max_total_hits { match self.pagination_max_total_hits {
Setting::Set(max) => { Setting::Set(max) => {
@ -714,6 +738,7 @@ impl<'a, 't, 'u, 'i> Settings<'a, 't, 'u, 'i> {
self.update_min_typo_word_len()?; self.update_min_typo_word_len()?;
self.update_exact_words()?; self.update_exact_words()?;
self.update_max_values_per_facet()?; self.update_max_values_per_facet()?;
self.update_sort_facet_values_by()?;
self.update_pagination_max_total_hits()?; self.update_pagination_max_total_hits()?;
// If there is new faceted fields we indicate that we must reindex as we must // If there is new faceted fields we indicate that we must reindex as we must
@ -1515,6 +1540,7 @@ mod tests {
exact_words, exact_words,
exact_attributes, exact_attributes,
max_values_per_facet, max_values_per_facet,
sort_facet_values_by,
pagination_max_total_hits, pagination_max_total_hits,
} = settings; } = settings;
assert!(matches!(searchable_fields, Setting::NotSet)); assert!(matches!(searchable_fields, Setting::NotSet));
@ -1532,6 +1558,7 @@ mod tests {
assert!(matches!(exact_words, Setting::NotSet)); assert!(matches!(exact_words, Setting::NotSet));
assert!(matches!(exact_attributes, Setting::NotSet)); assert!(matches!(exact_attributes, Setting::NotSet));
assert!(matches!(max_values_per_facet, Setting::NotSet)); assert!(matches!(max_values_per_facet, Setting::NotSet));
assert!(matches!(sort_facet_values_by, Setting::NotSet));
assert!(matches!(pagination_max_total_hits, Setting::NotSet)); assert!(matches!(pagination_max_total_hits, Setting::NotSet));
}) })
.unwrap(); .unwrap();

View File

@ -5,7 +5,7 @@ use heed::EnvOpenOptions;
use maplit::hashset; use maplit::hashset;
use milli::documents::{DocumentsBatchBuilder, DocumentsBatchReader}; use milli::documents::{DocumentsBatchBuilder, DocumentsBatchReader};
use milli::update::{IndexDocuments, IndexDocumentsConfig, IndexerConfig, Settings}; use milli::update::{IndexDocuments, IndexDocumentsConfig, IndexerConfig, Settings};
use milli::{FacetDistribution, Index, Object}; use milli::{FacetDistribution, Index, Object, OrderBy};
use serde_json::Deserializer; use serde_json::Deserializer;
#[test] #[test]
@ -63,12 +63,12 @@ fn test_facet_distribution_with_no_facet_values() {
let txn = index.read_txn().unwrap(); let txn = index.read_txn().unwrap();
let mut distrib = FacetDistribution::new(&txn, &index); let mut distrib = FacetDistribution::new(&txn, &index);
distrib.facets(vec!["genres"]); distrib.facets(vec![("genres", OrderBy::default())]);
let result = distrib.execute().unwrap(); let result = distrib.execute().unwrap();
assert_eq!(result["genres"].len(), 0); assert_eq!(result["genres"].len(), 0);
let mut distrib = FacetDistribution::new(&txn, &index); let mut distrib = FacetDistribution::new(&txn, &index);
distrib.facets(vec!["tags"]); distrib.facets(vec![("tags", OrderBy::default())]);
let result = distrib.execute().unwrap(); let result = distrib.execute().unwrap();
assert_eq!(result["tags"].len(), 2); assert_eq!(result["tags"].len(), 2);
} }