You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
After ingesting high throughput traffic, if you go to SN-STATS dashboard, you get a shard exception error as follow :
query_shard_exception at shard 0index logstash-2024.07.30node zG64wDeYSXqMvXshrIkLiA
Type
query_shard_exception
Reason
failed to create query: field expansion for [] matches too many fields, limit: 1024, got: 1153
Index uuid
3KP8KqqxTN2c0bGOEQCHRg
Index
logstash-2024.07.30
Caused by type
illegal_argument_exception
Caused by reason
field expansion for [] matches too many fields, limit: 1024, got: 1153
Searching on the internet about this error, comes from a default limit parameter that should be changed, either via the elasticsearch.yml, or the index settings itself and setting default fields for the index, here are the links to the discussions about it on the internet :
I can on my side, "patch" that with the "overkill" mitigation of setting indices.query.bool.max_clause_count on elasticsearch.yml manually ...
So either edit index mapping/settings to set default fields, or preset elastisearch.yml to correct this issue.
Expected Behavior
Do not get shard error exceptions (and therefore, correct output on kibana)
ulysse31
changed the title
🐞🐋 Either missing parameter to allo max index search output, or missing default fields declaration on created elasticsearch indices
🐞🐋 Either missing parameter to allow max index search output, or missing default fields declaration on created elasticsearch indices
Jul 31, 2024
For this bug, I already applied manually on my side a "mitigation" by setting indices.query.bool.max_clause_count to 4096 on the elasticsearch.yml. But I would suppose that on a lighter time window, the error will not appear in my case : in the end, this is caused by a query which result number exceeds default max result per query allowed, so it depends on the traffic amount ingested in the timelime.
Is there an existing issue for this?
Current Behavior
After ingesting high throughput traffic, if you go to SN-STATS dashboard, you get a shard exception error as follow :
query_shard_exception at shard 0index logstash-2024.07.30node zG64wDeYSXqMvXshrIkLiA
Type
query_shard_exception
Reason
failed to create query: field expansion for [] matches too many fields, limit: 1024, got: 1153
Index uuid
3KP8KqqxTN2c0bGOEQCHRg
Index
logstash-2024.07.30
Caused by type
illegal_argument_exception
Caused by reason
field expansion for [] matches too many fields, limit: 1024, got: 1153
Searching on the internet about this error, comes from a default limit parameter that should be changed, either via the elasticsearch.yml, or the index settings itself and setting default fields for the index, here are the links to the discussions about it on the internet :
https://gitlab.com/gitlab-com/gl-infra/scalability/-/issues/3311
https://discuss.elastic.co/t/error-since-7-4-upgrade-field-expansion-matches-too-many-fields/204087
https://stackoverflow.com/questions/57895269/elasticsearch-7-3-field-expansion-matches-too-many-fields
https://www.elastic.co/guide/en/kibana/current/upgrade-assistant-api-default-field.html
I can on my side, "patch" that with the "overkill" mitigation of setting indices.query.bool.max_clause_count on elasticsearch.yml manually ...
So either edit index mapping/settings to set default fields, or preset elastisearch.yml to correct this issue.
Expected Behavior
Do not get shard error exceptions (and therefore, correct output on kibana)
Steps To Reproduce
Docker version
Docker version 27.1.1, build 6312585
Docker version
Docker Compose version v2.29.1
OS Version
Debian GNU/Linux 12 (bookworm)
Content of the environnement File
COMPOSE_PROJECT_NAME=selks
INTERFACES= -i bond1
ELASTIC_MEMORY=64G
SCIRIUS_SECRET_KEY=
PWD=${PWD}
Version of SELKS
commit 4af455c (HEAD -> master, origin/master, origin/HEAD)
Author: Peter Manev [email protected]
Date: Thu Jun 13 13:18:18 2024 +0200
Anything else?
No response
The text was updated successfully, but these errors were encountered: