Skip to content

ES client indices.getDataStream methods fails on long name list - HTTP line is larger than 4096 bytes #2861

Closed as not planned
@tonyghiani

Description

@tonyghiani

🐛 Bug report

The indices.getDataStream() works well retrieving a short list of data streams by name, but there are some scenario where we need to dinamically filter available data streams by name and then query their definition.

When a cluster has many data streams and the composed request gets very long, we hit the HTTP line is larger than 4096 bytes issue.
We currently workaround this using a chunking algorithm when the list of requested names might grow a lot, but we found more than a case where this is now used and would be great to have it implemented as a core feature of the JS client.

This is how the error appears when we directly hit the API in ES (which is expected, the issue is about having the JS esClient overcome this behaviour):

GET _data_stream/logs-dataset-default,logs-... // Very long

// Response
{
  "error": {
    "root_cause": [
      {
        "type": "too_long_http_line_exception",
        "reason": "An HTTP line is larger than 4096 bytes."
      }
    ],
    "type": "too_long_http_line_exception",
    "reason": "An HTTP line is larger than 4096 bytes."
  },
  "status": 400
}

To reproduce

Trigger any request with the JS client such as

scopedClusterClient.asCurrentUser.indices.getDataStream({ name: ['logs-dataset-default', // Many others exceeding the limit] })

Expected behavior

Should correctly chunk the requests behind the scenes and return the expected results at once.

Node.js version

Latest

@elastic/elasticsearch version

Any

Operating system

Any

Any other relevant environment information

No response

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions