Hey! We've run into an issue with v0.10.0 and grap...
# troubleshoot
r
Hey! We've run into an issue with v0.10.0 and graphQL. When running the
searchAcrossLineage
on datasets that have a lot of downstream links we end up getting a
DataFetchingException
which seems to be related to
Copy code
{
  "error": {
    "root_cause": [
      {
        "type": "max_bytes_length_exceeded_exception",
        "reason": "max_bytes_length_exceeded_exception: bytes can be at most 32766 in length; got 45576"
      }
    ],
    "type": "search_phase_execution_exception",
    "reason": "all shards failed",
    "phase": "query",
    "grouped": true,
    "failed_shards": [
      {
        "shard": 0,
        "index": "datasetindex_v2_1668526466773",
        "node": "W7nwRsS-SlOu01ySlY3N-w",
        "reason": {
          "type": "max_bytes_length_exceeded_exception",
          "reason": "max_bytes_length_exceeded_exception: bytes can be at most 32766 in length; got 45576"
        }
      }
    ]
  },
  "status": 500
}
Since the upgrade required a reindexing in elasticsearch we're not exactly sure what we should be doing to address this new issue. Wondering if anyone has any ideas! It only pops up for datasets that have a lot of downstream lineage links.
plus1 1
a
Hi @rhythmic-stone-77840, @dazzling-yak-93039 might be able to help you here!
s
Amazing! L.aura is out today, but I am looking into this
h
x-post here: this is a known problem in 0.10.0 that happens when there are too many fields in column level lineage. This problem should be resolved when you upgrade to 0.10.1 which also has many other perf fixes for searchAcrossLineage. Suggestion is to wait a couple of days for the release to go out.
i
Just updated to 0.10.1 and got same:
Copy code
Caused by: org.elasticsearch.ElasticsearchStatusException: Elasticsearch exception [type=search_phase_execution_exception, reason=all shards failed]
Copy code
Suppressed: org.elasticsearch.client.ResponseException: method [POST], host [<http://elasticsearch-master:9200>], URI [/datahubpolicyindex_v2/_search?typed_keys=true&max_concurrent_shard_requests=5&ignore_unavailable=false&expand_wildcards=open&allow_no_indices=true&ignore_throttled=true&search_type=query_then_fetch&batched_reduce_size=512&ccs_minimize_roundtrips=true], status line [HTTP/1.1 400 Bad Request] {"error":{"root_cause":[{"type":"query_shard_exception","reason":"[simple_query_string] analyzer [query_word_delimited] not found","index_uuid":"bgzSaLp8Tzi3PbQTBs0mdA","index":"datahubpolicyindex_v2"}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"query","grouped":true,"failed_shards":[{"shard":0,"index":"datahubpolicyindex_v2","node":"hnDnNu2sTZiXcxgTIJBFdQ","reason":{"type":"query_shard_exception","reason":"[simple_query_string] analyzer [query_word_delimited] not found","index_uuid":"bgzSaLp8Tzi3PbQTBs0mdA","index":"datahubpolicyindex_v2"}}]},"status":400}
a
Hi @icy-flag-80360, is this still affecting you?
i
@astonishing-answer-96712, yes. After login I was got many popup 500 errors, and as I understand - authentication is OK (OIDC auth), but authorization failed, can't see any dataset or search anything.
a
Oh no! And you’re logging in as the root user or another?
i
@astonishing-answer-96712 checked with local user also (with default datahub:datahub auth) - same problem
a
@dazzling-yak-93039 should be able to help here!