calm-sunset-28996
09/07/2021, 8:54 AMexception: java.util.concurrent.CompletionException: java.lang.RuntimeException: Failed to retrieve entities of type Dataset
Caused by: java.lang.RuntimeException: Failed to batch load Datasets
Caused by: com.linkedin.r2.RemoteInvocationException: com.linkedin.r2.RemoteInvocationException: Failed to get response from server for URI <https://datahub-gms.net:443/entities>
at com.linkedin.restli.internal.client.ExceptionUtil.wrapThrowable(ExceptionUtil.java:135)
Caused by: io.netty.handler.codec.TooLongFrameException: Response entity too large: HttpObjectAggregator$AggregatedFullHttpResponse(decodeResult: success, version: HTTP/1.1, content: CompositeByteBuf(ridx: 0, widx: 2096929, cap: 2096929, components=335))
So the entities are too large, causing. the lookup to fail. To give a bit of context: this is only happening with really specific searches, where it has to retrieve multiple datasets which have a huge amount of columns (1000k+). Then it times out. If I search for these individual entities it's fine, the same when I go to their respective pages. Any idea on how to fix this? I'm looking for some netty settings atm like maxResponseKB
which I could potentially set.
It's probably the same error as https://github.com/linkedin/datahub/issues/3106mammoth-bear-12532
green-football-43791
09/07/2021, 4:37 PMcalm-sunset-28996
09/07/2021, 7:48 PMcalm-sunset-28996
09/15/2021, 10:57 AMgreen-football-43791
09/16/2021, 3:39 PMgreen-football-43791
09/16/2021, 3:39 PMcalm-sunset-28996
09/19/2021, 8:43 PM