Thread
#getting-started
    g

    gentle-exabyte-43102

    1 year ago
    fresh install of
    datahub
    , browsing to
    /browse/datasets
    i see "An error occurred. Please try again shortly." and in the console a request to
    api/v2/browse?type=dataset&count=100&start=0
    is a 400 with "Bad Request. type parameter can not be null"
    to be more clear, i'm just running everything with the docker-compose quickstart shell script
    g

    green-football-43791

    1 year ago
    are there any more informative errors printed in the logs of quickstart?
    finally, one other thing you can try is ingesting data and see if that helps at all (not sure if it would, but worth a shot)
    ./docker/ingestion/ingestion.sh
    g

    gentle-exabyte-43102

    1 year ago
    the logs are only really printing messages from mysql about a Bad Handshake and from the broker
    i did run the ingestion script to write the sample data into datahub and that succeeded
    all the containers are up
    g

    green-football-43791

    1 year ago
    does search work?
    b

    big-carpet-38439

    1 year ago
    is the
    elasticsearch
    container up? sometimes it goes down if you haven't allocated enough memory
    g

    gentle-exabyte-43102

    1 year ago
    it is up
    $ docker ps
    CONTAINER ID        IMAGE                                   COMMAND                  CREATED             STATUS              PORTS                                                      NAMES
    8d82c9f53720        linkedin/datahub-frontend:latest        "datahub-frontend/bi…"   5 days ago          Up 5 days           0.0.0.0:9001->9001/tcp                                     datahub-frontend
    60346a3c9359        linkedin/datahub-mce-consumer:latest    "/bin/sh -c /datahub…"   5 days ago          Up 5 days           0.0.0.0:9090->9090/tcp                                     datahub-mce-consumer
    d80631862e71        linkedin/datahub-mae-consumer:latest    "/bin/sh -c /datahub…"   5 days ago          Up 5 days           9090/tcp, 0.0.0.0:9091->9091/tcp                           datahub-mae-consumer
    d0c564809d05        linkedin/datahub-gms:latest             "/bin/sh -c /datahub…"   5 days ago          Up 5 days           0.0.0.0:8080->8080/tcp                                     datahub-gms
    5106c96ce0f0        landoop/kafka-topics-ui:0.9.4           "/run.sh"                5 days ago          Up 5 days           0.0.0.0:18000->8000/tcp                                    kafka-topics-ui
    1739e6e7f955        landoop/schema-registry-ui:latest       "/run.sh"                5 days ago          Up 5 days           0.0.0.0:8000->8000/tcp                                     schema-registry-ui
    fcf9b36a1459        confluentinc/cp-kafka-rest:5.4.0        "/etc/confluent/dock…"   5 days ago          Up 5 days           0.0.0.0:8082->8082/tcp                                     kafka-rest-proxy
    f53871ab5b0e        confluentinc/cp-schema-registry:5.4.0   "/etc/confluent/dock…"   5 days ago          Up 5 days           0.0.0.0:8081->8081/tcp                                     schema-registry
    d1708d89f4c1        kibana:5.6.8                            "/docker-entrypoint.…"   5 days ago          Up 5 days           0.0.0.0:5601->5601/tcp                                     kibana
    5de6abf821dd        confluentinc/cp-kafka:5.4.0             "/etc/confluent/dock…"   5 days ago          Up 5 days           0.0.0.0:9092->9092/tcp, 0.0.0.0:29092->29092/tcp           broker
    e0af8ee0fc28        confluentinc/cp-zookeeper:5.4.0         "/etc/confluent/dock…"   5 days ago          Up 5 days           2888/tcp, 0.0.0.0:2181->2181/tcp, 3888/tcp                 zookeeper
    92dd17889477        neo4j:4.0.6                             "/sbin/tini -g -- /d…"   5 days ago          Up 5 days           0.0.0.0:7474->7474/tcp, 7473/tcp, 0.0.0.0:7687->7687/tcp   neo4j
    e13d9ab43ba2        elasticsearch:5.6.8                     "/docker-entrypoint.…"   5 days ago          Up 5 days           0.0.0.0:9200->9200/tcp, 9300/tcp                           elasticsearch
    a8d3789ea1d6        mysql:5.7                               "docker-entrypoint.s…"   5 days ago          Up 5 days           0.0.0.0:3306->3306/tcp, 33060/tcp                          mysql
    g

    green-football-43791

    1 year ago
    what if you look at the logs of elastic?
    docker logs elasticsearch
    g

    gentle-exabyte-43102

    1 year ago
    $ docker logs elasticsearch
    [2021-02-27T00:34:15,283][INFO ][o.e.n.Node               ] [] initializing ...
    [2021-02-27T00:34:15,528][INFO ][o.e.e.NodeEnvironment    ] [3s_iZhN] using [1] data paths, mounts [[/usr/share/elasticsearch/data (/dev/nvme0n1p1)]], net usable_space [11.8gb], net total_space [19.9gb], spins? [possibly], types [xfs]
    [2021-02-27T00:34:15,529][INFO ][o.e.e.NodeEnvironment    ] [3s_iZhN] heap size [990.7mb], compressed ordinary object pointers [true]
    [2021-02-27T00:34:15,713][INFO ][o.e.n.Node               ] node name [3s_iZhN] derived from node ID [3s_iZhNxRCizV4OyW1riAA]; set [node.name] to override
    [2021-02-27T00:34:15,714][INFO ][o.e.n.Node               ] version[5.6.8], pid[1], build[688ecce/2018-02-16T16:46:30.010Z], OS[Linux/4.14.209-160.335.amzn2.x86_64/amd64], JVM[Oracle Corporation/OpenJDK 64-Bit Server VM/1.8.0_162/25.162-b12]
    [2021-02-27T00:34:15,714][INFO ][o.e.n.Node               ] JVM arguments [-Xms2g, -Xmx2g, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -XX:+AlwaysPreTouch, -Xss1m, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djna.nosys=true, -Djdk.io.permissionsUseCanonicalPath=true, -Dio.netty.noUnsafe=true, -Dio.netty.noKeySetOptimization=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Dlog4j.shutdownHookEnabled=false, -Dlog4j2.disable.jmx=true, -Dlog4j.skipJansi=true, -XX:+HeapDumpOnOutOfMemoryError, -Xms1g, -Xmx1g, -Des.path.home=/usr/share/elasticsearch]
    [2021-02-27T00:34:20,187][INFO ][o.e.p.PluginsService     ] [3s_iZhN] loaded module [aggs-matrix-stats]
    [2021-02-27T00:34:20,193][INFO ][o.e.p.PluginsService     ] [3s_iZhN] loaded module [ingest-common]
    [2021-02-27T00:34:20,193][INFO ][o.e.p.PluginsService     ] [3s_iZhN] loaded module [lang-expression]
    [2021-02-27T00:34:20,203][INFO ][o.e.p.PluginsService     ] [3s_iZhN] loaded module [lang-groovy]
    [2021-02-27T00:34:20,203][INFO ][o.e.p.PluginsService     ] [3s_iZhN] loaded module [lang-mustache]
    [2021-02-27T00:34:20,203][INFO ][o.e.p.PluginsService     ] [3s_iZhN] loaded module [lang-painless]
    [2021-02-27T00:34:20,203][INFO ][o.e.p.PluginsService     ] [3s_iZhN] loaded module [parent-join]
    [2021-02-27T00:34:20,203][INFO ][o.e.p.PluginsService     ] [3s_iZhN] loaded module [percolator]
    [2021-02-27T00:34:20,204][INFO ][o.e.p.PluginsService     ] [3s_iZhN] loaded module [reindex]
    [2021-02-27T00:34:20,204][INFO ][o.e.p.PluginsService     ] [3s_iZhN] loaded module [transport-netty3]
    [2021-02-27T00:34:20,204][INFO ][o.e.p.PluginsService     ] [3s_iZhN] loaded module [transport-netty4]
    [2021-02-27T00:34:20,205][INFO ][o.e.p.PluginsService     ] [3s_iZhN] no plugins loaded
    [2021-02-27T00:34:30,023][INFO ][o.e.d.DiscoveryModule    ] [3s_iZhN] using discovery type [zen]
    [2021-02-27T00:34:32,509][INFO ][o.e.n.Node               ] initialized
    [2021-02-27T00:34:32,509][INFO ][o.e.n.Node               ] [3s_iZhN] starting ...
    [2021-02-27T00:34:33,022][INFO ][o.e.t.TransportService   ] [3s_iZhN] publish_address {127.0.0.1:9300}, bound_addresses {127.0.0.1:9300}
    [2021-02-27T00:34:33,053][WARN ][o.e.b.BootstrapChecks    ] [3s_iZhN] max virtual memory areas vm.max_map_count [65530] is too low, increase to at least [262144]
    [2021-02-27T00:34:36,171][INFO ][o.e.c.s.ClusterService   ] [3s_iZhN] new_master {3s_iZhN}{3s_iZhNxRCizV4OyW1riAA}{gxmQVaWTSmufSSXWbLd4Qg}{127.0.0.1}{127.0.0.1:9300}, reason: zen-disco-elected-as-master ([0] nodes joined)
    [2021-02-27T00:34:36,248][INFO ][o.e.h.n.Netty4HttpServerTransport] [3s_iZhN] publish_address {172.17.0.4:9200}, bound_addresses {0.0.0.0:9200}
    [2021-02-27T00:34:36,248][INFO ][o.e.n.Node               ] [3s_iZhN] started
    [2021-02-27T00:34:38,184][INFO ][o.e.g.GatewayService     ] [3s_iZhN] recovered [7] indices into cluster_state
    [2021-02-27T00:34:38,981][WARN ][o.e.d.r.RestController   ] Content type detection for rest requests is deprecated. Specify the content type using the [Content-Type] header.
    [2021-02-27T00:34:39,495][WARN ][o.e.d.r.RestController   ] Content type detection for rest requests is deprecated. Specify the content type using the [Content-Type] header.
    [2021-02-27T00:34:39,644][WARN ][o.e.d.r.RestController   ] Content type detection for rest requests is deprecated. Specify the content type using the [Content-Type] header.
    [2021-02-27T00:34:39,730][WARN ][o.e.d.r.RestController   ] Content type detection for rest requests is deprecated. Specify the content type using the [Content-Type] header.
    [2021-02-27T00:34:39,909][WARN ][o.e.d.r.RestController   ] Content type detection for rest requests is deprecated. Specify the content type using the [Content-Type] header.
    [2021-02-27T00:34:40,509][WARN ][o.e.d.r.RestController   ] Content type detection for rest requests is deprecated. Specify the content type using the [Content-Type] header.
    [2021-02-27T00:34:42,287][INFO ][o.e.c.r.a.AllocationService] [3s_iZhN] Cluster health status changed from [RED] to [YELLOW] (reason: [shards started [[chartdocument][3], [chartdocument][2], [chartdocument][1]] ...]).
    [2021-02-27T00:34:43,601][INFO ][o.e.m.j.JvmGcMonitorService] [3s_iZhN] [gc][11] overhead, spent [318ms] collecting in the last [1s]
    [2021-03-03T23:51:31,210][INFO ][o.e.c.m.MetaDataMappingService] [3s_iZhN] [corpuserinfodocument/EBT5fAC4Reqk_L2BXQPvxw] update_mapping [doc]
    [2021-03-03T23:51:35,167][INFO ][o.e.c.m.MetaDataMappingService] [3s_iZhN] [dataprocessdocument/ekQx-_PnS0yZIReIOl5EMg] update_mapping [doc]
    g

    green-football-43791

    1 year ago
    hmm. that seems fine
    what about gms logs?
    linkedin/datahub-gms
    ?
    g

    gentle-exabyte-43102

    1 year ago
    bunch of AnnotationParser logs at startup and then
    2021-02-27 00:35:13.999:INFO:oejsh.ContextHandler:main: Started o.e.j.w.WebAppContext@3339ad8e{Open source GMS,/,file:///tmp/jetty-0_0_0_0-8080-war_war-_-any-8496186350013556629.dir/webapp/,AVAILABLE}{file:///datahub/datahub-gms/bin/war.war}
    2021-02-27 00:35:14.011:INFO:oejs.AbstractConnector:main: Started ServerConnector@5fdef03a{HTTP/1.1,[http/1.1]}{0.0.0.0:8080}
    2021-02-27 00:35:14.012:INFO:oejs.Server:main: Started @27679ms
    00:36:11.475 [qtp626202354-14] INFO  c.l.r.s.c.ResourceMethodConfigProviderImpl - RestLi MethodLevel Configuration for property timeoutMs sorted by priority - first match gets applied:
    *.* = 0
    
    00:36:11.522 [qtp626202354-14] INFO  c.l.r.s.c.ResourceMethodConfigProviderImpl - RestLi MethodLevel Configuration for property timeoutMs sorted by priority - first match gets applied:
    *.* = 0
    
    00:36:11.523 [qtp626202354-14] INFO  c.l.r.s.c.ResourceMethodConfigProviderImpl - RestLi MethodLevel Configuration for property timeoutMs sorted by priority - first match gets applied:
    *.* = 0
    
    00:50:17.855 [qtp626202354-11] INFO  c.l.parseq.TaskDescriptorFactory - No provider found for TaskDescriptor, falling back to DefaultTaskDescriptor
    23:51:29.846 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback - Kafka producer callback:
    23:51:29.847 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Metadata: MetadataAuditEvent_v4-0@2
    23:51:29.847 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Exception: null
    23:51:29.855 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback - Kafka producer callback:
    23:51:29.855 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Metadata: MetadataAuditEvent_v4-0@3
    23:51:29.855 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Exception: null
    23:51:29.861 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback - Kafka producer callback:
    23:51:29.861 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Metadata: MetadataAuditEvent_v4-0@4
    23:51:29.861 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Exception: null
    23:51:29.884 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback - Kafka producer callback:
    23:51:29.884 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Metadata: MetadataAuditEvent_v4-0@5
    23:51:29.884 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Exception: null
    23:51:29.889 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback - Kafka producer callback:
    23:51:29.889 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Metadata: MetadataAuditEvent_v4-0@6
    23:51:29.889 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Exception: null
    23:51:29.892 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback - Kafka producer callback:
    23:51:29.892 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Metadata: MetadataAuditEvent_v4-0@7
    23:51:29.893 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Exception: null
    23:51:29.901 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback - Kafka producer callback:
    23:51:29.902 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Metadata: MetadataAuditEvent_v4-0@8
    23:51:29.902 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Exception: null
    23:51:29.919 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback - Kafka producer callback:
    23:51:29.920 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Metadata: MetadataAuditEvent_v4-0@9
    23:51:29.920 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Exception: null
    23:51:29.933 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback - Kafka producer callback:
    23:51:29.934 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Metadata: MetadataAuditEvent_v4-0@10
    23:51:29.934 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Exception: null
    23:51:29.935 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback - Kafka producer callback:
    23:51:29.935 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Metadata: MetadataAuditEvent_v4-0@11
    23:51:29.935 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Exception: null
    23:51:29.936 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback - Kafka producer callback:
    23:51:29.936 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Metadata: MetadataAuditEvent_v4-0@12
    23:51:29.937 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Exception: null
    23:51:29.966 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback - Kafka producer callback:
    23:51:29.967 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Metadata: MetadataAuditEvent_v4-0@13
    23:51:29.968 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Exception: null
    23:51:29.968 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback - Kafka producer callback:
    23:51:29.968 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Metadata: MetadataAuditEvent_v4-0@14
    23:51:29.968 [kafka-producer-network-thread | producer-1] INFO  c.l.m.d.p.KafkaProducerCallback -   Exception: null
    i'm guessing those kafka exceptions were when i ran the sample ingestion
    g

    green-football-43791

    1 year ago
    can you try to browse again?
    it should print some output when you browse 🤔
    g

    gentle-exabyte-43102

    1 year ago
    same error message in the browser
    nothing new in the gms logs
    that seems strange to me. i thought a request to
    api/v2/browse?type=dataset&count=100&start=0
    routes to the gms container?
    g

    green-football-43791

    1 year ago
    yep, it should
    could you have a local gms running?
    g

    gentle-exabyte-43102

    1 year ago
    ah, damn. i'm sorry, i bet it's the nginx proxy not forwarding the param
    i can browse to
    api/v2/browse?type=dataset&count=100&start=0
    directly in my browser and see the same error
    g

    green-football-43791

    1 year ago
    ah, do you have a proxy up?
    g

    gentle-exabyte-43102

    1 year ago
    i have nginx in front of everything, but i just looked and i see this line
    proxy_pass <http://127.0.0.1:9001>$uri
    ah, so the front end is having its requests to api sent to 9001
    rather than gms
    g

    green-football-43791

    1 year ago
    yes, it will fwd through datahub-frontend
    g

    gentle-exabyte-43102

    1 year ago
    oh
    ok, so that's fine then to route everything from the public url to 9001?
    i have this in nginx
    location / {
                    proxy_set_header Connection '';
                    proxy_http_version 1.1;
                    chunked_transfer_encoding off;
                    proxy_buffering off;
                    proxy_cache off;
                    proxy_set_header X-Forwarded-Host $host;
                    proxy_set_header X-Forwarded-Server $host;
                    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
                    proxy_pass <http://127.0.0.1:9001>$uri;
            }
    g

    green-football-43791

    1 year ago
    that should work
    g

    gentle-exabyte-43102

    1 year ago
    i see a bunch of these in the frontend logs
    22:26:25 [application-akka.actor.default-dispatcher-1662] ERROR application -
    
    ! @7j191d9bm - Internal server error, for (GET) [/api/v1/user/me] ->
    
    play.api.UnexpectedException: Unexpected exception[RuntimeException: com.linkedin.restli.client.RestLiResponseException: Response status 404]
    	at play.api.http.HttpErrorHandlerExceptions$.throwableToUsefulException(HttpErrorHandler.scala:247)
    	at play.api.http.DefaultHttpErrorHandler.onServerError(HttpErrorHandler.scala:176)
    	at play.core.server.AkkaHttpServer$$anonfun$2.applyOrElse(AkkaHttpServer.scala:363)
    	at play.core.server.AkkaHttpServer$$anonfun$2.applyOrElse(AkkaHttpServer.scala:361)
    	at scala.concurrent.Future$$anonfun$recoverWith$1.apply(Future.scala:346)
    	at scala.concurrent.Future$$anonfun$recoverWith$1.apply(Future.scala:345)
    	at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:36)
    	at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:55)
    	at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:91)
    	at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91)
    	at akka.dispatch.BatchingExecutor$BlockableBatch$$anonfun$run$1.apply(BatchingExecutor.scala:91)
    	at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
    	at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:90)
    	at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
    	at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:43)
    	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
    	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
    	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
    	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
    Caused by: java.lang.RuntimeException: com.linkedin.restli.client.RestLiResponseException: Response status 404
    	at controllers.api.v1.User.getLoggedInUser(User.java:51)
    	at router.Routes$$anonfun$routes$1$$anonfun$applyOrElse$12$$anonfun$apply$12.apply(Routes.scala:814)
    	at router.Routes$$anonfun$routes$1$$anonfun$applyOrElse$12$$anonfun$apply$12.apply(Routes.scala:814)
    	at play.core.routing.HandlerInvokerFactory$$anon$3.resultCall(HandlerInvoker.scala:134)
    	at play.core.routing.HandlerInvokerFactory$$anon$3.resultCall(HandlerInvoker.scala:133)
    	at play.core.routing.HandlerInvokerFactory$JavaActionInvokerFactory$$anon$8$$anon$2$$anon$1.invocation(HandlerInvoker.scala:108)
    	at play.core.j.JavaAction$$anon$1.call(JavaAction.scala:88)
    	at play.http.DefaultActionCreator$1.call(DefaultActionCreator.java:31)
    	at play.mvc.Security$AuthenticatedAction.call(Security.java:69)
    	at play.core.j.JavaAction$$anonfun$9.apply(JavaAction.scala:138)
    	at play.core.j.JavaAction$$anonfun$9.apply(JavaAction.scala:138)
    	at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
    	at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
    	at play.core.j.HttpExecutionContext$$anon$2.run(HttpExecutionContext.scala:56)
    	at play.api.libs.streams.Execution$trampoline$.execute(Execution.scala:70)
    	at play.core.j.HttpExecutionContext.execute(HttpExecutionContext.scala:48)
    	at scala.concurrent.impl.Future$.apply(Future.scala:31)
    	at scala.concurrent.Future$.apply(Future.scala:494)
    	at play.core.j.JavaAction.apply(JavaAction.scala:138)
    	at play.api.mvc.Action$$anonfun$apply$2.apply(Action.scala:96)
    	at play.api.mvc.Action$$anonfun$apply$2.apply(Action.scala:89)
    	at play.api.libs.streams.StrictAccumulator$$anonfun$mapFuture$2$$anonfun$1.apply(Accumulator.scala:174)
    	at play.api.libs.streams.StrictAccumulator$$anonfun$mapFuture$2$$anonfun$1.apply(Accumulator.scala:174)
    	at scala.util.Try$.apply(Try.scala:192)
    	at play.api.libs.streams.StrictAccumulator$$anonfun$mapFuture$2.apply(Accumulator.scala:174)
    	at play.api.libs.streams.StrictAccumulator$$anonfun$mapFuture$2.apply(Accumulator.scala:170)
    	at scala.Function1$$anonfun$andThen$1.apply(Function1.scala:52)
    	at play.api.libs.streams.StrictAccumulator.run(Accumulator.scala:207)
    	at play.core.server.AkkaHttpServer$$anonfun$14.apply(AkkaHttpServer.scala:357)
    	at play.core.server.AkkaHttpServer$$anonfun$14.apply(AkkaHttpServer.scala:355)
    	at akka.http.scaladsl.util.FastFuture$.akka$http$scaladsl$util$FastFuture$$strictTransform$1(FastFuture.scala:41)
    	at akka.http.scaladsl.util.FastFuture$$anonfun$transformWith$extension1$1.apply(FastFuture.scala:51)
    	at akka.http.scaladsl.util.FastFuture$$anonfun$transformWith$extension1$1.apply(FastFuture.scala:50)
    	... 13 common frames omitted
    Caused by: com.linkedin.restli.client.RestLiResponseException: com.linkedin.restli.client.RestLiResponseException: Response status 404
    	at com.linkedin.restli.internal.client.ExceptionUtil.wrapThrowable(ExceptionUtil.java:130)
    	at com.linkedin.restli.internal.client.ResponseFutureImpl.getResponseImpl(ResponseFutureImpl.java:130)
    	at com.linkedin.restli.internal.client.ResponseFutureImpl.getResponse(ResponseFutureImpl.java:94)
    	at com.linkedin.identity.client.CorpUsers.get(CorpUsers.java:59)
    	at com.linkedin.datahub.dao.view.CorpUserViewDao.get(CorpUserViewDao.java:33)
    	at com.linkedin.datahub.dao.view.CorpUserViewDao.getByUserName(CorpUserViewDao.java:51)
    	at controllers.api.v1.User.getLoggedInUser(User.java:49)
    	... 45 common frames omitted
    Caused by: com.linkedin.restli.client.RestLiResponseException: RestException{_response=RestResponse[headers={Content-Length=7148, Date=Wed, 03 Mar 2021 22:26:25 GMT, Server=Jetty(9.4.20.v20190813), X-RestLi-Error-Response=true, X-RestLi-Protocol-Version=2.0.0},cookies=[],status=404,entityLength=7148]}
    	at com.linkedin.restli.internal.client.ExceptionUtil.exceptionForThrowable(ExceptionUtil.java:102)
    	at com.linkedin.restli.client.RestLiCallbackAdapter.convertError(RestLiCallbackAdapter.java:66)
    	at com.linkedin.common.callback.CallbackAdapter.onError(CallbackAdapter.java:86)
    	at com.linkedin.r2.message.timing.TimingCallback.onError(TimingCallback.java:81)
    	at com.linkedin.r2.transport.common.bridge.client.TransportCallbackAdapter.onResponse(TransportCallbackAdapter.java:47)
    	at com.linkedin.r2.filter.transport.FilterChainClient.lambda$createWrappedClientTimingCallback$0(FilterChainClient.java:113)
    	at com.linkedin.r2.filter.transport.ResponseFilter.onRestError(ResponseFilter.java:79)
    	at com.linkedin.r2.filter.TimedRestFilter.onRestError(TimedRestFilter.java:92)
    	at com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnError(FilterChainIterator.java:166)
    	at com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnError(FilterChainIterator.java:132)
    	at com.linkedin.r2.filter.FilterChainIterator.onError(FilterChainIterator.java:101)
    	at com.linkedin.r2.filter.TimedNextFilter.onError(TimedNextFilter.java:48)
    	at com.linkedin.r2.filter.message.rest.RestFilter.onRestError(RestFilter.java:84)
    	at com.linkedin.r2.filter.TimedRestFilter.onRestError(TimedRestFilter.java:92)
    	at com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnError(FilterChainIterator.java:166)
    	at com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnError(FilterChainIterator.java:132)
    	at com.linkedin.r2.filter.FilterChainIterator.onError(FilterChainIterator.java:101)
    	at com.linkedin.r2.filter.TimedNextFilter.onError(TimedNextFilter.java:48)
    	at com.linkedin.r2.filter.message.rest.RestFilter.onRestError(RestFilter.java:84)
    	at com.linkedin.r2.filter.TimedRestFilter.onRestError(TimedRestFilter.java:92)
    	at com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnError(FilterChainIterator.java:166)
    	at com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnError(FilterChainIterator.java:132)
    	at com.linkedin.r2.filter.FilterChainIterator.onError(FilterChainIterator.java:101)
    	at com.linkedin.r2.filter.TimedNextFilter.onError(TimedNextFilter.java:48)
    	at com.linkedin.r2.filter.message.rest.RestFilter.onRestError(RestFilter.java:84)
    	at com.linkedin.r2.filter.TimedRestFilter.onRestError(TimedRestFilter.java:92)
    	at com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnError(FilterChainIterator.java:166)
    	at com.linkedin.r2.filter.FilterChainIterator$FilterChainRestIterator.doOnError(FilterChainIterator.java:132)
    	at com.linkedin.r2.filter.FilterChainIterator.onError(FilterChainIterator.java:101)
    	at com.linkedin.r2.filter.TimedNextFilter.onError(TimedNextFilter.java:48)
    	at com.linkedin.r2.filter.transport.ClientRequestFilter.lambda$createCallback$0(ClientRequestFilter.java:102)
    	at com.linkedin.r2.transport.http.common.HttpBridge$1.onResponse(HttpBridge.java:82)
    	at com.linkedin.r2.transport.http.client.rest.ExecutionCallback.lambda$onResponse$0(ExecutionCallback.java:64)
    	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    	at java.lang.Thread.run(Thread.java:748)
    Caused by: com.linkedin.r2.message.rest.RestException: Received error 404 from server for URI <http://datahub-gms:8080/corpUsers/($params:(),name:foo)>
    	at com.linkedin.r2.transport.http.common.HttpBridge$1.onResponse(HttpBridge.java:76)
    	... 4 common frames omitted
    g

    green-football-43791

    1 year ago
    I see
    Caused by: com.linkedin.r2.message.rest.RestException: Received error 404 from server for URI <http://datahub-gms:8080/corpUsers/($params:(),name:foo)>
    was the foo corpUser ever ingested into the system?
    g

    gentle-exabyte-43102

    1 year ago
    i'm pretty sure not. the only thing i've ever tried to ingest is the sample data. i might have tried to login with foo
    g

    green-football-43791

    1 year ago
    ah, try logging in with
    datahub
    /
    datahub
    that is the default username and passwrod
    g

    gentle-exabyte-43102

    1 year ago
    right, i am logged in with datahub now
    g

    green-football-43791

    1 year ago
    do any new frontend logs show up when you try to browse?
    g

    gentle-exabyte-43102

    1 year ago
    nothing
    g

    green-football-43791

    1 year ago
    are you using quickstart or quickstart-react?
    g

    gentle-exabyte-43102

    1 year ago
    quickstart
    g

    green-football-43791

    1 year ago
    hmm, ok just to see what happens
    one thing to try would be nuking your current instance
    and booting back up with quickstart-react
    we may see a more helpful error there
    g

    gentle-exabyte-43102

    1 year ago
    ok, i'll give that a shot. appreciate your help 🙂 :thankyou:
    g

    green-football-43791

    1 year ago
    did you say search worked, or no?
    g

    gentle-exabyte-43102

    1 year ago
    search did not work (that i could tell)
    g

    green-football-43791

    1 year ago
    sure, sorry I don't have a quick answer
    g

    gentle-exabyte-43102

    1 year ago
    no worries
    g

    green-football-43791

    1 year ago
    what happens when you search?
    g

    gentle-exabyte-43102

    1 year ago
    i see "An error occurred while querying the search index. Please try again shortly." and the request to
    <http://datahub.adp.autodesk.com/api/v2/search?input=foo&type=dataset&start=0&count=10&as[>…]kedin.common.EntityTopUsage%2Ccom.linkedin.common.Status
    returns the same 400
    Bad Request. type parameter can not be null or empty
    g

    green-football-43791

    1 year ago
    do you have any dataset entities you know the id of?
    if so, the url would be
    <http://datahub.adp.autodesk.com/dataset/><dataset_urn>
    try to see if that works
    g

    gentle-exabyte-43102

    1 year ago
    the only urn i know would be from the sample data, something like
    urn:li:dataset:(urn:li:dataPlatform:kafka,SampleKafkaDataset,PROD)
    g

    green-football-43791

    1 year ago
    yep, does that work?
    g

    gentle-exabyte-43102

    1 year ago
    no it just redirects me to the main Browse datasets page
    g

    green-football-43791

    1 year ago
    hmm
    b

    big-carpet-38439

    1 year ago
    Can we get in a meeting and debug this? Want to get to the bottom of what's going wrong here