breezy-camera-11182
01/05/2022, 4:49 AM_03:58:27.120_ [main] WARN o.s.w.c.s.XmlWebApplicationContext:558 - Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'authorizationManagerFactory': Unsatisfied dependency expressed through field 'entityClient'; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'javaEntityClientFactory': Unsatisfied dependency expressed through field '_entityService'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'ebeanAspectDao' defined in com.linkedin.gms.factory.entity.EbeanAspectDaoFactory: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.linkedin.metadata.entity.ebean.EbeanAspectDao]: Factory method 'createInstance' threw exception; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'ebeanServer' defined in com.linkedin.gms.factory.entity.EbeanServerFactory: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [io.ebean.EbeanServer]: Factory method 'createServer' threw exception; nested exception is java.lang.NullPointerException
when running datahub-gms in minikube. i followed the deployment guide from https://datahubproject.io/docs/deploy/kubernetes and only change the mysql to my local mysql with these configuration (datahub-helm/charts/prerequisites/values.yaml
)
sql:
datasource:
host: "host.minikube.internal:3306"
hostForMysqlClient: "host.minikube.internal"
port: "3306"
url: "jdbc:<mysql://host.minikube.internal:3306/datahub?verifyServerCertificate=false&useSSL=true&useUnicode=yes&characterEncoding=UTF-8&enabledTLSProtocols=TLSv1.2>"
driver: "com.mysql.cj.jdbc.Driver"
username: "datahub"
password:
secretRef: mysql-secrets-datahub
secretKey: mysql-root-password
few-air-56117
01/05/2022, 2:11 PMmillions-notebook-72121
01/05/2022, 2:53 PMfresh-memory-20741
01/06/2022, 7:25 AMancient-hair-10877
01/06/2022, 8:29 AMfew-air-56117
01/10/2022, 8:30 AMwide-helicopter-97009
01/10/2022, 3:43 PMbetter-orange-49102
01/11/2022, 7:16 AMbillions-receptionist-60247
01/12/2022, 4:37 AMfew-air-56117
01/12/2022, 3:47 PMbillions-receptionist-60247
01/12/2022, 7:13 PMred-window-75368
01/17/2022, 2:18 PMbrave-businessperson-3969
01/18/2022, 1:16 PMhandsome-football-66174
01/18/2022, 7:05 PMfew-air-56117
01/19/2022, 11:24 AMlate-bear-87552
01/21/2022, 6:08 AMbillions-receptionist-60247
01/24/2022, 5:04 AMElasticsearchStatusException[method [HEAD], host [<https://xxxxxxxxxxx.es.amazonaws.com:443>], URI [/graph_service_v1?ignore_throttled=false&ignore_unavailable=false&expand_wildcards=open%2Cclosed&allow_no_indices=false], status line [HTTP/1.1 400 Bad Request]]
any idea why i'm getting this error.
i'm using aws elastic search
version: 6.4.5glamorous-controller-12246
01/24/2022, 6:02 PMadorable-flower-19656
01/25/2022, 2:00 AMfew-air-56117
01/25/2022, 8:47 AM<https://graph.microsoft.com/v1.0/me/photo/$value>
Thx 😇bland-wolf-37286
01/25/2022, 5:55 PMdatahub-frontend/conf/routes
, datahub-web-react/src/conf/Global.ts
and datahub-web-react/.env
to add the path prefix, then rebuild and make a Docker image off that. Is that correct?strong-iron-17184
01/25/2022, 6:19 PMbillions-twilight-48559
01/26/2022, 11:29 AMhallowed-airline-89779
01/27/2022, 6:48 AMfew-air-56117
01/27/2022, 9:31 AMlate-bear-87552
01/27/2022, 12:22 PMfew-air-56117
01/27/2022, 2:42 PMgorgeous-dinner-4055
01/27/2022, 5:00 PMambitious-pharmacist-14608
01/28/2022, 12:00 AMlate-bear-87552
01/28/2022, 7:00 AM05:24:01.419 [kafka-coordinator-heartbeat-thread | mce-consumer-job-client] WARN o.apache.kafka.clients.NetworkClient:969 - [Consumer clientId=consumer-mce-consumer-job-client-2, groupId=mce-consumer-job-client] Error connecting to node broker:29092 (id: 1 rack: null)
java.net.UnknownHostException: broker
at java.net.InetAddress.getAllByName0(InetAddress.java:1282)
at java.net.InetAddress.getAllByName(InetAddress.java:1194)
at java.net.InetAddress.getAllByName(InetAddress.java:1128)
at org.apache.kafka.clients.ClientUtils.resolve(ClientUtils.java:110)
at org.apache.kafka.clients.ClusterConnectionStates$NodeConnectionState.currentAddress(ClusterConnectionStates.java:403)
at org.apache.kafka.clients.ClusterConnectionStates$NodeConnectionState.access$200(ClusterConnectionStates.java:363)
at org.apache.kafka.clients.ClusterConnectionStates.currentAddress(ClusterConnectionStates.java:151)
at org.apache.kafka.clients.NetworkClient.initiateConnect(NetworkClient.java:962)
at org.apache.kafka.clients.NetworkClient.access$600(NetworkClient.java:74)
at org.apache.kafka.clients.NetworkClient$DefaultMetadataUpdater.maybeUpdate(NetworkClient.java:1135)
at org.apache.kafka.clients.NetworkClient$DefaultMetadataUpdater.maybeUpdate(NetworkClient.java:1023)
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:548)
at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:262)
at org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.pollNoWakeup(ConsumerNetworkClient.java:303)
at org.apache.kafka.clients.consumer.internals.AbstractCoordinator$HeartbeatThread.run(AbstractCoordinator.java:1280)
05:24:01.419 [kafka-coordinator-heartbeat-thread | mce-consumer-job-client] WARN o.apache.kafka.clients.NetworkClient:969 - [Consumer clientId=consumer-mce-consumer-job-client-2, groupId=mce-consumer-job-client] Error connecting to node broker:29092 (id: 1 rack: null)
java.net.UnknownHostException: broker