Hi all, I tried to use `datahub docker quickstart ...
# troubleshoot
b
Hi all, I tried to use
datahub docker quickstart --quickstart-compose-file ./docker/quickstart/docker-compose-without-neo4j-m1.quickstart.yml
install datahub on mac M1 I met this issue: Unable to run quickstart - the following issues were detected: - elasticsearch-setup is still running - mysql-setup is still running Docker log: qemu: uncaught target signal 11 (Segmentation fault) - core dumped any ideas? thanks
m
Hey @breezy-florist-18916 which version of Datahub cli
b
the latest version 0.8.16.5
m
Hmm
Can you kill all the existing containers first. What happens if you just run it without the quick start file specified.
b
same issue
m
Hmm really surprising. @dazzling-judge-80093 any ideas?
d
ahh, I think we missed to build multiplatform image from that one. I will check it
@breezy-florist-18916 can you check it now? The fix was merged and now we generated multiplatform docker image from elasticsearch-setup which should run on M1
f
I got the exactly same issue on my mac M1. Got the error below. Is it because of M1 mac?
Copy code
Unable to run quickstart - the following issues were detected:
- mysql-setup is still running
- elasticsearch-setup is still running
My datahub version
Copy code
DataHub CLI version: 0.8.17.4
Python version: 3.9.5 (default, May  3 2021, 19:12:05) 
[Clang 12.0.5 (clang-1205.0.22.9)]
d
@fast-airplane-26839 If you are fine recreating the local datahub then can your try to run a
datahub docker nuke
and try
datahub docker quickstart
again and check if it works after that? If it still fails then can you check which containers are running:
docker ps
and if there is any error in the
elasticsearch-setup
container logs ->
docker logs elasticsearch-setup
f
@dazzling-judge-80093 It still does not work. Here’s what I get
elasticsearch-setup
Copy code
CONTAINER ID   IMAGE                                       COMMAND                   CREATED          STATUS                    PORTS                                                                                      NAMES
c4d6fbbb146e   eugenetea/schema-registry-arm64:latest      "/etc/confluent/dock…"    22 minutes ago   Up 22 minutes             0.0.0.0:8081->8081/tcp, :::8081->8081/tcp                                                  schema-registry
ecb10b40f361   linkedin/datahub-frontend-react:head        "datahub-frontend/bi…"    22 minutes ago   Up 22 minutes (healthy)   0.0.0.0:9002->9002/tcp, :::9002->9002/tcp                                                  datahub-frontend-react
8e78de28347b   kymeric/cp-kafka:latest                     "/etc/confluent/dock…"    22 minutes ago   Up 22 minutes             0.0.0.0:9092->9092/tcp, :::9092->9092/tcp, 0.0.0.0:29092->29092/tcp, :::29092->29092/tcp   broker
28a79354a064   linkedin/datahub-elasticsearch-setup:head   "/bin/sh -c 'if [ \"$…"   22 minutes ago   Up 22 minutes                                                                                                        elasticsearch-setup
f6f44272a0a1   linkedin/datahub-gms:head                   "/bin/sh -c /datahub…"    22 minutes ago   Up 22 minutes (healthy)   0.0.0.0:8080->8080/tcp, :::8080->8080/tcp                                                  datahub-gms
f753504c2b40   acryldata/datahub-mysql-setup:head          "/bin/sh -c 'dockeri…"    22 minutes ago   Up 22 minutes                                                                                                        mysql-setup
23a216cd7e1e   kymeric/cp-zookeeper:latest                 "/etc/confluent/dock…"    22 minutes ago   Up 22 minutes             2888/tcp, 0.0.0.0:2181->2181/tcp, :::2181->2181/tcp, 3888/tcp                              zookeeper
a00c09824f46   mariadb:10.5.8                              "docker-entrypoint.s…"    22 minutes ago   Up 22 minutes             0.0.0.0:3306->3306/tcp, :::3306->3306/tcp                                                  mysql
4ba8fc487b73   elasticsearch:7.9.3                         "/tini -- /usr/local…"    22 minutes ago   Up 22 minutes (healthy)   0.0.0.0:9200->9200/tcp, :::9200->9200/tcp, 9300/tcp                                        elasticsearch
docker logs elasticsearch-setup
Copy code
2021/12/08 20:07:55 Waiting for: <http://elasticsearch:9200>
2021/12/08 20:07:55 Problem with request: Get <http://elasticsearch:9200>: dial tcp 172.19.0.3:9200: connect: connection refused. Sleeping 1s
2021/12/08 20:07:56 Problem with request: Get <http://elasticsearch:9200>: dial tcp 172.19.0.3:9200: connect: connection refused. Sleeping 1s
2021/12/08 20:07:57 Problem with request: Get <http://elasticsearch:9200>: dial tcp 172.19.0.3:9200: connect: connection refused. Sleeping 1s
2021/12/08 20:07:58 Problem with request: Get <http://elasticsearch:9200>: dial tcp 172.19.0.3:9200: connect: connection refused. Sleeping 1s
2021/12/08 20:07:59 Problem with request: Get <http://elasticsearch:9200>: dial tcp 172.19.0.3:9200: connect: connection refused. Sleeping 1s
2021/12/08 20:08:00 Problem with request: Get <http://elasticsearch:9200>: dial tcp 172.19.0.3:9200: connect: connection refused. Sleeping 1s
2021/12/08 20:08:01 Problem with request: Get <http://elasticsearch:9200>: dial tcp 172.19.0.3:9200: connect: connection refused. Sleeping 1s
2021/12/08 20:08:02 Problem with request: Get <http://elasticsearch:9200>: dial tcp 172.19.0.3:9200: connect: connection refused. Sleeping 1s
2021/12/08 20:08:03 Problem with request: Get <http://elasticsearch:9200>: dial tcp 172.19.0.3:9200: connect: connection refused. Sleeping 1s
2021/12/08 20:08:04 Problem with request: Get <http://elasticsearch:9200>: dial tcp 172.19.0.3:9200: connect: connection refused. Sleeping 1s
2021/12/08 20:08:05 Problem with request: Get <http://elasticsearch:9200>: dial tcp 172.19.0.3:9200: connect: connection refused. Sleeping 1s
2021/12/08 20:08:06 Problem with request: Get <http://elasticsearch:9200>: dial tcp 172.19.0.3:9200: connect: connection refused. Sleeping 1s
2021/12/08 20:08:07 Problem with request: Get <http://elasticsearch:9200>: dial tcp 172.19.0.3:9200: connect: connection refused. Sleeping 1s
2021/12/08 20:08:08 Received 200 from <http://elasticsearch:9200>

creating datahub_usage_event_policy
{
  "policy": {
    "phases": {
      "hot": {
        "actions": {
          "rollover": {
            "max_age": "7d"
          }
        }
      },
      "delete": {
        "min_age": "60d",
        "actions": {
          "delete": {}
        }
      }
    }
  }
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   271  100    21  100   250     45    541 --:--:-- --:--:-- --:--:--   586
}{"acknowledged":true}
creating datahub_usage_event_index_template
{
  "index_patterns": ["*datahub_usage_event*"],
  "data_stream": { },
  "priority": 500,
  "template": {
    "mappings": {
      "properties": {
        "@timestamp": {
          "type": "date"
        },
        "type": {
          "type": "keyword"
        },
        "timestamp": {
          "type": "date"
        },
        "userAgent": {
          "type": "keyword"
        },
        "browserId": {
          "type": "keyword"
        }
      }
    },
    "settings": {
      "index.lifecycle.name": "datahub_usage_event_policy"
    }
  }
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   541  100    21  100   520    781  19350 --:--:-- --:--:-- --:--:-- 20807
m
@fast-airplane-26839 this seems to indicate that things worked out ok
what does
datahub docker check
say?
s
Hello, I am having the exactly same problem. Since there are some images that can't build because of the platform, the other images that are able to build cannot communicate with the the pertinent images and stop building as well. I am trying to force the correct platform via source code, but still no success.