gifted-diamond-19544
10/21/2022, 9:13 AMECS.5 ECS containers should be limited to read-only access to root filesystemsThis control checks if ECS containers are limited to read-only access to mounted root filesystems. This control fails if the ReadonlyRootFilesystem parameter in the container definition of ECS task definitions is set to 'false'.
We tried to enable read only access to Root filesystems, but the containers do not run. Is there anyway we can fix this? Thank you!microscopic-mechanic-13766
10/21/2022, 11:43 AM*
to obtain all the things that have been created/ingested into Datahub and I saw this thing. Is this supposed to be like this?
Thanks in advance!microscopic-mechanic-13766
10/21/2022, 12:04 PMAsset Owners - Metadata Policy
which grants all metadata privileges ONLY for assets owners.
I also have created one group, which has 2 users inside: one with read role and the other with edit role.
After adding the group as the owner of a dataset, the read user is able to add glossary terms, domains, ....
Is this correct?? Shouldn't roles be over policies??best-umbrella-88325
10/21/2022, 1:36 PMsink:
type: datahub-rest
config:
server: '<http://a35f8626d7XXXXXbeec24fdaa5720-XXX.us-west-1.elb.amazonaws.com:8080/>'
source:
type: s3
config:
path_spec:
include: '<s3://XX-bkt/*.*>'
platform: s3
aws_config:
aws_access_key_id: XXXXXXX
aws_region: us-west-1
aws_secret_access_key: XXXXXXXXX
pipeline_name: 'urn:li:dataHubIngestionSource:f751376f-ec1a-4dee-a71f-7f4f96c3cdda'
numerous-bird-32188
10/21/2022, 2:13 PMhelpful-librarian-40144
10/24/2022, 3:37 AMbland-orange-13353
10/24/2022, 7:24 AMmicroscopic-mechanic-13766
10/24/2022, 10:59 AMTraceback (most recent call last):
File "/home/airflow/.local/bin/airflow", line 5, in <module>
from airflow.__main__ import main
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/__init__.py", line 35, in <module>
from airflow import settings
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/settings.py", line 35, in <module>
from airflow.configuration import AIRFLOW_HOME, WEBSERVER_CONFIG, conf # NOQA F401
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py", line 1187, in <module>
conf.validate()
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py", line 224, in validate
self._validate_config_dependencies()
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/configuration.py", line 267, in _validate_config_dependencies
raise AirflowConfigException(f"error: cannot use sqlite with the {self.get('core', 'executor')}")
airflow.exceptions.AirflowConfigException: error: cannot use sqlite with the CeleryExecutor
After doing some testing I have found that the source of the errors might be that on such version, sqlalchemy's version is downgraded to 1.3.24. Is it done for a certain reason?
Collecting sqlalchemy==1.3.24
Downloading SQLAlchemy-1.3.24-cp37-cp37m-manylinux2010_x86_64.whl
......
Attempting uninstall: sqlalchemy
Found existing installation: SQLAlchemy 1.4.9
Uninstalling SQLAlchemy-1.4.9:
Successfully uninstalled SQLAlchemy-1.4.9
I am using Airflow 2.3.2curved-magazine-23582
10/24/2022, 6:01 PMrhythmic-judge-41554
10/24/2022, 6:17 PMno matching manifest for linux/arm64/v8 in the manifest list entries
Any chance of supporting those in the future. I can get around this myself, so this is just an FYI and a question.late-insurance-69310
10/25/2022, 2:16 PMmicroscopic-mechanic-13766
10/26/2022, 8:37 AMglamorous-wire-83850
10/27/2022, 10:58 AMfull-apple-16103
10/27/2022, 1:46 PMdatahub docker quickstart
I get the following:
[ec2-user@ip- ~]$ datahub version
DataHub CLI version: 0.9.0.4
Python version: 3.7.10 (default, Jun 3 2021, 000201)
[GCC 7.3.1 20180712 (Red Hat 7.3.1-13)]
[ec2-user@ip- ~]$ datahub docker quickstart
No Datahub Neo4j volume found, starting with elasticsearch as graph service.
To use neo4j as a graph backend, run
datahub docker quickstart --quickstart-compose-file ./docker/quickstart/docker-compose.quickstart.yml
from the root of the datahub repo
Fetching docker-compose file https://raw.githubusercontent.com/datahub-project/datahub/master/docker/quickstart/docker-compose-without-neo4j.quickstart.yml from GitHub
Pulling docker images...
unknown shorthand flag: ‘f’ in -f
See ‘docker --help’.
Error while pulling images. Going to attempt to move on to docker compose up assuming the images have been built locallybitter-dog-24903
10/27/2022, 6:01 PMSyntaxError: Unexpected token '<', " <!DOCTYPE "... is not valid JSON
bitter-dog-24903
10/27/2022, 6:01 PMbitter-dog-24903
10/27/2022, 6:02 PMcom.datahub.util.exception.ESQueryException: Search query failed:
at com.linkedin.metadata.search.elasticsearch.query.ESSearchDAO.executeAndExtract(ESSearchDAO.java:73)
at com.linkedin.metadata.search.elasticsearch.query.ESSearchDAO.search(ESSearchDAO.java:100)
at com.linkedin.metadata.search.elasticsearch.ElasticSearchService.search(ElasticSearchService.java:67)
at com.linkedin.entity.client.JavaEntityClient.search(JavaEntityClient.java:288)
at com.datahub.authorization.PolicyFetcher.fetchPolicies(PolicyFetcher.java:50)
at com.datahub.authorization.PolicyFetcher.fetchPolicies(PolicyFetcher.java:42)
at com.datahub.authorization.DataHubAuthorizer$PolicyRefreshRunnable.run(DataHubAuthorizer.java:222)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: Request cannot be executed; I/O reactor status: STOPPED
at org.elasticsearch.client.RestClient.extractAndWrapCause(RestClient.java:857)
at org.elasticsearch.client.RestClient.performRequest(RestClient.java:259)
at org.elasticsearch.client.RestClient.performRequest(RestClient.java:246)
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1613)
at org.elasticsearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:1583)
at org.elasticsearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:1553)
at org.elasticsearch.client.RestHighLevelClient.search(RestHighLevelClient.java:1069)
at com.linkedin.metadata.search.elasticsearch.query.ESSearchDAO.executeAndExtract(ESSearchDAO.java:60)
... 13 common frames omitted
Caused by: java.lang.IllegalStateException: Request cannot be executed; I/O reactor status: STOPPED
at org.apache.http.util.Asserts.check(Asserts.java:46)
at org.apache.http.impl.nio.client.CloseableHttpAsyncClientBase.ensureRunning(CloseableHttpAsyncClientBase.java:90)
at org.apache.http.impl.nio.client.InternalHttpAsyncClient.execute(InternalHttpAsyncClient.java:123)
at org.elasticsearch.client.RestClient.performRequest(RestClient.java:255)
... 19 common frames omitted
bitter-dog-24903
10/27/2022, 6:03 PMmany-piano-52097
10/28/2022, 3:21 AMhigh-hospital-85984
10/28/2022, 11:31 AMlinkedin/datahub-ingestion:v0.9.0
seems to be missing in Dockerhub. Is this intentional?best-umbrella-88325
11/01/2022, 5:24 AMwitty-motorcycle-52108
11/02/2022, 5:40 PMcuddly-arm-8412
11/03/2022, 9:38 AMancient-apartment-23316
11/03/2022, 6:25 PMFailed to redirect to Single Sign-On provider. Please contact your DataHub Administrator, or refer to server logs for more information.
Here is my values.yaml:
datahub-frontend:
enabled: true
image:
repository: linkedin/datahub-frontend-react
tag: "v0.9.1"
# Set up ingress to expose react front-end
ingress:
enabled: false
service:
port: 80 ##################### Not 9002
oidcAuthentication:
enabled: true
provider: okta
clientId: "q"
clientSecret: "q"
oktaDomain: "<https://q.com>"
baseUrl: "dev-datahub.q.com/sso"
discoveryUrl: "q.com/.well-known/openid-configuration"
extraEnvs:
# - name: AUTH_OIDC_ENABLED
# value: "true"
# - name: AUTH_OIDC_CLIENT_ID
# value: "q"
# - name: AUTH_OIDC_CLIENT_SECRET
# value: "q"
# - name: AUTH_OIDC_DISCOVERY_URI
# value: "<https://qq.com/.well-known/openid-configuration>"
- name: AUTH_OIDC_BASE_URL
value: "q.com/sso"
# - name: AUTH_OIDC_SCOPE
# value: "openid profile email groups"
green-intern-1667
11/03/2022, 6:54 PMError response from daemon: invalid mount config for type "bind": bind source path does not exist: /Users/my_user/.datahub/mysql/init.sql
Any clue on that? I'm just following the quick start but facing it for a whilequiet-wolf-56299
11/03/2022, 10:54 PMcreamy-tent-10151
11/04/2022, 7:24 AMmicroscopic-mechanic-13766
11/04/2022, 12:27 PMFATAL: remaining connection slots are reserved for non-replication superuser connections
Thanks in advance!most-monkey-10812
11/04/2022, 1:55 PMlemon-cat-72045
11/04/2022, 6:35 PM