bland-orange-13353
01/27/2023, 1:27 PMpython3 -m pip install --upgrade pip wheel setuptools
python3 -m pip install --upgrade acryl-datahub
datahub version
damp-greece-27806
01/27/2023, 7:18 PMStarting upgrade with id NoCodeDataMigrationCleanup...
Executing Step 1/4: UpgradeQualificationStep...
Found qualified upgrade candidate. Proceeding with upgrade...
Completed Step 1/4: UpgradeQualificationStep successfully.
Executing Step 2/4: DeleteLegacyAspectRowsStep...
Completed Step 2/4: DeleteLegacyAspectRowsStep successfully.
Executing Step 3/4: DeleteLegacyGraphRelationshipStep...
Failed to delete legacy data from graph: java.lang.ClassCastException: class com.linkedin.metadata.graph.elastic.ElasticSearchGraphService cannot be cast to class com.linkedin.metadata.graph.neo4j.Neo4jGraphService (com.linkedin.metadata.graph.elastic.ElasticSearchGraphService and com.linkedin.metadata.graph.neo4j.Neo4jGraphService are in unnamed module of loader org.springframework.boot.loader.LaunchedURLClassLoader @7ca48474)
Failed to delete legacy data from graph: java.lang.ClassCastException: class com.linkedin.metadata.graph.elastic.ElasticSearchGraphService cannot be cast to class com.linkedin.metadata.graph.neo4j.Neo4jGraphService (com.linkedin.metadata.graph.elastic.ElasticSearchGraphService and com.linkedin.metadata.graph.neo4j.Neo4jGraphService are in unnamed module of loader org.springframework.boot.loader.LaunchedURLClassLoader @7ca48474)
Failed Step 3/4: DeleteLegacyGraphRelationshipStep. Failed after 1 retries.
Exiting upgrade NoCodeDataMigrationCleanup with failure.
Upgrade NoCodeDataMigrationCleanup completed with result FAILED. Exiting...
I’m not fully understanding this bit as it seems specific to neo4j, and I don’t know why that would/should be running for setups that aren’t using that. Thanks for you help in advance.ambitious-notebook-45027
01/28/2023, 9:40 AMWHZ-Authentication {
com.sun.security.auth.module.LdapLoginModule sufficient
java.naming.security.authentication="simple"
userProvider="<ldap://192.168.3.75:389>"
authIdentity="cn={USERNAME},dc=lenovoedu,dc=cn"
userFilter="(&(objectClass=inetOrgPerson)(uid={USERNAME}))"
debug="true"
useSSL="false";
};
brainy-piano-85560
01/29/2023, 1:12 PMbland-orange-13353
01/29/2023, 1:12 PMpython3 -m pip install --upgrade pip wheel setuptools
python3 -m pip install --upgrade acryl-datahub
datahub version
elegant-state-4
01/29/2023, 10:27 PM./gradlew :datahub-frontend:dist -x yarnTest -x yarnLint
[16:15:15] Generate [started]
[16:15:16] Generate [completed]
[16:15:16] Generate src/types.generated.ts [completed]
[16:15:16] Load GraphQL documents [completed]
[16:15:16] Generate [started]
[16:15:17] Generate [completed]
[16:15:17] Generate to src/ (using EXPERIMENTAL preset "near-operation-file") [completed]
[16:15:17] Generate outputs [completed]
Creating an optimized production build...
Browserslist: caniuse-lite is outdated. Please run:
npx browserslist@latest --update-db
Why you should do it regularly:
<https://github.com/browserslist/browserslist#browsers-data-updating>
Failed to compile.
/Users/eyomi/datahub/datahub-web-react/src/App.tsx
TypeScript error in /Users/eyomi/datahub/datahub-web-react/src/App.tsx(109,10):
'ThemeProvider' cannot be used as a JSX component.
Its instance type 'Component<ThemeProviderProps<DefaultTheme, DefaultTheme>, any, any>' is not a valid JSX element.
The types returned by 'render()' are incompatible between these types.
Type 'React.ReactNode' is not assignable to type 'import("/Users/eyomi/datahub/node_modules/@types/react/index").ReactNode'. TS2786
107 |
108 | return (
> 109 | <ThemeProvider theme={dynamicThemeConfig}>
| ^
110 | <Router>
111 | <Helmet>
112 | <title>{dynamicThemeConfig.content.title}</title>
info Visit <https://yarnpkg.com/en/docs/cli/run> for documentation about this command.
error Command failed with exit code 1.
> Task :datahub-web-react:yarnQuickBuild FAILED
FAILURE: Build failed with an exception.
Any idea what could be causing this issue?average-dinner-25106
01/30/2023, 7:23 AMmany-solstice-66904
01/30/2023, 9:22 AM./docker/dev.sh
script to get started on local development. When I do this I run into the following issue:
+ docker-compose -f docker-compose.yml -f docker-compose.override.yml -f docker-compose.dev.yml pull
[+] Running 9/12
⠿ zookeeper Pulled 1.2s
⠿ broker Pulled 1.2s
⠿ elasticsearch-setup Warning 1.5s
⠿ kafka-setup Warning 1.7s
⠿ mysql Pulled 1.2s
⠿ datahub-actions Pulled 1.3s
⠿ mysql-setup Pulled 1.4s
⠿ datahub-frontend-react Warning 1.5s
⠿ neo4j Pulled 1.3s
⠿ datahub-gms Pulled 1.4s
⠿ schema-registry Pulled 1.4s
⠿ elasticsearch Pulled 1.2s
WARNING: Some service image(s) must be built from source by running:
docker compose build datahub-frontend-react elasticsearch-setup kafka-setup
3 errors occurred:
* Error response from daemon: manifest for linkedin/datahub-frontend-react:debug not found: manifest unknown: manifest unknown
* Error response from daemon: manifest for linkedin/datahub-elasticsearch-setup:debug not found: manifest unknown: manifest unknown
* Error response from daemon: manifest for linkedin/datahub-kafka-setup:debug not found: manifest unknown: manifest unknown
running COMPOSE_DOCKER_CLI_BUILD=1 DOCKER_BUILDKIT=1 DOCKER_DEFAULT_PLATFORM="$(uname -m)" docker-compose -p datahub -f docker-compose.yml -f docker-compose.override.yml -f docker-compose.dev.yml build kafka-setup
exits with exit code 0 so that works fine. My local build also succeeds with no issues. I've had a look through the documentation and this channel but I have not found an issue similar to this so I am at a bit of a loss what to do here.fresh-cricket-75926
01/30/2023, 11:17 AMdelightful-orange-22738
01/30/2023, 12:34 PMhelm upgrade --install datahub datahub/datahub --values charts/datahub/values.yaml --version 0.2.128
LogIn.tsx:85 POST <http://datahub.bdp-prod/logIn> 500 (Internal Server Error)
fierce-garage-74290
01/30/2023, 1:00 PMdelightful-orange-22738
01/30/2023, 1:26 PMdatahub/datahub
with full admin rights 😢
helm upgrade --install datahub datahub/datahub --values charts/datahub/values.yaml --version 0.2.128
cuddly-plumber-64837
01/30/2023, 2:51 PMbland-orange-13353
01/30/2023, 2:51 PMpython3 -m pip install --upgrade pip wheel setuptools
python3 -m pip install --upgrade acryl-datahub
datahub version
early-student-2446
01/30/2023, 3:59 PMbulky-jackal-3422
01/30/2023, 4:16 PMInvalid version: '[<current_datetime>] {_plugin.py:349} INFO - Patching datahub policy 2.4.3'
I'm trying to use apache-airflow==2.4.3
, apache-airflow-providers-amazon==6.0.0
, acryl-datahub-airflow-plugin==0.9.6.2
. I've configured the gms_host
in the datahub utility, but I haven't set the connection on the airflow side because I would normally do this through the UI. Is there a way for me to set the configuration in my meltano.yml
? I've tried setting datahub_conn_id
under config.core
but no luck. Any idea what policy this error is referring to?numerous-ram-92457
01/30/2023, 6:44 PMfierce-garage-74290
01/30/2023, 11:20 PMBusiness Glossary File Format
that a certain glossary term is deprecated?
I cannot see any option in the documentation:
https://datahubproject.io/docs/generated/ingestion/sources/business-glossary/
I'd like to manage this via repo not UI.average-dinner-25106
01/31/2023, 1:47 AMbrash-helicopter-28341
01/31/2023, 8:21 AMbrash-helicopter-28341
01/31/2023, 8:23 AMlively-jackal-83760
01/31/2023, 8:56 AMwooden-breakfast-17692
01/31/2023, 12:49 PMnumerous-application-54063
01/31/2023, 5:07 PMfaint-painting-38451
01/31/2023, 5:14 PM/entities?action=ingest
and aspects?action=ingestProposal
and both went through successfully with the reader token. The GMS is validating the token because I tried the request with no token and then with an invalid one and these requests both got 401 errors. Also I noticed that trying a GraphQL mutate with the reader user was rejected, it's just the GMS calls that go through. Do the GMS endpoints check if the actor has the permissions to be making the call?creamy-machine-95935
01/31/2023, 7:58 PMelegant-state-4
01/31/2023, 9:03 PMgentle-lifeguard-88494
02/01/2023, 1:06 AM# query = """mutation {
# createAccessToken(input: {type: PERSONAL, actorUrn: "urn:li:corpuser:datahub", duration: NO_EXPIRY, name: "my personal token"}) {
# accessToken
# metadata {
# id
# name
# description
# }
# }
# }
# """
Then I went ahead and added it to my recipe:
source:
type: postgres
config:
# Coordinates
host_port: localhost:5432
database: andreasmartinson
# Credentials
username: ${USERNAME}
password: ${PASSWORD}
# Options
database_alias: postgres
# <https://datahubproject.io/docs/authentication/introducing-metadata-service-authentication/>
sink:
type: "datahub-rest"
config:
server: "<http://localhost:8080>"
token: ${DATAHUB_TOKEN}
I then re-ingested the metadata and passed in the token:
datahub ingest -c config/postgres_datahub_config.dhub.yaml
Then I tried this query:
import requests
query = """query search {
search(input: {type: DATASET, query: "*", start: 0, count: 100}) {
searchResults {
entity {
... on Dataset {
properties {
name
description
}
schemaMetadata {
name
fields {
fieldPath
nativeDataType
}
}
}
}
}
}
}
"""
url = '<http://localhost:9002/api/graphiql>'
r = <http://requests.post|requests.post>(url, json={'query': query})
print(r.status_code)
print(r.text)
I verified that the query worked using the graphql frontend, it's just that I keep getting a 401 permissions error. Could someone help me figure out why I might be getting a permissions error? Feel like I'm probably missing something simple hereaverage-dinner-25106
02/01/2023, 1:44 AMsilly-accountant-1288
02/01/2023, 7:14 AMfinal String multiHopTemplateDirect = "MATCH (a {urn: '%s'})-[r:%s*1..%d]->(b) WHERE b:%s RETURN a,r,b";
final String multiHopTemplateIndirect = "MATCH (a {urn: '%s'})<-[r:%s*1..%d]-(b) WHERE b:%s RETURN a,r,b";
should it become?:
final String multiHopTemplateDirect = "MATCH shortestPath((a {urn: '%s'})-[r:%s*1..%d]->(b)) WHERE b.urn <> '%s' AND b:%s RETURN a,r,b";
final String multiHopTemplateIndirect = "MATCH shortestPath((a {urn: '%s'})<-[r:%s*1..%d]-(b)) WHERE b.urn <> '%s' AND b:%s RETURN a,r,b";
and also remove:
// It is possible to have more than 1 path from node A to node B in the graph and previous query returns all the paths.
// We convert the List into Map with only the shortest paths. "item.get(i).size()" is the path size between two nodes in relation.
// The key for mapping is the destination node as the source node is always the same, and it is defined by parameter.
neo4jResult = neo4jResult.stream()
.collect(Collectors.toMap(item -> item.values().get(2).asNode().get("urn").asString(), Function.identity(),
(item1, item2) -> item1.get(1).size() < item2.get(1).size() ? item1 : item2))
.values()
.stream()
.collect(Collectors.toList());