You can do this. Create a cluster in Single User M...
# integrate-databricks-datahub
c
You can do this. Create a cluster in Single User Mode, It connects to unity catalog. Create a Personal Token for that user and configure it in datahub using Hive connector.
Copy code
spark.databricks.sql.initial.catalog.name <unity catalog name>
add this in your spark cluster config and you're good to go. Please make sure your user has select permission on tables. If not, run this:
Copy code
catalogs = spark.sql('show catalogs;');
for catalog in catalogs.toPandas()['catalog']:
  if catalog in ['default', 'samples']:
    continue
  print(catalog)
  use_catalog = f"USE CATALOG {catalog};"
  print(use_catalog)
  spark.sql(use_catalog);
  show_db = f"SHOW DATABASES;"
  print(show_db)
  dbs = spark.sql(show_db);
  for db in dbs.toPandas()['databaseName']:
    spark.sql(f"grant usage on database {db} to `datahub`;")
    if db in ['temp_notebooks', 'temp']:
      continue
    show_table = f"SHOW TABLES IN {db};"
    tables = spark.sql(show_table);
    for idx, row in tables.toPandas().iterrows():
      table = row['database'] + "." + row['tableName']
      grant_query = f'grant select on table {table} to `datahub`;'
      print(grant_query)
      spark.sql(grant_query);