https://datahubproject.io logo
Join Slack
Powered by
# integrate-tableau-datahub
  • o

    orange-coat-2879

    05/12/2022, 2:28 AM
    Hello, I tried to ingest data from tableau but got error below. I have installed acryl-datahub[tableau] successfully. What is the real problem? Thanks
    Copy code
    Installing collected packages: tableauserverclient
    Successfully installed tableauserverclient-0.18.0
    ubuntu@ip-172-31-16-11:~$ datahub ingest -c /home/ubuntu/datahub/tableau.yml
    [2022-05-12 02:07:40,616] INFO     {datahub.cli.ingest_cli:96} - DataHub CLI ver                                              sion: 0.8.34.1
    [2022-05-12 02:07:40,722] ERROR    {datahub.entrypoints:165} - tableau is disabl                                              ed; try running: pip install 'acryl-datahub[tableau]'
    [2022-05-12 02:07:40,722] INFO     {datahub.entrypoints:176} - DataHub CLI versi                                              on: 0.8.34.1 at /home/ubuntu/.local/lib/python3.8/site-packages/datahub/__init__                                              .py
    [2022-05-12 02:07:40,722] INFO     {datahub.entrypoints:179} - Python version: 3                                              .8.13 (default, Apr 19 2022, 02:32:06)
    [GCC 11.2.0] at /usr/bin/python3.8 on Linux-5.15.0-1005-aws-x86_64-with-glibc2.3                                              5
    [2022-05-12 02:07:40,722] INFO     {datahub.entrypoints:182} - GMS config {'mode                                              ls': {}, 'versions': {'linkedin/datahub': {'version': 'v0.8.34', 'commit': '5cce                                              3acddcb46443c748bf2eb0b1e5e53994d936'}}, 'managedIngestion': {'defaultCliVersion                                              ': '0.8.34.1', 'enabled': True}, 'statefulIngestionCapable': True, 'supportsImpa                                              ctAnalysis': True, 'telemetry': {'enabledCli': True, 'enabledIngestion': False},                                               'datasetUrnNameCasing': False, 'retention': 'true', 'noCode': 'true'}
    h
    • 2
    • 3
  • f

    fresh-napkin-5247

    05/12/2022, 1:33 PM
    Hello. Anyway I can get datahub to scrape all the projects on Tableau online, instead of me having to pass a list?
    h
    • 2
    • 2
  • w

    wonderful-dream-38059

    05/30/2022, 9:44 PM
    Hello team - the docs for the tableau integration say that
    Detect Deleted Entities
    is currently not supported. What does this mean in practice? My reading of the docs makes me think they just persist past deletion, and are never removed. If that is the case has anyone done any design work to allow removal of stale records post deletion? I'd be happy to help contribute if not.
    l
    h
    • 3
    • 5
  • w

    wonderful-dream-38059

    06/13/2022, 12:22 PM
    Me again 🙂. In testing the tableau connector more, I'm getting a big memory explosion. A large snowflake ingestion job or dbt ingestion job comfortably run in a container with ~1GB of memory. My Tableau ingestion job is still getting OOM Killed at 16GB of memory! Before I go down a big debugging hole - has anyone else seen very very high memory usage when running the tableau ingestion source? For any of the people who wrote the original, any hints on where the issue might be would be very helpful - otherwise I'll start diving into this one myself. (I'll work on this before I pick up any of the deleted entities stuff I mentioned above - I need to get the job to complete before I start upgrading it! 😄 ).
    l
    • 2
    • 1
  • f

    faint-advantage-18690

    07/12/2022, 8:00 AM
    Hi all, I am trying to get the lineage of one of my workbooks but it seems that the Tableau lineage does is not linked to the BigQuery table even though it uses one as a source. What I expect is : BigQuery table -> Published data source -> Embedded Data Source -> Charts But I get : Published data source -> Embedded Data Source -> Charts
    h
    • 2
    • 16
  • p

    purple-analyst-83660

    07/18/2022, 7:53 AM
    Hi All, I am trying to ingest metadata corresponding to a project. I get _NODE_LIMIT_EXCEEDED_ error first, when I try to include _page_size: 5_. I get this error. Can any body help? (Have attached the config yaml that I am using)
    h
    • 2
    • 5
  • c

    careful-insurance-60247

    08/10/2022, 3:00 PM
    I used to be able to see lineage to my mssql boxes but now I only see tableau datasets. Did this functionality change?
    h
    • 2
    • 15
  • m

    magnificent-lawyer-97772

    08/25/2022, 2:03 PM
    Hi folks, I am not sure whether this is the correct channel, but with some colleagues we are thinking of implementing some improvements to the Tableau connector. Namely, we want to add the Platform Instance to the connector. Our idea would be for the platform instance to represent a Tableau site, so a 1:1 relationship between them. What do folks think?
    plus1 4
    s
    w
    +2
    • 5
    • 7
  • m

    modern-artist-55754

    08/29/2022, 4:01 PM
    I’m facing some issues with the Node Limit exceeded. I noticed a few things: •
    PublishedDatasourcesConnection
    &
    CustomSQLTablesConnection
    doesn’t have
    page_size
    implemented like workbook. https://github.com/datahub-project/datahub/blob/7e15947a372f6f627f29f5a1c783383d49[…]daf6/metadata-ingestion/src/datahub/ingestion/source/tableau.py • The workbooksConnection is little complex ( i have some complex workbook and even with
    page_size
    =1, it still exceed the node limit), I think we can refactor the
    EmbeddedDatasourcesConnection
    to a seperate call like
    PublishedDatasourcesConnection
    (at least it seems to help with my issue, although i still have some issue that i haven’t worked out yet). https://github.com/datahub-project/datahub/blob/7e15947a372f6f627f29f5a1c783383d49[…]tadata-ingestion/src/datahub/ingestion/source/tableau_common.py
    👀 1
    h
    • 2
    • 6
  • w

    witty-butcher-82399

    08/31/2022, 6:16 AM
    Before starting the implementation 😅 , any feedback on the discussion in this thread?
  • c

    cuddly-butcher-39945

    09/26/2022, 11:40 PM
    Hey Team, I am trying to ingest Tableau Metadata and running into some snags. I've created a personal access token and my recipe looks like this...
    Copy code
    source:
      type: tableau
      config:
        # Coordinates
        connect_uri: <https://tableautest/#/home>
        site:
        projects: ["HOSPITALS"]
    
        # Credentials
        token_name: JGTest
        token_value: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
    
        # Options
        ingest_tags: True
        ingest_owner: True
        default_schema_map:
          mydatabase: public
          anotherdatabase: anotherschema
    
    sink:
        type: datahub-rest
        config:
            server: '<http://datahub-gms:8080>'
    I have read through the debug log, but have not really found anything meaningful other than the generic message at the bottom stating... ConnectionError: HTTPConnectionPool(host='datahub-gms', port=8080): Max retries exceeded with url: /config (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f14da1cdf50>: Failed to establish a new connection: [Errno -2] Name or service not known')) I've also attached my debug log, Thanks!
    TableauIngestionDebugTrace.txt
    m
    • 2
    • 40
  • g

    gifted-diamond-19544

    09/27/2022, 1:29 PM
    Hello. I am currently facing a problem in my Tableau ingestion pipeline, where I am getting the NODE_LIMIT_EXCEEDED warning. I am currently making the ingestion from the UI. I have set the
    page_size
    to 1, as instructed in the docs, however I am still getting the error. So what I did was, instead of trying to ingest all the Tableau projects on the same pipeline, I create various pipeline with just a subset of the projects, and scheduled them with a few minutes offset. This seems to be working, however it is kinda of cumbersome. I think it would be great to add an option to the Tableau ingestion recipe that specifies a time interval between the extraction of each Tableau project. I have tried this using the Python emiter (basically I put a sleep statement between the extraction of each project), and this solved the problem. However, since I am not using the UI, I don’t see an easy way to achieve this. Does anyone have any solution for this problem, when making the ingestion vie the UI? Thank you!
    m
    • 2
    • 7
  • a

    average-dusk-91249

    09/30/2022, 4:46 PM
    Hi, new to DataHub and excited to play around with it! We use Snowflake + Tableau and I was curious if there's a way for DataHub to extract the dashboard lineage in a way that can be connected back to Snowflake ingestion (if that makes sense). Tableau ingestion stores embedded Snowflake datasource names in the form of
    database.schema.table_name
    . Separately Snowflake ingestion has a hierarchal structure of
    database > schema > table_name
    . My assumption is that connecting the lineage between something like
    Tableau Dataset: mydb.myschema.mytable
    and
    Snowflake Dataset: mydb > myschema > mytable
    would not show up in lineage graphs automatically. For Tableau ingestion, I did see a section for
    default_schema_map
    in the YAML settings. Don't know if something would need to change there to make the connection between a scenario like this work.
    Copy code
    source:
        type: tableau
        config:
            ingest_owner: true
            default_schema_map:
                mydatabase: public
                anotherdatabase: anotherschema
            connect_uri: '<https://tableau.site.com>'
            password: '${tableau_password}'
            ingest_tags: true
            username: tableau_username
            projects: null
    pipeline_name: 'blah_blah'
    m
    • 2
    • 7
  • a

    average-dusk-91249

    10/03/2022, 6:12 PM
    A separate question/scenario: I attempted to run a full import of our Tableau metadata, setting projects to null to bring in everything. The run(s) were successful, but the ingestion appears to have missed published data sources that are not being used by workbooks/dashboards. Is it possible to bring in all Tableau assets, including published data sources without downstream workbooks?
    m
    • 2
    • 1
  • f

    full-engineer-98290

    10/24/2022, 9:31 PM
    Hey There, I am facing couple of issues with Tableau metadata. I started noticing these issues after upgrading to v0.9.0, but not 100% sure they are related to the upgrade: 1. Prior to the upgrade I was able to see many reports, dashboards and datasets in Tableau. But now I can see only a very, very few of these 2. Also prior to the upgrade I was able to see the lineage between Tableau datasets and redshift datasets, but now I cant see that. While running the ingestion recipe. I noticed a message on the console “Skipping upstream table <<...uid>> from lineage since its name is none”. Attaching the recipe file, python ingestion code and console message for reference. We have a demo coming up later this week. Tableau metadata and Redshift-Tableau linage were two of the key capabilities we wanted to demonstrate. Help and guidance will be greatly appreciated. Thanks.
    plus1 1
    m
    g
    • 3
    • 4
  • c

    chilly-truck-63841

    10/26/2022, 9:30 PM
    Hi team - does the tableau connector support stateful ingestion? I see this comment from a month ago suggesting it doesn't & I see an open feature request, but I see options in the config details for the connector that suggest it does
    m
    m
    • 3
    • 4
  • o

    orange-intern-2172

    03/13/2023, 10:03 AM
    Does anyone here know how to stop the Tableau tokens from expiring? I need something that is permanent... I'm using the Personal access token under my profile.
    • 1
    • 3
  • a

    acoustic-quill-54426

    03/13/2023, 4:11 PM
    X posting for visibility. We would be happy to go over it with you guys if needed. For context, this affects to 552/1141 custom SQL tables at my company. We have a script that uses the tableau metadata catalog API to fetch the queries, parses them and emits the lineage using a datahub mutation. It would be great if we could have this integrated with the ingestion !
    👍 3
  • a

    acoustic-quill-54426

    03/13/2023, 5:22 PM
    I think we introduced a breaking change here =>
    Validation error of type FieldUndefined: Field 'projectLuid' in type 'Workbook' is undefined @ 'workbooksConnection/nodes/projectLuid'
    that affects all versions previous to 2022.3 https://help.tableau.com/current/api/metadata_api/en-us/docs/meta_api_release_notes.html
    • 1
    • 3
  • l

    lively-jackal-83760

    05/24/2023, 11:11 AM
    Hi guys I noticed some weird behavior. I used Tableau source with stateful_ingestion option. The first time - it's ok, all necessary entities were created. On the second time, my Tableau server was down and datahub's client couldn't connect. But the ingestion wasn't stop, it wrote some HTTP connect error log and then continue to do the magic with stateful_ingestion. Without connection, we don't have data at all and the client decides to remove all my Tableau data from first time running 🙂 I guess it isn't expected behavior. What do you thinks? or maybe we have some option to prevent such behavior?
    h
    • 2
    • 1
  • h

    happy-belgium-57206

    06/06/2023, 2:08 PM
    Hi team - I have Tableau metadata available in file system (csv format). Is there an API which will consume this metadata and update DataHub backend storage (MySQL, ES, Kafka)? Any other approach to achieve above stated goal Thanks Dharmendra
    m
    g
    • 3
    • 3
  • m

    miniature-painter-94073

    07/10/2023, 1:10 PM
    Hi ALl, Quick Q - Im super new to DataHub, just hosting locally on Docker atm Ive set up some Tableau ingestions via the gui, can you programatically use every site in your server with code in the backend?
    h
    • 2
    • 3
  • n

    numerous-address-22061

    07/13/2023, 12:50 AM
    I ran our Tableau ingestion and the entities are in and look good, however it seems that none of the Embedded Data Sources were able to link upstream to their snowflake tables. Is this not an out of the box feature of the tableau ingestion? Almost every dashboard hooks up to Snowflake tables and we are hoping to see the lineage.
    m
    • 2
    • 1
  • n

    numerous-address-22061

    07/15/2023, 12:32 AM
    ^Seem to be close to having this working, however I am now noticing that a bunch of datasets were generated by the Tableau ingestion. These seem to be parsed from the Custom SQL, however the databases are off, it seems that because the Custom SQL doesnt use a fully qualified table name ex. select * from `analytics.table1`; datahub generates a dataset called
    analytics.analytics.table1
    and it cant figure out how to connect that Custom SQL to the actual snowflake Table which is
    database1.analytics.table1
    . How can I help it along here? Id rather it not map upstream at all than just guess and generate a dataset that sits in datahub. Ideally id get it to map back to its actual snowflake table (which is already in datahub).
    m
    a
    • 3
    • 5
  • f

    fast-xylophone-28117

    08/02/2023, 7:28 PM
    Hello everyone, We are trying to ingest data from the on-premise Tableau server. We made sure the user has all the permissions as specified in tableau pre-requisites here: https://datahubproject.io/docs/quick-ingestion-guides/tableau/setup but we are not able to ingest metadata yet wuing UI based ingestion recipe/scheduler. we keep getting this error, We know for sure username and passwords are correct, does anyone have idea on how do we resolve this SSL Certificate verify failed error?
    Copy code
    "tableau-login": [
              "Unable to login (check your Tableau connection and credentials): HTTPSConnectionPool(host='172.25.160.82', port=443): Max retries exceeded with url: /api/2.4/auth/signin (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:997)')))"
    With the same user, We can log in from UI fine. We also confirmed firewall is not blocking anything, the curl into the Tableau server IP address runs well from the datahub action pod. On the other hand, when I use exact same recipe on back end and run cli based ingestion manually, I get pass this error and get something else (some charts, tags, and projects ingested while some failed with this error.
    Copy code
    {
      "error": "Unable to emit metadata to DataHub GMS: java.lang.RuntimeException: Unknown aspect browsePathsV2 for entity container",
      "info": {
        "exceptionClass": "com.linkedin.restli.server.RestLiServiceException",
        "message": "java.lang.RuntimeException: Unknown aspect browsePathsV2 for entity container",
        "status": 500,
        "id": "urn:li:container:00eafb6262a384f1fc4e9582f576ba3d"
      }
    }
    h
    • 2
    • 2
  • n

    numerous-address-22061

    08/09/2023, 4:10 PM
    Hello every couple days or so I will see that the Tableau ingestion failed due to this error. It seems pretty random, has anyone seen this before?
    a
    g
    g
    • 4
    • 8
  • n

    numerous-address-22061

    08/17/2023, 5:13 PM
    I hate to copy this big of a trace into the chat but suddenly the last week since upgrading our tableau ingestion has been failing... has anyone seen a similar error?
    Copy code
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO - [2023-08-17, 05:06:24 PDT] ERROR    {datahub.entrypoints:199} - Command failed: 'NoneType' object has no attribute 'get'
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO - Traceback (most recent call last):
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/datahub/entrypoints.py", line 186, in main
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     sys.exit(datahub(standalone_mode=False, **kwargs))
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1130, in __call__
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     return self.main(*args, **kwargs)
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1055, in main
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     rv = self.invoke(ctx)
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1657, in invoke
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     return _process_result(sub_ctx.command.invoke(sub_ctx))
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1657, in invoke
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     return _process_result(sub_ctx.command.invoke(sub_ctx))
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/click/core.py", line 1404, in invoke
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     return ctx.invoke(self.callback, **ctx.params)
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/click/core.py", line 760, in invoke
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     return __callback(*args, **kwargs)
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/click/decorators.py", line 26, in new_func
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     return f(get_current_context(), *args, **kwargs)
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/datahub/telemetry/telemetry.py", line 448, in wrapper
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     raise e
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/datahub/telemetry/telemetry.py", line 397, in wrapper
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     res = func(*args, **kwargs)
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/datahub/utilities/memory_leak_detector.py", line 95, in wrapper
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     return func(ctx, *args, **kwargs)
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 198, in run
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     ret = loop.run_until_complete(run_ingestion_and_check_upgrade())
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     return future.result()
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 182, in run_ingestion_and_check_upgrade
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     ret = await ingestion_future
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 140, in run_pipeline_to_completion
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     raise e
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/datahub/cli/ingest_cli.py", line 132, in run_pipeline_to_completion
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     pipeline.run()
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/run/pipeline.py", line 367, in run
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     for wu in itertools.islice(
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/api/source_helpers.py", line 119, in auto_stale_entity_removal
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     for wu in stream:
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/api/source_helpers.py", line 143, in auto_workunit_reporter
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     for wu in stream:
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/api/source_helpers.py", line 208, in auto_browse_path_v2
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     for urn, batch in _batch_workunits_by_urn(stream):
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/api/source_helpers.py", line 346, in _batch_workunits_by_urn
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     for wu in stream:
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/api/source_helpers.py", line 156, in auto_materialize_referenced_tags
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     for wu in stream:
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/api/source_helpers.py", line 70, in auto_status_aspect
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     for wu in stream:
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/source/tableau.py", line 2590, in get_workunits_internal
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     yield from self.emit_sheets()
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/source/tableau.py", line 2028, in emit_sheets
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     yield from self.emit_sheets_as_charts(
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/source/tableau.py", line 2107, in emit_sheets_as_charts
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     project_luid: Optional[str] = self._get_workbook_project_luid(workbook)
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -   File "/usr/local/lib/python3.10/site-packages/datahub/ingestion/source/tableau.py", line 1438, in _get_workbook_project_luid
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO -     if wb.get(tableau_constant.LUID) and self.workbook_project_map.get(
    [2023-08-17, 05:06:24 PDT] {{pod_manager.py:235}} INFO - AttributeError: 'NoneType' object has no attribute 'get
    h
    • 2
    • 2
  • n

    numerous-address-22061

    08/17/2023, 5:14 PM
    related to
    workbook_project_map
    not being set to a value?
  • b

    brainy-musician-50192

    08/22/2023, 8:54 AM
    Hi guys, I ingested metadata from Snowflake and Tableau, running on most recent cli (0.10.5). What is already amazing is that lineage between Snowflake tables/views and Tableau data source objects was picked up. However, there is no column level lineage between Snowflake and Tableau. Is this expected behavior? I'm asking because one of the config options is:
    Copy code
    extract_column_level_lineage (boolean, default: true):
    When enabled, extracts column-level lineage from Tableau Datasources
    Does this mean lineage between tableau data source and tableau chart, and not between external table/view and tableau data source? If so, are there any future plans to add column lineage between Snowflake and Tableau?
    m
    h
    +2
    • 5
    • 14
  • s

    strong-author-11562

    08/31/2023, 9:45 PM
    Hi, I am getting this error when I am trying to ingest metadata from tableau
    log.txt
    • 1
    • 1
12Latest