https://linen.dev logo
Join Slack
Powered by
# advice-data-ingestion
  • a

    Aaron Bell

    05/10/2022, 4:36 PM
    Can Airbyte send data to google sheets?
    m
    h
    a
    • 4
    • 4
  • h

    hendrik

    05/10/2022, 8:16 PM
    Hey all I have a few connections into BigQuery and I’m thinking about changing the Loading Method from “Standard Uploads” to “GCS Staging” based on the recommendations in the docs. It seems like an obvious choice given what’s written in the docs but I’m a bit worried about migrating since I have quite a few dependencies on the existing connections. If I change the loading method will this affect the destination schemas at all? I’m using
    Incremental | Append
    sync mode and want to ensure continuity or at least plan for any potential breaks. Couldn’t find anything in the docs about changing from one to another
    m
    • 2
    • 1
  • s

    Saad Anwar

    05/10/2022, 9:34 PM
    hello
    m
    • 2
    • 4
  • r

    Romain LOPEZ

    05/11/2022, 6:58 PM
    Hi Team, @Vasyl Lazebnyk, I experienced an issue loading a file encoded with ISO-8859-1 (seems similar to https://github.com/airbytehq/airbyte/issues/9059#issuecomment-1005754619) My source is a Csv file on a SFTP serveur First file row contain caractere é, wich is encoded as 0xE9 I set my csv read option as {"encoding":"latin-1"}, but it seems encoding options is not used :
    Copy code
    But still having issue in the log 
    2022-05-11 18:52:14 source > Failed to read data of PERMONLY at <scp://Fuze_BI_PermOnly.CSV>: UnicodeDecodeError('utf-8', b'"PERM_INV_NO","INVOICE_DATE","CLIENT_NO","CLIENT_NAME","PO","CANDIDATE_NAME","MANDATE","SALARY","BILLED_AMT","NET_MARGIN_PCT","TERRITORY","SALES_REP","SALES_REP_NAME","SALES_REP2","SALESREP2_NAME","SALES_PAID","RECRUITER","RECRUITER_NAME","RECRUITER_PAID","RECRUITER_2","RECRUITER_2_NAME","NOTES"\r\n"043573","2019/01/15","01022","AESP GROUP                    ","               ","Vincent Godard                "," Op\xe9rateur De Machine CNC BEAM","        .00","    1365.00","     .00","QR","ME ","6443-CHENIER, MARIE-EVE       ","   ","                              ","  ","PB","Pamela Badran                 ","  "\r\n', 416, 417, 'invalid continuation byte')
    m
    • 2
    • 3
  • c

    Cody K.

    05/11/2022, 9:58 PM
    Hi All, Using CDC Incremental | Deduped + History (Raw data - no Normalization) for MSSQL connector. I’m getting multiple rows for a single key. My understanding was that deduped would look at the underlying table primary key and insert/update/delete the record based off that. I tried to test with Basic Normalization; however, getting problems running. Do i need to do my own max date on the record using the PK to dedup since Airbyte is keeping the History? Does this get cleaned up in Basic Normalization? Here’s an example (CARRIERID & VEHICLEID is the PK):
    m
    • 2
    • 18
  • k

    Kavin Rajagopal

    05/12/2022, 8:01 PM
    Hi team! I have 2 GA sources going to one Postgres destination. How do I make sure the second source appends to the first source while writing into the Postgres database?
    m
    • 2
    • 2
  • t

    Tony Mao

    05/13/2022, 5:48 AM
    Hi, after I upgraded to 0.38.2-alpha, Postgres source connector to 0.4.14 and Redshift destination connector to 0.3.33, Airbytes is no longer able to detect the source schema for my Azure Postgres database. I tried re-creating the source (all connection tests pass) and restarting the docker image but it is still not working. For some reason, my MySQL connections to the same destination work fine. I am able to see the source schema normally in a database explorer tool (DataGrip). I have also resetted the data succesfully.
    a
    m
    • 3
    • 6
  • c

    Cédric Malet

    05/13/2022, 9:35 AM
    Hello, is it possible to use MariaDB as a source? I only see the MariaDB connector as a destination 😕
    ✅ 1
    🙏 1
    m
    • 2
    • 2
  • n

    Nikita Kotlyarov

    05/13/2022, 10:36 AM
    Hi there! I have just noticed that I have started getting the following error messages in BigQuery destination:
    Copy code
    2022-05-13 00:35:25 destination > java.lang.RuntimeException: com.google.cloud.bigquery.BigQueryException: 400 Bad Request
    2022-05-13 00:35:25 destination > PUT <https://www.googleapis.com/upload/bigquery/v2/projects/supabase-analytics-internal-eu/jobs?uploadType=resumable&upload_id=ADPycdsAeFJDZQbZn8o92WF15WQX5s7B88ktLx8igZO2_gDeWi-6t07ajNEkGufqowRKhjuLnoQpIFLeQ2XeoAShTeLWnqhTA58T>
    2022-05-13 00:35:25 destination > {
    2022-05-13 00:35:25 destination >   "error": {
    2022-05-13 00:35:25 destination >     "code": 400,
    2022-05-13 00:35:25 destination >     "message": "Invalid credential",
    2022-05-13 00:35:25 destination >     "errors": [
    2022-05-13 00:35:25 destination >       {
    2022-05-13 00:35:25 destination >         "message": "Invalid credential",
    2022-05-13 00:35:25 destination >         "domain": "global",
    2022-05-13 00:35:25 destination >         "reason": "invalid"
    2022-05-13 00:35:25 destination >       }
    2022-05-13 00:35:25 destination >     ],
    2022-05-13 00:35:25 destination >     "status": "INVALID_ARGUMENT"
    2022-05-13 00:35:25 destination >   }
    2022-05-13 00:35:25 destination > }
    2022-05-13 00:35:25 destination >
    It looks like multiple connections with BigQuery destination are affected but the error occurs from time to time so the subsequent runs for some connections are successful. I have checked the BigQuery docs, but it is still not clear for me what exactly is incorrect :
    Copy code
    This error returns when there is any kind of invalid input other than an invalid query, such as missing required fields or an invalid table schema. Invalid queries return an invalidQuery error instead.
    Would you please advise how to troubleshoot such an issue?
    logs-1000.txt
    a
    m
    +2
    • 5
    • 22
  • a

    Anatole Callies

    05/13/2022, 10:48 AM
    Hi, Does anybody know why does Airbyte keeps a cursor value for each table when configured in incremental mode ? It would be simpler to just retrieve the maximum value of the cursor column in the destination when launching a new sync. The issue with maintaining a separate cursor value is that when there is a problem during the sync it can happen that the cursor value gets updated but the new rows do not reach destination tables. For instance, it is the case with the ongoing bigquery bug
    m
    • 2
    • 2
  • s

    Sania Zafar

    05/13/2022, 3:34 PM
    Hi, I have been using Airbyte with a docker based implementation. My syncs keep failing because of disk space completely used. Can somebody please help? How should I approach this problem?
    m
    • 2
    • 1
  • s

    Sania Zafar

    05/13/2022, 7:08 PM
    I am getting this error after updating the connector:
    Copy code
    2022-05-13 17:18:50 ERROR i.a.w.p.a.DefaultAirbyteStreamFactory(lambda$create$1):70 - Validation failed: {"type":"TRACE","trace":{"type":"ERROR","emitted_at":1.652462330735E12,"error":{"message":"Something went wrong in the connector. See the logs for more details.","internal_message":"java.lang.RuntimeException: Exceptions thrown while closing consumer: java.lang.RuntimeException: com.google.cloud.bigquery.BigQueryException: 400 Bad Request\nPUT <https://www.googleapis.com/upload/bigquery/v2/projects/snoonu-rudderstack/jobs?uploadType=resumable&upload_id=ADPycdvHwQPlV3KW5qsg_grRKcw7k3gc_VnthGFruu2rvjRzH5kDUe6vaKifRqRUmiXAW47Aj1Sj2_vDVbvgdVWaRxFosi_E6F22>\n{\n  \"error\": {\n    \"code\": 400,\n    \"message\": \"Invalid credential\",\n    \"errors\": [\n      {\n        \"message\": \"Invalid credential\",\n        \"domain\": \"global\",\n        \"reason\": \"invalid\"\n      }\n    ],\n    \"status\": \"INVALID_ARGUMENT\"\n  }\n}\n","stack_trace":"[io.airbyte.integrations.destination.bigquery.BigQueryRecordConsumer.close(BigQueryRecordConsumer.java:69), io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.lambda$close$0(FailureTrackingAirbyteMessageConsumer.java:67), io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:54), io.airbyte.integrations.base.FailureTrackingAirbyteMessageConsumer.close(FailureTrackingAirbyteMessageConsumer.java:67), io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:166), io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:107), io.airbyte.integrations.destination.bigquery.BigQueryDestination.main(BigQueryDestination.java:314)]","failure_type":"system_error"}}}
    ✅ 1
    m
    • 2
    • 2
  • s

    Sania Zafar

    05/14/2022, 4:35 PM
    I am setting up a connection using Postgres(0.4.15) as source and BigQuery(1.15) as destination. Airbyte version is 0.38.3, However sync doesnot completely sync all records. The sync takes only the first 200 MB and syncs the data.
    m
    • 2
    • 1
  • d

    David Effiong

    05/16/2022, 9:01 AM
    Hello Everyone, I successfully deployed Airbyte on Google compute engine and my connections have been syncing correctly and properly. I opened airbyte this morning and the UI error is UNKNOWN ERROR OCCURRED. I have checked the compute engine and everything seems okay. Please are there any suggestions on how I could resolve this? Thank you.
    h
    m
    b
    • 4
    • 22
  • a

    Adam Bloom

    05/16/2022, 7:06 PM
    Hi everyone! Just joining the slack, so apologies if this is the wrong channel. In addition to a few issues I've reported on github, we're having some serious performance issues with the S3 Source on a bucket with a large number of small objects. Unfortunately, our experiments so far show that data will actually accumulate in S3 faster than we can sync it with the current connector. I've reviewed the source logic and believe this is surmountable. Is the team already aware of these limitations and planning to address them? If not, where is the best place to discuss?
    s
    • 2
    • 1
  • m

    Melker Öhrman

    05/17/2022, 9:02 AM
    Hello, I have a question about the incremental - append pattern when using S3 as source. I've got millions of files that I want to load into redshift. Once I've done the initial load, I only want to load the new/modified files on my next syncs. Will airbyte scan all my files to determine what files are new/updated. I imagine that would take a lot of time for every incremental update?
    a
    a
    • 3
    • 6
  • m

    Mohit Reddy

    05/17/2022, 10:23 AM
    Hi, I have a question regarding Kafka destination connector - does it support custom transformations? I was tried setting up a destination b/w Amplitude (source) and Kafka (dest). I could see the data being ingested, but could not define any transformations. NOTE: This is airbyte running locally in the “dev” version (I had a small change which I wanted to be applied as well - https://github.com/airbytehq/airbyte/pull/12876)
    c
    • 2
    • 5
  • l

    lucien

    05/18/2022, 2:36 PM
    Hello I have a question regarding Postgres (v14) Source and incremental | Append. I have one big Postgres Table which is partitioned by YYYY/MM/DD. Is it possible to replicate the “root” table or should I replicate every partitions table ? If so how can I add every day a new partition table without resetting everything and reading again the whole table ? Thanks for your answer
    a
    • 2
    • 4
  • g

    gunu

    05/19/2022, 12:02 AM
    Is there someone in the airbyte team that understands incremental streams. I’m looking to create a fix for the survey monkey issue and need some help navigating a solution - I imagine it’s just a matter of this source connector is outdated / built incorrectly. There’s plenty of functioning incremental streams in other sources so I’m looking to copy-pasta similar logic to this source.
    a
    • 2
    • 3
  • u

    이진규

    05/19/2022, 12:45 AM
    Hi Team~ I have some questions. I'm trying to move all the files in my s3 bucket (log file compressed to gz) to another s3 bucket using airbyte. However, all attempts failed because the s3 source connector only supported csv, parquet, and avro file formats. is there a way to move all files in the s3 bucket to another s3 bucket using airbyte?
    a
    • 2
    • 1
  • j

    Jaime Farres

    05/19/2022, 10:23 AM
    Hello, we are heavy users of Airbyte and would like to move on the next step to have version control over what we set up in Airbyte, to then be able to have different environments and CI/CD. We’ve researched on the blog and slack and haven’t found much. What do you recommend?
    c
    a
    g
    • 4
    • 7
  • s

    Shubham Kalloli

    05/19/2022, 11:00 AM
    Hi team, I am currently using Airbyte to read csv files from S3 We are getting an error if the file only has headers. Is there a configuration workaround that we can use for skipping files with headers only?
    m
    • 2
    • 2
  • y

    Yanni Iyeze - Toucan Toco

    05/19/2022, 2:58 PM
    Hello Is there a way to speed up or monitor this process ?
    Copy code
    We are fetching the schema of your data source.
    This should take less than a minute, but may take a few minutes on slow internet connections or data sources with a large amount of tables.
    sometime it takes a loooong time, i don’t even know what’s happening 😂
    m
    • 2
    • 13
  • d

    Dwayne Rudy

    05/19/2022, 3:28 PM
    I've searched for definitive information on this but haven't found it, so I'll ask here. Are you able to resume a sync after cancelling a sync in progress? Does Airbyte sync partial tables? If it matters this would be between MySQL and Snowflake. There would also be a separate scenario of Airbyte syncing between MongoDB and Snowflake.
    a
    • 2
    • 1
  • e

    Eric

    05/19/2022, 4:15 PM
    is there an easy way to migrate an existing incremental sync from one airbyte server to another without having to do a full resync? it seems like recreating it on the new server requires a full resync
    a
    • 2
    • 3
  • p

    Pavan Katta

    05/19/2022, 10:40 PM
    Hi, we're using airbyte to sync our customers data. In one of the streams a particular file is returning 403 and its failing the entire stream. Is there a way to specify to skip files which throw errors and continue with the stream?
    m
    • 2
    • 2
  • j

    Jenny Brown

    05/20/2022, 8:26 PM
    I'm working on options for bringing metadata from a variety of sources, into data tables in snowflake. Airbyte came to mind since this follows the same general pattern as syncing data from data sources, except that what I want my source connector to read is the metadata instead, and treat it like data. I do not need to sync the actual data, just the metadata alone, from that connector. I'm aware that others might have already tried this and found gotchas or interesting workarounds, so I thought I'd ask. What I hope to sync: Tableau Dashboard and Workbook metadata, Tableau Data Source metadata, Tableau column names/descriptors/tags, Kafka Schema Registry schemas and schema change information, Snowflake schema/table/column metadata, and possibly Metabase metadata/column descriptions, as well as the contents of a data dictionary held in a Google Sheets file. All of this gets dumped into its own separate tables in a snowflake schema, and used by views created elsewhere. The end result is the foundation for a data catalog. All would be self-hosted, so no worries about firewalls; complex auth might be a problem. Thread for discussion. Have others tried this? Are there hangups with using Airbyte to move metadata instead of regular data? Tips/warnings?
    👋 1
    c
    s
    • 3
    • 6
  • m

    Michael

    05/23/2022, 6:29 AM
    Hello everyone, I've looked everywhere but I cannot seem to find how to automatically export sync logs (fetched data size, timestamp etc.). Is there an option to automatically export these whenever the connector is set to run?
    m
    • 2
    • 2
  • d

    Delbert

    05/24/2022, 12:11 AM
    Hi, I created an Airbyte Cloud Account. My goal is to connect shopify, but I can’t select shopify as source. Is shopify not available on the cloud version? Thanks!
    m
    • 2
    • 1
  • s

    Shubhransh Bhargava

    05/24/2022, 4:27 PM
    Hey, I am creating a connection between our Google big query and Google storage, I have given the permission role
    BigQuery Data Editor, BigQuery User
    to service account, but getting following error in sync
    Copy code
    2022-05-24 16:24:25 source > Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: com.google.cloud.bigquery.BigQueryException: Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.
    2022-05-24 16:24:25 source > 	at io.airbyte.integrations.source.relationaldb.AbstractRelationalDbSource.lambda$queryTable$0(AbstractRelationalDbSource.java:64)
    2022-05-24 16:24:25 source > 	at io.airbyte.commons.util.LazyAutoCloseableIterator.computeNext(LazyAutoCloseableIterator.java:37)
    2022-05-24 16:24:25 source > 	at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)
    2022-05-24 16:24:25 source > 	at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)
    2022-05-24 16:24:25 source > 	at com.google.common.collect.TransformedIterator.hasNext(TransformedIterator.java:46)
    2022-05-24 16:24:25 source > 	at io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38)
    2022-05-24 16:24:25 source > 	at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)
    2022-05-24 16:24:25 source > 	at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)
    2022-05-24 16:24:25 source > 	at com.google.common.collect.TransformedIterator.hasNext(TransformedIterator.java:46)
    2022-05-24 16:24:25 source > 	at io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38)
    2022-05-24 16:24:25 source > 	at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)
    2022-05-24 16:24:25 source > 	at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)
    2022-05-24 16:24:25 source > 	at io.airbyte.commons.util.CompositeIterator.computeNext(CompositeIterator.java:63)
    2022-05-24 16:24:25 source > 	at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)
    2022-05-24 16:24:25 source > 	at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)
    2022-05-24 16:24:25 source > 	at io.airbyte.commons.util.DefaultAutoCloseableIterator.computeNext(DefaultAutoCloseableIterator.java:38)
    2022-05-24 16:24:25 source > 	at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:146)
    2022-05-24 16:24:25 source > 	at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:141)
    2022-05-24 16:24:25 source > 	at java.base/java.util.Iterator.forEachRemaining(Iterator.java:132)
    2022-05-24 16:24:25 source > 	at io.airbyte.integrations.base.IntegrationRunner.lambda$runInternal$0(IntegrationRunner.java:154)
    2022-05-24 16:24:25 source > 	at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:54)
    2022-05-24 16:24:25 source > 	at io.airbyte.integrations.base.sentry.AirbyteSentry.executeWithTracing(AirbyteSentry.java:38)
    2022-05-24 16:24:25 source > 	at io.airbyte.integrations.base.IntegrationRunner.runInternal(IntegrationRunner.java:154)
    2022-05-24 16:24:25 source > 	at io.airbyte.integrations.base.IntegrationRunner.run(IntegrationRunner.java:105)
    2022-05-24 16:24:25 source > 	at io.airbyte.integrations.source.bigquery.BigQuerySource.main(BigQuerySource.java:175)
    2022-05-24 16:24:25 source > Caused by: java.lang.RuntimeException: com.google.cloud.bigquery.BigQueryException: Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.
    2022-05-24 16:24:25 source > 	at io.airbyte.db.bigquery.BigQueryDatabase.waitForQuery(BigQueryDatabase.java:201)
    2022-05-24 16:24:25 source > 	at io.airbyte.db.bigquery.BigQueryDatabase.executeQuery(BigQueryDatabase.java:185)
    2022-05-24 16:24:25 source > 	at io.airbyte.db.bigquery.BigQueryDatabase.executeQuery(BigQueryDatabase.java:135)
    2022-05-24 16:24:25 source > 	at io.airbyte.db.bigquery.BigQueryDatabase.query(BigQueryDatabase.java:106)
    2022-05-24 16:24:25 source > 	at io.airbyte.db.bigquery.BigQueryDatabase.unsafeQuery(BigQueryDatabase.java:102)
    2022-05-24 16:24:25 source > 	at io.airbyte.integrations.source.relationaldb.AbstractRelationalDbSource.lambda$queryTable$0(AbstractRelationalDbSource.java:61)
    2022-05-24 16:24:25 source > 	... 24 more
    2022-05-24 16:24:25 source > Caused by: com.google.cloud.bigquery.BigQueryException: Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.
    2022-05-24 16:24:25 source > 	at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.translate(HttpBigQueryRpc.java:115)
    2022-05-24 16:24:25 source > 	at com.google.cloud.bigquery.spi.v2.HttpBigQueryRpc.getQueryResults(HttpBigQueryRpc.java:643)
    2022-05-24 16:24:25 source > 	at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1393)
    2022-05-24 16:24:25 source > 	at com.google.cloud.bigquery.BigQueryImpl$36.call(BigQueryImpl.java:1388)
    2022-05-24 16:24:25 source > 	at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:105)
    2022-05-24 16:24:25 source > 	at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86)
    2022-05-24 16:24:25 source > 	at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49)
    2022-05-24 16:24:25 source > 	at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1387)
    2022-05-24 16:24:25 source > 	at com.google.cloud.bigquery.BigQueryImpl.getQueryResults(BigQueryImpl.java:1371)
    2022-05-24 16:24:25 source > 	at com.google.cloud.bigquery.Job$1.call(Job.java:338)
    2022-05-24 16:24:25 source > 	at com.google.cloud.bigquery.Job$1.call(Job.java:335)
    2022-05-24 16:24:25 source > 	at com.google.api.gax.retrying.DirectRetryingExecutor.submit(DirectRetryingExecutor.java:105)
    2022-05-24 16:24:25 source > 	at com.google.cloud.bigquery.BigQueryRetryHelper.run(BigQueryRetryHelper.java:86)
    2022-05-24 16:24:25 source > 	at com.google.cloud.bigquery.BigQueryRetryHelper.runWithRetries(BigQueryRetryHelper.java:49)
    2022-05-24 16:24:25 source > 	at com.google.cloud.bigquery.Job.waitForQueryResults(Job.java:334)
    2022-05-24 16:24:25 source > 	at com.google.cloud.bigquery.Job.waitFor(Job.java:244)
    2022-05-24 16:24:25 source > 	at io.airbyte.db.bigquery.BigQueryDatabase.waitForQuery(BigQueryDatabase.java:199)
    2022-05-24 16:24:25 source > 	... 29 more
    2022-05-24 16:24:25 source > Caused by: com.google.api.client.googleapis.json.GoogleJsonResponseException: 403 Forbidden
    2022-05-24 16:24:25 source > GET <https://www.googleapis.com/bigquery/v2/projects/localeai-314712/queries/bbb99068-9ffb-4b14-8399-d0c231c0bc47?location=us-east1&maxResults=0&prettyPrint=false>
    2022-05-24 16:24:25 source > {
    2022-05-24 16:24:25 source >   "code" : 403,
    2022-05-24 16:24:25 source >   "errors" : [ {
    2022-05-24 16:24:25 source >     "domain" : "global",
    2022-05-24 16:24:25 source >     "message" : "Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.",
    2022-05-24 16:24:25 source >     "reason" : "accessDenied"
    2022-05-24 16:24:25 source >   } ],
    2022-05-24 16:24:25 source >   "message" : "Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.",
    2022-05-24 16:24:25 source >   "status" : "PERMISSION_DENIED"
    2022-05-24 16:24:25 source > }
    2022-05-24 16:24:25 source > 	at com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:146)
    Any idea what could be the issue
    a
    • 2
    • 1
12345...12Latest