```Sink report: {'failures': [{'error': 'Unable to...
# ingestion
p
Copy code
Sink report:
{'failures': [{'error': 'Unable to emit metadata to DataHub GMS',
               'info': {'exceptionClass': 'com.linkedin.restli.server.RestLiServiceException',
                        'message': 'java.lang.RuntimeException: java.lang.reflect.InvocationTargetException',
                        'stackTrace': 'com.linkedin.restli.server.RestLiServiceException [HTTP Status:500]: java.lang.RuntimeException: '
                                      'java.lang.reflect.InvocationTargetException\n'
                                      '\tat com.linkedin.metadata.restli.RestliUtils.toTask(RestliUtils.java:39)\n'
                                      '\tat com.linkedin.metadata.restli.BaseEntityResource.ingestInternal(BaseEntityResource.java:182)\n'
                                      '\tat com.linkedin.metadata.restli.BaseEntityResource.ingest(BaseEntityResource.java:176)\n'
                                      '\tat com.linkedin.metadata.resources.dataset.Datasets.ingest(Datasets.java:310)\n'
                                      '\tat sun.reflect.GeneratedMethodAccessor21.invoke(Unknown Source)\n'
                                      '\tat sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n'
                                                   '\t... 88 more\n',
                        'status': 500}}],
 'records_written': 9555,
 'warnings': []}

Pipeline finished with failures
a
I got this error once, the reason was my kafka-setup docker container exited with error/timeout.
g
could you try running
datahub check local-docker
?
g
^ also could you paste sink config of the recipe