Example json passed to GMS API: ```{ "snapshot":...
# ingestion
c
Example json passed to GMS API:
Copy code
{
  "snapshot": {
    "urn": "urn:li:dataJob:(urn:li:dataFlow:(glue,logistics-load,PROD),logistics-load)",
    "aspects": [
      {
        "com.linkedin.common.Ownership": {
          "owners": [
            {
              "owner": "urn:li:corpuser:dataservices",
              "type": "DATAOWNER"
            }
          ],
          "lastModified": {
            "time": 1581407189000,
            "actor": "urn:li:corpuser:dataservices"
          }
        }
      },
      {
        "com.linkedin.datajob.DataJobInfo": {
          "name": "logistics-load",
          "description": "Tranform and load logistics data into Redshift",
          "type": "SQL"
        }
      },
      {
        "com.linkedin.datajob.DataJobInputOutput": {
          "inputDatasets": [
            "urn:li:dataset:(urn:li:dataPlatform:s3,logistics_raw.shipment,PROD)"
          ],
          "outputDatasets": [
            "urn:li:dataset:(urn:li:dataPlatform:redshift,redshift_edw_production.edw_logistics_box,PROD)"
          ]
        }
      }
    ]
  }
}
b
Do you mind trying
Copy code
{
  "snapshot": {
    "urn": "urn:li:dataJob:(urn:li:dataFlow:(glue,logistics-load,PROD),logistics-load)",
    "aspects": [
      {
        "com.linkedin.common.Ownership": {
          "owners": [
            {
              "owner": "urn:li:corpuser:dataservices",
              "type": "DATAOWNER"
            }
          ],
          "lastModified": {
            "time": 1581407189000,
            "actor": "urn:li:corpuser:dataservices"
          }
        }
      },
      {
        "com.linkedin.datajob.DataJobInfo": {
          "name": "logistics-load",
          "description": "Tranform and load logistics data into Redshift",
          "type": {
            "com.linkedin.datajob.azkaban:AzkabanJobType": "SQL"
          }
        }
      },
      {
        "com.linkedin.datajob.DataJobInputOutput": {
          "inputDatasets": [
            "urn:li:dataset:(urn:li:dataPlatform:s3,logistics_raw.shipment,PROD)"
          ],
          "outputDatasets": [
            "urn:li:dataset:(urn:li:dataPlatform:redshift,redshift_edw_production.edw_logistics_box,PROD)"
          ]
        }
      }
    ]
  }
}
c
ah, will try
hmm, still getting validation error on 'type'
Copy code
Parameters of method 'ingest' failed validation with error 'ERROR :: /snapshot/aspects/1/com.linkedin.datajob.DataJobInfo/type :: \"com.linkedin.datajob.azkaban:AzkabanJobType\" is not a member type of union [ { \"type\" : \"enum\", \"name\" : \"AzkabanJobType\", \"namespace\" : \"com.linkedin.datajob.azkaban\", \"doc\" : \"The various types of support azkaban jobs\", \"symbols\" : [ \"COMMAND\", \"HADOOP_JAVA\", \"HADOOP_SHELL\", \"HIVE\", \"PIG\", \"SQL\" ], \"symbolDocs\" : { \"COMMAND\" : \"The command job type is one of the basic built-in types. It runs multiple UNIX commands using java processbuilder.\\nUpon execution, Azkaban spawns off a process to run the command.\", \"HADOOP_JAVA\" : \"Runs a java program with ability to access Hadoop cluster.\\n<https://azkaban.readthedocs.io/en/latest/jobTypes.html#java-job-type>\", \"HADOOP_SHELL\" : \"In large part, this is the same Command type. The difference is its ability to talk to a Hadoop cluster\\nsecurely, via Hadoop tokens.\", \"HIVE\" : \"Hive type is for running Hive jobs.\", \"PIG\" : \"Pig type is for running Pig jobs.\", \"SQL\" : \"SQL is for running Presto, mysql queries etc\" } } ]\n'
b
So weird
Oh
That's my own typo
Should be
Copy code
{
  "snapshot": {
    "urn": "urn:li:dataJob:(urn:li:dataFlow:(glue,logistics-load,PROD),logistics-load)",
    "aspects": [
      {
        "com.linkedin.common.Ownership": {
          "owners": [
            {
              "owner": "urn:li:corpuser:dataservices",
              "type": "DATAOWNER"
            }
          ],
          "lastModified": {
            "time": 1581407189000,
            "actor": "urn:li:corpuser:dataservices"
          }
        }
      },
      {
        "com.linkedin.datajob.DataJobInfo": {
          "name": "logistics-load",
          "description": "Tranform and load logistics data into Redshift",
          "type": {
            "com.linkedin.datajob.azkaban.AzkabanJobType": "SQL"
          }
        }
      },
      {
        "com.linkedin.datajob.DataJobInputOutput": {
          "inputDatasets": [
            "urn:li:dataset:(urn:li:dataPlatform:s3,logistics_raw.shipment,PROD)"
          ],
          "outputDatasets": [
            "urn:li:dataset:(urn:li:dataPlatform:redshift,redshift_edw_production.edw_logistics_box,PROD)"
          ]
        }
      }
    ]
  }
}
c
ah, got it, works! thanks!