Hey Pedro,
We are currently running Great Expectations on in-memory batches through spark. We would like to get the expectation results into DataHub, but we're not sure what the best way would be to proceed. We know which datahub dataset we're validating, but using the GE operator would require us to pass this information through great expectations itself, and then modifying the validation action code to accept our new batch type. Maybe some batch metadata fields could be standardized for this, so that similar cases can work with datahub?
Would you have any pointers? Of course, we're willing to upstream some code :)