Hi, We have tried to utilize the Azure Blob Storag...
# general
h
Hi, We have tried to utilize the Azure Blob Storage for inputDirURI and outputDirURI while submitting the spark job to push the data into pinot server. We faced the below issue. Exception thrown while calling mkdir (uri=wasbs://test@dev1.blob.core.windows.net/test/segments/, errorStatus =409) com.azure.storage.file.datalake.models.DataLakeStorageException: Status code 409,  {"error":{"code":"EndpointUnsupportedAccountFeatures","message":"This endpoint does not support BlobStorageEvents or SoftDelete. Please disable these account features if you would like to use this endpoint. It works fine if we disable the “softdelete” option, but we need the softdelete feature in our blob storage. It seems to be pinot code supported only ADLS Gen2, which created on top of blob storage. So the blob credentials works when create the ADLS Gen2 object.  However, SoftDelete and BlobStorageEvents are unsupported feature in ADLS Gen2 object. Therefore, we got the above error. Is it possible to support Azure Blob Storage with SoftDelete and BlobStorageEvents in pinot?
x
@User do u have any idea on this?
m
IIRC, @User did a pros/cons of using ADLS vs ABS as deepstore for Pinot. We went with ADLS based on those findings. Will let him share once he is back.
h
Hi Team, is there any update for the above request?
m
I think @User is on vacation, but from what I recall, ADLS was giving a better abstraction for what is needed for PinotFS. If using an ABS is a must, you might explore adding support for it, shouldn’t be that hard, we can help
h
Thank you
s
If ADLS doesn’t support the soft delete feature while you need the feature, I think that we need to add the ABS based PinotFS implementation. I chose ADLS over ABS because ADLS provides the
atomic rename + hierarchical namespace (directory structure)
support. Technically, we don’t need
atomic
rename so we can replace it with copy & delete (original) when using ABS.