I’m trying to setup a dateTimeFieldSpec with a dat...
# troubleshooting
j
I’m trying to setup a dateTimeFieldSpec with a dateTrunc and I’m getting an error:
Copy code
{
      "name": "created_at_1_week_seconds",
      "dataType": "LONG",
      "defaultNullValue": 0,
      "transformFunction": "dateTrunc('week', created_at, 'MILLISECONDS')",
      "format": "1:SECONDS:EPOCH",
      "granularity": "1:SECONDS"
    },
Error:
Copy code
Exception in getting arguments for transform function 'dateTrunc('week', created_at, MILLISECONDS)' for column 'created_at_1_week_seconds'"
}
This works:
Copy code
{
      "name": "created_at_1_day_seconds",
      "dataType": "LONG",
      "defaultNullValue": 0,
      "transformFunction": "toEpochSecondsRounded(fromEpochDaysBucket(toEpochDaysBucket(created_at, 1), 1), 1)",
      "format": "1:SECONDS:EPOCH",
      "granularity": "1:SECONDS"
    },
Something about the dateTrunc function doesn’t seem to be compatible. Looking through the docs and the code it looks like dateTrunc may not be listed as a valid transform function in all use cases. The main issue is I want a toEpochDaysBucket(7) that lines up with the calendar week (Sunday or Monday) instead of the epoch week (Thursday). Any ideas?
x
I think the issue is that dateTrunc implements internal
BaseTransformFunction
function not
ScalarFunction
function which can be used in the
transformFunction
field.
j
I see. Is that an alternate query anyone can think of? Would there be a groovy syntax that would work?
x
hmmm
I saw some function like:
Copy code
/**
   * The sql compatible date_trunc function for epoch time.
   * @param unit truncate to unit (millisecond, second, minute, hour, day, week, month, quarter, year)
   * @param timeValue value to truncate
   * @param inputTimeUnitStr TimeUnit of value, expressed in Java's joda TimeUnit
   * @return truncated timeValue in same TimeUnit as the input
   */
  @ScalarFunction
  public static long dateTrunc(String unit, long timeValue, String inputTimeUnitStr) {
    return dateTrunc(unit, timeValue, inputTimeUnitStr, TimeZoneKey.UTC_KEY.getId(), inputTimeUnitStr);
  }
doe this
dateTrunc(week, created_at, MILLISECONDS)
work?
without single quote?
do you have the stacktrace
j
I’ll try that. How do I retrieve the stack trace?
x
check pinot server log
if you are on k8s you may need to enter into the container and see pinot-server.log file
mostly the
kubectl logs pinot-server-0
should tell
e
@Jai Patel won't this just truncate to the day? Looks like you wanted it by week.
lmk if you need more help with that we can experiment with it.
j
@Elon we are creating time buckets to help with segmenting our data for different graphical views. we are able to use the time bucket functions for 15 minutes, hour, and day. But the time bucket function has an undesirable offset for week. I think datetrunc will work and we already use a presto datetrunc fucntion to get the right behavior when using reconcile with offline tables. I want the same behavior with realtime.
💡 1
e
Sounds good, we can work on this.
j
@Xiang Fu I tried the syntax with the syntax without the single quotes and got the same error. I checked the controller and server containers and didn’t see any logs/exceptions connected with this.
x
hmm, it should be in server log, so it’s just silently has no msg consumed?
j
I didn’t see anything. I tried a few times with the log running in -f
x
hmm, have you changed the log level to info
e
we can't:) logging team said we generate too many logs
x
oic 😛
then maybe see if you can reproduce it from local
e
but it should generate a warn or error msg, right? 😃
x
like quickstart and create a table
I feel we should generate an error
💯 1
e
yep, @Jai Patel you can use quickstart to do that
x
@Neha Pawar not sure if you have more insight
j
the error is happening on schema import, so can’t even get to the point of creating a table.
so what I’m hearing is run local with logging and see if I can get a more detailed message in the log.
e
yep
j
where do the logs dump to?
e
if you run a local container you can just tail the logs from the docker container - if you run via intellij should be stdout/stderr
j
Well… that was a bit of a surprise. Schema addition worked when run locally on Pinot 0.7.1. I’ll try again on Pinot 0.6.0.
Confirmed, datetrunc parsing fails on PInot 0.6.0 and succeeds on Pinot 0.7.1. This may be an issue with the version we’re using.
@Elon @Xiang Fu ^^