https://pinot.apache.org/ logo
Join Slack
Powered by
# general
  • k

    Kishore G

    12/08/2019, 11:02 PM
    No. We added a new field type called dateTimeField which will replace TimeFieldSpec
  • p

    Paulo Silva

    12/08/2019, 11:18 PM
    I'm newer in pinot... I could not see how "dateTimeFieldSpec" can work with multiples fields. In json schema, "dateTimeFieldSpec" is a list/collection or object? Can you show a basic example wiith multiples date/datetime fields?
  • e

    Elon

    12/09/2019, 7:11 PM
    We deployed release 0.2.0 and get the following error when creating a schema which worked with release 0.1.0:
  • e

    Elon

    12/09/2019, 7:11 PM
    Copy code
    Cannot find a deserializer for non-concrete Map type [map type; class javax.ws.rs.core.MultivaluedMap, [simple type, class java.lang.String] -> [collection type; class java.util.List, contains [simple type, class java.lang.String]]]
    
     at [Source: (org.glassfish.jersey.message.internal.ReaderInterceptorExecutor$UnCloseableInputStream); line: 1, column: 1]
  • e

    Elon

    12/09/2019, 7:12 PM
    Here is the schema:
  • e

    Elon

    12/09/2019, 7:12 PM
    and here is the curl we used:
  • e

    Elon

    12/09/2019, 7:15 PM
    Copy code
    emaName": "flattened_orders_hours",
      "dimensionFieldSpecs": [
        {
          "name": "xxxx",
          "dataType": "STRING"
        },
        {
          "name": "xxxx",
          "dataType": "STRING"
        },
        {
          "name": "xxxx",
          "dataType": "STRING"
        },
        {
          "name": "xxxx",
          "dataType": "STRING"
        },
        {
          "name": "xxxx",
          "dataType": "STRING"
        },
        {
          "name": "xxxx",
          "dataType": "STRING"
        },
        {
          "name": "xxxx",
          "dataType": "STRING"
        },
        {
          "name": "xxxx",
          "dataType": "STRING"
        },
        {
          "name": "xxxx",
          "dataType": "STRING"
        }
      ],
      "metricFieldSpecs": [
        {
          "name": "xxxx",
          "dataType": "FLOAT"
        },
        {
          "name": "xxxx",
          "dataType": "FLOAT"
        },
        {
          "name": "xxxx",
          "dataType": "FLOAT"
        },
        {
          "name": "xxxx",
          "dataType": "FLOAT"
        },
        {
          "name": "xxxx",
          "dataType": "FLOAT"
        },
        {
          "name": "xxxx",
          "dataType": "FLOAT"
        },
        {
          "name": "xxxx",
          "dataType": "INT"
        },
        {
          "name": "xxxx",
          "dataType": "FLOAT"
        },
        {
          "name": "xxxx",
          "dataType": "INT"
        },{
          "name": "xxxx",
          "dataType": "FLOAT"
        }
      ],
      "timeFieldSpec": {
        "incomingGranularitySpec": {
          "name": "xxxx",
          "dataType": "LONG",
          "timeType": "SECONDS"
        },
        "outgoingGranularitySpec": {
          "name": "xxxx",
          "dataType": "INT",
          "timeFormat" : "EPOCH",
          "timeType": "HOURS"
        }
      }
    }
  • e

    Elon

    12/09/2019, 7:16 PM
    Copy code
    curl -k -X POST --header 'Content-Type: application/json' -d '@pinot/schema.json' localhost:9000/schemas
  • e

    Elon

    12/09/2019, 7:37 PM
    Is this an issue with 0.2.0 release or did something change with the api between release 0.1.0 and 0.2.0?
  • e

    Elon

    12/10/2019, 1:01 AM
    FYI, updating the presto connector again - got rid of thread context classloader and only use netty4 - looks like netty3 might be used in pinot by some other 3rd party libs (zkclient I think - from mvn dependency:tree)
  • s

    Subbu Subramaniam

    12/10/2019, 1:45 AM
    @User I was able to reproduce your problem with the 0.2.0 release. It does work with the quick-start that is currently built with the top of master. A lot of changes have happened, so I am not sure where the problem got introduced.
  • e

    Elon

    12/10/2019, 1:53 AM
    Oh great! So I'll pull from master and retry
  • e

    Elon

    12/10/2019, 1:54 AM
    Thanks @User!
  • s

    Subbu Subramaniam

    12/10/2019, 1:55 AM
    that will get you moving forward, but I will try to find the offending checkin
  • s

    Subbu Subramaniam

    12/10/2019, 1:55 AM
    rather, the correcting one 🙂
  • s

    Subbu Subramaniam

    12/10/2019, 2:04 AM
    I think you are seeing https://github.com/apache/incubator-pinot/issues/4603 and it was fixed in https://github.com/apache/incubator-pinot/pull/4639/
  • s

    Subbu Subramaniam

    12/10/2019, 2:04 AM
    @User
  • s

    Subbu Subramaniam

    12/10/2019, 2:05 AM
    I am not sure how the issue got introduced between 0.1.0 and 0.2.0, perhaps @User can explain (he fixed it)
  • e

    Elon

    12/10/2019, 2:06 AM
    I see that commit in the build, could it have gotten reintroduced again?
  • e

    Elon

    12/10/2019, 2:06 AM
    Either way I am trying to build from head
  • s

    Subbu Subramaniam

    12/10/2019, 2:23 AM
    @User ok, mystery solved. This type iof schema upload was not supported in 0.1.0 either. It was added after the commit from which 0.2.0 was cut. For now, in the official releases, the way to upload schema is using http parts, as in:
    Copy code
    curl -k  -X POST -F schema=@schema-file.json localhost:9000/schemas
  • s

    Subbu Subramaniam

    12/10/2019, 2:24 AM
    So, in 0.3.0, this command will be supported, yes.
  • e

    Elon

    12/10/2019, 2:24 AM
    Oh great, so for now do it the way you showed and after 0.3.0 go back to the command I pasted above?
  • s

    Subbu Subramaniam

    12/10/2019, 2:24 AM
    correct.
  • e

    Elon

    12/10/2019, 2:35 AM
    This worked, much appreciated!
  • a

    Alex

    12/10/2019, 3:32 AM
    hi, another question about offline segments. Can they have overlapping data ranges, or each segment must be responsible for an unique time range?
  • k

    Kishore G

    12/10/2019, 3:40 AM
    overlapping is fine
  • a

    Alex

    12/10/2019, 3:41 AM
    @User cool. And another one, any plans of refactoring SegmentIndexCreationDriverImpl ?
  • a

    Alex

    12/10/2019, 3:41 AM
    it currently tied to FS (as it uses RecordReader as iterator).
  • a

    Alex

    12/10/2019, 3:42 AM
    I’m trying to marry it with flink streams, and had to do some hacky bridge between RecordReader and Flink’s datastreams
1...103104105...160Latest