Jagannath Timma
04/01/2022, 5:33 PMWeixiang Sun
04/01/2022, 5:44 PMPrateek Singhal
04/01/2022, 10:57 PMAlice
04/02/2022, 5:30 PMAlice
04/02/2022, 5:43 PMSatyam Raj
04/04/2022, 9:58 AMMohemmad Zaid Khan
04/04/2022, 10:59 AMpinot-jdbc-client
in JMeter and Perf testing queries performance of Pinot using jmeter’s JDBC Request Sampler?Diana Arnos
04/05/2022, 8:44 AMBordin Suwannatri
04/05/2022, 9:19 AMfrancoisa
04/05/2022, 2:25 PMmvn install package -DskipTests -Pbin-dist
and it take more than 20 minutes to build 😕 Anyway to get faster ?Daniel
04/05/2022, 5:29 PMAshwin
04/06/2022, 5:02 AMAlice
04/06/2022, 9:25 AMPrashant Pandey
04/06/2022, 1:57 PM0 2022/04/06 12:06:04.179 ERROR [LLRealtimeSegmentDataManager_span_event_view_1__50__287__20220406T1205Z] [span_event_view_1__50__287__20220406T1205Z] Exception while in work
1 2022/04/06 12:06:04.365 INFO [FileUploadDownloadClient] [span_event_view_1__50__287__20220406T1205Z] Sending request: <http://controller-0.controller-headless.pinot.svc.cluster.local:9000/segmentStoppe> dConsuming?reason=java.lang.NullPointerException&streamPartitionMsgOffset=1059610656&instance=Server_server-span-event-view-realtime-7.span-event-view-realtime-headless.pinot.svc.cluster.local_8098&of fset=-1&name=span_event_view_1__50__287__20220406T1205Z to controller: controller-0.controller-headless.pinot.svc.cluster.local, version: Unknown
2 2022/04/06 12:06:04.366 INFO [ServerSegmentCompletionProtocolHandler] [span_event_view_1__50__287__20220406T1205Z] Controller response {"isSplitCommitType":false,"streamPartitionMsgOffset":null,"build TimeSec":-1,"status":"PROCESSED","offset":-1} for <http://controller-0.controller-headless.pinot.svc.cluster.local:9000/segmentStoppedConsuming?reason=java.lang.NullPointerException&streamPartitionMsgO> ffset=1059610656&instance=Server_server-span-event-view-realtime-7.span-event-view-realtime-headless.pinot.svc.cluster.local_8098&offset=-1&name=span_event_view_1__50__287__20220406T1205Z
3 2022/04/06 12:06:04.366 INFO [LLRealtimeSegmentDataManager_span_event_view_1__50__287__20220406T1205Z] [span_event_view_1__50__287__20220406T1205Z] Got response {"isSplitCommitType":false,"streamParti tionMsgOffset":null,"buildTimeSec":-1,"status":"PROCESSED","offset":-1}
I have attached the server logs when this happened.Nicolas Kovacs
04/06/2022, 2:47 PMdmitry H
04/06/2022, 5:13 PMFacundo Bianco
04/06/2022, 6:14 PMid,timestamp,application
1,1649268351,"{'app_name': 'foo', 'version': '1.0.0', 'app_id': None, 'business': 'ponzico'}"
And when I load that info I got
| id | timestamp | application |
|----|------------|-------------------|
| 1 | 1649268351 | foo,1.0.0,ponzico |
(In table-schema.json row "_application_" is configured as "STRING".)
There is a way to query "_application_" row based on one of the values inside? (ie SELECT * FROM testing WHERE application.app_name = "foo"
). Thanks in advance!Diana Arnos
04/07/2022, 2:39 PMabhinav wagle
04/07/2022, 5:18 PMAlice
04/08/2022, 8:56 AMSatyam Raj
04/08/2022, 9:12 AMfrancoisa
04/08/2022, 9:17 AMPeter Pringle
04/08/2022, 11:28 AMWeixiang Sun
04/08/2022, 5:15 PMEvan Galpin
04/08/2022, 7:41 PMAlice
04/09/2022, 9:08 AMsunny
04/11/2022, 1:14 AMpinot.broker.access.control.principals.<user>.tables=test_table
It seems that restarting broker is required whenever adding a privileged table.
Do you restart every time you grant table privileges in the production environment ? Or is there any other way?suraj kamath
04/11/2022, 7:42 AMException in thread "main" java.sql.SQLFeatureNotSupportedException
at org.apache.pinot.client.base.AbstractBaseStatement.setQueryTimeout(AbstractBaseStatement.java:167)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:60)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:226)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:35)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:355)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:325)
at org.apache.spark.sql.DataFrameReader.$anonfun$load$3(DataFrameReader.scala:307)
at scala.Option.getOrElse(Option.scala:189)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:307)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:225)
Alice
04/11/2022, 1:00 PM