Rishab Dawar
11/25/2022, 7:01 AMRohit Anilkumar
11/25/2022, 7:28 AMGOOD
, but all the columns have NULL
values and the time column has 1662356536
(looks like time of ingestion?). Adding more details in the thread.Pratik Tibrewal
11/25/2022, 9:15 AMLars-Kristian Svenøy
11/25/2022, 4:41 PMCaleb Shei
11/28/2022, 2:39 PM{
"name": "optAdditional",
"type": {
"type": "map",
"values": "string"
},
"default": {},
"doc": "collection of miscellaneous logged strings"
}
Caleb Shei
11/28/2022, 3:17 PMtranscript-table-offline.json
, what is MMAP
load mode vs. HEAP
load mode?
{
"tableName": "transcript",
"segmentsConfig": {
"timeColumnName": "timestampInEpoch",
"timeType": "MILLISECONDS",
"replication": "1",
"schemaName": "transcript"
},
"tableIndexConfig": {
"invertedIndexColumns": [],
"loadMode": "MMAP"
},
"tenants": {
"broker": "DefaultTenant",
"server": "DefaultTenant"
},
"tableType": "OFFLINE",
"metadata": {}
}
Caleb Shei
11/28/2022, 8:49 PMlong
column called dirTs
for a record's timestamp. How can I convert this long
data type into timestamp
data type? Which transformation function can I use?
"ingestionConfig":{
"transformConfigs": [
{
"columnName": "dirTs",
"transformFunction": "toTimestamp(\"dirTs\")"
}
]
},
Caleb Shei
11/28/2022, 8:49 PMtoTimestamp
?Caleb Shei
11/28/2022, 9:34 PMdirector
. But they are not ONLINE
. How can I make them online?Steven Hall
11/28/2022, 10:20 PMFailed to initialize PinotMetricsFactory. Please check if any pinot-metrics related jar is actually added to the classpath
Following this example:
https://docs.pinot.apache.org/basics/getting-started/running-pinot-locally
Successfully built the project using JDK11
Started all components including the server through admin scripts everything works as expected.
Then started the following via the admin scripts
Zookeeper
Controller
Broker
Attempting to start the server component in IntelliJ using a run configuration as shown here: https://docs.pinot.apache.org/basics/getting-started/running-pinot-locally#start-pinot-component-in-debug-mode-with-intellij
Details on IntelliJ
Build version: IntelliJ IDEA 2021.1.1 Build #IU-211.7142.45 April 30, 2021
Java version: 11.0.10+9-b1341.41x86_64
metrics-core-2.2.0.jar can be found under External Libraries
The issue is seemingly when we attempt to scan for annotated classes. I have evaluated that code fragment from PinotReflectionUtils. It does not return any found classes. see screen capture below
new Reflections(new ConfigurationBuilder().setUrls(ClasspathHelper.forPackage(packageName))
.filterInputsBy(new FilterBuilder.Include(regexPattern)))
Steven Hall
11/28/2022, 10:20 PMCaleb Shei
11/29/2022, 3:43 AMONLINE
, But when I go to Query Console and click on the table director
, I get HTTP 500 error
Below is from controller's log
2022/11/29 03:30:45.857 ERROR [PinotQueryResource] [grizzly-http-server-57] Caught exception in sendQueryRaw
java.io.IOException: Failed : HTTP error code : 500
at org.apache.pinot.controller.api.resources.PinotQueryResource.sendPostRaw(PinotQueryResource.java:306) ~[pinot-all-0.11.0.0-jar-with-dependencies.jar:0.11.0.0-cc1ceb04dc39a1d76eee11b480e34370c6344424]
at org.apache.pinot.controller.api.resources.PinotQueryResource.sendRequestRaw(PinotQueryResource.java:344) ~[pinot-all-0.11.0.0-jar-with-dependencies.jar:0.11.0.0-cc1ceb04dc39a1d76eee11b480e34370c6344424]
at org.apache.pinot.controller.api.resources.PinotQueryResource.sendRequestToBroker(PinotQueryResource.java:243) ~[pinot-all-0.11.0.0-jar-with-dependencies.jar:0.11.0.0-cc1ceb04dc39a1d76eee11b480e34370c6344424]
at org.apache.pinot.controller.api.resources.PinotQueryResource.getQueryResponse(PinotQueryResource.java:214) ~[pinot-all-0.11.0.0-jar-with-dependencies.jar:0.11.0.0-cc1ceb04dc39a1d76eee11b480e34370c6344424]
at org.apache.pinot.controller.api.resources.PinotQueryResource.executeSqlQuery(PinotQueryResource.java:139) ~[pinot-all-0.11.0.0-jar-with-dependencies.jar:0.11.0.0-cc1ceb04dc39a1d76eee11b480e34370c6344424]
at org.apache.pinot.controller.api.resources.PinotQueryResource.handlePostSql(PinotQueryResource.java:103) ~[pinot-all-0.11.0.0-jar-with-dependencies.jar:0.11.0.0-cc1ceb04dc39a1d76eee11b480e34370c6344424]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:52) ~[pinot-all-0.11.0.0-jar-with-dependencies.jar:0.11.0.0-cc1ceb04dc39a1d76eee11b480e34370c6344424]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:124) ~[pinot-all-0.11.0.0-jar-with-dependencies.jar:0.11.0.0-cc1ceb04dc39a1d76eee11b480e34370c6344424]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:167) ~[pinot-all-0.11.0.0-jar-with-dependencies.jar:0.11.0.0-cc1ceb04dc39a1d76eee11b480e34370c6344424]
at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$TypeOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:219) ~[pinot-all-0.11.0.0-jar-with-dependencies.jar:0.11.0.0-cc1ceb04dc39a1d76eee11b480e34370c6344424]
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:79) ~[pinot-all-0.11.0.0-jar-with-dependencies.jar:0.11.0.0-cc1ceb04dc39a1d76eee11b480e34370c6344424]
Priyank Bagrecha
11/29/2022, 7:34 AMRohit Anilkumar
11/29/2022, 9:19 AM{
"code": 400,
"error": "TableConfigs: linkpage already exists. Use PUT to update existing config"
}
Is there some thing I am missing out. I have used "tableType": "REALTIME"
in table config.Rohit Anilkumar
11/29/2022, 1:31 PMThomas Steinholz
11/29/2022, 3:08 PM2022/11/29 15:07:01.989 INFO [AddTableCommand] [main] {"code":500,"error":"<http://org.apache.pinot.shaded.org|org.apache.pinot.shaded.org>.apache.kafka.common.KafkaException: Failed to construct kafka consumer"}
when setting up the tables, but not finding any error logs in the pinot server or controllerMahesh babu
11/30/2022, 9:50 AMRostan TABET
11/30/2022, 10:26 AMGAPFILL
and it works perfectly in most cases.
However, I would like to aggregate the result of a GAPFILL
by a column that isn't the time column. When I try this, I get the following error message :
[...]
java.io.IOException: Failed : HTTP error code : 500. Root Cause: No Group By timebucket.
[...]
Is there a workaround? Why is it impossible to group by columns that aren't the time bucket?francoisa
12/01/2022, 3:51 PMLewis Yobs
12/01/2022, 3:58 PMGerrit van Doorn
12/01/2022, 8:38 PM2022/11/27 23:57:09.794 ERROR [JobDispatcher] [HelixController-pipeline-task-PinotCluster-(9d89f973_TASK)] No available instance found for job!
in my logs (0.10.0 version). Could this be related to the RealtimeToOfflineSegmentsTask
?Ward
12/03/2022, 10:16 PMMahesh babu
12/05/2022, 6:13 AMEaugene Thomas
12/05/2022, 6:53 AMChris London
12/05/2022, 9:29 AM450
, does anyone have any idea how I can get more context?
ProcessingException(errorCode:450, message:InternalError:
org.apache.pinot.sql.parsers.SqlCompilationException: Caught exception while parsing query: select * from Pageviews ORDER BY time DESC
at org.apache.pinot.sql.parsers.CalciteSqlParser.compileToSqlNodeAndOptions(CalciteSqlParser.java:136)
at org.apache.pinot.controller.api.resources.PinotQueryResource.executeSqlQuery(PinotQueryResource.java:135)
at org.apache.pinot.controller.api.resources.PinotQueryResource.handlePostSql(PinotQueryResource.java:103)
at jdk.internal.reflect.GeneratedMethodAccessor263.invoke(Unknown Source)
...
Alice
12/05/2022, 10:21 AMStuart Millholland
12/05/2022, 4:13 PMCaleb Shei
12/05/2022, 4:31 PMnot implemented yet
) that Pinot cannot create one segment from multiple input files (says, limited by the total size of the files)?Aaron Weiss
12/05/2022, 5:39 PMDamon
12/05/2022, 5:59 PMprometheus-community/kube-prometheus-stack
but the CPU usage panels is showing no data since I am metric container_cpu_`user`_seconds_total , but I have container_cpu_`usage`_seconds_total
Is there a way to enable the missing user metric?