Vengatesh Babu
04/27/2021, 2:10 PM<https://stackoverflow.com/questions/65886253/pinot-nested-json-ingestion>
Even examples given for JSON data types in build also not working (githubEvents)
https://github.com/apache/incubator-pinot/tree/master/pinot-tools/src/main/resources/examples/batch/githubEvents
Schema file:
{
"metricFieldSpecs": [],
"dimensionFieldSpecs": [
{
"dataType": "STRING",
"name": "name"
},
{
"dataType": "LONG",
"name": "age"
},
{
"dataType": "STRING",
"name": "subjects_str"
},
{
"dataType": "STRING",
"name": "subjects_name",
"singleValueField": false
},
{
"dataType": "STRING",
"name": "subjects_grade",
"singleValueField": false
}
],
"dateTimeFieldSpecs": [],
"schemaName": "myTable"
}
Table Config:
{
"tableName": "myTable",
"tableType": "OFFLINE",
"segmentsConfig": {
"segmentPushType": "APPEND",
"segmentAssignmentStrategy": "BalanceNumSegmentAssignmentStrategy",
"schemaName": "myTable",
"replication": "1"
},
"tenants": {},
"tableIndexConfig": {
"loadMode": "MMAP",
"invertedIndexColumns": [],
"noDictionaryColumns": [
"subjects_str"
],
"jsonIndexColumns": [
"subjects_str"
]
},
"metadata": {
"customConfigs": {}
},
"ingestionConfig": {
"batchIngestionConfig": {
"segmentIngestionType": "APPEND",
"segmentIngestionFrequency": "DAILY",
"batchConfigMaps": [],
"segmentNameSpec": {},
"pushSpec": {}
},
"transformConfigs": [
{
"columnName": "subjects_str",
"transformFunction": "jsonFormat(subjects)"
},
{
"columnName": "subjects_name",
"transformFunction": "jsonPathArray(subjects, '$.[*].name')"
},
{
"columnName": "subjects_grade",
"transformFunction": "jsonPathArray(subjects, '$.[*].grade')"
}
]
}
}
Data.json
{"name":"Pete","age":24,"subjects":[{"name":"maths","grade":"A"},{"name":"maths","grade":"B--"}]}
{"name":"Pete1","age":23,"subjects":[{"name":"maths","grade":"A+"},{"name":"maths","grade":"B--"}]}
{"name":"Pete2","age":25,"subjects":[{"name":"maths","grade":"A++"},{"name":"maths","grade":"B--"}]}
{"name":"Pete3","age":26,"subjects":[{"name":"maths","grade":"A+++"},{"name":"maths","grade":"B--"}]}
please help me to rectify this issue.
Ingestion Job output: (No error)
bin/pinot-admin.sh LaunchDataIngestionJob -jobSpecFile /home/sas/apache-pinot-incubating-0.7.1-bin/examples/batch/jsontype/ingestionJobSpec.yaml
SegmentGenerationJobSpec:
!!org.apache.pinot.spi.ingestion.batch.spec.SegmentGenerationJobSpec
cleanUpOutputDir: false
excludeFileNamePattern: null
executionFrameworkSpec: {extraConfigs: null, name: standalone, segmentGenerationJobRunnerClassName: org.apache.pinot.plugin.ingestion.batch.standalone.SegmentGenerationJobRunner,
segmentMetadataPushJobRunnerClassName: null, segmentTarPushJobRunnerClassName: org.apache.pinot.plugin.ingestion.batch.standalone.SegmentTarPushJobRunner,
segmentUriPushJobRunnerClassName: org.apache.pinot.plugin.ingestion.batch.standalone.SegmentUriPushJobRunner}
includeFileNamePattern: glob:**/*.json
inputDirURI: examples/batch/jsontype/rawdata
jobType: SegmentCreationAndTarPush
outputDirURI: examples/batch/jsontype/segments
overwriteOutput: true
pinotClusterSpecs:
- {controllerURI: '<http://localhost:9000>'}
pinotFSSpecs:
- {className: org.apache.pinot.spi.filesystem.LocalPinotFS, configs: null, scheme: file}
pushJobSpec: {pushAttempts: 2, pushParallelism: 1, pushRetryIntervalMillis: 1000,
segmentUriPrefix: null, segmentUriSuffix: null}
recordReaderSpec: {className: org.apache.pinot.plugin.inputformat.json.JSONRecordReader,
configClassName: null, configs: null, dataFormat: json}
segmentCreationJobParallelism: 0
segmentNameGeneratorSpec: null
tableSpec: {schemaURI: '<http://localhost:9000/tables/myTable/schema>', tableConfigURI: '<http://localhost:9000/tables/myTable>',
tableName: myTable}
tlsSpec: null
Trying to create instance for class org.apache.pinot.plugin.ingestion.batch.standalone.SegmentGenerationJobRunner
Creating an executor service with 1 threads(Job parallelism: 0, available cores: 40.)
Initializing PinotFS for scheme file, classname org.apache.pinot.spi.filesystem.LocalPinotFS
Submitting one Segment Generation Task for file:/home/sas/apache-pinot-incubating-0.7.1-bin/examples/batch/jsontype/rawdata/test.json
Initialized FunctionRegistry with 119 functions: [fromepochminutesbucket, arrayunionint, codepoint, mod, sha256, year, yearofweek, upper, arraycontainsstring, arraydistinctstring, bytestohex, tojsonmapstr, trim, timezoneminute, sqrt, togeometry, normalize, fromepochdays, arraydistinctint, exp, jsonpathlong, yow, toepochhoursrounded, lower, toutf8, concat, ceil, todatetime, jsonpathstring, substr, dayofyear, contains, jsonpatharray, arrayindexofint, fromepochhoursbucket, arrayindexofstring, minus, arrayunionstring, toepochhours, toepochdaysrounded, millisecond, fromepochhours, arrayreversestring, dow, doy, min, toepochsecondsrounded, strpos, jsonpath, tosphericalgeography, fromepochsecondsbucket, max, reverse, hammingdistance, stpoint, abs, timezonehour, toepochseconds, arrayconcatint, quarter, md5, ln, toepochminutes, arraysortstring, replace, strrpos, jsonpathdouble, stastext, second, arraysortint, split, fromepochdaysbucket, lpad, day, toepochminutesrounded, fromdatetime, fromepochseconds, arrayconcatstring, base64encode, ltrim, arraysliceint, chr, sha, plus, base64decode, month, arraycontainsint, toepochminutesbucket, startswith, week, jsonformat, sha512, arrayslicestring, fromepochminutes, remove, dayofmonth, times, hour, rpad, arrayremovestring, now, divide, bigdecimaltobytes, floor, toepochsecondsbucket, toepochdaysbucket, hextobytes, rtrim, length, toepochhoursbucket, bytestobigdecimal, toepochdays, arrayreverseint, datetrunc, minute, round, dayofweek, arrayremoveint, weekofyear] in 942ms
Using class: org.apache.pinot.plugin.inputformat.json.JSONRecordReader to read segment, ignoring configured file format: AVRO
Finished building StatsCollector!
Collected stats for 4 documents
Using fixed length dictionary for column: subjects_grade, size: 20
Created dictionary for STRING column: subjects_grade with cardinality: 5, max length in bytes: 4, range: A to B--
Using fixed length dictionary for column: subjects_name, size: 5
Created dictionary for STRING column: subjects_name with cardinality: 1, max length in bytes: 5, range: maths to maths
Using fixed length dictionary for column: name, size: 20
Created dictionary for STRING column: name with cardinality: 4, max length in bytes: 5, range: Pete to Pete3
Created dictionary for LONG column: age with cardinality: 4, range: 23 to 26
Start building IndexCreator!
Finished records indexing in IndexCreator!
Trying to create instance for class org.apache.pinot.plugin.ingestion.batch.standalone.SegmentTarPushJobRunner
Initializing PinotFS for scheme file, classname org.apache.pinot.spi.filesystem.LocalPinotFS
Start pushing segments: []... to locations: [org.apache.pinot.spi.ingestion.batch.spec.PinotClusterSpec@4e31276e] for table myTable
Ken Krugler
04/27/2021, 4:01 PMStart pushing segments: []... to locations:
indicate that no segments were created? Also, what’s the column that’s being used for the dateTimeFieldSpecs
?Syed Akram
04/28/2021, 5:44 AMVengatesh Babu
04/28/2021, 7:59 AMSyed Akram
04/28/2021, 9:03 AMXiang Fu
Xiang Fu
➜ bin/pinot-admin.sh LaunchDataIngestionJob -jobSpecFile examples/batch/jsontype/ingestionJobSpec.yaml
Plugins root dir is [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins]
Loading all plugins. Please use env variable 'plugins.include' to customize.
Trying to load plugin [pinot-gcs] from location [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-file-system/pinot-gcs]....
Successfully Loaded plugin [pinot-batch-ingestion-spark] from dir [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-batch-ingestion/pinot-batch-ingestion-spark]
Trying to load plugin [pinot-protobuf] from location [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-protobuf]
Successfully loaded plugin [pinot-protobuf] from jar file [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-protobuf/pinot-protobuf-0.7.1-shaded.jar]
Successfully Loaded plugin [pinot-protobuf] from dir [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-protobuf]
Trying to load plugin [pinot-orc] from location [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-orc]
Successfully loaded plugin [pinot-orc] from jar file [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-orc/pinot-orc-0.7.1-shaded.jar]
Successfully Loaded plugin [pinot-orc] from dir [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-orc]
Trying to load plugin [pinot-csv] from location [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-csv]
Successfully loaded plugin [pinot-csv] from jar file [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-csv/pinot-csv-0.7.1-shaded.jar]
Successfully Loaded plugin [pinot-csv] from dir [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-csv]
Trying to load plugin [pinot-confluent-avro] from location [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-confluent-avro]
Successfully loaded plugin [pinot-confluent-avro] from jar file [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-confluent-avro/pinot-confluent-avro-0.7.1-shaded.jar]
Successfully Loaded plugin [pinot-confluent-avro] from dir [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-confluent-avro]
Trying to load plugin [pinot-parquet] from location [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-parquet]
Successfully loaded plugin [pinot-parquet] from jar file [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-parquet/pinot-parquet-0.7.1-shaded.jar]
Successfully Loaded plugin [pinot-parquet] from dir [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-parquet]
Trying to load plugin [pinot-avro] from location [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-avro]
Successfully loaded plugin [pinot-avro] from jar file [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-avro/pinot-avro-0.7.1-shaded.jar]
Successfully Loaded plugin [pinot-avro] from dir [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-avro]
Trying to load plugin [pinot-json] from location [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-json]
Successfully loaded plugin [pinot-json] from jar file [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-json/pinot-json-0.7.1-shaded.jar]
Successfully Loaded plugin [pinot-json] from dir [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-json]
Trying to load plugin [pinot-thrift] from location [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-thrift]
Successfully loaded plugin [pinot-thrift] from jar file [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-thrift/pinot-thrift-0.7.1-shaded.jar]
Successfully Loaded plugin [pinot-thrift] from dir [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-input-format/pinot-thrift]
Trying to load plugin [pinot-kafka-2.0] from location [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-stream-ingestion/pinot-kafka-2.0]
Successfully loaded plugin [pinot-kafka-2.0] from jar file [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-stream-ingestion/pinot-kafka-2.0/pinot-kafka-2.0-0.7.1-shaded.jar]
Successfully Loaded plugin [pinot-kafka-2.0] from dir [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/plugins/pinot-stream-ingestion/pinot-kafka-2.0]
SegmentGenerationJobSpec:
!!org.apache.pinot.spi.ingestion.batch.spec.SegmentGenerationJobSpec
cleanUpOutputDir: false
excludeFileNamePattern: null
executionFrameworkSpec: {extraConfigs: null, name: standalone, segmentGenerationJobRunnerClassName: org.apache.pinot.plugin.ingestion.batch.standalone.SegmentGenerationJobRunner,
segmentMetadataPushJobRunnerClassName: null, segmentTarPushJobRunnerClassName: org.apache.pinot.plugin.ingestion.batch.standalone.SegmentTarPushJobRunner,
segmentUriPushJobRunnerClassName: org.apache.pinot.plugin.ingestion.batch.standalone.SegmentUriPushJobRunner}
includeFileNamePattern: glob:**/*.json
inputDirURI: examples/batch/jsontype/rawdata
jobType: SegmentCreationAndTarPush
outputDirURI: examples/batch/jsontype/segments
overwriteOutput: true
pinotClusterSpecs:
- {controllerURI: '<http://localhost:9000>'}
pinotFSSpecs:
- {className: org.apache.pinot.spi.filesystem.LocalPinotFS, configs: null, scheme: file}
pushJobSpec: {pushAttempts: 2, pushParallelism: 1, pushRetryIntervalMillis: 1000,
segmentUriPrefix: null, segmentUriSuffix: null}
recordReaderSpec: {className: org.apache.pinot.plugin.inputformat.json.JSONRecordReader,
configClassName: null, configs: null, dataFormat: json}
segmentCreationJobParallelism: 0
segmentNameGeneratorSpec: null
tableSpec: {schemaURI: '<http://localhost:9000/tables/myTable/schema>', tableConfigURI: '<http://localhost:9000/tables/myTable>',
tableName: myTable}
tlsSpec: null
Trying to create instance for class org.apache.pinot.plugin.ingestion.batch.standalone.SegmentGenerationJobRunner
Creating an executor service with 1 threads(Job parallelism: 0, available cores: 16.)
Initializing PinotFS for scheme file, classname org.apache.pinot.spi.filesystem.LocalPinotFS
Submitting one Segment Generation Task for file:/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/examples/batch/jsontype/rawdata/data.json
Initialized FunctionRegistry with 119 functions: [fromepochminutesbucket, arrayunionint, codepoint, mod, sha256, year, yearofweek, upper, arraycontainsstring, arraydistinctstring, bytestohex, tojsonmapstr, trim, timezoneminute, sqrt, togeometry, normalize, fromepochdays, arraydistinctint, exp, jsonpathlong, yow, toepochhoursrounded, lower, toutf8, concat, ceil, todatetime, jsonpathstring, substr, dayofyear, contains, jsonpatharray, arrayindexofint, fromepochhoursbucket, arrayindexofstring, minus, arrayunionstring, toepochhours, toepochdaysrounded, millisecond, fromepochhours, arrayreversestring, dow, doy, min, toepochsecondsrounded, strpos, jsonpath, tosphericalgeography, fromepochsecondsbucket, max, reverse, hammingdistance, stpoint, abs, timezonehour, toepochseconds, arrayconcatint, quarter, md5, ln, toepochminutes, arraysortstring, replace, strrpos, jsonpathdouble, stastext, second, arraysortint, split, fromepochdaysbucket, lpad, day, toepochminutesrounded, fromdatetime, fromepochseconds, arrayconcatstring, base64encode, ltrim, arraysliceint, chr, sha, plus, base64decode, month, arraycontainsint, toepochminutesbucket, startswith, week, jsonformat, sha512, arrayslicestring, fromepochminutes, remove, dayofmonth, times, hour, rpad, arrayremovestring, now, divide, bigdecimaltobytes, floor, toepochsecondsbucket, toepochdaysbucket, hextobytes, rtrim, length, toepochhoursbucket, bytestobigdecimal, toepochdays, arrayreverseint, datetrunc, minute, round, dayofweek, arrayremoveint, weekofyear] in 704ms
Using class: org.apache.pinot.plugin.inputformat.json.JSONRecordReader to read segment, ignoring configured file format: AVRO
Finished building StatsCollector!
Collected stats for 4 documents
Using fixed length dictionary for column: subjects_grade, size: 20
Created dictionary for STRING column: subjects_grade with cardinality: 5, max length in bytes: 4, range: A to B--
Using fixed length dictionary for column: subjects_name, size: 5
Created dictionary for STRING column: subjects_name with cardinality: 1, max length in bytes: 5, range: maths to maths
Using fixed length dictionary for column: name, size: 20
Created dictionary for STRING column: name with cardinality: 4, max length in bytes: 5, range: Pete to Pete3
Created dictionary for LONG column: age with cardinality: 4, range: 23 to 26
Start building IndexCreator!
Finished records indexing in IndexCreator!
Finished segment seal!
Converting segment: /var/folders/kp/v8smb2f11tg6q2grpwkq7qnh0000gn/T/pinot-6a7b22be-9d75-4668-bcdd-430e1aee249d/output/myTable_OFFLINE_0 to v3 format
v3 segment location for segment: myTable_OFFLINE_0 is /var/folders/kp/v8smb2f11tg6q2grpwkq7qnh0000gn/T/pinot-6a7b22be-9d75-4668-bcdd-430e1aee249d/output/myTable_OFFLINE_0/v3
Deleting files in v1 segment directory: /var/folders/kp/v8smb2f11tg6q2grpwkq7qnh0000gn/T/pinot-6a7b22be-9d75-4668-bcdd-430e1aee249d/output/myTable_OFFLINE_0
Computed crc = 2489432204, based on files [/var/folders/kp/v8smb2f11tg6q2grpwkq7qnh0000gn/T/pinot-6a7b22be-9d75-4668-bcdd-430e1aee249d/output/myTable_OFFLINE_0/v3/columns.psf, /var/folders/kp/v8smb2f11tg6q2grpwkq7qnh0000gn/T/pinot-6a7b22be-9d75-4668-bcdd-430e1aee249d/output/myTable_OFFLINE_0/v3/index_map, /var/folders/kp/v8smb2f11tg6q2grpwkq7qnh0000gn/T/pinot-6a7b22be-9d75-4668-bcdd-430e1aee249d/output/myTable_OFFLINE_0/v3/metadata.properties]
Driver, record read time : 2
Driver, stats collector time : 0
Driver, indexing time : 15
Tarring segment from: /var/folders/kp/v8smb2f11tg6q2grpwkq7qnh0000gn/T/pinot-6a7b22be-9d75-4668-bcdd-430e1aee249d/output/myTable_OFFLINE_0 to: /var/folders/kp/v8smb2f11tg6q2grpwkq7qnh0000gn/T/pinot-6a7b22be-9d75-4668-bcdd-430e1aee249d/output/myTable_OFFLINE_0.tar.gz
Size for segment: myTable_OFFLINE_0, uncompressed: 5.79K, compressed: 1.59K
Trying to create instance for class org.apache.pinot.plugin.ingestion.batch.standalone.SegmentTarPushJobRunner
Initializing PinotFS for scheme file, classname org.apache.pinot.spi.filesystem.LocalPinotFS
Start pushing segments: [/Users/xiangfu/temp/rc-test/apache-pinot-incubating-0.7.1-bin/examples/batch/jsontype/segments/myTable_OFFLINE_0.tar.gz]... to locations: [org.apache.pinot.spi.ingestion.batch.spec.PinotClusterSpec@b978d10] for table myTable
Pushing segment: myTable_OFFLINE_0 to location: <http://localhost:9000> for table myTable
Sending request: <http://localhost:9000/v2/segments?tableName=myTable> to controller: 192.168.86.73, version: Unknown
Response for pushing table myTable segment myTable_OFFLINE_0 to location <http://localhost:9000> - 200: {"status":"Successfully uploaded segment: myTable_OFFLINE_0 of table: myTable"}
Xiang Fu
bin/quick-start-batch.sh
Xiang Fu
Finished segment seal!
Converting segment: /var/folders/kp/v8smb2f11tg6q2grpwkq7qnh0000gn/T/pinot-6a7b22be-9d75-4668-bcdd-430e1aee249d/output/myTable_OFFLINE_0 to v3 format
v3 segment location for segment: myTable_OFFLINE_0 is /var/folders/kp/v8smb2f11tg6q2grpwkq7qnh0000gn/T/pinot-6a7b22be-9d75-4668-bcdd-430e1aee249d/output/myTable_OFFLINE_0/v3
Deleting files in v1 segment directory: /var/folders/kp/v8smb2f11tg6q2grpwkq7qnh0000gn/T/pinot-6a7b22be-9d75-4668-bcdd-430e1aee249d/output/myTable_OFFLINE_0
Computed crc = 2489432204, based on files [/var/folders/kp/v8smb2f11tg6q2grpwkq7qnh0000gn/T/pinot-6a7b22be-9d75-4668-bcdd-430e1aee249d/output/myTable_OFFLINE_0/v3/columns.psf, /var/folders/kp/v8smb2f11tg6q2grpwkq7qnh0000gn/T/pinot-6a7b22be-9d75-4668-bcdd-430e1aee249d/output/myTable_OFFLINE_0/v3/index_map, /var/folders/kp/v8smb2f11tg6q2grpwkq7qnh0000gn/T/pinot-6a7b22be-9d75-4668-bcdd-430e1aee249d/output/myTable_OFFLINE_0/v3/metadata.properties]
Driver, record read time : 2
Driver, stats collector time : 0
Driver, indexing time : 15
Tarring segment from: /var/folders/kp/v8smb2f11tg6q2grpwkq7qnh0000gn/T/pinot-6a7b22be-9d75-4668-bcdd-430e1aee249d/output/myTable_OFFLINE_0 to: /var/folders/kp/v8smb2f11tg6q2grpwkq7qnh0000gn/T/pinot-6a7b22be-9d75-4668-bcdd-430e1aee249d/output/myTable_OFFLINE_0.tar.gz
Size for segment: myTable_OFFLINE_0, uncompressed: 5.79K, compressed: 1.59K