rich-furniture-49524
05/16/2022, 10:48 PM{
"messageId": "",
"originalTimestamp": "2022-05-16T22:42:29.417Z",
"receivedAt": "2022-05-16T22:42:29.514Z",
"request_ip": "",
"rudderId": "",
"sentAt": "2022-05-16T22:42:29.419Z",
"timestamp": "2022-05-16T22:42:29.512Z",
"type": "track",
"userId": ""
}
The reason for this is because we need to clear and reprocessing many many events which already in our bucket.
I’ll use the python SDK.better-pager-8742
05/17/2022, 4:29 AMthis.client = new Analytics(RUDDER_STACK_WRITE_KEY, DATA_PLANE_URL);
this.client.track({ event, properties })
Cant seem to figure out why its not logging to RSbetter-pager-8742
05/17/2022, 5:22 AMdry-pizza-75186
05/17/2022, 12:21 PMOrder Completed
track events as a Conversion to Google Ads.
I have created the conversion on Google Ads and have the Conversion ID and Label values.
I then create a Google Ads RudderStack destination and input:
• Conversion ID (do you need the “AW-” prefix here? I’ve tried with and without it, the same result, no events sent)
• Client-side Events Filtering (Is this needed? Given that I specify the Click Event Conversion fields below. I’ve tried with/without, same result, no events sent)
• Click Event Conversion. On Conversion Label, I fill out the Google Ads label for that conversion, and on Name, I fill out “Order Completed,” which is the name of the track call I want to map
• Send Page View --> On
• Conversion Linker --> On
• Dynamic Remarketing --> On
Can anyone help me with how I can resolve this? Below you can find the entire payload of an example track call that I want to forward to GAds:dry-pizza-75186
05/17/2022, 12:21 PMworried-tailor-13264
05/17/2022, 12:28 PMshy-cartoon-10869
05/17/2022, 8:40 PMsteep-gold-12616
05/18/2022, 3:09 AMdamp-traffic-32919
05/18/2022, 3:22 AMmany-refrigerator-87639
05/18/2022, 9:59 AMmany-refrigerator-87639
05/18/2022, 10:27 AMrich-furniture-49524
05/18/2022, 11:32 AMidentity
nor track
events? We are using JS SDK.fresh-winter-89121
05/19/2022, 7:46 AM{
"appType": "EMBEDDED",
"server": "UP",
"db": "UP",
"acceptingEvents": "TRUE",
"routingEvents": "FALSE",
"mode": "DEGRADED",
"goroutines": "372",
"backendConfigMode": "API",
"lastSync": "2022-05-19T10:40:28+03:00",
"lastRegulationSync": ""
}
We have installed it using helm charts and below images are being used. Can you please help in this?
rudder-server:15022022.115114
rudder-transformer:15022022.133053bland-fall-68870
05/19/2022, 11:11 AMbland-fall-68870
05/19/2022, 11:11 AMgreat-coat-91156
05/19/2022, 1:02 PMbrief-pager-63582
05/19/2022, 1:33 PMdazzling-petabyte-60244
05/19/2022, 4:22 PMstocky-tailor-33483
05/19/2022, 5:18 PMstocky-tailor-33483
05/19/2022, 7:52 PMshy-cartoon-10869
05/19/2022, 11:25 PMrefined-exabyte-53440
05/20/2022, 10:40 PMclient_id
for cloud mode GA4?thankful-electrician-16984
05/22/2022, 11:44 AMbright-oyster-66890
05/23/2022, 4:26 AMnice-butcher-94150
05/23/2022, 7:06 AMlemon-dog-49453
05/23/2022, 7:43 AMconfig.json
as opposed to relying on <http://app.rudderstack.com|app.rudderstack.com>
control plane since its subject to frequent change. Is there a way to have the open source control plane generate the correct config.json
- happy to submit a PR for this if someone can guide me through the fixes that need to be made.fancy-answer-79663
05/23/2022, 9:31 AMnice-telephone-72729
05/23/2022, 10:29 AMnice-mouse-19898
05/23/2022, 11:54 AMlemon-dog-49453
05/23/2022, 2:49 PM2022-05-23T14:45:04.215Z ERROR warehouse/upload.go:1346 [WH]: Failed during exporting_data_failed stage: 1 errors occurred:
1 errors occurred:
Skipping pages table because it previously failed to load in an earlier job: 2 with error: '[CLICKHOUSE][27y9c7hIwbMlAspMej3RAtMEkWD][27yAaqp6TdCOPUXL0rgmseaXzCw][default][pages] Error occurred while committing with error:[CLICKHOUSE][27y9c7hIwbMlAspMej3RAtMEkWD][27yAaqp6TdCOPUXL0rgmseaXzCw][default][pages] Error while committing transaction as there was error while loading in table with error:code: 241, message: Memory limit (for query) exceeded: would use 9.43 GiB (attempt to allocate chunk of 4422040 bytes), maximum: 9.31 GiB: (while reading column user_id): (while reading from part /var/lib/clickhouse/store/1e8/1e86d536-fdca-4dbf-a5e5-852e8b9e8bbe/20220519_39_46_1/ from mark 375 with max_rows_to_read = 16384): While executing MergeTreeThread: while pushing to view default.new_pages_aggregate_info (5e3d9f16-74d0-4a81-9bf2-2fb791aa08f0)'
2022-05-23T14:45:04.224Z ERROR warehouse/warehouse.go:219 [WH] Failed in handle Upload jobs for worker: %!w(*fmt.wrapError=&{Upload Job failed: 1 errors occurred:
1 errors occurred:
Skipping pages table because it previously failed to load in an earlier job: 2 with error: '[CLICKHOUSE][27y9c7hIwbMlAspMej3RAtMEkWD][27yAaqp6TdCOPUXL0rgmseaXzCw][default][pages] Error occurred while committing with error:[CLICKHOUSE][27y9c7hIwbMlAspMej3RAtMEkWD][27yAaqp6TdCOPUXL0rgmseaXzCw][default][pages] Error while committing transaction as there was error while loading in table with error:code: 241, message: Memory limit (for query) exceeded: would use 9.43 GiB (attempt to allocate chunk of 4422040 bytes), maximum: 9.31 GiB: (while reading column user_id): (while reading from part /var/lib/clickhouse/store/1e8/1e86d536-fdca-4dbf-a5e5-852e8b9e8bbe/20220519_39_46_1/ from mark 375 with max_rows_to_read = 16384): While executing MergeTreeThread: while pushing to view default.new_pages_aggregate_info (5e3d9f16-74d0-4a81-9bf2-2fb791aa08f0)' +0xc0007a0300})