prehistoric-honey-22655
10/20/2022, 9:13 AMcopy into test_table from (select $1:_timestamp::varchar from @my_s3_stage ); --gives me 0 files scanned
or following gives me following error: "_PARQUET file format can produce one and only one column of type variant or object or array. Use CSV file format if you want to load more than one column._"
copy into test_table
from (
select
$1:_timestamp::timestamp,
$2:anonymous_id::varchar,
$3:channel::varchar,
$4:context_app_build::varchar
from
@my_s3_stage)
file_format = (
TYPE = 'PARQUET'
SNAPPY_COMPRESSION = TRUE
)
FORCE = TRUE
ON_ERROR = 'SKIP_FILE_1%'
powerful-zebra-50304
10/20/2022, 9:22 AMminiature-doctor-84037
10/20/2022, 10:09 AMminiature-doctor-84037
10/21/2022, 1:25 PMbitter-mouse-72233
10/21/2022, 4:54 PMbumpy-intern-99877
10/22/2022, 10:02 AMbrief-pager-69878
10/23/2022, 11:14 PM"{\"last will and testament\": \"in progress\"}"
• Column with VARCHAR datatype in Redshift so data look like, {"last will and testament": "in progress"}
Everything that I have tried, though, results in Braze viewing the data as string and therefore not being to treat it like a JSON (and, for example, parse out values for certain keys).
Can you help? Should it be possible to use Reverse ETL to send user traits that are JSON objects? If so, how? Thanks!refined-island-8932
10/24/2022, 7:52 AMbroad-river-36855
10/24/2022, 7:12 PMmammoth-xylophone-84173
10/25/2022, 6:51 AMthankful-plumber-98061
10/25/2022, 7:52 AMfew-businessperson-23812
10/25/2022, 4:25 PMlate-leather-232
10/25/2022, 6:17 PMgreat-processor-9812
10/26/2022, 6:11 AMTaboola Pixel
but was not able to find the same in destination. Just wanted to know if there a way to integrate that?gifted-pizza-68567
10/26/2022, 9:28 PMbright-afternoon-44693
10/26/2022, 10:37 PMmetadata(event)
straight-raincoat-91897
10/27/2022, 10:01 AMcold-gold-778
10/27/2022, 5:22 PMsquare-policeman-22381
10/28/2022, 9:30 AM<script type="text/javascript">
!function(){var e=window.rudderanalytics=window.rudderanalytics||[];e.methods=["load","page","track","identify","alias","group","ready","reset","getAnonymousId","setAnonymousId","getUserId","getUserTraits","getGroupId","getGroupTraits"],e.factory=function(t){return function(){var r=Array.prototype.slice.call(arguments);return r.unshift(t),e.push(r),e}};for(var t=0;t<e.methods.length;t++){var r=e.methods[t];e[r]=e.factory(r)}e.loadJS=function(e,t){var r=document.createElement("script");r.type="text/javascript",r.async=!0,r.src="<https://cdn.rudderlabs.com/v1.1/rudder-analytics.min.js>";var a=document.getElementsByTagName("script")[0];a.parentNode.insertBefore(r,a)},e.loadJS(),
e.load("XXXXXXXXXXXXXXXXXXXXXXX","<https://YYYYYYYYYYYYYYY.dataplane.rudderstack.com>"), e.page(),e.track()}();
</script>
Any idea of what to do to have the same in both 🙏🏾 ?wide-lawyer-50229
10/28/2022, 1:30 PMcontext.device.id
for GA4 client_id
mapping out of the box in the transformer?
We are now mostly depended on anonymousId
value for the deviceId, but that gets us some wrong mappings. eg in posthog it maps to Anon Distinct ID
according to posthog engeneers.
But deleting anonymousId
for not being used in posthog breaks the mapping for GA4. Would be great if you would standardise the source for device id or client id since its a standard data point used in multiple tools.
Please consider it. 🙏proud-businessperson-71460
10/28/2022, 3:30 PMearly-sundown-60512
10/29/2022, 1:35 AMcool-jelly-87034
10/29/2022, 6:37 PMsalmon-plastic-31303
10/30/2022, 11:59 AMrich-noon-99360
10/30/2022, 4:45 PMfull-napkin-56745
10/30/2022, 7:31 PMsteep-monitor-58104
10/31/2022, 2:02 AMfull-napkin-56745
10/31/2022, 6:01 AMfull-napkin-56745
10/31/2022, 6:02 AMfull-napkin-56745
10/31/2022, 6:59 AM