https://supabase.com/ logo
Uploading a csv via API
h

Hugo Peran

05/25/2023, 9:09 PM
Hello, csv import is not working for me, so i had to use the API for importing. Here's a part of my code. process.env.NODE_TLS_REJECT_UNAUTHORIZED = "0"; const supabase = createClient(SUPABASE_URL, SUPABASE_ANON_KEY); fs.createReadStream('transformed-results-test.csv') .pipe(csv()) .on('data', async (row) => { const newCompany = { first_name: row.first_name || 'N/A', last_name: row.last_name || 'N/A', job_title: row.job_title || 'N/A', reworked_job_title: row.reworked_job_title || 'N/A', skills: row.skills || 'N/A', past_company: row.past_company || 'N/A', company_city: row.company_city || 'N/A', company_linkedin_url: row.company_linkedin_url || 'N/A', revenues: row.revenues ? parseInt(row.revenues) : null, funding: row.funding ? parseInt(row.funding) : null, company_description: row.company_description || 'N/A', company_secondary_industry: row.company_secondary_industry || 'N/A', tags: row.tags || 'N/A', }; const { data, error } = await supabase.from('prospects').insert(newCompany).select(); if (error) { console.log(
Error inserting data: ${error.message}
); } else if (data && data.length > 0) { data.forEach((record) => { console.log(
Inserted company data: ${JSON.stringify(record)}
); }); } else { console.log('No data inserted'); } }) .on('end', () => { console.log('CSV file successfully processed'); }); Terminal response in the first comment
Here's what my terminal is writing me back. CSV file successfully processed Inserted company data: {"id":"58242d98-8107-4e01-b993-182d4dfad2eb","created_at":"2023-05-25T18:11:59.618868+00:00","updated_at":"2023-05-25T18:11:59.618868+00:00","deleted_at":null,"first_name":"christoph","last_name":"N/A","job_title":"N/A","reworked_job_title":"N/A","business_email":"N/A","personal_email":"N/A","phone":"N/A","social_url":"N/A","description":"N/A","keywords":"N/A","connections_count":null,"country":"N/A","city":"N/A","linkedin_id":"N/A","skills":"N/A","past_company":"N/A","rewards":"N/A","company_name":"N/A","company_domain":"N/A","company_industry":"N/A","company_address":"N/A","company_country":"N/A","company_city":"N/A","company_founded":null,"company_size":"N/A","company_linkedin_url":"N/A","company_phone":"N/A","company_type":"N/A","company_id":"N/A","email_format":"N/A","revenues":null,"funding":null,"company_description":"N/A","company_secondary_industry":"N/A","tags":"N/A"} as you can see the first name is included and creates the row, but the object send is not taking other values of the same rows. How could I make it sure that the object being sent is full ? (remember it's from a csv)
g

garyaustin

05/25/2023, 9:45 PM
Have you looked at using PSQL or pgAdmin thru the database port? Assuming this is a one time setup and not an ongoing process you need automated thru the API.
h

Hugo Peran

05/25/2023, 10:17 PM
pgAdmin to mass import data ? This is sort of of one time setup for now, but this code will be reused as mass update and row insertion through automation. So pgAdmin option is not the best
Hello ? @garyaustin
s

silentworks

05/26/2023, 8:52 AM
Please don't @ mods. If you were having a conversation with them before they will reply when they get the chance if they know the answer.
h

Hugo Peran

05/26/2023, 9:12 AM
Ok sorry, I solved my problem (it was a csv formating problem), but weird thing happened. My script imported 56k rows in 3 min, but then it stopped at 200 people before the end. Is the API rate limited ?
s

silentworks

05/26/2023, 9:13 AM
It likely is, maybe chunk them in intervals and see if that work.