Hello prisma crew. I’m in a situation where I need...
# prisma-client
c
Hello prisma crew. I’m in a situation where I need to process and update a table with 100k+ records - it’s going to take awhile running each update one at a time. Each update is unique.
updateMany
really only seems to handle one data change on multiple records. What I really need is something like
createMany
, where you can prep lots of updates for many records, then execute them all at once. I take it the only real option to speed things up is batching inside of a transaction? In sets of like 10 or so?
Copy code
await prisma.$transaction([
  prisma.table.update({ where: { id: id1 }, data: { field: field1 }),
  prisma.table.update({ where: { id: id2 }, data: { field: field2 }),
  prisma.table.update({ where: { id: id3 }, data: { field: field3 }),
  ...
])
a
I don't think that would speed things up much -- you're still doing 100k writes
I think you have two options -- you could use prisma's queryRawUnsafe method and a function that turns your records that need to be updated into strings, and write the query yourself as an insert on conflict update
(that's assuming you're using Postgres)
What I actually do is install Knex along with Postgres. This uses another database connection, but Knex itself is fairly low overhead.
Knex allows you to pass an array to insert and do the onConflict('merge') option to update many/upsert many. I use this to periodically update ~8k records and it takes a small fraction of the time that it would take making 8k update statements.
Prisma's DX is much nicer for me, though, so I use Knex specifically for batch updates and Prisma otherwise.
c
@Austin Zentz Thank you for the response, I’ll look into your recommendation. It ended up taking about 4 or 5 hours to run 🙃