I’m not seeing any information in the docs regardi...
# orm-help
h
I’m not seeing any information in the docs regarding maximum query sizes, etc. for various providers. As an example, I’m querying 20k records from a PGSQL table with sizeable JSON fields. As it stands, I can request about 12k records in a single query before it crashes with:
Copy code
Failed to convert rust `String` into napi `string`
I realize this has to do with “underlying technologies”, (and yes, I should certainly be batching my queries ) but besides a couple of vaguely-related GH issues dealing with the
Node
version, I’m not really seeing anything on it. Any guidance/best practices on this besides “don’t run big queries”? 😆
1
a
Hey 👋🏾 This is an interesting question and I found this issue which might be related to yours. One approach would be paginating your queries to get smaller chunks of data. We have a guide on pagination in our docs you could refer to. 🙂
With the help of our engineering, it seems NodeAPI (napi) / JS has a limit on string size. When you query massive amounts of data, it fails when the result is passed back to JS from Rust.
If you don’t mind, could you create an issue/ comment on the current issue with a reproduction and your use case to help our engineering team look into this issue further?