Fork me on GitHub
#graphql
<
2020-08-16
>
steveb8n04:08:15

Q: I’ve started using Lacinia to return results for a vega-lite chart. This means 1000s of objects. For big data-sets, it’s creating quite a lot of GC pressure. While this can be caused by internal allocations (not using transducers fully yet), I’m wondering how much of the allocation/GC comes from the resolvers. Has anyone else got an insight into this?

steveb8n04:08:15

I will dig in and measure the internal transformations but just curious about best practice for this at the resolver level. that may include not using graphql for these results and just returning EDN.

hlship22:08:42

Well, essentially, Lacinia has to build the data structure up from the leaves, THEN can start converting all that EDN to JSON (or to a stream of characters), and there isn’t a good way to make it happen in a lazy way.

hlship21:08:07

We haven't done this, but for large data sets, you could consider implementing it as a subscription, as that can deal with a streaming model to get all the data to the client over time, rather than in one giant response.

hiredman22:08:39

depending on the object graph you may want to implement pagination

hiredman22:08:58

https://graphql.org/learn/pagination/ has a discussion of some possible implementations

steveb8n01:08:25

thanks both. I am using idiomatic GQL pagination already and it’s working well. the subscription idea is a good one as I plan to use websockets for another requirement as well. I’ll take a look at that

steveb8n01:08:27

I don’t think laziness will help with GC as it will still need to allocate the same number of short lived objects. I could be wrong in my thinking on this, happy to be corrected

steveb8n01:08:12

subs could be challenging as my stack is api-gateway -> datomic Ions. I am considering adding ec2 to the stack and Lacinia would run there but I’m not sure for either design, how well subs would run through AWS Api Gateway.

steveb8n01:08:32

but that’s on me to test out. I’ll report back if I find anything interesting