Fork me on GitHub

R.e. dataloader: on the plus side Lacinia has its preview api, so you know in general terms what coming "up" further down the query document tree. On the other hand, the async/promise stuff and the building of the result bottom to top might pose some challenges. I'm not sure how the other GraphQL implementations operate internally.


Dataloader says that it batches things by pushing loads to the next event loop tick - that seems quite clever but also quite messy to me.


I was thinking about caching the other day, just passing around an atom in the resolver context or even the resolver result would be enough, no?


I've been thinking also of an atom in the resolver context as a mechanism for coordination and caching. It feels like there might be a higher level of abstraction, but I haven't seen it yet.


yeah, I hadn’t remembered that about data loader but it is quite clever. I don’t know how that could work in a threaded context, but like you said @hlship, you have the ability to preview children to fetch pessimistically


I think that an atom solves some of the thread-safeness and gives you the general mechanism, but the nice thing about dataloader is that it gives you a simple API to implement. Array<key> => Map<key, value> is simple enough to implement


I read the source code for elixir’s absinthe dataloader last night and I don’t think it does the fancy next-tick batching, I think it just allows you to call “load” or “load_many”


which might be OK combined with lacinia’s preview API


another thing that people will want is the ability to change the backing of the cache to be something like Redis/Elasticache rather than just an atom. we do this at work


it would be nice to have a simple protocol to create those backings rather than have to write those over and over again


anyway I decided to put my money where my mouth is and started the beginnings of a simple dataloader API after talking about it the other night:

🎉 4