Fork me on GitHub
#graphql
<
2023-02-10
>
hlship17:02:04

Anyone else care to share what they have built on top of Lacinia? I'm always looking for usage details, and experiences good and bad. What's your experience in a Clojure/ClojureScript only environment? How well does your Lacinia GraphQL work with non-Clojure front end teams? Take a few minutes to share!

vlaaad18:02:58

When I worked at Arbetsförmedlingen (Swedish Government Employment Agency), we built a product that exposed employment-related word taxonomy (e.g. relations between terms like “Software Developer” and “Clojure Developer”). We implemented a GraphQL endpoint that is now widely used (at least internally). Our DB is Datomic Cloud, meaning every query requires network access. This was our main issue with Lacinia that does not provide N+1 solution. Initially we went used Superlifter, but we wanted to have lower latency over throughput because we have a caching layer in front of us, so first response time matters, but after that it’s cache cache cache. Therefore I built plusinia 🙂

vlaaad18:02:52

We have non-Clojure frontend team and for them the GraphQL endpoint was a lifesaver because our REST APIs historically return taxonomy information with slashes in names, e.g. {"concept/name": "foo"} , and GraphQL doesn’t support such names 😄

vlaaad18:02:42

And slashes in names are pita in js

enn18:02:43

Our SPA previously did a big up-front load (via REST API) of all the data it might need, and filtering and sorting happened on the client. We had a syncing mechanism to keep the client’s view of the world up-to-date. We are trying to use GraphQL to move toward a system with a thinner client. Our GraphQL schema uses Relay-style connections for all cardinality-many references and has pagination, filtering, and sorting baked in, with the goal that the client should only ever need to pull stuff it’s actually going to render immediately. Our frontend team doesn’t use Clojure or Clojurescript. Being able to use SDL as a lingua franca to talk about data shapes has been really helpful. They also use the schema to generate TypeScript types. We don’t use anything like superlifter. Out database is on-prem Datomic, which is plenty fast once the (in-process) object cache is warm, but can be tricky during warmup. We’ve done some experimentation with parallel resolvers to speed warmup but found it hard to get the concurrency semantics we wanted with Lacinia. (Basically what we wanted was to fork the resolver tree at a particular node, and have all resolvers for that node’s children run in the same thread.)

oliy20:02:43

I built a UI for a risk system at an investment bank. We had a 'drill down' UI which went from broad and shallow to narrow and deep. Lacinia allowed us to use the same graph for every layer and the subscription support made it easy to keep the UI up-to-date. We were able to reduce the memory usage of the UI by keeping the data it held to the minimum required by the view. The resolvers all pulled data out of redis.

hlship20:02:18

It's very hard to keep the momentum going when squeezing in time to work on open source projects. It's very motivating to hear these stories.

oliy20:02:24

Is the team at Walmart still contributing to lacinia or is it just you now?

Daniel Craig21:02:52

I am using/used Lacinia for a proposed project at my company. Clojure is a tough sell at my company, but I used Lacinia atop Neptune DB and wrote a frontend so that our airline-related data could be plotted in the form of force-directed graph.

Daniel Craig21:02:21

This past december I had some 20% time that coincided with your updates to the documentation so I dusted off my code with the aim of making a vector service that read the stream of neptune db change records and provide the data in the form of vectors that one can subscribe to.

Daniel Craig21:02:54

Unfortunately Kinesis data streams can be quite expensive so I haven't found any traction yet in my company

hlship22:02:33

It was increasingly hard to get any new open source code out the door at Walmart (lack of priority combined with entrenched bureaucracy). None of the other developers had as much inclination as I did to work in open source. The open source bug doesn't hit everyone - maybe they need less affirmation in their lives? 🙂

hlship22:02:02

It's good that NuBank has at least a few projects built on Lacinia, otherwise it might be more difficult to devote any of my 20% time to it.

hlship22:02:13

And I'm now split between this and Pedestal.

Daniel Craig22:02:50

Yeah I get the impression that Walmart and my company have a few things in common in that way

orestis07:02:13

We decided on GraphQL and Lacinia as part of our rewrite to Clojure. We started with Mongo which had relatively low latency, so resolvers were quite chatty. We are migrating to Postgres to be able to do more complex joins and accurate filtering, so now on “hot” paths we are trying to do all the work at the root level. To do that, we calculate all possible data upfront in a big SQL query, which then is pruned to only include selected fields. Postgres is usually clever enough to avoid joins if no field is selected. We also wrote a little macro to load graphql queries in the front end, which validates them against the actual schema. So a broken query will actually break the front end build. We are quite happy with Lacinia. The timings extension is very useful, we even expose it (conditionally) in production so we can diagnose performance issues live.

steveb8n18:02:34

At http://Nextdoc.io we use Lacinia w/ Pedestal, running in Datomic Ions (so no N+1) problem. We have re-frame, typescript and Salesforce Apex clients talking to it. Typescript tooling was very nice i.e. being able to generate from introspection query. We disable introspection in prod and use interceptors to implement server-side “persisted” GQL operations. I also built a clojure client application generator that uses introspection queries - hope to OSS in future. Lacinia is in the middle of our stack and is critical so we are very grateful for your efforts.

steveb8n18:02:51

We are starting to use AWS websockets apis which moves us away from GraphQL but, if Lacinia Subscriptions could work with AWS API gateway and serverless, that would bring us back with that effort as well

Rowland Watkins14:02:02

I’m using Lacinia/Pedestal for B2B services, a mixture of GraphQL, REST, multipart (I haven’t seen anything concrete for file upload using GraphQL) @U04VDKC4G fantastic library, thank you for all the work you have put into it 🙏:skin-tone-2:

dazld09:03:05

We used lacinia in a recent project to create a unified API facade over datomic, clickhouse and typesense for a standard typescript / react frontend team. I have to say, it was one of the best experiences I’ve ever had in terms of building an API, and any friction was absolutely minimal - stuff like keywords being returned as :string instead of stringor date parsing.. nothing crazy. Being able to use tools like graphiQL locally for talking to the DB was an absolute joy, compared to how much work have had to do in the past to get a graphical API tool going - almost entirely negates the need for things like swagger, or postman collections. There was a very minor bump, which was trying to get the SDL out of the schema EDN - unless we missed it, I couldn’t find a built in way to do that, which would have been useful. From the frontend team’s perspective, the tech used to create the API was totally transparent, and they could just focus on the data - which is how it should be. Things we didn’t do - look into subscriptions or very fine grained ACL. They’d be part of any future work, I imagine.