This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
- # architecture (1)
- # aws (23)
- # beginners (13)
- # boot (18)
- # cider (5)
- # clara (1)
- # cljs-dev (22)
- # cljsjs (9)
- # cljsrn (28)
- # clojure (120)
- # clojure-canada (12)
- # clojure-dev (6)
- # clojure-italy (4)
- # clojure-korea (1)
- # clojure-russia (18)
- # clojure-sg (8)
- # clojure-spec (45)
- # clojure-uk (12)
- # clojurescript (240)
- # component (4)
- # cursive (17)
- # datomic (91)
- # editors-rus (4)
- # figwheel (2)
- # flambo (6)
- # hoplon (163)
- # instaparse (6)
- # jobs (1)
- # leiningen (2)
- # luminus (5)
- # om (22)
- # om-next (2)
- # onyx (35)
- # perun (15)
- # play-clj (1)
- # protorepl (4)
- # re-frame (106)
- # reagent (4)
- # ring (106)
- # schema (1)
- # spacemacs (17)
- # untangled (40)
- # yada (14)
Using your example, are you suggesting that we build a new
endpoint instance on each request?
No, you run the endpoint function when your system starts up, and that gives you your handler that you can feed into your adapter.
We're trying to see if we can reduce the number of functions that take connections because testing against database instances is easier than testing against connections.
(defn endpoint [conn] (fn [request] (let [db (d/db conn) rest-of-handlers (-> (some-handler db) (some-middleware db) (some-other-middleware db))] (rest-of-handlers request))))
@annataberski and I were just talking about how amazing your response time was to our question. 😄
I see the problem you have. With other databases I’d be inclined to use a protocol as a form of encapsulation.
(defprotocol FooDatabase (get-foo [abstract-db id])) (extend-type datomic.Connection ;; or a component record (get-foo [conn id] (get-datomic-foo (db conn) id))) (defn endpoint [datomic] (GET “/foo/:id” [id] (get-foo datomic id)))
Datomic’s immutable DBs are really useful, but since you usually need to perform writes as well, they’re not a complete abstraction.
Whereas if you wrap the datomic connection, or a containing component, in a protocol, then you have a line of separation.
My general feeling is that it’s a good idea to wrap I/O-like operations in a protocol, both as a way of improving testing, and as a formal border between your functional internal code and the not-so-functional outside.
So you're saying that we should coerce the connections into databases dynamically if necessary?
The problem with that is that each middleware will get a different database instance, right?
In which case you might need to add the database to the request. to be clear, I’m not against adding additional information to the request map, but it does represent additional complexity.
I didn't like adding it to the request map because it doesn't feel like it's part of the request.
Since I tend to approach web apps with the idea of “How to I throw away information as quickly as possible”.
The sooner we can do that, the less data we have to deal with, and the simpler our software can be.
So you're saying we should put validation in a simple function (rather than middleware) and flatten our middleware down?
Hmm. Does that mean that the request map should be paired down as you go? Or does it even mean that the value flowing through the middleware will cease to be a request map after a certain point?
My feeling is that it’s useful for the request map to be a request map, in that all the information inside tends to be connected. Having a lobotomised request map seems like you could be running into type problems; something that looks like a request but is missing data.
But once you no longer need the request map, I think it makes sense to throw away all the information you don’t need.
So we could have another chain of handlers inside the bottom-most handler that operates on something other than the request map (and closes over the database).
I think I’d need to think about this, and it really depends on what you’re doing! Have you taken a look at Duct?
So… lets say you’re getting some data in. You need to validate it, and if it’s okay you put it in the database, and return some result. To me that feels like a polymorphic function. Or something that calls polymorphic functions.
On the type of the data you’re receiving. Or… since you’re using Datomic, maybe you just treat each attribute as individually validated.
(defmethod validate :user/password [m k v] (and (string? v) (= v (:user/retype-password m))))
For example, required attributes for each "model" are expressed via a attribute attached to the entity for the attribute that is required. (Hope that makes sense.)
I think so… I tend to look at functions and ask “What does this function need to know?”
I think in our early design of this system we overused request middleware. We're coming around, but it's taking a while.
Clojure has a lot of tools for dealing with complexity and encouraging simplicity.
There’s no strict type system, framework, or whatever to enforce the idea of keeping things simple.
And because Clojure programming is different from OOP, it’s easy to layer in complexity without realizing it.
Since OOP takes a defensive approach to complexity. You have classes and objects that setup walls to contain complexity and mutable state.
With Clojure, at least to me, it feels like you have to go on the offensive. Be very aggressive and eliminate instead of contain.
So OOP says “let’s contain mutable state in objects”, and Clojure says, “let’s eliminate mutable state where possible”, as one example.
So I think my approach is to ask what a function needs, and whether its needs can be simplified. For instance, an immutable database is simpler than a raw connection.
So I’m in favour of your approach of trying to make as many things use databases over connections. But ideally you’d want to do it without drawing in the request, if possible.