This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2021-01-11
Channels
- # aws (3)
- # babashka (67)
- # beginners (284)
- # calva (19)
- # cider (12)
- # cljdoc (9)
- # clojure (111)
- # clojure-austin (4)
- # clojure-europe (34)
- # clojure-france (12)
- # clojure-greece (2)
- # clojure-nl (14)
- # clojure-taiwan (2)
- # clojure-uk (11)
- # clojurescript (34)
- # community-development (2)
- # conjure (8)
- # datomic (15)
- # events (3)
- # fulcro (12)
- # jobs (3)
- # leiningen (4)
- # malli (3)
- # meander (11)
- # mount (2)
- # off-topic (29)
- # pathom (11)
- # re-frame (31)
- # reagent (19)
- # remote-jobs (3)
- # reveal (8)
- # rewrite-clj (1)
- # sci (1)
- # shadow-cljs (8)
- # spacemacs (4)
- # sql (1)
- # startup-in-a-month (2)
- # tools-deps (2)
- # vim (7)
- # xtdb (6)
Anyone has an idea how you could detect that *ns*
has changed. Or more like, that we left the current namespace?
Does anyone here have experience to parameterize a uuid string?
I am not being able to make following query in next-jdbc
.
["SELECT * FROM sessions WHERE (session_id = ?)" "1594d14d-91f7-4c8e-88cb-b06ca14ada0f"]
It throws following error
ERROR in (logout) (QueryExecutorImpl.java:2433)
Uncaught exception, not in assertion.
expected: nil
actual: org.postgresql.util.PSQLException: ERROR: syntax error at or near "WHERE"
Position: 18
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse (QueryExecutorImpl.java:2433)
org.postgresql.core.v3.QueryExecutorImpl.processResults (QueryExecutorImpl.java:2178)
org.postgresql.core.v3.QueryExecutorImpl.execute (QueryExecutorImpl.java:306)
org.postgresql.jdbc.PgStatement.executeInternal (PgStatement.java:441)
org.postgresql.jdbc.PgStatement.execute (PgStatement.java:365)
org.postgresql.jdbc.PgPreparedStatement.executeWithFlags (PgPreparedStatement.java:155)
org.postgresql.jdbc.PgPreparedStatement.execute (PgPreparedStatement.java:144)
I think
["SELECT * FROM sessions WHERE (session_id = ?)" (UUID/fromString "1594d14d-91f7-4c8e-88cb-b06ca14ada0f")]
should work. After importing java.util.UUID
of course.Curious why we need to change a string to uuid
before querying against a string column. session_id
is a varchar
in db.
Trying above line I get following error
Uncaught exception, not in assertion.
expected: nil
actual: org.postgresql.util.PSQLException: ERROR: operator does not exist: character varying = uuid
Hint: No operator matches the given name and argument types. You might need to add explicit type casts.
Position: 42
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse (QueryExecutorImpl.java:2433)
Ah, it is a varchar
? I thought it is an UUID
in Postgres, if it isn't then it is just a normal String parameter, isn't it?
It is strange that is says it is syntax error. Does it work if you pass in the full SQL statement without the param? ie. ["SELECT * FROM sessions WHERE session_id = '1594d14d-91f7-4c8e-88cb-b06ca14ada0f'"]
@U1CQ2GB3M Log all the PG statements (via the server config, not via CLJ) and see what the actual statement is that gets executed.
@U2A8HLYQ7 yup it works if I pass it within the sql statement.
@U1CQ2GB3M Sorry, was AFK. Next step would be to try it with ["SELECT * FROM sessions WHERE session_id = ?" "1594d14d-91f7-4c8e-88cb-b06ca14ada0f"]
.
If it fails can we see the code that gets executed and the table definition?
Also, what @U2FRKM4TW suggested (getting the logs of the queries received) makes a lot of sense if it is feasible to get to the DB somehow.
this was a good excuse to look a bit at next.jdbc
source code, but I don't think anything "special" is happening in there that might cause this error, so I suspect there is something wrong with that SQL and param vector from your initial post.
Anyways, you could do the following to see the SQL statement that will be sent to Postgresql for execution:
(import org.postgresql.jdbc.PgPreparedStatement)
(def ps (.prepareStatement (.getConnection YOUR_DATASOURCE) "SELECT * FROM sessions WHERE (session_id = ?)"))
(.setObject ps 1 "1594d14d-91f7-4c8e-88cb-b06ca14ada0f")
(str ps)
"SELECT * FROM sessions WHERE (session_id = '1594d14d-91f7-4c8e-88cb-b06ca14ada0f')"
Just in case - what PostgreSQL ends up doing can be infinitely complex. It all depends on your schema. E.g. you can have a bazillion INTEAD OF
rules and some triggers that end up trying to execute broken SQL that has absolutely nothing to do with the original query.
What makes me especially suspicious is that the error says Position: 18
. But the 18th character of that query is in the middle of the sessions
word. It doesn't make any sense.
anyways, the code above does about what next.jdbc
is doing, so if the SELECT string looks legit (and I don't see why it wouldn't) the OP will know that the problem isn't in next.jdbc
as originally suspected
@U2A8HLYQ7 @U2FRKM4TW You guys are right. The issue was with one of my queries. Apparently I was focusing on a wrong query. The query below works fine.
(db/query! (sql "select *"
"from sessions s"
"inner join users on users.id = s.user_id"
(where {:s.session_id session_id})))
The issue was with following query
(db/query! (sql "UPDATE sessions"
(set {:idle_timeout 0})
(where {:session-id session-id})))
Instead of set
I should have used set-values
as per this doc here https://github.com/ludbek/sql-compose#set-values .
Its funny cos I wrote that little package.
Anyways thanks guys. As suggested, watching the db log was helpful.Hey guys. In this code:
(defn factors [n]
(filter #(zero? (mod n %)) (range 1 (inc n))))
(defn prime? [n]
(= (factors n) [1 n]))
(def all-primes
(filter prime? (range)))
if you do (time (take 10000 all-primes))
the reported time will print instantly. Why is that? although take
executes the first n all-primes
. I am familiar with the fact that take
returns a lazy sequence, but is there a better way to calculate the execution time? Do I have to do for example (last (take n all-primes))
?user=> (do (time (filter odd? (range 1000000))) nil)
"Elapsed time: 0.014845 msecs"
nil
user=> (do (time (doall (filter odd? (range 1000000)))) nil)
"Elapsed time: 45.723745 msecs"
nil
do
with nil
in there just so that REPL does not realize the seq by printing it all out.doall
would work but I would use either (last (take n))
or (first (drop n))
since doall
holds the seq in memory and you only want the time.
dorun
is better still :)
but in practice you would probably want the nth prime ... dorun
is good for just timing it
just for completeness (time (nth all-primes 10000))
does the job as well and it is shorter 🙂
One more question, if you do
(defn pinc [n] (println "*") (inc n))
(def arr (map pinc (range)))
(take 10 arr)
The last line will indeed print *
10 times (so the function will execute on the first n elements). Why does time
not report that?(take 10 arr)
returns the head of a seq. That gets passed to time
when you do (time (take 10 arr))
. time
prints the elapsed time (a few milliseconds) and returns the head of the seq to the REPL.
The REPL prints it out and that's when it gets evaluated.
if you do something larger again ex. (time (take 10000 arr)
it might be more obvious that time
prints out the Elapsed time first and then the arr
is printed out.
I have a seq of foo
s that I want to process sequentially and for each foo
I might generate zero, one or “a few” bar
s. The way I’d usually do this is with a (mapcat identity (for …))
just because I like the way for
looks — but I’m not so wild about mapcat identity
(and the http://chouser.n01se.net/apply-concat/)… Is this a transducer shaped hole?
@orestis I think so:
user=> (into [] (comp (map (fn [x] [x x])) cat) [1 2 3])
[1 1 2 2 3 3]
So I guess as a follow up - is it common for internal APIs that implement business logic by returning a transducer? In my case, the logic is all in the function that takes a foo and returns some bars. There will be multiple of those functions (probably a multi method) but I guess they are a little bit concerned about the plumbing since they need to be a bit efficient. I guess what I’m asking is, is it better to return a transducer that can be directly applied to a seq or a function that can only work with the correct plumbing?
when faced with this decision (which pops up from time to time) I tend to choose vanilla functions. My reasoning is - which of these is more agnostic and reusable?
(defn logic1 []
(fn []
...))
(defn logic2 []
(map (fn []
...)))
(defn logic3 [xs]
(map (fn []
...)
xs))
(fn, trandsucer and call to a coll-processing function, respectively)In a way, a vanilla fn is a superset of the other two options. By pickng the vanilla fn, I can choose laziness or transduder, map
or filter
, a la carte
what do you mena by "concerned about plumbing"? do you mean tied to particular transducing contexts, or coupled with the shape of the input or something?
@orestis You can also consider an API that returns an eduction which in loosey-goosey terms is a transducer coupled with its input source
Anyone know any good examples of eduction being used like this? I feel like it's the part of transducers i never fully "got". I've re-read the official transducers page and the clojuredocs page for eduction a few times but it's not quite clicking
@jjttjj we use a couple of functions that return an eduction as the result from a sql query. you can do transformations on this, but only until you consume the eduction will the SQL query be executed and the results streamed
by transformations you mean like (map my-fn eduction-result)
right? like the regular sequence functions?
(into [] (map my-fn) eduction-result)
though this is what I'm not sure of. If this will create a single transducer pass over the contained eduction collection or if it will be two pass
@U0K064KQV eductions don't do passes until you consume them. you can compose them with other transducers and they will be "merged"
I also wasn't sure if they'd be merged in all context. Like maybe you can:
(->> (eduction (map inc) [1 2 3 4])
(eduction (filter even?)))
But can you:
(->> (eduction (map inc) [1 2 3 4])
(into [] (filter even?)))
Neato, used like this eduction is pretty nice actually. Never really thought of using it.
I'd say it depends, are they going to process the returned collection further? If so, and you care about performance, probably best you return a transducer or an eduction.
I just want to make Clojure fans aware that in #startup-in-a-month you will be able to follow @andyfry01 picking up Clojure for his project of starting 12 apps in 12 months. As I understand things, he is not completely new to functional, but there sure will be Clojure things he need to figure out. So maybe consider being part of his line of support and helping him create as many success stories for Clojure as possible the following year? ❤️
Thanks for the plug @pez! I appreciate it. You can indeed check out the #startup-in-a-month channel for posts, and I'll also plug a few links as well: • My Twitch channel, where you can watch me livestream the whole development process! https://www.twitch.tv/a_fry_ • The project blog, which has a bunch of related links and more info on the project: https://startupinamonth.net/
I guess the semantics I’m debating is should the api itself be concerned with transforming a single element or transforming streams/sequences
I've implemented things as transducers, and then just have another arity that just adds an xs
arg and does (into [] (this-fn xf-args) xs)
for easier use at the repl with sequences
Thanks for the plug @pez! I appreciate it. You can indeed check out the #startup-in-a-month channel for posts, and I'll also plug a few links as well: • My Twitch channel, where you can watch me livestream the whole development process! https://www.twitch.tv/a_fry_ • The project blog, which has a bunch of related links and more info on the project: https://startupinamonth.net/
@jjttjj that’s an interesting approach, I’ll have a think about it. I need a REPL to try things out :) but it helps discussing here in the abstract too
A random observation (or two): I’m converting an important CI (bash) script, which mainly contains a series of small inline awk scripts, to Clojure (babashka). It is such a bliss being equipped with a REPL! Interestingly the loc count more than doubles. I hadn’t expected that.
Indeed. Back in the days I wrote whole systems in awk. Actually is a thing I like about Clojure. That it somehow reminds me about when I was so productive with awk.
awk is an incredible tool when you have an awk-shaped problem
This is one such problem, plus the fact that I tend to shape my problems like that. However, I am so happy for babashka helping me to write it in Clojure that I can hardly word it. The requirements list for this pipeline used to stress me out, but now I feel how I am smiling looking at it.
There's probably some utils you could build up to imitate a more awk like workflow, in terms of the read line, process, repeat
that would be a cool library
can you give a clojure (pseudo-code) example of what you would normally do in awk? I am lacking awk knowledge
So, from the top of my mind when I think about what awk is… It has this pattern -> action
structure that it applies to all rows
of the input. The pattern is a predicate of sorts, having access to the current row as input. The action is code executed when the pattern matches. It also has access to the current row as input. Both the pattern and the action “automatically” has the current row split up into columns. Both also have access to any state set up by actions. Patterns can be of a special variant of FROM, TO, which I think about as two patterns, it will match from when FROM is true until TO is true. (Both the pattern and the action has syntactic convenience access to the current row, but I don’t think it is what makes so much difference.) Pseudocode… I have to think about it a bit…
Haha, yes, Some of the problems would lend themselves to awk. It is a quite full featured language as such, but a bit like early javascript, where sourcing in existing code bases is not really solved. (But there is cat
and stuff in the shell that bridges this to some extent.)
Here’s an example which displays some of what I tried to described, and also highlights some things I forgot to mention. It’s from https://stackoverflow.com/a/33110779/44639 and an answer to a question about how to implement tail
in awk:
awk -v tail=10 '
{
output[NR % tail] = $0
}
END {
if(NR < tail) {
i = 0
} else {
i = NR
}
do {
i = (i + 1) % tail;
print output[i]
} while (i != NR % tail)
}'
The things it shows that forgot to mention is that the defauly pattern is to match all rows. And that there are special patterns, BEGIN
and END
that match before and after the input. I somehow think that in a Clojure context BEGIN/END is not so important, but I might be overlooking something.> should print the last 10 lines of a text file
cat README.md | bb -io -e '(drop (- (count *input*) 10) *input*)'
Indeed. Babashka has a lot of the awk feeling in my book. Even if awk works on each row at a time, so slurping up the whole input is not usually what you do.
*input*
is lazy, but to get the count means that it will be realized fully. how does awk do this?
AWK has three phase: do something at the beginning for each line <process-line>> do something at the end And you can set variables in the begin, that you set in the process phase, and use in the end phase
Yeah. The awk script gets called before the input, once for each input row (where “row” is defined by a separator regex) , and then after the input.
So how does it know the amount of lines at the start, it reads the entire file probably?
It can’t know the amount of lines in the start. This is known only in the action matching the END pattern.
tail
is the variable fed to this particular script. NR
keeps track of the current row number. (Which then in the END
pattern is the number of total lines.)
output
is an array. An associative array as it happens, because all arrays in awk are like that.
ah, ok, so it assigns the NR mod tail-th element to the current line, so in the end you will have the last 10 lines in the array
Print a random line from a file:
awk 'rand() * NR < 1 { line = $0 } END { print line }' file.name
This seems to do what awk does then, sort of:
$ cat README.md | bb -io '(first (reduce (fn [[tail nr] line] [(assoc tail (mod nr 10) line) (inc nr)]) [(vec (repeat 10 nil)) 0] *input*))'
Cool. The awk script for printing a random line uses a Knuth trick to not realize the whole file. So a reduce might be in place again. Maybe a reduce context is what we can imagine any awk script to “live” in.
Hey guys, I am seeing strange behavior with tolitius/mount
. Is anyone here an expert?
@doubleagent it's probably more productive to just post the question - I've solved people's mount problems in the past based on clojure knowledge despite never having used mount
I would normally do that but taking a different approach since I can't post the code.
I fixed the issue but I don't have any explanation for why the issue was happening in the first place.
#mount for details