This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2018-11-07
Channels
- # aleph (15)
- # beginners (186)
- # boot (11)
- # bristol-clojurians (1)
- # clara (1)
- # cljdoc (2)
- # cljs-dev (5)
- # clojure (57)
- # clojure-austin (1)
- # clojure-dev (87)
- # clojure-italy (7)
- # clojure-spec (5)
- # clojure-uk (56)
- # clojurescript (18)
- # cursive (29)
- # data-science (10)
- # datomic (84)
- # duct (83)
- # figwheel-main (4)
- # fulcro (42)
- # jobs (3)
- # lambdaisland (2)
- # off-topic (28)
- # parinfer (3)
- # portkey (3)
- # re-frame (28)
- # reitit (7)
- # remote-jobs (8)
- # shadow-cljs (29)
- # spacemacs (30)
- # specter (6)
- # sql (8)
- # tools-deps (60)
morning
Morning all- cheers @jasonbell. I’m in Bristol and try as often as possible to get myself along to our Clojure (study) meetup. Found the UK channel and thought I’d stick my head in 😄
Welcome Rod 🙂 Where are you based in Bristol? Do you work for Ovo on their Clojure team there?
I saw Juxt's blog post about Crossref (https://juxt.pro/blog/posts/clojure-in-crossref.html)
iirc these or somebody else was hiring and posted in the jobs channel recently
All the major hat tipping and thank yous to @dominicm Aleph/Yada SSL demo on Github. Worked first time, thank you!
Anyone got any idea how I can create a Transit reader that does not need a ByteArray as a parameter..?
There are 2016-vintage tutorials that show this:
(def r (cognitect.transit/reader :json))
(cognitect.transit/read r "{}")
type of thing, but when I try to create a reader like this I get an arity error and it seems__ to be the case that the /reader function wants a ByteArray and indeed for that to be the content I want to read... Have I misunderstood..?
I've looked at the sourcecode - I need to supply an InputStream, I can see that... Need to fix this tomorrow...
I have a Luminus app that’s coercing times of dates incorrectly when they are pulled from MySQL.
| daa2f7fe-fafb-4525-8bac-c991c37b799b | 2018-11-30 13:00:00 |
from the db becomes #object[org.joda.time.DateTime 0x422e20d2 2018-11-30T07:00:00.000Z]
, is there any verbose logging I can turn on? I see that [clj-time.jdbc]
is used when the database connection is established with Luminus but I’m at a loss on how to fix this one.Wondering if @seancorfield has come across this one before?
And the local time of the db server is correct….
$ date
Wed 7 Nov 2018 21:16:53 GMT
@jasonbell where is the conversion code?
What is the timezone of your database and what is the timezone of your JVM/application?
Hmm, I've only ever seen that sort of thing when the TZ of the JVM pulling the data is different to the DB...
user> (t/now)
#object[org.joda.time.DateTime 0x44cf71f4 "2018-11-07T21:21:03.981Z"]
And that's kind of an odd TZ offset to adjust by (6 hours). That's neither West Coast nor East Coast nor anywhere in Europe...?
My thinking too. I’ll keep digging around. If I come up with anything I’ll let you know.
Is it consistently wrong by 6 hours on all date/times coming from the DB?
mysql> select now();
+---------------------+
| now() |
+---------------------+
| 2018-11-07 21:22:38 |
+---------------------+
1 row in set (0.00 sec)
yes it is it’s weird. It might be some strange HugSQL bug, that’s the only black box bit I can’t see what’s going on
In your REPL, just as a sanity check, what do (java.util.Date.)
and (clj-time.coerce/from-date (java.util.Date.))
return?
user> (java.util.Date.)
#inst "2018-11-07T21:25:04.296-00:00"
user> (clj-time.coerce/from-date (java.util.Date.))
#object[org.joda.time.DateTime 0x7621213d "2018-11-07T21:25:26.398Z"]
user>
Luminus/HugSQL combo is using Conman to do it’s SQL execution so I’m going to REPL that and see what happens
1) that your mysql client does not have some weird default time zone per session which auto-converts the time stamp
^ I’ve gotten bitten by that when my app was showing different times than my command line client
2) what happens if you store the data as time stamp without timezone? (Not sure what the MySQL equivalent of that type is)
we had a problem with 2 recently where with postgres it was pushing UTC times into BST
it was something silly like times were being coerced by some library wrong and we had to make sure everything had Z at the end or something