This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2017-02-06
Channels
- # aleph (2)
- # aws (3)
- # bangalore-clj (3)
- # beginners (119)
- # boot (263)
- # cider (13)
- # cljs-dev (16)
- # clojars (2)
- # clojure (114)
- # clojure-austin (1)
- # clojure-chicago (1)
- # clojure-finland (1)
- # clojure-france (24)
- # clojure-italy (6)
- # clojure-russia (28)
- # clojure-serbia (7)
- # clojure-spain (1)
- # clojure-spec (89)
- # clojure-uk (139)
- # clojurescript (216)
- # community-development (3)
- # core-async (135)
- # css (2)
- # cursive (31)
- # datomic (44)
- # emacs (15)
- # hoplon (2)
- # jobs (3)
- # lein-figwheel (14)
- # leiningen (2)
- # lumo (21)
- # off-topic (16)
- # om (7)
- # om-next (1)
- # onyx (53)
- # perun (9)
- # planck (15)
- # portland-or (29)
- # protorepl (2)
- # re-frame (32)
- # reagent (8)
- # ring-swagger (22)
- # rum (51)
- # spacemacs (4)
- # untangled (2)
Something like https://github.com/Gonzih/cljs-electron ?
Another one is http://descjop.org/ (although I haven't tried it yet).
hi all , anyone here managed to use clj-http and a client side cert , when I try to make a call after providing the require keystores and trust stores i get the following error (CertPathValidatorException Path does not chain with any of the trust anchors sun.security.provider.certpath.PKIXCertPathValidator.validate (PKIXCertPathValidator.java:153)) , im too much of a clojure/java noob to know how im supposed to fix that
Navigate to source using M-. displays "Visit tags table (default TAGS)" instead of navigate. don't know what was happened it was working fine
@rc1140 That error is very generic and merely means that the certificate presented could not be identified as trusted. That may happen for several trivial reasons not related to the content of the certificats. Could be a configuration error or - something that has tripped me several times - an assumption on Java keystore handling regarding what names the certificates should stored as in the keystore.
@rc1140 Have you looked at https://github.com/aphyr/less-awful-ssl ?
@fnil thanks for getting back to me , forgot to update the chan , the issue in my case was that the server cert was out dated 😞
Yeah, this is the one I was thinking about: https://github.com/gerritjvv/clj-tcp
I have used Netty from Java before and was quite happy with it. Never tried the above library (clj-tcp) though.
Can anyone say why I get the following exception?
user=> (import 'java.util.zip.ZipOutputStream)
java.util.zip.ZipOutputStream
user=> ZipOutputStream.
CompilerException java.lang.ClassNotFoundException: ZipOutputStream., compiling:(/tmp/form-init5711048387653942292.clj:1:7619)
Actually, I presume it's because it isn't in openjdk.
@danielstockton works for me here w/openjdk.
Hmm, yeah. I get the same exception with GZIPOutputStream too.
openjdk version "1.8.0_121"
I do get a different error, however:
CompilerException java.lang.IllegalArgumentException: No matching ctor found for class java.util.zip.ZipOutputStream, compiling:(/private/var/folders/lj/7jsmpmqs3f18n08f1gm5l6rw0000gn/T/form-init526093045682586576.clj:1:1)
This means the class can actually be found.Did you do (ZipOutputStream.)
? Because I get that error trying to wrap it in parens.
yes, give it something and it works:
(java.util.zip.ZipOutputStream. (ByteArrayOutputStream.))
Oh, i see. I am trying to give it something but I was using the threading macro. I guess that's a nono.
(-> ZipOutputStream.
(Channels/newChannel)
(.write (fress/write {:bob :test})))
I lied (unintentionally) 😄 I'm not giving it a baos
(-> (ZipOutputStream. (java.io.ByteArrayOutputStream.))
(Channels/newChannel)
(.write (fress/write {:bob :test})))
is my code now, I've got a different error but I should be good to continue on my own. ThanksIt works if i use a GZIPOutputStream..
just to be clear about what mpenet just said: the first arg of most (all?) core threading macros is a form that is evaluated, and ZipOutputStream.
is not what you want to do. If there were a no-arg constructor for the class, you still have to call the constructor by using: (ZipOutputStream.)
Yep, I understood, thanks.
I knew that but was getting confused by all the errors.
I'm not really familiar with Executors... I have a fixed threadpool in which I'd like to initialize every thread with a Kafka consumer before they start work (pulling data from Kafka using the consumer)
does anyone have a basic working electron app I can copy/paste? the first two results I got from github are not working for me
hello everyone, using schema.core
how can I create a field with a varying type? some link s/Str :or s/Number
cond-pre
(https://github.com/plumatic/schema/blob/master/src/cljx/schema/core.cljx#L561) is what you want to do that, @plins
still struggling to get this working
(schema/defschema Table
(schema/cond-pre {:headers [schema/Str]
:rows [[schema/Num]]}
{:headers [schema/Str]
:rows [[schema/Str]]}))
any ideas on how this should work?I think you can also use s/conditional
(def EitherFooOrBar
(s/conditional #(foo-like? %) Foo :else Bar))
@plins don't understand your example
Is anyone here familiar with high performance in clojure? I'm hitting a bottleneck using an executor fixed thread pool and I'm not sure how to reason about it
@pesterhazy, i want to create a field inside a schema, which can be either an integer or a String
https://gist.github.com/shayanjm/1ab27912f94219728fbc3cb8b4eb704b#file-consumer-clj-L30
Basically, I'm trying to figure out how to speed up this particular consumer. Its only job is to consume from a single topic and write the data to a DB. The timestamp is calculated before the DB query is even fired
but as the input speeds up, the average elapsed time gets huge (>2 secs for ~500 bursted messages)
Not really sure what's going on. Would love to speed this up but just throwing threads at it hasn't really been helping either
schema works well for many of us
@plins here's what I think you want from your example:
(schema/defschema Table
{:headers [schema/Str]
:rows [[(schema/cond-pre schema/Str schema/Num)]]})
this returns a schema where the given item must be a map that contains only the keys :headers
and :rows
; :headers
must be a (possibly empty) list/vector of strings; and :rows
must be a vector of vectors (any of them possibly empty, or the entire containing vector might be empty); each of these sub-vectors/lists will be composed of a miss-match of strings or numbers
so 10 db transactions take 100ms — and 500 transactions take 2000ms (or more) … is that right? @shayanjm
if you want each of the subvectors/lists to be composed of only strings or only numbers, but to allow either of them as possibilities:
(schema/defschema Table
{:headers [schema/Str]
:rows [(schema/cond-pre [schema/Str] [schema/Num])]})
(def str-or-number (schema/cond-pre schema/Num schema/Str))
(schema/defschema Table
{:headers [str-or-number]
:rows [[str-or-number]]})
ended up with thisyep, that would do it
hi there, just compiled an uberjar for my project (that is basically just dependencies for the moment) and it’s spitting out a 25MB jar, I this normal ?
@shayanjm step 1: turn on reflection warnings vis (set! warn-on-reflection true) at the top of the file
bleh....
(set! *warn-on-reflection* true)
@joshjones - better statement would be that "getting" 500 messages takes 2000ms or more
The compiler will then warn you about several reflection issues your code has. That can make a huge difference. Especially in get-next-message
@tbaldridge before or after the ns block?
after
aside from that, I'd look into what your db/insert-match<!
function is doing. I'd stub that out for testing at first, and afterwards perhaps look into batching writes to the database
@tbaldridge: Yeah, so that timestamp is recorded before insert-match<! is fired. The elapse on the timestamp is irrespective of how long it takes to transact to the DB
@shayanjm also, I'd look a bit into how you're measuring time here. Depending on how you are enqueuing these messages, I'm not sure you're measuring what you think you're measuring
If the thing writing to the queue is on another box you could be experiencing time skew.
@tbaldridge: A timestamp is slapped onto the message when written to the queue (process is local)
Also, at some point you're measuring the delivery time of Kafka.
Yeah I guess my assumption here is that there's no way Kafka could possibly be the bottleneck... that might be an unfair assumption
does anyone have datascript working with persistent storage? (no I don't want datomic -- I'm writing a re-fram/electron app, and I need a way to persist my datascript)
@qqq I’ve done some thinking on this and I’ve come up with 2 options: Save an edn file to disk or persist to localstorage
@mruzekw : either way, is it with https://github.com/tonsky/datascript-transit read-transit-str and write-transit-str ? I found a github issue where someone wanted "incremental logging" and the answer was "no, you have to write entire db out even after 1 change"
Is there a good way of using Datascript with re-frame without having to rewrite the reference implementation?
alanspringfield have you checked out https://github.com/mpdairy/posh?
@schmee thanks for the suggestion
so posh is a complete replacement for re-frame?
@schmee Thanks again. Seems like a pretty interesting talk