Fork me on GitHub
#clojure
<
2022-11-08
>
emccue01:11:47

In data.json the indent option is undocumented - https://github.com/clojure/data.json/blob/master/src/main/clojure/clojure/data/json.clj#L637 - can I rely on it? And corollary that :indent-depth option - is that just for book-keeping or also a tweakable option?

kwladyka11:11:42

Not sure what you mean by “realy on it”. You can overwrite this default values like here https://github.com/clojure/data.json/blob/master/src/main/clojure/clojure/data/json.clj#L701 BTW I always use jsonista, I am curious how clojure.data.json is different

Lior Neria11:11:25

i have json file that contains vector of jsons i want to save every json as json file in the resources folder in the project anyone can help? example how my json file look like:

[
  {
    "name": "dyyi1"
    "topics": [
      "merge"
    ],  
    "filters": [
      "no"]
  },
{
    "name": "exc"
    "topics": [
      "keeper"
    ],  
    "filters": [
      "no"]
  }]

p-himik11:11:27

The question sounds more appropriate for #C053AK3F9. But the overall workflow is "read the data, iterate over each map, write it to a file". Each step needs implementation. Some required things are available within Clojure itself, some are third-party libraries.

kwladyka11:11:25

defmacro question how to achieve this:

(let [x "test"]
    (or (.startsWith x "a")
        (.startsWith x "b")
        (.startsWith x "te")))
(in reality it is more complicated, this is simplified example) from
(defmacro foo [rules]
  `(let [x# "test"]
     (or ~@(for [rule rules]
             `(.startsWith x# ~rule)))))


  (foo ["a" "b" "te"])

  (macroexpand-1 '(foo ["a" "b" "te"]))
(this doesn’t work) I have an issue with x# in (.startsWith x# ~rule).

p-himik11:11:08

Why do you need a macro here? Why not just some?

p-himik11:11:37

And what do you mean exactly by "doesn't work"? The expansion seems fine to me, when I run it on my end.

kwladyka11:11:54

I am writing a macro to convert vector like

[{:ns "consistency.logs.google.json-payload-integrant-test" :level :info}
...
]
into java.util.logging.Filter. This Filter will be triggered on each log so I want to keep it in the best performance. (.startsWith x# ~rule)) will be extended also about level

kwladyka11:11:28

> doesn’t work

(foo ["a" "b" "te"])
Syntax error compiling at (src/consistency/logs/google/json_payload.clj:99:3).
Unable to resolve symbol: x__11725__auto__ in this context
(macroexpand-1 '(foo ["a" "b" "te"]))
=>
(clojure.core/let
 [x__11726__auto__ "test"]
 (clojure.core/or
  (.startsWith x__11725__auto__ "a")
  (.startsWith x__11725__auto__ "b")
  (.startsWith x__11725__auto__ "te")))

p-himik11:11:42

Ah, right.

kwladyka11:11:42

I tried with ' etc. and always something is wrong

p-himik11:11:26

I'd just use gensym explicitly.

kwladyka11:11:48

but all in all my main issue is I don’t know how to write macro which use let in that way

p-himik11:11:53

Assuming you need it at all. But doesn't look like it, so you can just use 'x.

kwladyka11:11:57

Can you give an example?

kwladyka11:11:20

I am macro avoider and this is probably first time when I want to use it 🙂

p-himik11:11:02

(let [x (gensym "x")]
  `(let [~x "test"]
     (or ~@(for [rule rules]
             `(.startsWith ~x ~rule)))))
or something like that.

👍 1
p-himik11:11:38

`(let [~'x "test"]
   ...)

👍 1
p-himik11:11:04

Given that you want to have the best performance, also make sure that there's no reflection on those calls to .startsWith.

👍 1
kwladyka11:11:34

thank you. I will experiment with it and paste final code to get more hints 🙂

👍 1
kwladyka21:11:58

(defmacro code-handler-filter [rules]
  (proxy [Filter] []
    (^Boolean isLoggable [^LogRecord record]
      `(let [~'record-logger-name (.getLoggerName record)
             ~'record-level (.getLevel record)]
         (println ~'record-logger-name ~'record-level)
         (or ~@(for [{:keys [logger-name level] :as rule} rules]
                 `(and (.startsWith ~'record-logger-name ~logger-name)
                       (< ~'record-level ~level))))))))
(l/info "bar")
Execution error (ClassCastException) at consistency.logs.google.json_payload.proxy$java.lang.Object$Filter$41b16ca5/isLoggable (REPL:-1).
class clojure.lang.Cons cannot be cast to class java.lang.Boolean (clojure.lang.Cons is in unnamed module of loader 'app'; java.lang.Boolean is in module java.base of loader 'bootstrap')
What I am doing wrong here? It works without macro. The same issue is for simpler version
(defmacro code-handler-filter [rules]
  (proxy [Filter] []
    (^Boolean isLoggable [^LogRecord record]
      `(let [~'record-logger-name (.getLoggerName record)
             ~'record-level (.getLevel record)]
         (println ~'record-logger-name ~'record-level)
         true))))
But this (not macro) works:
(proxy [Filter] []
             (^Boolean isLoggable [^LogRecord record]
               (let [record-logger-name (.getLoggerName record)
                     record-level (.getLevel record)]
                 (println record-logger-name record-level)
                 true)))

kwladyka21:11:15

Alternative question: Does this macro is worth it for code which will run on every log in the system? Does it make a difference to have (or (and ...) (and ...) vs do some or other function on the collection each time?

p-himik22:11:21

> What I am doing wrong here? That macro, instead of expanding into a (proxy ...) expression, expands into a proxy object that's been constructed during macro expansion. You gotta quote that (proxy ...) form itself. > Does it make a difference to have (or (and ...) (and ...) vs do some or other function on the collection each time? Whenever you have such questions, you should just profile and see for yourself. It might very well be that some somehow gets inlined by the JVM. Or you might see drastic differences between JVM or Clojure versions.

kwladyka22:11:22

It was working without proxy. proxy is also a macro. There is the same issue with ' and without for proxy.

kwladyka22:11:11

Like I am looking on it and even no have idea why it doesn’t work

kwladyka23:11:26

it looks like the issue is let. Whatever let inside

kwladyka23:11:04

actually let is also a macro hmm

kwladyka23:11:20

I give up for today. I can’t with with macros.

kwladyka23:11:28

(defmacro code-handler-filter [rules]
  (let [rules->conditions `(or ~@(for [{:keys [logger-name level] :as rule} rules]
                                   `(and (.startsWith ~'record-logger-name ~logger-name)
                                         (< (.intValue ~'record-level) ~(.intValue ~level)))))]
    `(proxy [Filter] []
       (^Boolean ~'isLoggable [^LogRecord record#]
         (let [~'record-logger-name (.getLoggerName record#)
               ~'record-level (.getLevel record#)]
           (println ~'record-logger-name ~'record-level)
           ~rules->conditions)))))
I was pretty close, but then… ~(.intValue ~level) and all kind of variations Attempting to call unbound fn: #'clojure.core/unquote it is too much for my defmacro experience

kwladyka11:11:43

--- another question --- java.util.logging Do we have something ready to use for java.util.logging.Filter to filter by logger name (namespace) and level? I am writing my own Filter, but maybe something already exist and I can’t find it.

the-alchemist14:11:21

for such a small use case, I would just write my own filter instead of pulling in an extra dependency just for that. but, more broadly, are you tied to java.util.logging? it’s quite dated and limited. most people in the Java community don’t use it for a variety of reasons (perf, lack of features, no MDC, etc.) if you’re open to something more advanced: • https://github.com/BrunoBonacci/muloghttps://github.com/ptaoussanis/timbre both allow filtering on a variety of criteria

kwladyka15:11:37

I know https://github.com/ptaoussanis/timbre and I prefer https://github.com/clojure/tools.logging I found a few very good reasons for that, but I don’t remember them 🙂 Probably the main one is Clojure use Java under the hood and clojure.tools.logging do the job for the whole system. Not only Clojure part. https://github.com/BrunoBonacci/mulog - I don’t know this one > are you tied to java.util.logging? it’s quite dated and limited. Hmm I think it changed in Java 9 or 11 and it is ok to use it. About 2 years ago I was trying to make custom JSON logging structure to stdout and I lost huge amount of time trying each logging framework and each had serious issues. Like for example it works in REPL, but crash during compile to JAR because of conflicts etc. etc. I found then java.util.logging just work as expected and do all what I need. I don’t know. Maybe situation changed during this time or I missed something, but after this experience I don’t like logs framework anymore. Really don’t like them 🙂

kwladyka15:11:29

But feel free to change my mind! 🙂

kwladyka15:11:25

I know the only one framework which was reasonable to choose and had conflicts during compiling to jar had in plans to fix it, but it wasn’t critical for them. The issue was only for Clojure, not for regular Java. At least what I remember. So today maybe it works. I don’t remember which one was it.

Jing Guo11:11:47

I searched the history but couldn't find anything helpful (as well as searched some curated Clj lists). Out of curiosity, are there any quantum computing/info libraries written in Clojure? One example I know of is Q# from Microsoft...

John Tran14:11:23

@U0306EUQCP6 There is a larger curated list not specific to clojure: https://quantiki.org/wiki/list-qc-simulators

👍 1
the-alchemist15:11:29

it would make sense to use a Java QC library because of Clojure’s good Java interop. Q# looks like a C# variation, from a quick glance from the screenshot, which makes sense based on Microsoft’s target demographic

the-alchemist15:11:19

i think u could even run Q# code via https://github.com/clojure/clojure-clr, but I won’t pretend to be a Q# expert 😉

Jing Guo15:11:14

Thank you @U06TTFDB8 I was thinking about relevant written in Clj. It’s helpful to know.

🙇 1
Ben Sless16:11:54

I know there's work done in QC in Common Lisp world,might be worth taking a look there

🙂 1
👍 1
John Tran20:11:01

I wonder if this is an opportunity for a project. I loosely used clojure (as a tool) for my dissertation (which was quantum-related). But not much more than that.

agorgl15:11:03

Hello! What is the equivalent of lein's `

:uberjar {:aot :all}
for tools.build?

kwladyka15:11:18

clj -T:build uber '{:version "0.0.1"}'

(ns build
  (:refer-clojure :exclude [test])
  (:require [org.corfield.build :as bb]))

(def lib 'data-collector/start)
(def main 'data-collector.start)
(def jar-path "target/app.jar")

(defn clean [opts]
  (bb/clean opts))

(defn test [opts]
  (bb/run-tests opts))

(defn uber [opts]
  (-> (assoc opts
             :lib lib
             :main main
             :uber-file jar-path)
      (bb/uber)))

(defn ci [opts]
  (-> opts
      (test)
      (clean)
      (uber)))

agorgl15:11:26

I am using seancorfield's

:build {:deps {io.github.seancorfield/build-clj
                 {:git/tag "v0.6.3" :git/sha "9b8e09b"
                  ;; since we're building an app uberjar, we do not
                  ;; need deps-deploy for  deployment:
                  :deps/root "slim"}}
          :ns-default build}
does it apply aot by default on uberjar builds?

agorgl15:11:22

Thank you!

Alex Miller (Clojure team)15:11:43

sorry, I'm not sure what build-clj does - my answer was about tools.build itself

Alex Miller (Clojure team)15:11:05

and fyi, there is a #C02B5GHQWP4 channel if you have followups

agorgl15:11:03

Ah thank you, sorry posting in wrong channel

hoynk16:11:54

Not a Clojure question, but does anybody have some resources on modeling something like a calendar, with recurrent events. My difficulty lies mostly on keeping track of things as the user edits the schedule and also indicates that on a certain week it actually took place at another time/date… well the real life part.

dpsutton16:11:13

Asking for how you might accomplish this. We are sending snowplow events using the snowplow-java-tracker jar. That defaults to using okhttp3 for the requests but takes an optional HttpClient and we choose to use an apache commons one. The problem comes from the https://github.com/snowplow/snowplow-java-tracker/blob/master/src/main/java/com/snowplowanalytics/snowplow/tracker/configuration/NetworkConfiguration.java class has a field of type okhttp3.CookieJar. This field is unused if a custom httpclient is provided. And this all works at runtime with no classes from okhttp3 is present. However, in CI we lint with eastwood and eastwood enumerates all of the fields and errors when it cannot find the interface okhttp3.CookieJar I figure one workaround is to create the interface so eastwood can find it:

(do
  (create-ns 'okhttp3)
  (in-ns 'okhttp3)
  (clojure.core/refer-clojure)
  (definterface CookieJar
    (saveFromResponse [url cookies])
    (loadForRequest [url]))
  (in-ns 'metabase.analytics.snowplow))
My questions are: 1. Is the jvm really that dynamic? The NetworkConfiguration class has a field and a method that reference the CookieJar class. But it seems the class itself is never loaded if the method is never called or the field never referenced? 2. We never intend to have okhttp3 in our classpath but is this interface creation a fine way to allow eastwood to find the class? Or perhaps another way? We intend to open a bugreport that an optional dependency is included in a class required to make their tracker, but it does seem to “work” except for the linting

dpsutton16:11:31

yup. but its type is okhttp3.CookieJar, a type in an optional dependency that is not on our classpath

dpsutton16:11:55

and eastwood is inspecting this class and enumerating its fields, and asks the reasonable question “what is a CookieJar” and cannot find any class by that name

hiredman16:11:59

I would double check what classes are there

kwladyka16:11:16

Why not add this dependency if you need to use okhttp3.CookieJar ?

dpsutton16:11:33

because we do not want to bring in that whole jar. it is an optional dependency and it works 100% without it. Except that eastwood probes a bit more than the runtime code and needs to know the class

dpsutton16:11:40

@U0NCTKEV8 what do you mean?

kwladyka16:11:16

> because we do not want to bring in that whole jar. Correct me if I am wrong, but it will be not in “whole jar” if you will not import this files. There will be only CookieJar in the jar, because code will import only this one.

dpsutton16:11:46

<scope>compile</scope>
<optional>true</optional>

dpsutton16:11:51

these make me think okhttp3 is involved when compiling (makes sense) but is not in the final jar

dpsutton17:11:13

and indeed the snowplow-java-tracker-0.12.0.jar has no okhttp3 classes in it

hiredman17:11:48

it wouldn't, it isn't an uberjar

dpsutton17:11:31

and clj -Sdeps '{:deps {com.snowplowanalytics/snowplow-java-tracker {:mvn/version "0.12.0"}}}' -Spath shows no okhttp3 jar

dpsutton17:11:58

so i think it is not including any of the okhttp3 files. Just surprised that a class can reference types not on the classpath as long as they are not referenced at runtime

vemv21:11:48

this is something I'd be willing to support in Eastwood if a problem remains. it is certainly a feature that if something is ok for the Clojure compiler, then it also is for Eastwood - no config requirerd :)

markbastian22:03:25

I’ve filed an https://github.com/jonase/eastwood/issues/444 describing this issue along with a https://github.com/metabase/snowplow-eastwood-issue to reproduce it. Hopefully this helps to nail down the issue.

dpsutton18:11:47

We are hoping to host the amazon athena driver in s3. Does anyone have a link to an article about how to do this? Lots of things appear to be a plugin so you can build and deploy a project, but we don't want to modify and build it, just host the jars themselves without building it. Does anyone have a guide to the easiest way to accomplish this?

Cam Saul19:11:34

Already got it (mostly) working, and answered in our private Metabase Slack but I'll post it here in case anyone else wants to know how to do it. Basically you just need a Maven-y folder structure like this

$ tree ~/athena-maven/

/home/cam/athena-maven/
└── com
    └── metabase
        └── athena-jdbc
            ├── 2.0.33
            │   ├── athena-jdbc-2.0.33.jar
            │   └── athena-jdbc-2.0.33.jar.sha1
            │   ├── athena-jdbc-2.0.33.pom
            │   └── athena-jdbc-2.0.33.pom.sha1
            └── maven-metadata.xml
and a maven-metadata.xml like so
$ cat ~/athena-maven/com/metabase/athena-jdbc/maven-metadata.xml

<metadata>
  <groupId>com.metabase</groupId>
  <artifactId>athena-jdbc-42</artifactId>
  <versioning>
    <release>2.0.33</release>
    <versions>
      <version>2.0.33</version>
    </versions>
    <lastUpdated>20221108000000</lastUpdated>
  </versioning>
</metadata>
Basically that's it, then create an S3 bucket and upload it and give it a policy that allows public read access. I don't think you need list bucket permissions, just object read permissions.

Cam Saul19:11:57

And then your deps.edn looks something like

{:mvn/repos
 {"athena" {:url ""}}

 :deps
 {com.metabase/athena-jdbc {:mvn/version "2.0.33"}}}

🙏 1
Cam Saul19:11:38

You can poke around Maven central itself if you want more examples of how the structure should look -- e.g. https://repo.maven.apache.org/maven2/c3p0/c3p0/0.9.0/ -- you can include .md5 in addition to .sha1 as well for example

Cam Saul19:11:09

You can actually have Maven generate all the checksums and .pom and everything for you like so

$ mvn install:install-file -Dfile=AthenaJDBC42-2.0.33.jar -DgroupId=com.metabase -DartifactId=athena-jdbc -Dversion=2.0.33 -Dpackaging=jar -DcreateChecksum=true
then it installs it to your local .m2 directory:
$ tree /home/cam/.m2/repository/com/metabase/athena-jdbc

/home/cam/.m2/repository/com/metabase/athena-jdbc
├── 2.0.33
│   ├── athena-jdbc-2.0.33.jar
│   ├── athena-jdbc-2.0.33.jar.md5
│   ├── athena-jdbc-2.0.33.jar.sha1
│   ├── athena-jdbc-2.0.33.pom
│   ├── athena-jdbc-2.0.33.pom.md5
│   ├── athena-jdbc-2.0.33.pom.sha1
│   └── _remote.repositories
├── maven-metadata-local.xml
├── maven-metadata-local.xml.md5
└── maven-metadata-local.xml.sha1
and you can just rename maven-metadata-local* to maven-metadata* and upload that

dpsutton19:11:25

oh wow. I wonder if there’s an env var to set the local repo? MVN_LOCAL_REPO=local/mirror mvn install … and then little cleanup and push to s3

Cam Saul19:11:56

Actually I'm not 100% sure this does the right thing... it doesn't seem to pull <dependencies> from pom.xml inside the JAR itself and include that in the .pom file. Trying to figure out how to make that work

dpsutton19:11:22

https://maven.apache.org/plugins/maven-install-plugin/examples/specific-local-repo.html can specify -DlocalRepositoryPath=path-to-specific-local-repo so it can go in a different location than just the ~/.m2 repo

dpsutton19:11:21

actually, https://maven.apache.org/guides/mini/guide-3rd-party-jars-local.html seems like more what we want: mvn install:install-file to just put a single jar in a local repository

Cam Saul19:11:04

install:install-file is what I did above 🙃

Cam Saul19:11:18

So you can tell it what POM file to use instead of generating one with -DpomFile=<path-to-pomfile> , and you can use jar xf to extract the appropriate pom.xml from the JAR. But I don't think this is the right way to do it, here at least. The pom.xml file inside the JAR has several different profiles and <build> info and other stuff I don't think is supposed to go in the .pom file

Cam Saul20:11:30

allegedly you're supposed to be able to do something like

mvn install:install-file -Dfile=AthenaJDBC42-2.0.33.jar
(or mvn org.apache.maven.plugins:maven-install-plugin:3.0.1:install-file to force it to use 3.x instead of 2.x) and it will read the pom.xml info from META-INF inside the JAR but in this case at least it doesn't seem to work... I think the POM inside this specific JAR is busted. It's a template instead of an actual POM file with actual dependency information. I'm going to try to fix it by hand and see if I can make that work

dpsutton20:11:26

you aren’t kidding

clj -Sdeps '{:deps {athena/athena {:local/root "/Users/dan/Downloads/AthenaJDBC41-2.0.33.jar"}}}'
Error building classpath. 22 problems were encountered while building the effective model for Athena:AthenaJDBC${env.JDBC_V}:${env.MAJOR_V}.${env.MINOR_V}.${env.REVISION_V}.${env.BUILD_V}
[WARNING] 'artifactId' contains an expression but should be a constant. @
[WARNING] 'version' contains an expression but should be a constant. @
[ERROR] 'artifactId' with value 'AthenaJDBC${env.JDBC_V}' does not match a valid id pattern. @

dpsutton20:11:34

i’m not sure how to even use it

Cam Saul21:11:00

it actually looks like all of those deps are bundled in the JAR and shadowed so I don't think you actually NEED any additional deps. Seems to work fine for me without them at least.