This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2019-10-11
Channels
- # announcements (6)
- # architecture (9)
- # beginners (120)
- # calva (13)
- # cider (28)
- # clj-kondo (8)
- # cljs-dev (7)
- # clojure (113)
- # clojure-europe (13)
- # clojure-italy (7)
- # clojure-nl (9)
- # clojure-spec (44)
- # clojure-uk (7)
- # clojuredesign-podcast (15)
- # clojurescript (18)
- # cursive (9)
- # data-science (3)
- # datomic (32)
- # defnpodcast (1)
- # events (2)
- # fulcro (6)
- # jobs (1)
- # kaocha (5)
- # london-clojurians (2)
- # luminus (2)
- # nyc (2)
- # off-topic (54)
- # ring (6)
- # shadow-cljs (136)
- # sql (1)
- # testing (2)
- # tools-deps (64)
- # vim (83)
Hi Yes we do for Atlassian Plugins
I cannot because we have a closed source project 😞
But I can help you get up and running
Is there something in particular that is not working?
We use maven for our Clojure project and then we point to a activator
No I just try to find projects actually using OSGi with Clojure. Because I can’t find current ones.
like this:
<Bundle-Activator>nl.avisi.atlascrm.atlascompat.ClojureActivator</Bundle-Activator>
I like to have a plugin system for a application I develop and evaluate to use OSGi for that.
Is it a Clojure project for which you want to create a Plugin System?
Then I would not go that route if I where you
OSGI is pretty complex, and Clojure by default is not compatible
So probably every Plugin ends up with their own Clojure runtime, which gives quite a lot of overhead
Because every OSGI bunde bundles their own dependencies
You could export some Clojure API to require things, but then you get problems with which Classloader should be used to load stuff
It is just a rabbithole that I would not recommend, we only do this because we had a existing Clojure project and wanted to use that in our Atlassian Plugin
ok that is a good point. because of possible dependency issues, I started to look at OSGi.
Yeah that is a very logical thing to do
There are some clojure-osgi projects which can load namespaces from other bundles
But they all have weird problems with: AOT, clojure.tools.logging does not work
If you just use uberjars for plugins and load them into your classpath, there will be dependency issues if you have different versions of the same dependency.
And clojure.pprint fails on lot's of them
If you want everyone to bundle their own Clojure version with all deps then it could work
I really havent thought about needing a separate Clojure runtime for every plugin. So you are right, that would be just to much overhead.
Do you know the plugin system of metabase? https://github.com/metabase/metabase/wiki/Writing-a-Driver:-Packaging-a-Driver-&-Metabase-Plugin-Basics
They are using DynamicClassLoader: https://github.com/metabase/metabase/blob/master/src/metabase/plugins/classloader.clj
No I have never looked at that tbh
We once looked at: https://github.com/puppetlabs/trapperkeeper/blob/master/documentation/Plugin-System.md
The metabase thing looks pretty interesting though
Sorry I don't have any better news about the OSGI front 😞
But hopefully this will save you a lot of time
If you get anywhere with the metabase approach I would find that very interesting
FWIW we got OSGi working with Clojure as a shared dependency using https://github.com/talios/clojure.osgi - which includes an example of creating a manifest with maven-bundle-plugin
the only class-loading issue we ran into was stale namespace state, which was solved with :reload-all
this is a good intro if (like me) you know nothing about OSGi https://cwiki.apache.org/confluence/download/attachments/7956/Learning_to_ignore_OSGi.pdf
what is a good way to pass a callable thing to Clojure from Java? E.g. a function of two arguments. Passing lambda's doesn't work. Implementing an IFn
is awkward since none of the methods are marked as default
.
Hello, I have some data in a vector and for each element, I want to see if it has something in common with the rest of the elements after it. What would you think of reduce
for this? Since I’m looking at the rest of the items, not just the next one, it seems a little unnatural…
A couple of example inputs and expected outputs would make it very clear 🙂
Usually finding a specific problem like that, will give you specific solutions which contain the general ideas
Maybe I can make do with loop
but was wondering if there’s something more “higher level”
@ilyab partition-by
:
(partition-by :a [{:a 1} {:a 1} {:a 2} {:a 3}])
=> (({:a 1}
{:a 1})
({:a 2})
({:a 3}))
I'm taking a stab in the dark, but if you reversed the vector, and then did reduce on that, the 'accumulator' data that you pass from one step to the next could be something about all of the reversed elements seen so far, and compared against the next one?
another common operation for comparing to other inputs is group-by
user=> (group-by :a [{:a 1} {:a 1} {:a 2} {:a 3}])
{1 [{:a 1} {:a 1}], 2 [{:a 2}], 3 [{:a 3}]}
Some strange behavior perhaps someone can explain. I have a large-ish data structure (vector containing maps) that raises an exception in my program when I use it, but doesn’t cause an exception when I first stringify it and do read-string on it. The exception itself doesn make any sense either, it is
Execution error (ExceptionInfo) at datahike.db/validate-val (db.cljc:989).
Bad entity value 0 at [:db/add 58 :ImportFrom/level 0], value does not match schema definition. Must be conform to: (= (class %) java.lang.Long)
I think 0 is a long.My code
(defn tryme []
(let [ex (code2ast (slurp "resources/example.py"))
schema (->> ex
(learn-schema (schema-as-map base-schema))
schema-as-schema)]
(d/delete-database uri) ; cleanup previous database
(d/create-database uri :initial-tx schema) ; create the in-memory database
(let [conn (d/connect uri)] ; connect to it (returns an atom)
(d/transact conn ex #_(read-string (cl-format nil "~S" ex)))
(simple-example conn))))
You can see where I’ve commented out the read-string
to use ex
in this example.@peterd examine the exception, which appears to be a data-bearing exception, by calling ex-data
on it
> (try (tryme) (catch Exception e (ex-data e)))
{:error :transact/schema,
:value 0,
:attribute :ImportFrom/level,
:schema
#:db{:valueType :db.type/long,
:ident :ImportFrom/level,
:cardinality :db.cardinality/one}}
I tried doing it like the above.Here it is in detail
#error {
:cause "Bad entity value 0 at [:db/add 58 :ImportFrom/level 0], value does not match schema definition. Must be conform to: (= (class %) java.lang.Long)"
:data {:error :transact/schema, :value 0, :attribute :ImportFrom/level, :schema #:db{:valueType :db.type/long, :ident :ImportFrom/level, :cardinality :db.cardinality/one}}
:via
[{:type clojure.lang.ExceptionInfo
:message "Bad entity value 0 at [:db/add 58 :ImportFrom/level 0], value does not match schema definition. Must be conform to: (= (class %) java.lang.Long)"
:data {:error :transact/schema, :value 0, :attribute :ImportFrom/level, :schema #:db{:valueType :db.type/long, :ident :ImportFrom/level, :cardinality :db.cardinality/one}}
:at [datahike.db$validate_val invokeStatic "db.cljc" 989]}]
:trace
[[datahike.db$validate_val invokeStatic "db.cljc" 989]
[datahike.db$validate_val invoke "db.cljc" 981]
[datahike.db$transact_add invokeStatic "db.cljc" 1200]
[datahike.db$transact_add invoke "db.cljc" 1198]
[datahike.db$transact_tx_data invokeStatic "db.cljc" 1434]
[datahike.db$transact_tx_data invoke "db.cljc" 1280]
[datahike.core$with invokeStatic "core.cljc" 231]
[datahike.core$with invoke "core.cljc" 224]
[datahike.core$_transact_BANG_$fn__41228 invoke "core.cljc" 438]
[clojure.lang.Atom swap "Atom.java" 37]
[clojure.core$swap_BANG_ invokeStatic "core.clj" 2352]
[clojure.core$swap_BANG_ invoke "core.clj" 2345]
[datahike.core$_transact_BANG_ invokeStatic "core.cljc" 437]
[datahike.core$_transact_BANG_ invoke "core.cljc" 434]
[datahike.core$transact_BANG_ invokeStatic "core.cljc" 529]
[datahike.core$transact_BANG_ invoke "core.cljc" 444]
[datahike.core$transact invokeStatic "core.cljc" 636]
[datahike.core$transact invoke "core.cljc" 629]
[datahike.core$transact invokeStatic "core.cljc" 633]
[datahike.core$transact invoke "core.cljc" 629]
[datahike.connector$eval55768$fn__55770$fn__55772 invoke "connector.cljc" 31]
[clojure.core$binding_conveyor_fn$fn__5754 invoke "core.clj" 2030]
[clojure.lang.AFn call "AFn.java" 18]
[java.util.concurrent.FutureTask run "FutureTask.java" 264]
[java.util.concurrent.ThreadPoolExecutor runWorker "ThreadPoolExecutor.java" 1128]
[java.util.concurrent.ThreadPoolExecutor$Worker run "ThreadPoolExecutor.java" 628]
[java.lang.Thread run "Thread.java" 834]]}
I don’t see a problem with that.I can print the variable to the cider REPL, assign it to a var, and likewise it runs fine, just like doing (read-string (cl-format nil "~S~" var)
If you can store that map in a Clojure Var at a REPL, or in your program, and print out (class <expression-that-extracts-the-number-0>)
, it would be interesting to know whether it was java.lang.Long
or some other type.
It could be java.lang.Integer
, for example, and would look exactly the same in that output either way.
Sorry, I should have said vector instead of map, but you get the idea.
Printing out a java.lang.Integer
type value (or Byte, or Short, etc.) and reading it back in would definitely change it to a Long
I am having a problem with expression-that-extracts-the-number-0
. #error {…}
I though (-> foo :error/data ….)
Are you able to capture it in a REPL? That lets you retry multiple variations more quickly, perhaps.
And the key might be just :data
, not :error/data
?
you need the ex-data
function to get the data map from the Exception object
Two possibilities suggest themselves -- figure out where the Integer type value is coming from, and find a convenient place to convert it to Long. Or, find out who wrote the spec that checks to see if it is Long, and see if they can loosen it to allow other integer types.
There is a #datahike channel where people familiar with that spec might hang out.
int?
is a pred for fixed size integers. integer?
is a pred for any integer (fixed or arbitrary precision)
in case those are helpful
They might have efficiency concerns about allowing other integer types and doing the conversion on the library side, but I suspect you will find out when interacting with the datahike devs.
They allow these:
#{:db.type/instant :db.type/boolean :db.type/uuid
:db.type/value :db.type/string :db.type/keyword :db.type/ref
:db.type/bigdec :db.type/float :db.type/bigint :db.type/double
:db.type/long :db.type/symbol}
I've got a function that needs to return a map with, say, twenty calculations, and most of the values in the map are dependent on one of the other values. Right now I've got something like this:
(defn f [...]
(let [foo ...
bar (something-with foo)
baz (something-else bar)
...]
{:foo foo
:bar bar
:baz baz
...}))
Is there a better way to write this? As you can see, I'm currently repeating foo
, bar
, etc. three times where I'd prefer it to be repeated just once.IMO - this is better than special threaders. The dependencies are clear. The desired execution order is clear. The way the code is structured tells the reader useful information.
@dale I might reach for as->
in this case and thread the "map built so far" into each calculation that can add each new computed var
I'm just finding suggestions to write a macro using &env, but I'm wondering if there's a better way without resorting to macros.
(normally I advocate as->
inside ->
but here I think it's worth making an exception)
@seancorfield beat me to it. I would use as->
(as-> {} m
(assoc m :foo (compute-foo))
(assoc m :bar (compute-bar (:foo m)))
,,,)
@seancorfield, @peterd That is just what I was looking for, thanks!
I might still start it with ->
TBH
(-> {}
(assoc :foo (compute-foo))
(as-> m
(assoc m :bar (compute-bar (:foo m)))
,,,))
I picked up the habit of using a ?
in the variable. Thus (as-> {:foo (compute foo)} ?m (assoc ?m,,,))
I'm torn on that. I tend to use short, mnemonic names (`m` for map, for example), but I do have sympathy for just using symbols, such as %
or $
...
In the example, I’m not sure. You’d like to bind a name and carry it through each form, thus (as->...)