Fork me on GitHub
#datomic
<
2016-10-24
>
timgilbert16:10:17

Say, if I want to include another jar in the datomic transactor, do I just copy it into the resources directory? Specifically, I'm trying to get this logback appender working: https://github.com/logzio/logzio-logback-appender

marshall16:10:34

@timgilbert Yes, the resources dir should be on the classpath if youre using the bin/transactor script to launch

timgilbert16:10:46

Great, thanks

timgilbert18:10:52

Hmm, actually it appears that doesn't work, the server gets started with java -server -cp lib/*:datomic-transactor-pro-0.9.5404.jar:samples/clj:bin:resources

timgilbert18:10:38

...but just sticking a jar file inside resources doesn't work, I think the jar needs to be explicitly named

timgilbert18:10:00

I tried sticking it in lib but the glob lib/* is never actually expanded

timgilbert19:10:13

Making this change to bin/classpath seems to do the trick:

$ diff /tmp/cp bin/classpath
6c6
<   s="`echo lib/*`:`echo *transactor*.jar`"
---
>   s="lib/*:`echo *transactor*.jar`"

timgilbert19:10:56

Hmm, on further inspection it looks like some kind of dependency problem with the appender, will poke around more.

Oct 24 19:05:19 ip-172-31-14-11 transactor[25414]: Reported exception:
Oct 24 19:05:19 ip-172-31-14-11 transactor[25414]: java.lang.NoSuchMethodError: ch.qos.logback.core.Context.getScheduledExecutorService()Ljava/util/concurrent/ScheduledExecutorService;
Oct 24 19:05:19 ip-172-31-14-11 transactor[25414]: #011at io.logz.logback.LogzioLogbackAppender.start(LogzioLogbackAppender.java:166)
Oct 24 19:05:19 ip-172-31-14-11 transactor[25414]: #011at ch.qos.logback.core.joran.action.AppenderAction.end(AppenderAction.java:96)

timgilbert19:10:25

Ah, and I just realized that Java itself supports the foo/* glob syntax, so the shell has nothing to do with it

timgilbert19:10:38

Ok, peering at the code more closely, it seems as though the problem is that the appender uses logback-classic 1.1.7, but the transactor itself uses 1.0.13

joshg19:10:45

Does Datomic still have a 10 billion datom limitation?

joshg19:10:48

I was looking into using it for a data warehousing application instead of data vault.

robert-stuttaford19:10:19

it’s not a hard limit @joshg, but an anticipation of the likely max size that a peer process can deal with

robert-stuttaford19:10:36

because peers need to hold onto all the roots of the index and still have space for actual data

robert-stuttaford19:10:52

theoretical limit. if you have biiiig instances, your limit is higher

robert-stuttaford19:10:26

afaik nothing in the code imposes a limit

robert-stuttaford19:10:56

i’m not sure about the entity id space limitations though, perhaps @stuarthalloway can share 🙂

joshg19:10:38

Thank you for the clarification, that makes sense. So it’s possible to use Datomic with more datoms than 10 billion, but perhaps that wouldn’t be its best use case.