Fork me on GitHub

Is there a way in one's $HOME/.clojure/deps.edn file to have a :local/root dependency with a relative path name starting from your home directory? I tried {:local/root "$HOME/path/to/my.jar"} and {:local/root "~/path/to/my.jar"} , but neither seemed to work


I ask because it would be helpful to have such a thing for sharing a ~/.clojure/deps.edn file contents across multiple systems, where my home directory absolute path name differs between them.


tildes are not expanded in .edn files


neither is $HOME


but I get the problem, would be nice to see a solution for


my current work-around is to put all source under ~/src/ (or some other directory right under my home directory) and make paths relative: e.g. "../REBL/REBL.jar" ofc, that means on each system source code needs to live under a directory right under ~. on a related note, to get the effect of "variables" in deps.edn, i've been thinking whether to generate deps.edn (possibly from some template).


with some lightweight scripting / templating you can get around it though


ah, you also suggested that sogaiu 😉

Alex Miller (Clojure team)13:02:21

I'm going to continue to resist variables in deps.edn for as long as possible :)

👍 4

Is tilde as alias for home directory in same category?


Not an urgent need by any means, just asking while we are in the neighborhood.

Alex Miller (Clojure team)17:02:47

it's in the category of "processing step beyond reading as data"

Alex Miller (Clojure team)17:02:42

having none of those is a huge advantage for downstream work


Just hacking my own build tool right now, and I created new file build.edn and code for processing with comment in code "This sections dropped to build.edn cause Alex Miller continue to resist variables in deps.edn" 😎


I mean, you can give that as the reason if you want, but as an engineer, I'd say addressing the reasons for that design decision, rather than focusing on the person that made them, is a good practice. Sure, include the name of one or more people responsible for making the decisions if you want, but without the reasons, it can sound like you believe the decisions are arbitrary, or not grounded in any kinds of reasons. Not saying you have done that -- just pointing out the possibilities of misinterpretation.


I think generating your own deps.edn is a good idea


Are there tools/methodologies around adapting lein projects containing java to be amenable to clj/`deps.edn` ? I'm looking at a potential dependency that is fairly new and fast-moving, would like to be able to reliably pin my reference to a known SHA, so I can reference the code at that point rather than depending on potentially out-of-date docs, etc. I could manually run lein and deploy it locally, but something more automated where I can just update the SHA and have clj do the right thing would be nice-to-have.


@dave.dixon The most reliable way I know is fork it and add a deps.edn to it.


Yes. But then how do I get the downstream project to compile the java and put the classes on the classpath? I think I could do it for the specific case of that project using an alias and a tool like badigeon, but curious if anybody had come up with anything more general.


if it contains java, it's a bit of a different story. I would just deploy it as a leiningen artifact to your own org on clojars in this case


(my personal take on it, for compiling java I always use lein, since it just works)


Sort of hacked it by building the java classes, and adding that directory to the :paths . Pushed to github, and that works. Curious if there's any reason to not do this.


> Curious if there’s any reason to not do this. It’s not good to commit complied (and binary) artefacts into source control. It would be better to compile the java classes and deploy it as a library. Either a shell script with javac and some mvn calls, or just move the java code into its own lein project and use lein & project.clj to compile and deploy the java bit, and depend on it from the tools.deps lib. Whether to break out the java as a separate lib largely depends on how stable it is, and how often you expect to change it etc.


> It’s not good to commit complied (and binary) artefacts into source control. I've heard this, and my understanding (maybe wrong) for the rationale is that git doesn't handle large binaries well, as might be the case for a jar or fully linked executable. Class files, however, are pretty small, and map closely to the corresponding java source.


The key risk is having the compiled class files be out of sync with the source. But that's kind of the problem I'm facing anyway, source is changing, and the clojars artifacts may not match, which makes it hard to figure out which features are there etc. More ideal would be a tool that built the java classes after clj pulled from the indicated :sha.


Well there are a whole bunch of reasons: 1. git doesn’t handle large files in a repo very well (operations on git repos slow to a crawl on large repo’s, e.g. changing branches etc) 2. you can’t diff a binary very well, so merging branches etc becomes painful. Should you accept a change or not? 3. you have a dependency between source and output files in the repo 4. related to 3, but for non-binary output files; e.g. compiling and minifying css/js you get huge diffs in your repo that can cause some tools to crash/struggle to render. You can set attributes on the files in git, but not all tools honor them properly. 5. different versions of javac generate different byte code for the same source code so you may get diffs and deltas just by upgrading java versions 6. tools.deps can’t handle files in a repo over ~100mb (due to limitations in jgit) 7. adopting this practice on repo early on, normalises the behaviour, and all sorts of binaries get commited; git performance tanks, and then you have to rewrite the whole history with filter-branch, or worse because of lack of patience around rewriting the history correctly the team decide to drop all the git history and start fresh in a new repo. I’ve worked in many repo’s where this has been done, and it’s almost always caused problems. Not saying you won’t get away it; just be aware of the problems.


I’ve also personally experienced all of those at some point in the last 15+ years or so using git.


Agreed in general. And most of this sounds like something that should be fixed in git, making the abstraction less leaky, but that's a whole other thread. Clearly it would be better to have a way to trigger the java build after clj pulls and update the classpath accordingly.


There are fixes in/around git for things like large-files, and git handles binaries ok… However you can’t really solve the issue of how to merge a random binary; it’s essentially pick one or the other or commit a new one. I’m not sure git can really do much more to help there. For me philosophically it’s just the wrong thing to do; git is for source control and generated files aren’t source. I’m saying this as someone who has committed binaries into repos though; and may at some point do it again… I’d just always very much prefer to avoid it.


I have not been working on this full time by any means, but from a programmatically generated file from, I have a list of 18,493 projects and their URLs they list in the artifact there, which appear to be on Github, downloading now. This is partial progress towards answering the question "what fraction of Leiningen project.clj files are readable using clojure.edn/read".


As an aside on the kinds of weird things I now expect more often in such large-ish data sets: 672 projects with the incorrect URL , likely because there was no name filled in somewhere when the artifact was created on Clojars.


447 have this as the only URL my code finds in the data about the artifact, from Leiningen's template, it appears:


Those are all outside of my list of 18,493, which is reduced from the original list size of 25,675. 18K seemed like a representative subset 🙂


It might be quicker to just hit up the github url for project.clj


Not at this point it wouldn't be 🙂


And I am curious to ask more questions than about just the project.clj file, too, e.g. how many have a deps.edn or pom.xml file checked into the source repo?


Basically I am not going for quick results here, but the potential for thorough follow up question asking capability.


... but I should first say: thanks for the suggestion. I had not considered that, and might be useful later.


The idea of cloning all those repos hurts my disk


Somewhere on a back disk or three of mine, I have about half a terabyte of Internet packet traces recorded at various times and places from live links on the Internet, headers only, plus timestamps, used in some research on various things in computer networking. Disk space is pretty cheap 🙂


I’m trying to run a project with a private git dependency on Windows (in Powershell). The doc says “ssh authentication works by connecting to the local ssh agent (ssh-agent on nix or *Pageant via PuTTY on Windows)” I tried creating a key pair in puttyGen, added it to Pageant, (and of course added the public key to my github account). When that didn’t work, I tried generating a new key pair with ssh-keygen and adding it to the openssh agent via ssh-add. I think the relevant part of the stacktrace is:

Caused by: org.eclipse.jgit.errors.TransportException: : Authentication is required but no CredentialsProvider has been registered 


Is there something I’m missing? I was able to get it working fine on my mac.


That's https, you need to use ssh @samwagg0583


Change the url to be the ssh one on github


oh wow, I can’t believer I did that and have been spinning my wheels for hours!

duckie 4

That did it, @dominicm. I’m embarrased, but thanks so much.

👍 4

one nix package (Lumo) suddenly started to fail with

Caused by: org.eclipse.aether.resolution.ArtifactResolutionException: Could not transfer artifact org.clojure:clojure:pom:1.10.1 from/to central (): : Name or service not known
I'm using tools-deps to resolve the dependencies, but I highly doubt this is related to tools-deps. But could be some mirror setting I guess? This at least isn't network error since Hydra (the CI of nixpkgs) also reported this exact same error. Maybe this error sounds familiar to someone? Full stack here 4wiw


In a web browser, I can go to

successfully. Maybe some transient failure, or networking problem between that host and public Internet?


you need to paste your settings for fetching these deps


or whatever nixpkgs is doing


seems pretty clear: Caused by: Name or service not known at java.base/ Method)


either transient name resolution failure, or JVM is mistakenly resolving maven through IPv6 instead of IPv4


let repos = [


      name = "org.clojure/clojure";
      path = pkgs.fetchMavenArtifact {
        inherit repos;
        artifactId = "clojure";
        groupId = "org.clojure";
        sha512 = "f28178179483531862afae13e246386f8fda081afa523d3c4ea3a083ab607d23575d38ecb9ec0ee7f4d65cbe39a119f680e6de4669bc9cf593aa92be0c61562b";
        version = "1.10.1";

so I download the jars then call clojure basically


also the error you're pasting is a different version than the one in nix


1.10.1 vs 1.10.0-beta5


the latter one uses beta, yes, same error still. I tried bumping.


but still, the error cause is accurate


it has nothing to do with the version


there's a name resolution error


yes, something in that direction was my instinct


just dns lookup?


instinct / gut feeling unnecessary 🙂 you just have to read the stacktrace


it appears like it's resolving through IPv6


Well, or it is using IPv6 when IPv6 is disabled somewhere in the network between you and DNS resolver, or between your host and where JAR file is.


not sure if the JVM always appears like it's resolving through IPv6


well, I was only explaining why the first paste was in diferent version, but yeh, nevermind


you can try to set -.preferIPv4Stack=true on the JVM to see if it resolves differently


or try to resolve something that you know is invalid, and see if you get the same trace


nice, that sounds like something I can try. I'm using like always. Maybe maven isn't discoverable there (for me dns lookups are black boxes)


thanks for the tip fellas 🙂


np. you should test your hypothesis


also check if nix changed their global ipv6 settings or something


that's also plausible, tough it could only be the package manager that could have such power, all else is configured by the user


it's a beginning of a bug hunt I guess, again, thanks


Sounds like it. We can throw the hot potato into what looks like the most likely hands for the problem, which at least in this case strongly appears not to be deps.edn code, nor any other particular package manager type of thing.

Alex Miller (Clojure team)22:02:19

I was on a thread last week about some Maven deploy errors (different problem) and it sounded like they've been doing some server updates. could be something on their side. They have a jira system for reporting issues if you wanted to ask there.

👍 4

as a first step you should try to figure out what DNS settings you have in /etc/resolve* and what address types you have set with ip addr show

👍 4

reddest herring this week, I actually had stable and good connection to maven, the error was arriving at installation time and not build time, which in nix mean no unwanted side-effect (calling http(s)). So to take my quesiton to a new direction, why would the clojure executeable try to download clojure-1.10.1 when I specify its jar location on the classpath? My classpath is (stored as classp)

and command is
${clojure}/bin/clojure -Scp ${classp} -e "${compileClojurescript true}"


well, I'll open a ticket if I find a clear regression between previous versions. No need to solve it here now.


As a guess (I do not know implementation details), but there might be cases where clojure finds a different version in local file system, and time stamps or I-don't-yet-know-what cause it to make network connections to see if a newer version isa available?


that could be, for security resons, all file timstamps are set to 0 (1jan1970).


If you want to absolutely prevent this possibility, you could use clojure command to create a classpath as its output, and save that in a one-line bash script


Using -Spath option.


before you reach installation time, or whatever step doesn't like you to make network connections.


hmm, good point


that should be easy thing to test. The maven endpoints are initially made this way. I make tools-deps create classpath and maven endpoints which I have nix download for me, and then provide it again to clojure. Has worked fine last 2 years.


then the question is if the generator and the build environment are mismatching, tough I updated 2 months ago, there's a chance that nixpkgs has even newer clojure data.


Something changed, but I don't know what. Depending upon how much time you like tracking down such root causes, and how important it is, you can try, or just change steps to avoid it happening.

👍 4
Alex Miller (Clojure team)00:02:50

looking at the clojure script, I think it will still do staleness checking (whether one of the deps.edn files is newer than the cached classpath file), and potentially download and create an unused classpath file even if you -Scp

Alex Miller (Clojure team)00:02:02

that seems like a bug, I don't think there's any reason to do that


I was thinking one could use clojure -Spath ... > mypath.txt when you had network connections allowed, and "java -cp cat mypath.txt`` " plus other needed options to java command where you were not supposed to make network connections, but sure, making network connections unnecessarily in clojure / clj programs seems like a bad idea.

Alex Miller (Clojure team)00:02:06

using clojure -Scp for the latter should have that effect


Maybe it is completely unrelated, but here we discovered that our Java mvn builds were throwing warnings cause we were still using and it seems it is not supported anymore. The error reporting of this particular failure path is not really clear. It appeared all of a sudden around two weeks ago now...


Is supported still, and they are deprecating/phasing-out http?

Alex Miller (Clojure team)05:02:32

they used to transfer you, but now it's just dead - that changed at end of 2019 I think


Yeah just http is dead 😁