Fork me on GitHub

Docker Desktop is becoming a pain. The config.json for authentication is no longer written properly so you can create a secret out of it for Kubernetes.


Someone sent me a blog post about using an existing config.json from a linux box and fudging that in instead, hardly the best way of doing things IMHO. The M1 is a job all in all but there are some small gaps that are costing me time, all down to Docker and private registries.


It's interesting to me as an in-house dev why docker is even a thing. I can understand if you're working for on a multitude of projects for different clients, but for me, it's just another layer of indirection that provides little to no value, in fact I'd say it's value is net negative for me. 1) I can't access my database directly (now it's docker exec mongo or some such) 2) I have to give the docker thingy resources up front, like a bunch of cores, mem, and disk. This may or may not be a problem, and might very well be me not understanding how things work. Anyhow, I have no interest in figuring it out. 3) I could agree that it's somewhat nice to have my dev env up and running rather quickly, but that could also have been solved by a quick babashka script 4) Everything we run of our own software is in Clojure, which AFAIK is running on the jvm, ie a virtual machine.

☝️ 1

Strong endorsement from me

Rachel Westmacott10:11:35

there's definitely some tension between docker and the JVM

Martynas Maciulevičius11:01:04

You can use different versions of the same database at the same time


Yah, I understand that docker provides some value, but docker enthusiasts know the value of everything and the cost of nothing.


Rant mode off.


@slipset I agree that it's nice to run things directly when you can but in my previous job I think docker really was the good choice. We needed to run a specific version of postgres with a specific plugin that only works with that specific postgres version. Being able to run this on macos regardless of your global environment saves many headaches. Similar for other services we used, elasticsearch with specific plugins etc.


I'm more interested in Docker as a delivery artifact. For example, we're using AWS elastic beanstalk and that still doesn't have support for JVM 17. But I could package our app in a container image and then I could have full control over that.


But is still need a way to run said image locally, to troubleshoot.


Also nice to have a replicable environment for building artefacts, as with multi stage builds.


One thing I struggled with is to use it for development with Clojurescript where it just got too slow


I find Docker very useful for dealing with services I need for development but don't need/want to have to manage: Redis, Elastic Search (two instances), and Percona -- all specific versions. I don't want to have to deal with installing all that on each dev machine. I also use it to run a variety of databases needed for testing stuff like java.jdbc and next.jdbc.


If you only have jvm things and no external services you need to run locally then I agree docker doesn't solve many problems. The R stuff I run is really, really brittle and often breaks in point versions

Rachel Westmacott10:11:51

We use docker for our R code. I pity/empathise with anyone who tries to put R into production.


Luckily production is still a local machine. I'd still like things to work consistently. That doesn't really seem to be a design goal of R tho

😂 1

We use docker to spin up localstack, which gives us a lot of parity with AWS services locally, allowing us to test things out without affecting (or incuring costs) on AWS (``)

❤️ 1

(as well as our db, redis, es etc...)


I've also used localstack for testing an AWS library