Fork me on GitHub

Heck, even Clojure had them before then with structs :)


this has been forever in Haskell probably? probably earlier in OCaml?

data Record = Record { a :: String } deriving (Eq, Show)


Probably earlier than that? • Algol bulletin 1967, Page 7. O’Hare record classes, not implemented in Algol, but ended up in Simula67 • Pascal structs ~1970 • defstruct from the LispMachine manual, 1984


category theory goes back to at least 1945: 😉


eh eh if it wasn’t implemented in a language at that point it doesn’t count :)


just for the … record (ba dum tss), record syntax was introduced in 1996 (Haskell 1.3), but it’s just syntactic sugar

Lennart Buit12:05:50

and super annoying to use

Nathan Knox13:05:43

Man I love this chain...something about interesting people sharing interesting info...Thanks everyone!

Daniel Hines19:05:55

Thoughts on advanced degrees in computer science?

Daniel Hines19:05:08

How useful are they in a career?


Depends on the career. Pretty critical if you want to be a professor of Comp. Sci. at most universities.

Daniel Hines19:05:44

What about in industry?

Daniel Hines19:05:01

I was kind of inspired by @niko963’s Conj talk. When he described working with PhD’s on sweet Clojure tech as part of a Master’s thesis… that sounded pretty cool.


a lot of the people at Galois working on provably correct software are PhDs (at least masters)


everyone else I know has a bachelors at best, but still making good money


a more advanced degree might enable you to work on more interesting problems but there are probably better ways to make more money

Daniel Hines19:05:46

I have a BS in Exercise Science. Needless to say, useless for software, beyond a few basic understandings of science in general. I wonder if I could jump straight into a Master’s though. I really wouldn’t want to do a BS again.

Nathan Knox19:05:24

@d4hines Anecdotal at best, but I do not have a degree in CS. I have useless for software undergraduate degrees, and am a consulting technical architect and software engineer. Granted, I don't work exclusively with Clojure, so I may not be your ideal test subject. However, my point is more that advanced degrees (in any subject) are not strict prerequisites for a career in software (general, I know...but I don't know what specific careers you are considering).


The more life-threatening something is the more likely a degree is required.


with the exception of contract killers



Nathan Knox20:05:43

Here I go coding again!


@ztellman just went to work for applied semantic machines which has the chair of the linguistics department from stanford and this guy:


^ that is absolutely amazing

Nathan Knox20:05:55

@dpsutton Oh man...that guy is my hero. He knows exactly what he's doing, putting that last line there. He could have omitted it, but he really needed to say here's my e d u c a t i o n haha

Lennart Buit20:05:59

[Other story, maybe thread this?] Lets say that I have a branch in Git ‘feature’, that I started off another branch ‘skeleton’. Now I started writing another feature, feature2 that depends on changes made to the skeleton in the feature(!) branch (stupid!). What is the cleanest way to move these (obviously mediocrely committed) changes to the skeleton branch?

Lennart Buit20:05:00

I went ahead and upgraded react & deps in feature, that was … stupid


the guy above him is a marshall scholar with a phd, below him is a woman with a phD from university of washington. this guy is awesome

Nathan Knox20:05:31

@lennart.buit from a Git perspective, or Clojure-specific?

Lennart Buit20:05:52

Git ^^! Thats why its in this chan 🙂

Nathan Knox20:05:56

Sure, just confirming! Sounds like a use case for rebase to me.

Daniel Hines20:05:35

I’ve recently discovered git cherry-pick as well - super useful.

Lennart Buit20:05:37

Hmm, I fail to understand how. There are changes in feature that should have been in skeleton. I could rebase feature onto skeleton, but that is already the case…

Lennart Buit20:05:37

Yeah, if I were properly Git’ing, cherry-pick + rebase -i would have saved my skin. But the changes on feature are sadly intertwined with the changes that should have been in skeleton.

Lennart Buit20:05:41

I think I’ll bite the bullet, and manually filter the feature branch with an interactive rebase

Lennart Buit20:05:47

… but after a good nights rest :’)

Daniel Hines20:05:56

@dpsutton Are you citing him as a model example or as an exception that proves a rule?


i'm citing him as a badass 🙂


and i guess proof that you can work anywhere despite pedigree. although it certainly helps 🙂


at my current company everybody but me has phds or mcss in either math or physics or cs, so as an outlier I'd definitely say that a degree helps, in certain domains


there are MIT, CMU, and stanford phds in there and they make up the vast majority though


Yeah, I have a GED and I make money writing Clojure :man-shrugging:


well you are in my club of badasses @john


I've also enjoyed classes at universities throughout my life, but I have no intention on getting the piece of paper proving my indentured servitude


@dpsutton you're making me blush 😊

Eric Ervin20:05:13

A degree in mathematics is more useful than a degree in computer science? (neither as useful as my philosophy degree)


I wouldn't have ever even considered programming/data science as a career track if it weren't for the formal logic I did in my philosophy degree. Sometimes the "useless" things pay off in unexpected ways.

Eric Ervin20:05:22

Slack thinks your react is a facepunch but I'll take it as a fistbump


😅 whoops, the latter was definitely my intent!


its all up to the person


I just finished a philosophy class last year. I'd definitely recommend it for programmers.


I liked my mathematics degree studies. not sure how much it directly helped me get a job as a programmer


other than people seeing it and going “you must be smart”


i already programmed before it, and didn’t switch to programming as a career until I was desperate for $$$

Eric Ervin20:05:26

A graduate degree probably helps. I've worked with some people with undergraduate comp sci degrees and they only know computer hardware and applications.

Daniel Hines20:05:14

So, for those pro-graduate degree - would most faculties be amenable to something like, “Hey, I didn’t do a BS in CS, but look at all this Clojure code I’ve written, and here’s a formal proof of its correctness”?


can you do formal proofs on clojure?


bookmarked! thanks!

Daniel Hines20:05:46

Well, I was more thinking about proving the algorithm, and then implementing it in Clojure.

Daniel Hines20:05:53

Via something like TLA+

Daniel Hines20:05:53

Although, Erlang has a model checker. I wonder if Clojure could have one too.


I think core.logic could also be made the brains of a model checker with the right extensions, clp(set)


that was put in the thread under my question


and i'm excited to read it later


it is basically a dependently typed edsl in clojure


and you can use the type system of the edsl to do proofs, which is not exactly the same thing as proofs in clojure

Daniel Hines20:05:37

So what you’re saying is you can write proofs in clojure, not prove things about a particular clojure program?


from my understanding of how dependently typed proof stuff works, that is kind of the opposite of what they do


they don't prove things about a program, a program is an existence proof of the type of the program


so you could use such a thing to prove things about a program, but I don't think you get information about your program encoded in your logic (types) for free


I haven't thought about it a whole lot, but doing automated proofs that a Clojure/Java (or ClojureScript) program satisfies some kind of spec seems fraught with the implementation details of Clojure. It also depends upon how much of the system you want to prove operates as desired, e.g. are you including the JVM implementation plus JIT behavior? For example, you could prove behaviors about a Clojure program assuming that conj disj and clojure.set and things in clojure.core "work correctly" (whatever that means formally), but then if there is a bug in the implementation of transients, and your program used those, your proof would be assuming something that wasn't true.


Decades ago, I worked with MALPAS -- a dialect of Pascal used by the Ministry of Defense etc that was amenable to formal proof systems.


My company was commissioned to write a C to MALPAS translator so that companies could "prove" C programs. It was an interesting project using mechanical translations of whole C codebases -- MALPAS had no global variables so all globals in C had to be traced and lifted into the main function and then passed down through the entire call tree. Our assumption regarding the standard library was that "it worked" and we mapped calls to a MALPAS "equivalent" (mostly stubs that specified the appropriate semantics).


We ended up with a three-pass analyzer/translator because of the whole-program-globals issue.


(so it was slow and anything beyond fairly simple C programs took a huge amount of time to analyze and generated massive MALPAS programs)


For arbitrary C code, or some "proof-friendly subset"? I've seen papers and articles by John Regehr that changed my opinion from "I think I know what things are safe to write in C code" to "Maybe I did with particular old C compilers that didn't take advantage of all kinds of undefined behavior, but now I don't trust myself any more".


I mean, the Linux kernel bug exposed by certain GCC compiler flag settings here just makes me wonder how the Linux kernel developers stay sane: Particularly scary is a (half-remembered, but I think is true) that there is no static way to check whether your C program results in undefined behavior, and no warnings from most C compilers when they take advantage of it because common C programs embed such things in #define macros.


My team back then created one of the first ANSI-validated C compiler systems. It also highlighted every undefined, unspecified, and implementation-defined behavior (based on what the standard said).


No static way is, indeed, correct for the vast majority of such things. We wrote our own VM for the runtime, and our own standard library, as well.


(IIRC, we got our ANSI certification the same day as two other vendors -- the first three certified vendors)


So, yeah, code had to be run in order to track the undefined etc behavior. That also factored into the MALPAS translation -- some constructs had to be flagged in the MALPAS code as potentially incorrect so that static analysis alone wouldn't be sufficient to verify the translated program.


And we were only doing this at the level of the ANSI standard -- we weren't generating machine code, nor optimizing any of that.