This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2019-05-09
Channels
- # announcements (12)
- # beginners (159)
- # boot (3)
- # calva (41)
- # cider (48)
- # clara (2)
- # clj-kondo (8)
- # cljdoc (8)
- # clojure (70)
- # clojure-dev (10)
- # clojure-europe (2)
- # clojure-losangeles (1)
- # clojure-nl (12)
- # clojure-spec (7)
- # clojure-uk (63)
- # clojurescript (24)
- # cursive (24)
- # datomic (22)
- # expound (17)
- # figwheel (1)
- # fulcro (176)
- # graphql (23)
- # jobs (9)
- # jobs-discuss (56)
- # kaocha (1)
- # mount (3)
- # nyc (1)
- # off-topic (91)
- # onyx (3)
- # overtone (4)
- # pathom (3)
- # pedestal (1)
- # re-frame (11)
- # reitit (19)
- # ring (8)
- # shadow-cljs (16)
- # test-check (5)
- # testing (2)
- # tools-deps (20)
- # vim (9)
Heck, even Clojure had them before then with structs :)
this has been forever in Haskell probably? probably earlier in OCaml?
data Record = Record { a :: String } deriving (Eq, Show)
Probably earlier than that?
• Algol bulletin 1967, Page 7. O’Hare record classes, not implemented in Algol, but ended up in Simula67 https://archive.computerhistory.org/resources/text/algol/ACM_Algol_bulletin/1061086/p5-naur.pdf
• Pascal structs ~1970
• defstruct
from the LispMachine manual, 1984 https://archive.org/details/bitsavers_mitcadrchiualJun8420Defstruct_1832300
category theory goes back to at least 1945: https://en.wikipedia.org/wiki/Product_(category_theory) 😉
just for the … record (ba dum tss), record syntax was introduced in 1996 (Haskell 1.3), but it’s just syntactic sugar
and super annoying to use
Man I love this chain...something about interesting people sharing interesting info...Thanks everyone!
Thoughts on advanced degrees in computer science?
How useful are they in a career?
Depends on the career. Pretty critical if you want to be a professor of Comp. Sci. at most universities.
What about in industry?
I was kind of inspired by @niko963’s Conj talk. When he described working with PhD’s on sweet Clojure tech as part of a Master’s thesis… that sounded pretty cool.
a lot of the people at Galois working on provably correct software are PhDs (at least masters)
a more advanced degree might enable you to work on more interesting problems but there are probably better ways to make more money
I have a BS in Exercise Science. Needless to say, useless for software, beyond a few basic understandings of science in general. I wonder if I could jump straight into a Master’s though. I really wouldn’t want to do a BS again.
@d4hines Anecdotal at best, but I do not have a degree in CS. I have useless for software
undergraduate degrees, and am a consulting technical architect and software engineer. Granted, I don't work exclusively with Clojure, so I may not be your ideal test subject. However, my point is more that advanced degrees (in any subject) are not strict prerequisites for a career in software (general, I know...but I don't know what specific careers you are considering).
Here I go coding again!
@ztellman just went to work for applied semantic machines which has the chair of the linguistics department from stanford and this guy:
@dpsutton Oh man...that guy is my hero. He knows exactly what he's doing, putting that last line there. He could have omitted it, but he really needed to say here's my e d u c a t i o n haha
[Other story, maybe thread this?] Lets say that I have a branch in Git ‘feature’, that I started off another branch ‘skeleton’. Now I started writing another feature, feature2
that depends on changes made to the skeleton in the feature
(!) branch (stupid!). What is the cleanest way to move these (obviously mediocrely committed) changes to the skeleton
branch?
I went ahead and upgraded react & deps in feature
, that was … stupid
the guy above him is a marshall scholar with a phd, below him is a woman with a phD from university of washington. this guy is awesome
@lennart.buit from a Git perspective, or Clojure-specific?
Git ^^! Thats why its in this chan 🙂
Sure, just confirming! Sounds like a use case for rebase
to me.
I’ve recently discovered git cherry-pick
as well - super useful.
Hmm, I fail to understand how. There are changes in feature
that should have been in skeleton
. I could rebase feature
onto skeleton
, but that is already the case…
Yeah, if I were properly Git’ing, cherry-pick
+ rebase -i
would have saved my skin. But the changes on feature
are sadly intertwined with the changes that should have been in skeleton
.
I think I’ll bite the bullet, and manually filter the feature
branch with an interactive rebase
… but after a good nights rest :’)
@dpsutton Are you citing him as a model example or as an exception that proves a rule?
and i guess proof that you can work anywhere despite pedigree. although it certainly helps 🙂
at my current company everybody but me has phds or mcss in either math or physics or cs, so as an outlier I'd definitely say that a degree helps, in certain domains
there are MIT, CMU, and stanford phds in there and they make up the vast majority though
I've also enjoyed classes at universities throughout my life, but I have no intention on getting the piece of paper proving my indentured servitude
A degree in mathematics is more useful than a degree in computer science? (neither as useful as my philosophy degree)
I wouldn't have ever even considered programming/data science as a career track if it weren't for the formal logic I did in my philosophy degree. Sometimes the "useless" things pay off in unexpected ways.
Slack thinks your react is a facepunch but I'll take it as a fistbump
😅 whoops, the latter was definitely my intent!
I wouldn't have ever even considered programming/data science as a career track if it weren't for the formal logic I did in my philosophy degree. Sometimes the "useless" things pay off in unexpected ways.
I just finished a philosophy class last year. I'd definitely recommend it for programmers.
I liked my mathematics degree studies. not sure how much it directly helped me get a job as a programmer
i already programmed before it, and didn’t switch to programming as a career until I was desperate for $$$
at my current company everybody but me has phds or mcss in either math or physics or cs, so as an outlier I'd definitely say that a degree helps, in certain domains
A graduate degree probably helps. I've worked with some people with undergraduate comp sci degrees and they only know computer hardware and applications.
So, for those pro-graduate degree - would most faculties be amenable to something like, “Hey, I didn’t do a BS in CS, but look at all this Clojure code I’ve written, and here’s a formal proof of its correctness”?
why yes, yes you can! https://github.com/latte-central/LaTTe
Well, I was more thinking about proving the algorithm, and then implementing it in Clojure.
Via something like TLA+
Although, Erlang has a model checker. I wonder if Clojure could have one too.
I think core.logic could also be made the brains of a model checker with the right extensions, clp(set)
and you can use the type system of the edsl to do proofs, which is not exactly the same thing as proofs in clojure
So what you’re saying is you can write proofs in clojure, not prove things about a particular clojure program?
from my understanding of how dependently typed proof stuff works, that is kind of the opposite of what they do
they don't prove things about a program, a program is an existence proof of the type of the program
so you could use such a thing to prove things about a program, but I don't think you get information about your program encoded in your logic (types) for free
I haven't thought about it a whole lot, but doing automated proofs that a Clojure/Java (or ClojureScript) program satisfies some kind of spec seems fraught with the implementation details of Clojure. It also depends upon how much of the system you want to prove operates as desired, e.g. are you including the JVM implementation plus JIT behavior? For example, you could prove behaviors about a Clojure program assuming that conj disj and clojure.set and things in clojure.core "work correctly" (whatever that means formally), but then if there is a bug in the implementation of transients, and your program used those, your proof would be assuming something that wasn't true.
Decades ago, I worked with MALPAS -- a dialect of Pascal used by the Ministry of Defense etc that was amenable to formal proof systems.
My company was commissioned to write a C to MALPAS translator so that companies could "prove" C programs. It was an interesting project using mechanical translations of whole C codebases -- MALPAS had no global variables so all globals in C had to be traced and lifted into the main function and then passed down through the entire call tree. Our assumption regarding the standard library was that "it worked" and we mapped calls to a MALPAS "equivalent" (mostly stubs that specified the appropriate semantics).
We ended up with a three-pass analyzer/translator because of the whole-program-globals issue.
(so it was slow and anything beyond fairly simple C programs took a huge amount of time to analyze and generated massive MALPAS programs)
For arbitrary C code, or some "proof-friendly subset"? I've seen papers and articles by John Regehr that changed my opinion from "I think I know what things are safe to write in C code" to "Maybe I did with particular old C compilers that didn't take advantage of all kinds of undefined behavior, but now I don't trust myself any more".
I mean, the Linux kernel bug exposed by certain GCC compiler flag settings here just makes me wonder how the Linux kernel developers stay sane: https://lwn.net/Articles/575563/ Particularly scary is a (half-remembered, but I think is true) that there is no static way to check whether your C program results in undefined behavior, and no warnings from most C compilers when they take advantage of it because common C programs embed such things in #define macros.
My team back then created one of the first ANSI-validated C compiler systems. It also highlighted every undefined, unspecified, and implementation-defined behavior (based on what the standard said).
No static way is, indeed, correct for the vast majority of such things. We wrote our own VM for the runtime, and our own standard library, as well.
(IIRC, we got our ANSI certification the same day as two other vendors -- the first three certified vendors)
So, yeah, code had to be run in order to track the undefined etc behavior. That also factored into the MALPAS translation -- some constructs had to be flagged in the MALPAS code as potentially incorrect so that static analysis alone wouldn't be sufficient to verify the translated program.
And we were only doing this at the level of the ANSI standard -- we weren't generating machine code, nor optimizing any of that.
Has any attempt been made to model HTML in datalog? It would seem relatively straight forward to go from a tree to a graph and back and conceptually i would think their might be some query and transaction benfiets to being able to using datalog. I was able to find this article https://juxt.pro/blog/posts/datascript-dom.html and some initial fiddling and hammock time both confirm the idea is at least possible, if not practical. As a hypothetical possibility, you could then query your html as part of understanding the project, e.g show me all the places we have lists. Another idea, is that it would be interesting to consider building your html more as individual transactions, rather then editing the html tree itself (though i currently cant imagine why this would be better).
I’m highly interested in this topic, hopeful it will lead to some breakthrough toward a WYSIWYG GUI designer. Not exactly sure how, but Datalog hasn’t failed to be fruitful in any domain I’ve applied it to yet, so, here’s hoping 🥂