Fork me on GitHub

@john immutable OO still has the issue of classes breaking when other classes change


not if classes are immutable 😉


immutable ... change 😄


this is why it's so good that clojure makes it hard to use OO without designing interfaces first


yknow if i wrote my classes into stone tablets it probably wouldn't be that bad 🙂


bet they'd be slow


idk StoneVM has gotten a lot better


the fact that defrecord doesn't just let you make up new methods ad-hoc has saved me from so many half-assed poor design decisions...


I love looking back on bad design choices


it's a good feeling


Every mistake is a learning experience!


only after you debug it 🙂


shudders java php


I really haven't messed much with clojure's OO.


I prefer to build everything in raw data first


If something shines, then I'll make it a datatype.


well - every time you call seq (or call something that implicitly calls seq) you are leveraging clojure's OO


though you don't have to think about it


I built an entire OO library in clojure before I knew better. Chucked it out and replaced it with a more functional design in 1/3 the code that runs faster and uses less memory.


I switched to Java from C++ in '97 because I'd gotten frustrated with the complexity and obscurity of a lot of C++'s quirks. At first I liked the simple syntax and English-like "prose" of Java... up until around Java 5 🙂


C++ is getting kind of crazy. The lambda syntax uses every kind of bracket




@noisesmith I think that's a confusing way to look at Clojure tho' ... thinking about the OO-ness inherent in the abstractions ... you're thinking about implementation details, IMO.


Once I wrote an elisp program that produced every unicode character that was a "matching delimiter" - I wonder if I still have that around somewhere - there are so many that are unused


@benbot Yeah, I agree. I have a hard time with what C++ has become since the '98 standard


algol had some that could only be created if you could backspace and overstrike 2 characters 🙂


@wottis yeah it really does change your mindset


(I think it was algol, never used it)


how did the implement that?


@terry.poot I wanted to write my final year uni project in Algol 68 but my supervisor wouldn't allow it...


well, you were sitting at a typewriter/terminal, so you typed a paren, hit backspace and typed a dash


@seancorfield I think implicit datatype polymorphism based on shared interfaces is OO whatever you call it - eg. when I compare using collections in OCaml where I have to always remember to call List.fold vs. Array.fold etc... (until I start using OO in OCaml, then I can use A#fold and B#fold regardless of implementation type, but I probably need to define that myself because OCaml library writers don't tend to like to use the object system in that language)


The thing about clojure though... I'm not smart enough to wax poetic on the "expression problem" but I've read and I have a strong feeling that clojure solves that problem with namespaces in a way that, going forward, all new languages will have to at least meet. Monkey patches that break things everywhere, you don't hear about much in clojure.


@seancorfield at my college, the profs were arrogant enough to say they could understand anything we could write, so we could use any language available at the university. I honestly did 90% of my homework in APL


@noisesmith You seem to think in very OOP terms... and I suspect static typing terms too... no wonder Clojure seems to be error prone 🙂


and when you pair that with how extensible clojure is, in datatypes and macros... are there any features that cannot be bolted onto clojure with libraries?


@noisesmith FWIW, I've been trying to learn Haskell properly for 25 years and the static type system just doesn't sit well with me


it's hard to bolt on a type system


haskell is poison for me - too theoretical and I get nothing done - I can manage in OCaml but there's no way I'll be making an Enterprise app with OCaml anytime soon


all projects that have tried so far has failed


For those that don't know APL, it was the first language I saw described as a write-only language (perl was the second, and doesn't hold a candle to it)


Is core.typed really a failure? or does annotating types just suck?


it's kind of uphill to try to use it, from what I gather


@terry.poot My final year project was to write an APL interpreter. Small world I guess.


@seancorfield : did you succeed? I've always gotten stuck at parsing APL ....


(I wrote it in Pascal b/c I wasn't allowed to write it in Algol 68)


I've heard was the way to do it, but never did it.


Oh yes, fully functioning APL interpreter. It was fun.


what's really fun is when the only printer you can print on only does ascii (APL has its own character set). That turns barely readable into illegible. Kudos to the faculty for keeping their word though, they never told me to stop. Even had one prof try to help me turn my two line homework program into a one-liner, but we failed.


The department bought an APL print head for their Diablo teletype so I could print programs properly...


I wrote an apl interpreter in fortran based on a series of articles in Byte when I was in high school. Until the teacher made me stop, because my program was a whole box of punch cards and he was tired of carrying it back and forth to the district office where the mainframe was.


I'd encountered APL during my industrial placement year at an insurance company and I was totally fascinated.


Ah, Byte magazine... happy days!


lol teletypes and punch cards


At uni we backed programs up onto punch tape. Have a care, young whipper-snapper! 😆


yep, one run per day because the cards went to the district office at night and came back with printouts the next morning.


the first person to teach me to program was an experimental composer who had a fortran program that he drop off at the mainfraime (as a stack of cards) and the next day he would pick up the cards and a reel of audio tape. Due to the nature of his hand rolled audio software he had to use a desk calculator and slide rule to figure out the parameters that would lead to a specific pitch output.


this was all well before I met him though


I don't have any cool ancient coding stories 🙂


But I was on the clojure bandwagon since like 2007 or 8!


counts for something 😉


actually being ancient is a prereq 🙂


Always been fascinated by lisp and its ilk. Was totally shocked when my boss agreed to let me use clojure. I was going to campaign for scala because I thought that was as far as I could get. Note to people trying to sell your boss on functional programming, the phrase "100% test coverage" is effective.


maybe it's just my team but I find people don't write tests, I keep leading workshops and encouraging people but I don't have the authority to mandate writing tests and people just play with it in the repl until it works and don't check in any test code


probably just a culture thing I guess


you guys are old; I started with Visual Basic.


the more I talk on this channel tonight the more I feel like I need a vacation, hah


a lot of it is. management has to buy in and mean it. TDD can actually be effective in an OO language, but yeah, the repl is quick and easy and effective, but ephemeral. In my case, my stuff got shoved from R&D straight into production, so I'm only up to about 50% or so, but I try to extend that when I can.


Re tests:

Clojure build/config 15 files 1382 total loc
Clojure source 181 files 38772 total loc,
    2582 fns, 679 of which are private,
    329 vars, 16 macros, 39 atoms
Clojure tests 115 files 12395 total loc


I like having good tests. I have a tendency towards scorched earth refactorings, and I like having a safety net. 🙂


not really old but I came from a hardware background and started with assembler for the intel C51 chips... crazy stuff.


One of the things I loved about C was i didn't have to write any more assembler


I enjoyed writing assembler...


... for a little while. Then it just got to be too much work to get anything done.


ehhh.... finding stuff in the thick paperback programming manual wasn't all that fun. mine had bookmarks sticking out all over the place


When folks ask me "What books would you recommend for learning X?" and I think "Gee, there were no books on X when I learned it..." /graybeard


I still have my copy of "Programming the 6502" by Rodney Zaks, if that helps!


Lost mine along the way (and it might have been a different book but the title was similar)


Haha, nowadays people are are asking for online interactive learning experiences 😂


I like text. I read faster than people talk, and I don't need to work all the problems (only the ones I don't know how to solve).


I remember when the "Head First" series started to appear and I was like "WTF?"... 😆


I think things have changed so much in this industry that some of the basic touchstones have changed too.


No one learns about memory models and pointers and registers any more...


tutorial style books drive me bonkers. "Just do this, see what happened, isn't that cool?" Why did that work? "That's in the next chapter"


(witness the recent discussion of barriers and memory models on the mailing list)


I had to teach a young co-worker about bit manipulation. They don't teach that anymore. He had a masters degree.


On the one hand, understanding those things is undeniably helpful. On the other hand most people will hardly use it at all. So while part of me thinks everyone should learn C and have to debug pointer problems, part of me realizes this is tantamount to torture, since they'll (probably) never have to do it again.


and I thought my school changing the CPU design course project from using a hardware description language to C was insane....


apperently none of the CompSci students got the hang of writing declarative code that models hardware


I'm in two minds... we did a compiler design course and being able to understand how some (virtual) machine works is, to me, very important as a fundamental basis for computer science.


It doesn't need to be particularly practical or specific.


But the concepts are important.


I agree, but then I do have that grounding. Yet most of my co-workers don't have that background, and for the most part, it doesn't really hurt them.


modern systems are too civilized. there's benefit to throwing the kids out in the wild with a knife and a loin cloth as a rite of passage. 🙂


or manage their own memory space, as the case may be


Yeah... that's why I'm in two minds...

seancorfield06:04:36 so many ways, modern computer systems shouldn't require any of that knowledge (but I don't quite think we're there yet?).


Well, they don't. Unless you're one of the people building the stuff that makes that possible. The problem is, too few people can do that, so things like JVMs are black magic to most coders.


In java classes, they'd start by saying "let's do something simple and print something to the screen with public static void main(String args[])... oh, don't worry what all those words mean - we'll tell you what void means later."


just a whole lot of convention without starting with what the meaning of the terms are.


so I'm working on a CLJS concurrency library using webworkers. I'm trying to model them as closely to Agents as possible. But they're mostly built on simple atoms. For remotely owned state, you leverage send to change it. But I'm thinking about supporting ISwap for the worker that locally owns the state.


Anyway, I'm wondering about a name... Just Agents? or I was thinking maybe Molecule, like a network of atoms?


Aw, I'm sorry @john but you've wandered back on topic and you should take that to #clojurescript 🙂 🙂


sorta 🙂


I'm too on-topic for off-topic lol


Re: ancient topics in CS degree. One of the biggest enlightments I had was when I finally understood pointers in C, which was actually recommended before I started my university, it took me several months back then. We also did a lot of graphic and database stuff at our university, none of which I use today. During my years as a professional and student (I did a lot of computer related work back then already) I did a lot of very different stuff, network administration, exchange server stuff, active directory, linux setup in small offices, creating several programs with the W32 API, building web pages, ... I hardly ever directly applied what I learned in my studies there, but there are two things that are important and where I agree with @seancorfield: 1. the concepts you learn 2. The ability to learn about a topic / problem and being able to solve it as well as the self esteem to being able to do so. Thats why I also find it intriguing so many jobs are related to a specific programming language, especially in a corporate environment. Getting into a new technology takes me days to weeks, depending on the topic and the tooling environment which is nothing compared to the other skills that one acquires over its years. To make point, I think it does not matter if you teach punch cards, C, or OOP concepts or whatever, as long as you teach, and teach students to learn for themselves. And of course, as long as you teach mathematics 😄


Any idea if there's already a tool to parse email list archive like this


@noisesmith I find Haskell enlightening in many parts, but sometimes frustrating because of the “academic language”. But that’s because, well, it’s the community that they are in. Now that people are seeing value in functional programming languages, maybe the early adopters - and I mean the early adopters outside of academy - will bring it closer to the masses without all that Mathematical-ish jargon.


But that "mathematical-ish jargon" really is just the core underpinnings of computer science and the root of being able to build correct, simple (in the maintainable sense) software.


yeah but it can hinder you when you’re trying to be a bit more practical. The ideals are what you want to take away more than the semantics imo


it’s kind of the difference between science (or academia) and engineering. I think haskell is a good science language that people like using for engineering


kind of like how python is a good engineering language that was taken up and made good for science


It's all a matter of learning curve ultimately, takes ages to learn to produce non-shitty quality haskell code, then there's a high level of chance that if you get employed to code in haskell people there might have a very different style too (many ways to do the same task). Most popular langs make things simple and boring : ex: go, python etc


and that's why these are the language that are used in "science" (at large) too, most don't have the patience/time to get up to speed


most of the people I know working in "science" (physics, ee, etc) use python/go/c/java/fortran(!) and probably never even heard of Haskell


Haskell is much bigger in academic mathematics


The same could be said for, e.g., chemistry. Anyone can start mixing chemicals without knowing what's going on, but you'd better have some idea of the underpinning theories when things go wrong, or when you're asked to alter things arbitrarily.


A lot of what happens in a CS degree is intuition tuning.


idk if i agree with that


CS degrees vary waaay too much depending on what uni you go to


The uni near me has an awful CS program… it’s basically poor training for 30 year old problems


yes but a lot of the stuff that's considered basic CS knowledge (algo, data structures, sys etc) is not something exclusive to functional programming (or haskell)


that isn’t even taught well (or at all) at some unis


The one I went to actually removed their discrete math class


replaced it with a more practical, more watered down version


The mathematical theory of computation is a very different space than the mathematical basis used for functional langs such as Haskell. One's an underlying reality (discrete math and logic), the other is a modeling tool.


The underlying reality is highly important to know, even if it's just enough to get by.


yeah but a basis in that reality can give context to the computation theory


I'd still argue a functional language is a better way to start students thinking about modeling, but don't know if it extends to pragmatism in a career.


oh for sure


SICP was the intro to programming textbook for MIT for like 40 years


isn't it python now 🙂 ?


but until recently it was much harder to get a job without OOP concepts


but I wouldn't worry for MIT students


With our luck it'd be JS 😛


I think it was because the professor got bored though lol


I remember reading and interview about why they changed and it sounded kind of like a mix of “i’m bored of that book” and “programming today is more about using other people’s work than making your own”


well as boring as python is, it's used everywhere and quite good at a lot of things


I think the professor who wrote SICP left. A year or two later they switched.


I think he’s still there


idk last update 2014


I might be thinking of something else tbh 🙂


I hate that we consider "programming by poking" the new norm.


I think it’s a really great way to introduce yourself to programming


if your in the field it shouldn't be, but nowadays everybody uses programming


then when you want to start writing good software, the lessons and “aha” moments you get are much more impactful


@mpenet s/programming/poking ?


I learned by that “poking” method until I started reading about good OOP design in ruby, then went through that Land of Lisp book and the amount of realizations i got


left quite an impact


I think you get a different view going from the theory up


Me too. Clojure feels significantly less "pokey," the source code is readable, the libraries are simple, it's a nice world.


When I first learned about immutable state I didn’t really get why my team was so into it… but something happened where I traced back all the state changed to some weird function call… and my eyes were opened 0.0


idk… I’m a fan of the programming by poking method for intro courses


@dominicm I like how funny the community is with a lot of things… like lein and figwheel


good sense of humor


python is a good one too for humour.


i’ve only used python for playing around with ML… no humor there 😞


There's an article about ML and fizzbuzz It's a fiction where the interviewee use ML to solve fizz buzz


The name python comes from the Monty Python and a few things makes a references to them as the cheese shop which is python package index.


My first functional language was Standard ML... taught in a programming languages course. (oh wait, are we talking about Machine Learning? 😛)


I'm really not a huge fan of python... the tooling just feels like using some shell script people hacked together. I'd take npm over pip any day, it may be slow but it works!


I think poking is a great way to get started with a language, it gets me motivated to learn more about it after seeing things work.... This didn't work for me with Haskell though, I spent 2 days trying to get a JSON endpoint up and failed (gave up after that). I think I can sacrefice some elegance/correctness for simplicity by sticking with Clojure for now...


Haskell puts a lot of weight on modeling things up front based on a very wide set common abstractions; if you screw up the modeling, everything else becomes difficult to change. Comprehensive knowledge of your modeling tools is essential. Clojure uses a narrow set of common abstractions instead; perhaps harder to specify exquisitely mathematical models in the same way Haskell can, but since Clojure uses more general-purpose abstractions, it's easier to mold and shape it to shifting needs.


there's also things like databases, where the schema is provided at runtime; in clojure -- you use a map; in haskell, you have to do some crazy generic category foo to have some piece of data that satisfies this generic schema


@akiroz have you seen yarn? It’s a billion times better than npm


Also I think lisps are pretty elegant… maybe not correct… but still elegant


I quite like haskell but it’s super brittle, all of the work is in trawling through unintelligible errors getting your code to compile


which isn’t always fun


I like clojure but all the effort is making sure it runs in a number of edge cases, usually always compiles 😄


currently looking at spec tho, that should help me avoid some schoolboy errors


I don’t like haskell actually, but I’m amused to see a clojure fan call its errors unintelligible


benbot: yep, been using yarn for a while now~ 😄


Emojis are great in it :)


they are prettty unintelligible, and I do a lot of C++ template metaprogramming 😄


Code tree on github .... better file navigation ....


speaking of haskell, I found the language pragmas to be quite a pain....


Wow that code tree is awesome


cheers for the tip 🙂


is there a "law" for how progress slows as you get closer the the finish line? i've been about 2 days from an official release for about 2 months now. the farther in you get, the more expensive it is to change anything, even if it's small, since at least you have to go through all of your wonderful sample code and documentation. so the last 10 % seems to take about 95% of the time. seems like there ought to be a witty "Fuber's Law" along those lines, but i can't recall one. e.g. once you get past x% done, every additional % done will cost you 100-x + delta, so you're not done until you've spent well over 100%. :)


sounds like the natural consequence of the pareto principle (you get 80% of the result in 20% of the time, meaning your progress slows by 400% once you are in the last 20% of actual result left)


if the pareto principle was a recursive phenomenon, that would mean each remaining portion would take 4x as long as the previous step


hey, i've even heard of that. sounds about right. but it lacks charm.😉


also, in my experience, if I find that changes are getting difficult to implement, that can be a learning experience I can apply to my next design


what i'm experiencing now is not so much that a particular change is hard to implement, but that the system has attained a degree of complexity that means it takes longer and longer even if it is simple (you not only make the source changes, you have to make all the test, doc, etc. changes - and then you have to review and double-check it all ad infinitum).


the closer you get to "done", the longer it takes to finish.?


sounds exactly like Xeno’s paradox


wow! i just looked that up on goigle!


i mean before i saw your msg. must be right.


Xeno's (Zeno's?) Law of Software Development?


oh, right, it’s Zeno


"You can't get there (done) from here (almost done)."


of course we have "real artists ship" (S. Jobs, or so I've heard), but i prefer Valery's "A poem is never finished, only abandoned."