This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2018-01-26
Channels
- # arachne (80)
- # beginners (76)
- # boot (16)
- # cider (66)
- # cljs-dev (62)
- # cljsjs (1)
- # clojure (106)
- # clojure-dev (5)
- # clojure-greece (2)
- # clojure-italy (9)
- # clojure-russia (1)
- # clojure-spec (61)
- # clojure-uk (130)
- # clojurescript (21)
- # core-async (9)
- # cursive (3)
- # datomic (37)
- # events (41)
- # figwheel (31)
- # fulcro (27)
- # hoplon (1)
- # jobs (2)
- # lumo (11)
- # off-topic (155)
- # re-frame (71)
- # reagent (27)
- # ring-swagger (3)
- # shadow-cljs (132)
- # spacemacs (5)
- # specter (1)
- # sql (37)
- # test-check (10)
- # uncomplicate (5)
- # unrepl (2)
- # yada (3)
Going back a bit to exposure to OOP vs functional, etc... I think a lot could be said on that subject. I won't try to say it all.
My first real programming language was C (BASIC was the first, and Dijkstra had some pungent words about BASIC and exactly this question,) and I wound up doing scheme/CL and C++ in parallel for a while after that. But for a long time I treated C++ as C with a novel scoping mechanism (a not uncommon approach among graphics programmers to this day.) I'd say the C++ I wrote back then was essentially procedural rather than object-oriented in any real sense.
But in the early aughties I started paying attention to the whole OOPSLA, GoF, Ward's Wiki thing, and eventually wound up pretty deep in the rabbit hole of Object Oriented Design.
I've come to think that much of that turned out to be a dead end for me, in the long run, but I learned quite a bit from it.
The thing is, procedure is the natural coin of the von Neumann architecture. And any approach that isn't entirely procedural must be a matter of encoding procedure. The encoded procedure is tucked away somewhere in every program that can run on a computer.
I eventually wrote some complicated software in a true OOD style. I was never a huge "patterns" guy, and I at least had the good sense to stay away from factory factory factories, and such, but...
I eventually found that the way I was encoding procedure was through run-time relationships between objects with mutable state, and... well, that that was too clever by far, and there were way too many ways to do it, none obviosuly more straightforward than the rest.
And that these architectures could become difficult to understand because the procedure was embodied in run-time relationships. You had to model the changing run-time behavior of the system in your head to understand even pretty simple things.
Don't get me wrong- I had a lot of fun writing software in that style. But I had a bad time maintaining and extending it.
Suffering from the effects of that style was educational for me, and made me more inclined to prefer a functional style, even in languages where that was not entirely natural.
So I'd say that, unlike Dijkstra, I think that exposure to OOD is not a permanent impediment, and can in fact be an asset.
And I really did learn a lot from the Smalltalk community, OOPSLA, etc. A lot of it was pretty OOD specific, if not C++ weakness specific, but a lot of it was and is applicable to all sorts of programming.
I do try my hardest to avoid people who are still enamored with architecture for the sake of architecture though.
Even abstraction for the sake of abstraction, or DRY for the sake of DRY. Everyone's first complaint about beginners is that their code is not DRY enough. My main complaint with experienced programmers is that their code is often far too DRY.
@tagore yeah and with OOP style i’ve ended up backing myself into a corner far too often over abstracting things only to find a small new use case that breaks everything i’ve meticulously built
It's funny- when I was just starting to be reasonably adept at programming, say after a couple of years of doing it...
These days... I write a lot of fairly plodding code, much more than I would have then.
at one of my first jobs half the company was in germany and often I would come to work in the morning finding my code refactored with a stern message about DRY waiting for me.
@jgdavey h I know the feeling. There is maybe a bit of struggle for me even now over this where I work.
wrt abstraction I find for whatever reason c++ is relatively easy to avoid too much abstraction (maybe because doing boilerplate stuff for the abstraction is painful), but languages like java and c# make it very easy to go down the abstraction rabbit hole
awhile back when i worked for a company here a task fell on me last minute and i had 8 hours to finish something that was someone else’s mess. had to navigate through like 6 interconnected classes that made absolutely no sense why it was that way. ended up just ignoring all that and doing it from scratch, was way shorter and got it done in time, and no interconnected class dependencies.
ooh, storytime? 😄
I once worked on a .NET app that used events a lot. Basically they were the equivalent to a list of functions that could be attached to a object's property. So when you fire the event the functions all get called. Each of those functions were on classes that had a polymorphic tree about 3-4 levels deep. It was insane because any sort of change in the object could send you flying in the debugger to the other side of the app.
.NET also has properties, which are methods that look like fields. So foo.bar = 32, could end up calculating the 32 factorial for all you know
If someone can follow your code as easily as they can follow 'Goldilocks,' and with as much engagement, you have done your job better than I generally do mine 😉
For that matter foo.bar
could do the same, so that can hang your debugger when it tries to watch a class.
speaking of preventing nesting, peter introduced me to this and i use it quite liberally https://github.com/ptaoussanis/encore/blob/master/src/taoensso/encore.cljx#L212
It's really possible these days with C++14
Something like Ferret, but a bit more opinionated on how to do RCs and dynamic typing.
It's just that it was for a very long time close to impossible to do serious work in graphics in any other language.
and of course there are funny little quirks like c++17 structured bindings don’t have ignore, so you have to do stuff like auto [a, b] = func(whatever); (void)b;
to avoid compiler warnings
So you think you kind of know C++ and then you try to read someone else's code and realize that they are using a subset of the language you aren;t at all familiar with.
i have that experience, except instead of c++ it’s every new version of javascript that comes out biannually
It is the most bizarre templatey thing I've ever seen, but according to him it actually made sense to write it.
theres a talk from cppcon last year I think called “constexpr all the things” and they basically demonstrate writing a program that runs at compile time and just emits the answer at runtime. I think they mentioned (or someone maybe commented on the video or something) that the code takes 45 minutes to compile.
Erwin Unruh showed something like that to the committee back in the mid-90's. We had quite a bit of fun with it... writing C++ programs that didn't actually compile but the error messages produced had the answers to the computation 🙂
Reminds of aphyr solving the n queens problem in the haskell type system https://aphyr.com/posts/342-typing-the-technical-interview
Except they are so hard to write that it's amazing when someone manages to compute something interesting with them.
yeah the templating language is pretty dense…at least constexpr gets part of the way to making it comprehensible
I haven't kept up with C++, but is there anything there that isn't trivial with CL-style macros?
But the whole point of programming languages is to make the theoretical possible given ou very limited brains.
I think I'd find writing an acceptable web-server challenging in my favorite language.
The amazing thing about a dog who speaks isn't his conversational skills- it is that he speaks at all.
It was even more amazing when people figured out that C++ templates are basically a Turing-complete language.
I'm inclined to think that if you find it amazing that a language is Turing-complete I'd rather not do general programming in it.
Of course it also turns out that macros in Lisp are overrated, but that's another topic.
metaprogramming is only going to be so useful because ultimately you can’t do everything at compile time, but at least it shouldn’t have to be completely different from the language you’re already working in for the runtime component.
I'm inclined to think that the only good use of macros is to introduce new syntax, and that it had better be useful new syntax.
You could in fact consider a lot of Clojure's implementation macros that provide a convenient interface to its lower level JVM bits.
Erwin Unruh showed something like that to the committee back in the mid-90's. We had quite a bit of fun with it... writing C++ programs that didn't actually compile but the error messages produced had the answers to the computation 🙂
Blame vi
again 🙂
What's the most meaningful message you might plausibly accidentally send because you thought you were in normal mode?
😂 My wife refers to vi
as "vicious instrument" after using it at work decades ago...
There's def some garble in many of my github comments. A lot of them end with :wq for no very good reason.
I am curious about the longest string someone can come up with that is both comprehensible English and a plausible vim command sequence.
I bet you could write a clojure.spec
for that and then use generative testing and a dictionary to find the answer 🙂
Interesting you should say that though. I just spent a bit more on a graphics card than I consider entirely reasonable in order to be able to experiment with things like that.
I'm pretty convinced that generative testing is a step in the direction of generative coding. That said, figuring out what you want has always been the most difficult part of writing software.
In a very real sense if you can write down what you want, completely, and it doesn't run, your compiler is not sufficiently smart.
I'm interested in the gaps... how do we get from partially expressed to completely specified?
https://www.zerohedge.com/news/2018-01-26/over-400-million-stolen-hacked-japanese-cryptocurrency-exchange <-- is this what's causing most of http://coinmarketcap.com to be going red ?
This "scientist" thing is an interesting idea. So you want to push a change to production (or have a ci system which does it all the time), but you dont want your production users to experience the suprise of getting new code. The idea is that instead you have a "use" and a "try" block. The users always get the outcome of the "use" block, but the system also runs the "try" or "experiement" block. The system keeps track of the two outcomes and shares them with you, the developer. This way you get to test your code in production (only if your code is side effect free) and get reporting on if your new technique worked... so you can feel comfortable switching to the newer approach when the stats look good. https://github.com/yeller/laboratory
There’s a GitHub engineering blog post describing the motivation for their “science” library for Ruby. This article led to some fun discussions with the C-suite why one would ever wish to push experimental code into a production environment. https://githubengineering.com/scientist/
I'm looking for a hosted database, ideally postgresql or mysql, but it could be anything. Requirements: - most importantly, easy to set up (ideally zero set up, free for small amounts of data) - usable from a hosted service like https://zeit.co/now - only tiny data requirements (<10M) - only simple queries, k/v store might be enough I'm aware of AWS's RDS and gcloud sql. Anything else I could try?
Basically I need something hosted to use for coordination, for simple services.
@rauh, interesting. Does that persist to disk?
hm that's only for ephemeral data (mostly caches) then
perhaps firebase is a better option
how hard is your requirement that it be an RDBMS? DynamoDB would be the fir-
yeah that would be fine
good idea, didn't think of that
today’s #justclojuristthings moment for me - that feeling the first time you correctly identify some->>
as the appropriate tool for a “real” problem, write it out, and it works the first time
power …overwhelming…
from my friend who went to lambda conf: > It was awesome but dunno if closure people would like it. It was very haskell centric
A room full of types types
@pesterhazy: seconding aws dynamodb/rds
@qqq, RDS postgresql db.t1.micro is USD 18/month
interesting summary of the lambdaconf 2016 situation mentioned in #events for those of us who missed it: http://medium.com/@codepaintsleep/lambdaconf-2016-controversy-2d4b13c338cf
@qqq, GCloud SQL postgresql db-f1-micro is USD 8/month
I guess an acceptable price...
@pesterhazy digitalocean has 1 vcpu/512mb/2gb(?) instances for $5/mo that you could throw postgres or mysql on
@pesterhazy A bunch of the MongoDB cloud providers have a free tier. Though you'll share the instance...
@jgh, true! but RDS/GCloud give you reliable backups, that seems worth the premium
@rauh, any ones you've tried?
For $6 a month you have fully managed 100MB redis instance with failover and persistance...
@pesterhazy: if your work is tiny, does it not qualify for amazon free tier ?
Yes but only for a year right? @qqq
@michael740 was asking in the beginners channel yesterday about other discussions of, and/or the significance of, the Y combinator (not the startup incubator, but the lambda calculus construct after which that company was named).
I've never studied lambda calculus, but I know that an oft-explained aspect of the Y combinator is that you can use it to define anonymous recursive functions, and I've heard that perhaps its origin was in proving that the lambda calculus is a Turing complete model of computation.
Does anyone here know if there is anything more to it its significance than that? I do believe that is hugely significant, but was wondering if there was something else about it than that. It is of course a mind-twisting example of code, too, that makes for a puzzler to figure out what is going on.
Here is a link to a potentially-informative StackOverflow question on the topic, with many answers that I haven't fully digested: https://stackoverflow.com/questions/93526/what-is-a-y-combinator
Hmm: Searching a bit more, some discussion on possible origins of the idea here: https://mathoverflow.net/questions/31893/what-is-the-history-of-the-y-combinator