Fork me on GitHub
#off-topic
<
2024-03-10
>
Victor Inacio01:03:29

Seems legit

5
not-sure-fry 9
😂 12
Nundrum03:03:46

I have a desperate need for the twitching eye emoji

adi03:03:59

The "haskjure" logo is a bit off, but legit otherwise... :) https://github.com/carp-lang/Carp A statically typed lisp, without a GC, for real-time applications. It's written in Haskell, and has inferred static types.

Drew Verlee06:03:46

I love this, my brain needs more antibranding.

Martynas Maciulevičius07:03:14

I need one for firefox that's shaped as chrome or IE

Ivar Refsdal19:03:32

Looks like this is from https://github.com/mkrl/misbrands Funny stuff indeed 🙂

respatialized02:03:04
replied to a thread:Seems legit

😂 7
adi03:03:59
replied to a thread:Seems legit

The "haskjure" logo is a bit off, but legit otherwise... :) https://github.com/carp-lang/Carp A statically typed lisp, without a GC, for real-time applications. It's written in Haskell, and has inferred static types.

henrik11:03:54

My wife uses some kind of mix of prefix and infix notation for boolean operations in spoken language: “We can have; or the chocolate, or the vanilla.” Is there a word for this kind of notation? (btw, it’s kind of neat, as it allows my brain a microsecond to prepare for the upcoming choice, I find)

😂 1
henrik12:03:29

(or "chocolate" or "vanilla")

p-himik12:03:55

I read it as (combine {:op :or, :operand "chocolate"} {:op :or , :operand "vanilla"}).

👍 1
andy.fingerhut12:03:33

Is her native language something other than English, perhaps? If so, what language? Just curious.

andy.fingerhut12:03:38

TLA+ is a formal notation for describing systems that uses something like that. TLA+ documentation calls it "bullet point notation", but I have never seen the syntax, nor the term "bullet point notation", anywhere else. Search for "bullet" on this page: https://learntla.com/core/operators.html

👍 1
vemv12:03:51

> Is her native language something other than English, perhaps? If so, what language? It would sound to me like a transliteration of Spanish where that construction is natural-sounding

henrik13:03:36

She’s Brazilian, but spent her childhood in the US, so Portuguese and English are equally native to her. Though of course it might have something to do with two parsers coexisting somehow. I don’t want to ask, because I don’t want to put her attention on it. I love the quirk. Bullet point notation makes sense! It kind of seems like a natural AST expansion.

henrik13:03:51

I don’t know Spanish, but I can imagine that the grammar has a lot of overlap with Portuguese.

1
vraid19:03:00

Portuguese has exactly that construction too https://en.wiktionary.org/wiki/ou#Portuguese

1
craftybones14:03:59

Are there any blog posts or talks that discuss when and why we regressed from being plugged into a run time? Even in the 80s, with Smalltalk, people were still doing some sort of live programming. Why did this compile/run cycle come to dominate?

👀 1
craftybones14:03:31

I only see videos or posts on the glorious past or cursing the inglorious present…not much by way of analysis on what the hell happened in between

craftybones14:03:51

Obviously C++ and Java took over. But why did the world go that way?

craftybones14:03:56

My first two languages were Logo and BASIC. Both were systems you interacted with. That was as late as the 90s.

craftybones14:03:28

Ironically, machines got faster at that point. Wouldn’t that have given people more license to write even fancier interpreters and IDEs?

craftybones14:03:07

The PC as a gaming platform gained a lot of traction then. Is this a reason for moving over to C or C++ (en masse that is) ?

valtteri14:03:47

Alan Kay has talked a lot about this https://youtu.be/ubaX1Smg6pY?t=1785

💯 1
valtteri14:03:05

☝️ is one of the videos where he talks about why we’re stuck with the programming languages and tools from the 60's and 70's. TL;DR, no-one wants to invest into long-term research to come up with dramatically better tools so we’re stuck with small incremental improvements driven by short-sighted economics.

craftybones14:03:26

Thanks a lot. I’ll check it out

valtteri14:03:40

Also, if you haven’t watched Jack Rusher’s talk from Strange Loop 2022, it’s also on this topic https://youtu.be/8Ab3ArE8W3s

craftybones14:03:34

I’ve seen this. However, he doesn’t address exactly why we’ve become this way. It is an amazing talk nevertheless

valtteri14:03:42

Ah true. I don’t remember which Alan Kay video it was, but in one of them he said that e.g. Sun spent hundreds of millions in marketing Java so corporations adopted it, then universities etc…

craftybones14:03:11

I mean, the fact that the industry grew and required skilled labour, but not so skilled that there wouldn’t be an industry

craftybones14:03:17

that is certainly one reason

craftybones14:03:42

It is easy to keep to a C/Algol syntax/semantics than to retrain your workforce to rethink their models with something like Erlang

valtteri14:03:48

Also, people often mix “good” with “popular” 😉

craftybones14:03:01

It is remarkable. Most universities don’t teach much Javascript and it is one of the most used languages in the world. Some universities teach Python, but very few teach computation at a fundamental level. I’m not sure how many people are taught assembly today. It is hard to appreciate computation without running through several levels of programming languages and systems

craftybones14:03:50

A lot of grads(I run an internship program) simply don’t know a dynamic language. Even their python knowledge is limited. They know how to write simple programs or use packages, but nothing about python itself

valtteri14:03:52

They invested into Java in the early 2000's and it would cost money to change the curriculum

valtteri14:03:56

My gut says that how and what you initially learn is mostly based on luck.. What there happens to be in the uni curriculum, what tech happens to be used at your first job etc… what kind of people you happen to hang out with.

valtteri14:03:51

Some people stick with the learning & career path that semi-randomly just happens.

valtteri14:03:21

Some lucky ones get to start with smart people around them, some not. Some are curious by nature and that drives them to places like this Slack. 🙂

valtteri14:03:27

I guess the large masses will follow what happens to be popular at that time… or what they happen to encounter first.

Ben Sless14:03:09

Even before java, C, C++ and Unix ruined live programming

valtteri14:03:23

There was the gap in 80's - 90's when machines had very limited computing power and compiled static systems provided significant perf benefit. I guess that’s part of the reason?

craftybones15:03:40

Partially. My annoyance is that even classrooms shifted. For instance, my entire high school class learned Logo as their first language. As far as first languages go for children, that can’t be beat. our second language which we learned in high school was BASIC. Again, while not supreme, not bad at all. you could draw, play music and sounds, animate…all of that with very few lines of code. Today, most people aren’t learning this in school. Some learn Scratch if they get lucky, though that is really not the same. Some learn Python, but they don’t get to do anything as cool as what we did. I wrote a program in school that generated music randomly if you gave it a scale as an input. A project I turned in my 10th was a Mandelbrot set program and another one including encryption(I didn’t even know encryption, I made it up by manipulating ASCII).

craftybones15:03:45

My school wasn’t special

craftybones15:03:01

I had some access to people who programmed, and that certainly helped, but the curriculum itself was decent.

craftybones15:03:18

My friends wrote programs like Mastermind.

craftybones15:03:25

And they had no exposure to programming.

craftybones15:03:45

This aspect shouldn’t have changed, but it did.

phill16:03:31

Hardware is why interactivity died out, and Education is why it has been slow to return! In the late '70s and into the '80s, 2-4 MHz and 16k-64k days, "Creative Computing" and "BYTE" owned the open-minded, hobbyist space. The interactive BASIC common in ROM was addictive, but compared to C it was slow, limited-in-capability, and scale-resistant. Serious mass-market programmers moved away from BASIC and didn't miss it at all. The next mass-market innovation was Turbo Pascal, which restored interactivity in the debugger. In 2024, this remains the state of the art for most programmers! IBM entered the market, PCs entered business, and the education machine took over the mind-control of new entrants to the practice of programming. The transactional unit of education is the document.

vemv16:03:03

Interesting topic. My Clojure journey started simultaneouly with an increased appreciation for interactive programming. This talk and article were particularly influential: https://www.youtube.com/watch?v=tz-Bb-D6teE https://steve-yegge.blogspot.com/2007/01/pinocchio-problem.html I remain disappointed (but not bitter) when people keep writing systems, libraries, tooling that are what Yegge calls dead systems in the article. There's the flip side that dead systems tend to be simpler (easier?) to understand. They remind me of functional programming in a way - data in, data out - nothing 'alive' in between :)

vemv16:03:51

My all-time favorite example is related to Ruby. Problem: you want better IDE completions Solution if you are Stripe: turn Ruby into what is not (rigid type system, nothing like Spec), producing a behemoth of a solution Solution if you're an Emacser: inspect ruby objects at runtime https://github.com/dgutov/robe (fun fact - Ruby was in good part modelled after Elisp)

seancorfield17:03:17

My university computer science course taught a little BASIC then used Pascal for the rest of the course (UCSD p-System). In my sandwich/industrial year, I worked at an insurance company and mostly did assembler and COBOL but was exposed to APL. In my final year, my big project was writing an APL interpreter in Pascal. My best friend's big project was writing a Lisp interpreter (in Pascal). Those two things encouraged me to stay on and do three years of PhD research into the design and implementation of functional programming languages. Then I got a job and had to learn C, then C++, then Java. Dark times 😁 I'm glad there's a Lisp on the JVM now!

Nundrum18:03:26

Years and years ago I worked just a little bit with IBM's Lotus Notes. In a way that was much like a runtime and very "live". I bring that up because there is a tension between describable, repeatable systems like Nix and NixOS and ... whatever Lotus was. It looks to me like maybe we're starting to find ways to resolve that tension. Divergent paths trying to come back together.

chucklehead19:03:50

Notes + Sametime circa 2004-5 is still the best collaboration software I’ve ever encountered in a professional environment. Not the prettiest or most pleasant to use or administer, but by far the most useful. I do think there’s something to the comments re C and Unix, with the Unix philosophy of do one thing and do it well tending to lead to a proliferation of small programs that avoid runtimes other than libc, which doesn’t provide any interactive facilities. I think to get the full benefit the interactivity needs to be inherent throughout the system and not isolated to programs running under higher-level runtimes like the BEAM or JVM.

D21:03:32

Well, your post reminds me of this talk: https://youtu.be/8pTEmbeENF4?si=SoLaJOewFdDhimyy