Fork me on GitHub
#off-topic
<
2022-09-17
>
eggsyntax20:09:03

@U042622QS30 personally I'm mainly impressed by the incredible effort this must have taken. Writing anything (an addition function, say) in pure untyped lambda calculus is extremely painstaking and time-consuming. To write a whole Lisp interpreter, and to keep it totally pure by representing numbers and strings directly in the lambda calculus -- my jaw drops even imagining the time it must have taken. Presumably/hopefully the author used some intermediary tools to do it rather than writing everything from scratch purely as lambda expressions? It's still pretty mind-blowing even if so.

eggsyntax20:09:15

There's also a very deep elegance to it, because Lisp is ultimately based, conceptually, on the lambda calculus (in a way that other languages generally aren't), but as far as I know no one has ever before proved the point by writing a lisp interpreter purely in the lambda calculus.

eggsyntax20:09:40

> I don't see what problem it is solving Sure. I expect that this was made for purely aesthetic/challenge/learning reasons, like building a full-size Volkswagen out of lego bricks. It doesn't have to solve a problem, it's just a remarkable and beautiful achievement.

Ben Sless20:09:41

On the contrary. It's purely elegant. Building a lisp on pure LC is a self contained computer you can send to aliens over radio

💯 4
☝️ 2
Ben Sless20:09:43

You just need a LC interpreter and IO and you have everything

Ben20:09:50

Wow, a triumph of the mind! Looking at https://woodrush.github.io/lambdalisp.pdf, I really wonder what the development process was like. 😅

Ben Sless20:09:40

Just like ogres, it's got layers

😆 1
eggsyntax20:09:30

Wow, for last year's project, the author did in fact write a Lisp interpreter in Conway's life 🏆 https://twitter.com/woodrush924/status/1473669203278594049

Ben Sless20:09:49

Not really. If you build up layers of abstraction, it's actually clean, elegant and impressive

eggsyntax20:09:53

@U042622QS30 sounds like it's not your cup of tea; that's fine 🤷

Ben Sless20:09:26

The JVM is just a SECD machine, which is sort of a Lisp Machine, which is a sort of enhanced LC interpreter

Ben Sless20:09:33

These sort of circular/reflective things are pearls. They're more like Maxwell's equations

1
Ben20:09:28

> How does one interpret that PDF? The lambda calculus is a model of computation equivalent to Turing machines. So you can write any program with it. The program this guy chose to write is a Lisp interpreter, and the PDF is that program. If by “interpret” you mean “make sense of individual parts”, I cannot do that, but would love to be able to 🙂

eggsyntax20:09:37

^ it'll be extremely interesting to actually run it and see what kind of performance they've managed to achieve 😁

Ben Sless20:09:58

This, and wonder how it'll work on something like HVM

👀 1
eggsyntax20:09:20

> it'll be extremely interesting to actually run it and see what kind of performance they've managed to achieve It's at least performant enough that entering & running simple code at the REPL feels snappy; I wasn't sure they would have been able to achieve even that. Will write something in a while that loops a lot of times, see how that does...

eggsyntax21:09:01

Just to give a data point for anyone who's curious, the following took 18 seconds for me on a ~6 month old Mac Studio with M1 Max chip:

(defparameter i 0)
(defparameter j 0)
(loop
  (if (> i 100)
    (return))
  (if (> j 100)
    (progn
     (setq j 0)
     (setq i (+ i 1))))
  (setq j (+ j 1)))
(running in the https://github.com/irori/clamb lambda calculus interpreter; no idea what kind of perf that intrinsically has)

Ben Sless21:09:18

Sounds like a reasonable price to pay for purity and elegance. I'll take it!

😆 1
Drew Verlee21:09:10

So, numbers are represented by cons cells? I'm just trying to understand. How is this more lambda calculus then normal?

borkdude21:09:27

numbers in lambda calculus are lambda expressions themselves

borkdude21:09:23

(everything in lambda calculus is a lambda)

Drew Verlee21:09:34

That's what's novel here right? That normally the system doesn't actually go to the lengths to make everything a um.. list?

borkdude21:09:51

lambda, yes

Drew Verlee21:09:22

Kk. Thanks. 👍

eggsyntax23:09:28

Above and beyond the sheer feat of doing this at all, it really goes above and beyond in packing in additional cool or impressive qualities: • Object orientation built in via closures • (Limited) garbage collection • Not just macros (as first-class objects) but reader macros • Uses both a stack and a heap • The whole thing's written in continuation passing style • Purely functional implementation of the environment (which in fairness, how could it not be) • Optionally self-hosting-ish • And just as a totally crazy bonus, if I'm understanding the docs correctly, the author's Lisp-to-LC compiler (written for this project) also includes the SKI combinator calculus as a target so you can run LambdaLisp on a SKI interpreter 🤯 Just ridiculously neat 😍

Ben Sless05:09:40

> That's what's novel here right? That normally the system doesn't actually go to the lengths to make everything a um.. list? > Not even a list, a lambda term. Even lists are encoded as lambdas

adi07:09:03

I've had to have several cups of tea to calm down after seeing lambdalisp on HN this morning 😅 ... such fine, fine work! Continuation-passing Style has been an enigma to me. I had a bit of a lightbulb moment on reading this bit: > ... by carefully writing the entire program in continuation-passing style, the evaluation flow can be controlled so that the program only proceeds when the value under attention is in beta-normal form. In this situation, since values are already evaluated to their final form, the need for memoization becomes unnecessary in the first place.

Ben Sless07:09:26

Does that mean you can use cps for partial evaluation?

thinking-face 1
adi09:09:21

I'm just guessing... If we don't have a final (irreducible) value, i.e. we can still do more beta reductions, then we have a partially evaluated lambda term. If I understand it right, the lambdalisp interpreter is a single lambda expression that takes intial world state as input (the Lisp program to interpret) and cycles through beta reductions of the input until it gets to the final value. Would that not imply that the intermediate terms are partials?

Ben Sless09:09:55

Yes. The problem here which I'm trying to understand how cps solves is duplication. Imagine you close over a value and return a lambda. The value itself is an expression. You beta reduce it and get to a state where the unevaluated expression is INSIDE the lambda. Then you map it over 1e9 elements 🙂

bronsa12:09:01

> Lisp is ultimately based, conceptually, on the lambda calculus this is false, McCarthy is quoted as saying he did not understand lambda calculus while creating LISP

1
bronsa12:09:05

[..] And so, the way in which to do that was to borrow from Church's Lambda Calculus, to borrow the lambda notation. Now , having borrowed this notation, one of the myths concerning LISP that people think up or invent for themselves becomes apparent, and that is that LISP is somehow a realization of the lambda calculus, or that was the intention. The truth is that I didn't understand the lambda calculus, really. 

👀 1
👍 1
bronsa12:09:27

semantically lisps are also historically very far from lambda calculus. it's more the scheme side of the family that got closer

andy.fingerhut14:09:50

Such things are sometimes useful for having an even simpler “substrate” at the foundation, which can aid in formal proofs of various kinds about such code or systems. But I doubt anyone considers such things as practical bases for day-to-day use in software development

eggsyntax14:09:18

>> Lisp is ultimately based, conceptually, on the lambda calculus > this is false, McCarthy is quoted as saying he did not understand lambda calculus while creating LISP Oh, interesting! Not sure where I ran across the claim myself. Thanks! Intuitively it does feel like Lisp is in some sense more similar to the LC than other languages, which is presumably why people keep reinventing the myth 😆

adi16:09:28

@UK0810AQ2 re: https://clojurians.slack.com/archives/C03RZGPG3/p1663493275366689?thread_ts=1663444001.681039&amp;cid=C03RZGPG3... I don't quite grok how it's beta reducing the input such that it obviates the need for memoization. And I don't quite get CPS either. In other words, my very partial understanding prevents me from evaluating the partial evaluation 😅

eggsyntax17:09:44

re: the McCarthy quote, the end of that paragraph matches my intuition & and is roughly what I was trying to get at (from https://books.google.com/books?id=Hy-jBQAAQBAJ&amp;q=%22borrow+the+lambda+notation%22#v=snippet&amp;q=%22borrow%20the%20lambda%20notation%22&amp;f=false)

👍 1
adi05:09:49

Accepted, however lambda notation is one thing and the lambda calculus itself is an entirely different thing. So it's worth not conflating the two!

1
respatialized14:09:38

https://tromp.github.io/cl/diagrams.html I would love to feed this LC definition into one of these LC diagram generators and see the fabric of computation rendered like a tapestry in front of my eyes.

❤️ 1
respatialized14:09:16

> My new language is taking shape. It is gestalt oriented, rendering it beautifully suited for thought, but impractical for writing or speech. It wouldn't be transcribed in the form of words arranged linearly, but as a giant ideogram, to be absorbed as a whole. Such an ideogram could convey, more deliberately than a picture, what a thousand words cannot. The intricacy of each ideogram would be commensurate with the amount of information contained; I amuse myself with the notion of a colossal ideogram that describes the entire universe. Ted Chiang, "Understand" (from Stories of Your Life and Others)

😍 2
eggsyntax14:09:26

@UFTRLDZEW & then the next step, at the risk of being over-literal, would be to feed it to one of those companies that makes actual tapestries from images (like https://www.customphototapestry.com/) 😁