Fork me on GitHub
Karol Wรณjcik08:10:30

Does anyone know how to disable annoying red error spaceline state?


@chase-lambert there's several aspects of Clojure that work great for business apps, but are less than optimal for game development.


As a friend of mine said: "game development is about solving every hard problem in computer science 60 times a second"


I think Clojure could be used in some games (Civ-like or turn-based games). But for normal applications Clojure simply doesn't optimize for things that games need.


Doing (+ x y) in Clojure were x and y are boxed integers isn't a big deal for a web server. In a game? That will murder your performance due to the allocations that have to be later GC'd.


If a game runs at 60fps, that gives you 16ms to run each frame. I don't think there's a web service I've worked on in years that averages less than 100ms per request. And if we cared about performance we'd simply allocate more servers.


So in order to get the level of performance needed for games, you end up needing languages that understand things like memory ordering, struct layout, vector math, etc. And the JVM doesn't give you access to any of that.

๐Ÿ” 4

I think Arcadia is Mono-based, isnโ€™t it?


yep, so it gets you part of the way there, as the CLR has way more memory layout features than the JVM, but it doesn't solve the problems that Clojure doesn't have a way of expressing these CLR constructs.


Which is why one of the devs of Arcadia has been writing his own Clojure-inspired lisp for the CLR.


A type-inferred lisp with some lower-level struct constructs would get you much closer to what's needed for game development


For example, in Clojure this function:

(defn add [x y]
  (+ x y))

(add 1 2)


is compiled as (fn ^Object [^Object x ^Object y])


but the type could be inferred as (fn ^int [^int x ^int y]) because we see it being called with ints.


F# does this sort of thing, all functions are generic, and a different version is compiled depending on how it's called.


On the JVM that would require whole program analysis. On the CLR you have generics so it gets a lot simpler (and faster).


JN language do something like that in CLR, right @nasser ?


partially, yeah, though jn does its own specialization to get around the limits of CLR generics. its more like julia in that regard.

๐Ÿ‘ 4

Carp lang is pretty young but looks promising for this:

โž• 4

Typed Racket is already a thing, isn't it?


Does typed racket actually modify the generated code?


I'm trying to get some code to run on the JVM regardless of whether or not it's imported. I'm quite happy to use java for part of this solution. I'm trying to do something akin to clojure's socket repl. I think I've done the same thing, but my code isn't running until the class is referenced. Is there a named approach to doing this I should look at? I should have mentioned, what I mean is that the jar goes on the classpath, and some jvm opts are provided, then "magic happens".


What I'm talking about is leveraging static analysis to remove allocations in a lisp?


I don't know enough about it to say ๐Ÿ™‚


But yeah for game development you're looking at about 2 possible engines: Unity (CLR based) and Unreal (C++ based). So at least you have a JIT/reflection with Unity.


And then the mad people who roll their own engines, based on all the stuff from yonks back ๐Ÿ˜‰


c.f. Star Citizen == Amazon Lumberyard == CryEngine


Frontier, who seemed to do their own thing and then use it for all their games, so Elite: Dangerous and Planet Coaster use the same engine


But then you get stuff like XCOM built using the Unreal engine


Yeah, there's plenty of people who roll their own, but now you have two problems, you got to write a game and an engine


Welp I guess we know where the $200mil went for Star Citizen ๐Ÿ˜›


That's one expensive tech demo.


Hey, it might be a game by the time I retire


Or they've have run off with all the cash


I mean, the latest trailer showed god knows how many big name stars, plus more they haven't yet announced


I recall... Gillian Anderson, Mark Hamill, Henry Cavill, Liam Cunningham, John Rhys-Davies, Gary Oldman, Mark Strong, Ben Mendehlson


And I've probably forgotten some of them


That's the part that blew my mind... 200mil / 500 employees, doesn't give you a whole lot of wiggle-room for salaries/time


And it's not like you need or want that level of acting for a MMO (or whatever the game is). Spend a ton on motion-capture actors and you are either a) going to have a really linear, short story or b) have that jarring effect where 20 actors are recognizable, and the other 200 in the game are generic faces created via some character creator.


Either way, someone should write a textbook on feature creep and use Star Citizen as the case-study ๐Ÿ˜„

โ˜๏ธ 8

I want them to succeed, but every bone in my body says it's never going to be released.


i love it when the slack talks about games


ah, I was wondering when you'd show up


(and correct my mis-conceptions :P)


I've been doing some F# work lately, and it's pretty cool how it works. Functions are generic by default:

let add x y =
    x + y


Looks like threads are trickling into V8's wasm impl


Next step? We get to write a multi-threaded pause-less, concurrent, GC for JS ๐Ÿ˜


Hey, we can dream ๐Ÿ˜‰


Have another meetup coming up this wednesday, so I'll have to make a little more progress between now and then ๐Ÿ™‚


That is compiled down to something like Func<T, T<Func<T, T>>, so each use of add can create it's own variant based upon the input types it needs. And from what I can tell that's done at runtime.


That sort of approach seems like it would work well for a lisp (but would also require a JIT in the runtime)


@tbaldridge in Haskell this sort of thing is done at compile time using type classes (not sure if this is relevant to the conversation)


at compile time all such T are known through the typeclass annotation yes? Haskell requires extra work whereas the generic waits to see what you do with it. Is that a fair summary?


@tbaldridge i might be doing it wrong, but when i do

let add x y =
    x + y
that compiles to
public static int add(int x, int y)
        return x + y;



let add x y =
    x + y

let result = add 5.5 6
errors out
error FS: The type 'int' does not match the type 'float'


yeah thats part of the problem for me


that operators are treated specially with special inference rules


a related issue i have with F# is its lack of type overloading


so you cannot, as i understand it, define an add function that operates on both ints and floats


For Arcadia, we are working towards a cut of Clojure with support for generics and value types. The goal is zero-alloc numerics, etc, and we have limited support for that already with the MAGIC library


Oh wow - would that be a generic .net thing or Unity specific?


the solution is to change add to let add + x y = x + y so you have to pass in some + that matches the types on x and y, which is more less how I understand type classes are implemented in ghc (desugars in to dictionary passing)


@orestis generic to .NET


the somewhat neglected dev blog is at and i expect it to get more attention as we get underway with the compiler


@nasser @timsgardner yeah I was wrong about the + part of that. I've done that with other types, but it seems the inferencer works differently with numerics


And ah, now I remember, it's down to a CLR bytecode issue. There's different CLR bytecodes for integer + float add. All generics can do is change the input types, not rewrite the bytecode


Seems like the runtime could do that though


If the function being called doesn't have a compiled overload for type X, go compile the function with the argument type-hinted as X, and then link that into the caller function.


now it's clear for me that it works like julia-lang