Fork me on GitHub
#off-topic
<
2023-01-17
>
respatialized00:01:10

I wonder how difficult it would be to create a custom eval that obeys this constraint

phronmophobic00:01:33

easy. you just write an implementation that logs everything somewhere.

respatialized00:01:15

I've wondered sometimes why I can't just query the JVM for non-gc'd values directly see what's lying between my REPL's couch cushions before I clean up

phronmophobic00:01:04

more seriously, I'm pretty sure enforcing that constraint would run into the halting problem. For example, it would completely disallow the usage of (range) since it can't be fully printed. If you allow partial representations, then you could circumvent the constraint by claiming that any output is a valid partial representation for any value.

jpmonettas01:01:08

I'm not sure I'm following, but in case I am, in https://github.com/jpmonettas/flow-storm-debugger you can instrument entire code bases so you retain values when anything runs, and you get to look at them with a UI. It even works for things like (range) since you can lazily explore it

adi04:01:59

What if the language is a client-server system that uses an observable database as operating memory?

adi04:01:23

Maybe Self comes closest to the kind of total observability the tweet asks for? AFAIK, one can introspect any object at any time in the running program, including https://handbook.selflanguage.org/2017.1/monitor.html.

adi04:01:46

Like if everything is available by construction, there is no need for a programmer to labour over evidence.

moe09:01:21

tangential, but what's the name of that notebook-like clj GUI/(browser?) thing you can pipe data to visually inspect it for debugging purposes?

solf10:01:57

There's 2 similar ones, portal and reveal

moe10:01:30

thanks, portal was the one i had seen

moe10:01:18

i could definitely do with some of that in my life

quoll15:01:32

> What if the language is a client-server system that uses an observable database as operating memory? I was thinking about this a while ago, and a few people showed me examples of systems that do indeed use a database as working memory. I was thinking about it with Asami, since the on-disk database is accessed via memory mapping, so it’s not too horribly slow if you don’t allocate so much memory that you’re paging too heavily. But in reality, you could write a compiler that allocates both the stack and the heap in a memory-mapped file. That would work… except it would be horribly slow because you’re relying on L2 cache a lot (and worse), and not properly using the registers, which are an order of magnitude faster.

respatialized18:01:25

I’ve sometimes wondered whether there could be something analogous to GC for a system like that; could you track object usage patterns and move the less-frequently used stuff out of the heap portion of this hypothetical “working memory database” and into the disk or memory mapped portion?

quoll20:01:23

Well, I think that the heap and memmapping are about the same. Mmapped data can get into L2 with no problems, and will be flushed back out to RAM like any other memory. It’s just that the OS knows that those pages are dirty, and will schedule them for writing to disk. The issue is if everything is managed in memory, and you don’t make proper use of registers. If you ARE using registers (and a compiler needs to), then that’s OK, but then it’s not externally accessible data. And I’ve fallen behind on compiler design, but I think that these days the stack uses a lot more registers in a CPU, since register frames are a thing. Basically, you might end up subverting some of the benefits of modern hardware and OSs.

adi05:02:14

Thanks for sharing, @U051N6TTC That's cool and insightful 🙇 > Basically, you might end up subverting some of the benefits of modern hardware and OSs. Idle speculation had led me to this thought as well. From what little I understand, all the modern-day SIMD / vectorization or speculative evaluation type magic assumes some guarantee of physical memory layout / access / locality, right?

adi05:02:39

More crudely, if I cast a database as abstract memory, it stands to reason that such a layer of indirection would break the world-model of CPU architecture.

quoll11:02:16

I don’t really think so. If you consider a C program, when a process starts it requests memory for the stack and heap via a system call. On Linux, that’s brk. Functions like malloc allocate from this, and if it runs out, then brk is called again to get more. When there isn’t much available, the OS will start swapping pages in and out of the swap file, which used to be recommended to be twice RAM (though it can be anything, including nothing). When you memory map a file, the difference is that instead of memory being swapped in and out of the swap file, it’s swapped in and out of the file you specify. Also, pages are always scheduled to be written out, while they are not scheduled to be written out if they’re destined for swap. That only happens when memory gets tight. But otherwise mmapped files act just like all other memory. It’s why a lot of file access is done this way, eg. process and dylib loading is all done with mmap.

quoll11:02:25

The concern I have is what the compiler does with this. Use registers a lot, and you’re not using the allocated memory, and you can’t rely on your data going to disk.

adi12:02:21

Thanks for the explanation, quoll gratitude

👍 2
Dallas Surewood19:01:20

I'm going to be getting a second monitor today. Are there any do's and don'ts from the double monitor crowd on here? I have never had two monitors before. I'm going to be getting a high hertz one just because the lower framerate is gonna drive me crazy

p-himik19:01:59

Unless you're absolutely certain that a dual monitor setup suits you, I really hope that new monitor can potentially simply replace the old one. I myself am a huge anti-fan of dual monitor setup. Used to use it 15 years ago and was quite glad when I could afford one large monitor instead of two smaller ones.

☝️ 2
Dallas Surewood19:01:46

I have a pretty big monitor but I have hated the workflow of window management in windows 10. I pretty much just use alt+tab exclusively cause that's easiest to me

p-himik19:01:20

Heheh, tiling WMs FTW.

Dallas Surewood19:01:58

What do you use?

p-himik19:01:23

AwesomeWM at the moment, quite happy with it.

p-himik19:01:08

Probably, I myself use Linux. But tiling window managers for Windows should also exist. At least, they used to back when I was using Windows. Although I won't be able to recall which one I used.

borkdude19:01:24

I think in Powertoys they have stuff like this (don't use it myself)

Dallas Surewood19:01:27

What didn't you like about having two monitors?

Dallas Surewood19:01:40

I have a pretty big monitor already

jumar19:01:36

Health aspect: it might worse (or cause) neck & back pain.

2
p-himik19:01:53

Having to rotate my head too much, noticing differences between two monitors, especially color-wise.

Ben Lieberman19:01:23

FancyWM is on the Microsoft store, I used it without too many issues

Noah Bogart19:01:36

always put a single monitor directly in front of you. that should be your primary. and then whatever secondary monitors you have, you can puit them to the sides. but always have your primary workflow where your head looks in a neutral position

👍 2
cp4n19:01:47

I have used the tiling featured in PowerToyz. Its not bad. Can remember layouts and stuff too. Would recommend it if you want tiling in Windows

Martynas Maciulevičius20:01:13

> I pretty much just use alt+tab exclusively cause that's easiest to me When I still used windows I configured a vertical taskbar and used Win+number to switch between the windows. This avoids alt-tab but Windows likes to group the windows so it's still kind of alt-tab-y when you have several explorer windows opened. On linux I use i3wm which has similar Win+number behavior but I can tile the windows inside each tag. I had two monitors back then because I worked in a company where I could usurp some monitors. But even if they were from the same vendor the colors were actually different. I ended up using one monitor as my primary because you have to remember which number corresponds to which window and where you put it. With my current config the "tags" are only on one window. So if I switch to a tag and it resides in another window it only changes that window, not the current one.

Dallas Surewood20:01:38

I'm surprised that most people have preferred not to have two monitors since I always see it as a recommendation

solf20:01:43

I feel like I used to hear a lot about dual monitors 10 years ago, now not so much. It would make sense as they have gotten bigger and bigger, and there’s only so much space you can use until having more isn’t that useful

Martynas Maciulevičius20:01:59

I think you should get a taller monitor, not dual. For programming 4:3 is probably better than 16:9 or whatever the "fake wide 4k" is these days

solf20:01:15

I have a 4k 32" 16:9 monitor, it’s already so tall I don’t like putting my emacs fullscreen

2
Martynas Maciulevičius20:01:41

Is 32" even a good idea? You said you don't like your Emacs in full screen. I found that these monitors can cost €1k

mdiin20:01:22

I’m using a 34 inch curved ultra-wide, and I love it. It works really well both with Mac and my Linux machines, both with tiling and non-tiling WMs.

👍 2
mdiin20:01:35

And Emacs full screen just means room for more buffers 🥳

Daniel Craig21:01:32

I prefer to have an odd number of monitors, 1 or 3, so that one is always in the middle. I had a bit of neck pain in the past when I used 2 monitors

2
isak21:01:36

I just thought of a cool idea for multiple monitors - alt tab, except it just rotates what is on each screen. Anyway, for now, I have 3, and of the side monitors is in portrait orientation.

solf21:01:49

@U028ART884X I think my (curved) 32" was around 450 euros. I do like it, but I also have a non-curved 30" and I prefer that one. The 32" takes too much space on my desk too

Martynas Maciulevičius21:01:39

> I just thought of a cool idea for multiple monitors - alt tab, except it just rotates what is on each screen. At this moment I have 12 open windows (workspaces) and currently I use a small laptop screen. You get into O(n) complexity if you only have alt-tab to switch between those. If you use alt-tab and alt-shift-tab then you're still in linked-list world :thinking_face: And in my linux config I have: • Index-based adressing (i3's default with super-num) • I implemented a search on window title • I can also use scroll wheel to switch between the workspace tags (comes with i3 by default) • i3 also comes with ability to mark windows as in VIM. So I can even mark a window and then jump into it without even thinking about workspaces. But I never use this one because it should somehow work out of the box, not configure-and-lose-on-restart. So if you think about linked lists then why not use a stack to handle windows? Push a window, pop a window 😄 I think I should implement workspace contexts. This way I'd be able to handle double or triple the amount of my open apps by somehow configuring a context for each of them and switching between contexts, not workspace tags. Edit: I saw a video of jpmonettas where he had a terminal and debugger toggle key. He has one floating terminal and one floating debugger window that he can hide and recall. But that's not dynamic, I don't prefer this way of doing things, I want to be able to do it more than once.

2
phill22:01:30

I got a 2nd monitor when I should've just gotten a bigger one. Now I have learned the rule of thumb: If you don't need two hammocks, you don't need two monitors.

phill23:01:01

P.S. The 2nd monitor made a big improvement, except that I immediately ran into the problem that the right place to put a monitor already had something in it. :-)

Dallas Surewood23:01:25

Well I guess I have a 27". I can't really increase the size by much while still keeping it center (maybe 33"). But I could easily add space off the right end

Cora (she/her)00:01:57

one of my friends has a wide curved monitor and swears by it, so that's an option. personally I use a 27" monitor right in front of me and my little laptop screen off to the side. I put all the distractions onto my laptop screen so I'm not always seeing them while I work

didibus07:01:51

Going 32"+ was a game changer for me. I feel at 27 you could still use two monitors. But at 32 one is basically the size of two. Curved is also great.

didibus07:01:19

Keep in mind it's the diagonal size, so 27 -> 32 is going from 311 inches^2 to 437 inches^2, that's almost 40% bigger.

mauricio.szabo15:01:46

@U042LKM3WCW I basically love having two monitors - I feel I can leave things "separated" like a terminal on a monitor, and wha I am doing on other. Things that are absolute must-have for me - good color accuracy, 125hz at least, and blue light filter. It makes way less tiring to work on it!