Fork me on GitHub
#off-topic
<
2023-10-28
>
Joseph Graham16:10:33

I wonder if we could replace git with just a series of GPT-engineer prompts that makes the desired changes. Git repos could be a lot smaller then! In-fact it would probably be quite easy to implement such a system. And it's self-documenting!

Joseph Graham16:10:21

I wrote an entire script with nothing but a series of GPT-engineer prompts earlier today.

p-himik16:10:51

It would also require a very specific model with a fixed RNG seed, no?

Joseph Graham16:10:19

yes the rng seed would have to be recorded along with the prompt

Joseph Graham16:10:13

actually a problem does occur to me. it would be very resource intensive to reverse a commit

Joseph Graham16:10:43

because if you ask gpt-engineer to remove a feature it's not clear you would end up with the same thing as when you asked it to add the feature

Joseph Graham16:10:57

so would have to re-compute the entire history just to rollback a commit

Howon Lee17:10:01

git is already packed but with a lossless algorithm

Howon Lee17:10:27

neural net compression is like 40 years old, as technology goes. of course the modern stuff is better but brutally resource-impractical

Howon Lee17:10:11

spending $25 million usd on a gpu bank to save 0.05 cents on your storage..

1
Howon Lee17:10:57

(or even spending $5 on your llm prompt series to save 0.05 cents on your storage)

Joseph Graham17:10:13

is the $25 million number legit?

jeroenvandijk17:10:03

I think simple "prompts" like rename this var would save some time in code reviews hehe. But you don't need an LLM for this

Howon Lee18:10:10

openAI main gpu bank is like 30,000 gpu's, $10k each

Howon Lee18:10:47

probably a neural compressor thing will have smaller total addressable market and you can scale that up or down as you will, so i guessed 10x less. this would be for a centralized firm, but consumer-grade gpu can't deal with llm bullcrap or can only deal marginally

Howon Lee18:10:36

i mean, openai gets a volume discount obviously

Howon Lee18:10:22

your own personal gpu is still like 2 grand and 5 cents/hr usage for viciously worse models, and amortization is far worse of course

Howon Lee18:10:56

AI is a magical land of computing where cost-of-goods-sold matters again

☝️ 1
Howon Lee18:10:42

total git societal effort is junio hamano and his buddies writing some c occasionally, and everyone reading their manpages and books and articles and stuff

mmer09:10:43

When you use prompts to do something - would it be true that the next time you use the same prompts you might get a different answer as there might have been extra learning input. Doesn't that make it less deterministic than we have been used to?

Joseph Graham09:10:10

Software stops working over time when it's not maintained due to libraries changing etc. This method might actually solve that problem, even as it creates a new one.

mmer09:10:36

Agreed, but there is lots of software that is in the it works don't change it mode. Some of my earliest code was running at least 25 years after I first wrote it.

Howon Lee14:10:23

neural nets are only nondeterministic because of sensitive dependence on initial conditions. if you slam down initial conditions perfectly (like, "ok, crc32 matches on model, same seed" perfectly), it'll be deterministic

mmer14:10:17

I was not commenting on whether the same model would give you the same results, but that these llms are constantly evolving so overtime we could see very different results.

p-himik14:10:51

neural nets are only nondeterministic because of sensitive dependence on initial conditionsIt's not entirely correct - it depends on implementation. Even without explicit randomness, there can still be non-deterministic results due to at the very least unpredictable order of multithreaded computations. Floating point operations are not associative.

Howon Lee14:10:32

the unpredictability of multithreaded stuff is also due to SDIC, lol

p-himik14:10:40

If you treat the state of your OS/hardware/clock/whatever else as IC, yes. :) Abstractions leak and shower us all.