This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2023-10-04
Channels
- # announcements (6)
- # babashka (7)
- # beginners (2)
- # biff (5)
- # calva (2)
- # cherry (17)
- # cider (3)
- # clj-kondo (8)
- # clojure (202)
- # clojure-brasil (8)
- # clojure-europe (20)
- # clojure-norway (23)
- # clojure-uk (4)
- # clojuredesign-podcast (5)
- # conjure (1)
- # cursive (9)
- # eastwood (22)
- # events (8)
- # fulcro (3)
- # hyperfiddle (22)
- # introduce-yourself (7)
- # lsp (67)
- # malli (1)
- # matrix (1)
- # meander (6)
- # off-topic (76)
- # pedestal (8)
- # polylith (17)
- # quil (12)
- # re-frame (2)
- # reagent (8)
- # releases (3)
- # shadow-cljs (67)
- # sql (93)
- # squint (39)
- # tools-deps (46)
- # vim (7)
Would be great if you could demonstrate a problem set.
Often you choose your paradigm first e.g. for request response (go with Clojure + HTTP Rest + Load balancer) or message processing (go with Clojure + Redis/Kafka etc). If you can isolate your state (keep it external) then you can scale up vertically or horizontally as many as you need. Like most Clojure development you can often build up from assembling libraries together rather than having to write your code in a framework that promises scalability. Clojure is very flexible, what often takes a whole new language (e.g. erlang for distributed systems or golang for go block async) - clojure can achieve the same with just a library and no change to the core language. That's the broad advice, happy to discuss a more specific version. Also note that clojure is super fast - you may need 10x fewer servers compared to a python/ruby/etc implementation.
Imaginary use case is building a discord. So many users chat in dms and in chatrooms.
Erlang/OTP is hard to beat for this. But you are leaving a lot of questions open: persistence?, max size of channels etc etc. It’s achievable but the solution would likely be very different from erlangs and/or leverage more than « just » clojure
If you make it self hosted / federated you move the problem to someone else, you don't really need to support scale as much as usability
https://github.com/redplanetlabs/rama-clojure-starter/blob/master/README.md Rama comes to mind
> So many users chat in dms and in chatrooms. Because chatrooms are relative isolated from each other - you can scale these out more easily than one large social network (e.g. Facebook). I think Clojure (or lots of other programming languages) would handle this fine.
Parallelism as a very broad answer... The internet scales very well (if you can afford it).
An effective architecture is also very useful to ensure a system can be scaled that way, there are lots of common design options that help with scalability and enabling parallelism, e.g. CQRS as just one of many examples. Or just add kafka everywhere
To add onto respatialized's comment on Rama. The twitter scale blog post is good: https://blog.redplanetlabs.com/2023/08/15/how-we-reduced-the-cost-of-building-twitter-at-twitter-scale-by-100x/
If you just mean cluster, where different processes are aware of each other and can exchange messages, depending on your environment you can use something like jgroups
Erlang's is probably most famed for its availability promises - supposedly you get more 'nines' uptime with less effort. Probably there's good stuff in it but I also sense (from my experience pestering my Elixir friends with questions) that this fame is largely legendary, not unlike the 'legend' that Go-style concurrency is more scalable than thread-based concurrency. It's stuff that's been repeated so many times over HN that people take those for granted.
https://www.youtube.com/watch?v=ROor6_NGIWU has some of rich directly addressing erlang
For fans of complexity (the study of), this one is recommended: https://www.youtube.com/watch?v=8XCXvzQdcug

So I've been wondering: What's the sales pitch for Clojure with the advent of AI tools? Clojure ticks a lot of boxes that matter for humans writing applications (immutability, interactive development, ease of reasoning with pure functions, expressiveness), but why would someone care in the age of "tell the computer in plain language what to make" (or even just take a picture of the GUI you want)? I've seen some arguments that AI sucks at Clojure compared to say Python and JS, so Clojure jobs are safe, but if I'm running a business and I just need software that works, I don't care what it's written in. If the models are well-trained on Python and JS, and there's no shortage of successful products out there built on those ecosystems, then having them generate code in those languages is likely going to lead to a better outcome. I wonder if languages like Clojure become the equivalent of a car enthusiast building their own dream car, or a wood worker making custom furniture. For the overwhelming majority of use cases, production line cars and furniture are what's going to sell. And in software, I don't think you can even make the woodworking argument of "some people like bespoke, handcrafted furniture and will pay a premium for it". Businesses want to make money, so software in hand quickly is better for that goal (obviously as long as it works). We already know this is the case, because no business person ever cares how we factor our code or test it or layer it or whatever; they care that it meets the requirements. We care about those things insofar as they make our lives easier to maintain and evolve these applications, but if the AI is instructed to add features to its own generated code and can do it, then it doesn't matter, right? Are we really just banking on "AI isn't ever going to be able to evolve software without it falling apart"? FYI I thoroughly enjoy Clojure and found the learning journey very rewarding. I'm wondering these things for my son, who's going to be growing up in a very different technical landscape. For decades, there's been value in learning different programming paradigms and understanding algorithmic complexity, but does all of that become moot? We've always needed to understand the problem statement to build good software, but I wonder if we're not on track for "software developer" and "business analyst" to become the same job.
I'm firmly convinced that the "AI is about to take all our jobs" premise of your question should be thoroughly questioned. The hype around LLMs is creating a thick fog that makes it difficult to see their capabilities accurately. From everything I've seen, the only people getting benefits from LLM-based tools to make software are skilled programmers who know how to correct the defects of a model's output, rather than non-technical business users replacing programmers entirely.
If software becomes so good that software development isn't necessary anymore, I think software developers are in a good position to be able to use that software better than most. It is still kind of 'talking to the computer', and you need to be precise, which programmers are good at.
Yeah, I'm not arguing that point, I'm wondering about Clojure's value proposition. Sure, skilled programmers still need to tweak the output right now, but what motivates say a skilled Python or JS dev to take the plunge and learn and entirely new way of doing things "manually" rather than leveraging those existing skills to work with an LLM?
Like imagine you've built your career doing OOP in one of the mainstream languages. You're now at a crossroads: do you learn to leverage LLMs, or do you climb the steep learning curve of Clojure?
> I've seen some arguments that AI sucks at Clojure compared to say Python and JS, so Clojure jobs are safe, but if I'm running a business and I just need software that works The programming language landscape (which is indirectly but largely influenced by the whims of investors) may be changing. I don't find it a coincidence that language diversity in job offers has decreased. I wish Clojure was something "so good that cannot be ignored", but IMO that isn't true, one often simply trades one set of problems for another. To claim Clojure programming is simpler can come across as a suspiciously self-serving claim. 2023 has been a weird year, hopefully our community will come out stronger.
I'm a self taught programmer. A preference for simple but powerful tools, that map cleanly on to the problem I'm trying to solve in understandable ways, led me to Clojure from Python. The same impulse is what makes me extremely averse to using something as incomprehensible as a LLM as part of my professional work. I wrote an essay about my thoughts on the topic not too long ago: https://respatialized.net/the-will-to-understand
I didn't read your post in entirety, but while AI can write some initial software, can it maintain it? what happens when AI gets itself into an edge case where it does not know how to proceed? Do we call the humans in to vet what the AI has created? can the AI up skill and train humans on what it has written (on board new developers) can the AI mentor new developers?
@U04RG9F8UJZ Flexiana asked me that question when they interviewed me for "Clojure Corner": https://flexiana.com/2023/07/clojure-corner-with-sean-corfield-part-2
And yeah, the claim about Clojure's simplicity is self-serving, much in the same way that "I should have a weekend" or "I should have control over how I do my work" is also self-serving.
Is AI going to join the call with your most profitable clients and triage their issues and create a plan to address their concerns?
What you're assuming is, the only value of developers is writing code, but that isn't the truth there is more to being a software engineer
Being able to reason about the code is something that will be important even if AI is doing more of the writing. So if you believe Clojure makes that easier, as many here do, that would be the value prop.
I'll mention one other thing, if you're not as skeptical as I am or are otherwise still worried about the use of AI to replace you, or about the much more likely outcome of it being used to degrade your pay and working conditions. I would also suggest the solution to the problem lies outside the realm of software tools entirely: the Writer's Guild of America recently obtained a deal that protects the rights of authors against the use of AI to duplicate their work and force them into a position subordinate to the outputs of a LLM. I'm in a labor union. Consider joining with your programmer and non-programer colleagues alike to form one! https://www.theverge.com/2023/9/26/23891835/wga-contract-summary-ai-streaming-data
A couple of ways this could be wrong, and Clojure actually would not be worth the time investment: • it turns out static typing helps AI a lot • it turns out that the 100x + quantity of mainstream code is a big advantage for AI so that it works much better for them • these dwarf the advantages of Clojure (e.g., better ability to reason about state)
Yeah, those points are along the lines of what I'm thinking. I'm not too concerned about AI taking my job: I'm close enough to the end of my career that by the time AI is (possibly) good enough, I'll be retired. I'm more interested from the angle of people who aren't yet in tech, but may want to get in to it, or people at the crossroads I mentioned. As another analogy, we don't really use assembly languages anymore. Sure, there are places. But in general, does it benefit someone to wade into that? Most of the stuff around CPU architecture and instruction sets I learned in school probably isn't exactly true anymore anyways, so was it worth the course time and brain cycles to learn it? I do agree that there's a fog around this topic. A lot of the hype articles seem to stop at one and done experiments, but that's not really the hard part of the job. The hard part is understanding and tweaking and growing the application. But just like higher level languages made punch cards a novelty, LLMs may be positioned to filter out more niche languages and refine implementations of the more popular ones
One thing I can say is that Clojure's clear-headedness makes it a prime educational language - consider that a student who understands map, filter, reduce, and lazy sequences knows enough to describe streaming architectures, and hypothetically they would be able to use an LLM to create some sort of streaming service for example using Flink or Spark or something along those lines
In that way, learning clojure is quite valuable
Hypothetically speaking
Following on from my comments in the Clojure Corner interview I linked above, I think that software development as an industry will change a lot over time as LLMs etc get better at reading/explaining/writing code but I think niche languages will be affected less than mainstream ones and most of the reasons for choosing a niche language in the first place will continue to hold true for longer. As mentioned above tho', there's a lot more to being a software developer than just reading/explaining/writing code and many of us will learn to use LLMs etc in our day-to-day work like we've learned to use any other improvements in tooling.
With my OSS projects, I use Copilot for two main areas: writing documentation and writing tests. It's pretty good at the former (also something it makes some pretty insane leaps about what it thinks I am intending to write). For the latter, the "random generation" aspect means I have to spend less time "inventing" data for example-based tests and it mostly gets the syntax somewhat correct (even in Clojure -- although it tends to add a lot more )))
than necessary).
I agree with @U04V70XH6’s approach, or rather I found a similar one. Copilot is very good at quickly and efficiently spitting out repeated data(like) things. Once it "gets" what the shape is, it generates stuff extremely quickly. So in a sense it's copy paste on steroids. Unfortunately I don't write much Clojure at work (I use the REPL and write personal tools), so I have a perspective of using Copilot and similar with mainstream languages: I mostly have it turned off, because it is distracting in many cases. • repeatedly breaks my flow • makes horrid suggestions too often • it's very slow • it conflicts with my editing/navigation in effectively non-deterministic ways (mostly because it's slow and unpredictable). I turn it on for things I know it's good at: rote work.
> As another analogy, we don't really use assembly languages anymore. Sure, there are places. But in general, does it benefit someone to wade into that? Most of the stuff around CPU architecture and instruction sets I learned in school probably isn't exactly true anymore anyways, so was it worth the course time and brain cycles to learn it? I disagree. I don't have a degree, I was a programming hobbyist since I was a kid and somehow got a job, so I'm an autodidact and in part taught by my dad. My dad would often tell me war stories from the time he wrote assembly. I always thought of learning (stuff like) assembly and how computers work under the hood to be arcane / things of the past / things compiler writers have to worry about AKA not my problem. But I've discovered that learning those things is extremely beneficial and surprisingly simple, or rather not as scary as I thought! It helps me to think more machine like if that makes sense, it dispels some of the magical aura around what's happening inside the computer. Perhaps most importantly, it opens up new questions that I wasn't able to ask before. I've found that the more foundational knowledge that I absorb, the more opportunities emerge. It's not always immediate. It's also very fun 🙂
Those who have learned these things in school, might not always appreciate how impactful that kind of knowledge is. (I mean that in the most respectful way). Because they didn't experience the lack of said knowledge.
Thanks for the interesting discussion and perspective folks 🙂 @U04V70XH6 @U01EFUL1A8M are you using freely available tooling or paid versions?
Another thought that I want to share: Fundamentally programming is about thinking and communicating about goals and problems precisely. I don't think that skill will go away anytime soon! I don't have a son to worry about, but we have a new apprentice. It's of course not the same, but we feel very much responsible to teach him useful stuff. Our overall strategy is to focus on the thinking tools just as much as on the practical mechanics of programming.
@U04RG9F8UJZ I get Copilot for free because of my OSS work but it is the "paid" version, as far as I can tell.
I pay for ChatGPT. I don't use it in a mindless copy-and-paste manner. Normally it helps me write something like a regex, or fill some knowledge gap. I'll build by hand on top of that - maybe GPT gave me some inspiration, but 'productionizing' that code is where human expertise comes into play.
@U01EFUL1A8M for sure on the thinking angle. That's kind of my point when I suggested "software developer" and "business analyst" converging. There's already significant overlap between the two, especially if you aren't just a "code monkey" and are diving deep into the domain. I think teaching those kinds of skills doesn't go away, regardless of the tools. My thinking, and motivation for this discussion is entirely speculative: what does the job of the future look like, and is there really a sales pitch for learning to "drive manual", especially in a language that's niche, when "self-driving" is (allegedly) on the horizon. I really appreciate your take on being self taught. My sister-in-law is in the same boat (my wife and I have mentored her and helped her get into the career from an arts background), and she's super keen on learning the "under the hood" details as well. I'm trying to separate things like "learning for curiosity" from "do I need this to put food on the table". I think it's a very fuzzy line, though. As you said, some of those things help you ask better questions, so it's hard to know beforehand what's going to come in handy and what isn't.
The driving analogy is interesting because I learned to drive manual/stick as a teenager and drove nearly all manual cars up until 2005 (26 years) because I really enjoyed the experience of driving. In 2005, we bought a Prius (which doesn't have a manual option, as I recall). Last year we bought a new car that is our first to have some degree of "self-driving" (it follows lane markings and will steer for you, unless the curve is too much) and I love that feature and use it pretty much all the time. My wife hates it so it's lucky that it is off by default and resets to off every time you start the car! 🙂 My early programming included a lot of assembly language and then a lot of C and C++ so it was pretty low-level and you had to know how the machine worked under the hood. As I've gotten further into my career (aka getting older), I've leaned toward higher and higher-level languages with more abstraction so I (mostly) don't have to worry about how the machine works now. Right now, I'm still extremely skeptical about AI/LLMs in general and their ability to generate useful code in particular, but I can see it becoming more useful to me (for documentation and for tests, certainly, and perhaps eventually for writing some source code) and maybe I'll grow to like the "abstraction" of describing a problem and having it sketch out the code for me... :thinking_face:
The car thing makes me think about risk/engineering and manual overrides... We just this week had an issue with a notification system. The data coming in was lacking and some notifications didn't go off. I spare you the details. Long story short: It would have really been nice if we had a "manual override" to trigger (parts) of the thing directly and we noted that this would be a thing that we will think about more when we write these kinds of programs. Ultimately someone needs to understand how stuff works at some point and make their hands dirty.
@U04V70XH6 you mentioned how you climbed up the abstraction layers over your career so to speak. I think an important point here is that LLM are not providing actual computational abstractions. Or at least it doesn't feel that way to me. To quote Dijkstra: "The purpose of abstraction is not to be vague, but to create a new semantic level in which one can be absolutely precise."
You, "first-worlders" where it's hard to find a manual car 😄 Anyway, agree with some points, disagree with lots of others. Everything so far, with one single exception, that I asked ChatGPT game me wrong code. Not like "marginally wrong that needs tweaking", an actual "so wrong that I can't use anything"
I had for a while some GPT4 credits, and I tried on some arduous work that I had to do. The result was literally atrocious - it basically copied the whole source code and pasted it in the middle of a funtion
A few times I got good and correct code from ChatGPT in C#: • Count the number of non-white pixels in a bitmap using the byte layout • Merge two C# records given the schema (would be a 2 level map in Clojure that could be done with built-in functions, but was ~50 lines of C#)
@U01EFUL1A8M Dijkstra said a lot of things for effect 😄 I consider "abstraction" in the context of being able to describe solutions to problems by using language that is further away from the machine. The examples Isak gave are good ones, IMO.
What's the sales pitch for Clojure with the advent of AI tools?What is the sales pitch for hand manufacturing with the advent of assembly line manufacturing? For in-person meetings with the advent of virtual conferencing? For oratory with the advent of writing? Even assuming LLMs become reliable utility infrastructure (I am a skeptic), I don't think they will remove what exists. It will be a yes, and proposition. More will be possible, and more avenues for work will reveal themselves. Truly landmark inventions create surpluses. As long as those surpluses become reasonably widely available, we will always have things to occupy us. One of the things that will change is the premise of the question... e.g. why would such a sales pitch even be necessary?
If LLM tech really legitimately becomes that capable, will a specific programming language even matter?
One would be able to use literally anything, because today's languages would become mere bytecode output, compiled from text natural language prompts. (And there would be some high-paying work for the bytecode-knowers, for when "low level" hacking becomes necessary.)
Something that perhaps wasn't mentioned is the "garbage in garbage out" problem. Authoritative, battle hardened information is hard to come by.
Agreed. And, as a bytecode producer, I would like to charge a hefty premium for feeding any for-profit LLM. I hope the class action lawsuits come thick and fast and force them to release derivative work back into the commons from which they have taken. (The tech itself is okay I think, the grift and the gatekeeping is not, when almost the entire value of your system derives from the input it consumes.)

This matter, like many that are related to Clojure, was the subject of a definitive pronouncement by a real computer scientist: > 93. When someone says "I want a programming language in which I need only say what I wish done," give him a lollipop. --http://www.cs.yale.edu/homes/perlis-alan/quotes.html
So, here's a thing: there are some specific areas that LLMs can really replace people - and they aren't for now. As an example: I had a problem with CircleCI (another bug related to Gitlab - I collect these). I reported it, and the "Senior Support Engineer" gave me an answer that was so bad that I asked ChatGPT to write one. Turns out - the ChatGPT one was better. None actually solved my problem, obviously, but if I could choose between the two, I would prefer the automated one.
So maybe these people will be replaced in the future; maybe we'll need to have a higher bar for humans to "take over" (which I am totally fine with); or maybe something incredibly different will happen (maybe a monstruous programming language that literally accepts the syntax from any of the existing ones, so any code that these LLM injected ever can be re-used in a single, unified thing that nobody will expect to write but automated tools will nail it). I mean, I am quite fine with any of these - I have a lot of ideas and so little time to actually try them, so something that I could test my ideas and see if they have value would be very welcome 😄
I've developed a fairly liberal view of what I consider "programming". The definition I like to use is "getting a computer to do what you want". Notably, this includes methods that aren't typically considered programming (eg. excel, photoshop, siri) and doesn't include other methods that might typically be considered programming (eg. AbstractFactoryManagerImpl, overly complicated config files, monads 😛). From this perspective, programming has already become a common widespread activity, even before the advent of LLMs. LLMs aren't the first technology with an approachable interface that spits out "code". In general, most of programming is filled with unnecessary complexity and I'm excited about the possibilities where LLMs might help. I haven't used LLMs to write more than a function or two, but I have found LLMs very useful as part of my design process. For building a framework for how to think about the future of programming, I've looked into the history of literacy. I think there's lots of interesting thoughts there, but the summary is that literacy has become widespread. People use writing for all sorts of things at home and in the office. It would be ridiculous to pay someone to write a grocery list for you, but that doesn't mean people don't get paid for more specialized forms of writing (eg. contracts, law, novels, copywrite, etc). This is the direction that I think "programming" will take in the future. In the future, I hope it will be possible to make a simple desktop, web, or mobile app without a CS degree (if that's not already true). Conversely, I think there will always be a place for "professional" programming. Back to the original question: How good will Clojure be in this brave, new world? It's hard to predict the future, but I think Clojure has a few super powers that make it particularly well suited: • Clojure is data and information oriented: LLMs thrive off of good data and flail with bad data. Being data oriented makes it easier to get the most out of LLMs. • Clojure is designed to be hosted: Clojure already has access to the python ecosystem through libpython-clj. Further, if deeper integration makes sense, it's possible to write a dialect that specifically targets the python platform. Another benefit of being hosted is that makes it easier to take Clojure to new devices. We have computers in our desktops, laptops, phones and we'll soon be getting them in our cars, tvs, fridges, etc. If you want programs that go anywhere, then a language that's designed to be hosted is a good start. • Clojure is stable: The more programs we have, the more important it is to build on something stable. Folks might be used to replacing their phone every few years, but I think folks aren't going to be happy about replacing every device or appliance (which now have computers) every few years. If you're building on top of an ecosystem that requires rewrites every 6 months, then your old code and old devices will quickly stop working. Finally, I'll leave you with one of my favorite Clojure talks which seems prescient, https://www.youtube.com/watch?v=ShEez0JkOFw.
My day job is python. I use chatgpt for a few things (mainly through the api because its cheaper) 1. Generating peripheral code thats not business specific, generic algorithms etc 2. Understanding industry jargon specific to the business domain Im operating 3. Quick catchups on concepts I have forgotten about Chatgpt has so far been poor at useful suggestions in simplifying business logic etc, and if code gets long the errors add up and its not worth the time to debug. Overall I dont see AI being close to replacing the core value of programmers, but it does eliminate alot of so called junior programmer work where they mainly write code for stuff thats peripheral to the business. I would expect it to impact Clojure even less since Clojure makes writing peripheral code less of a chore.
@U7RJTCH6J in a sense, I agree with you. I never understood why people that CSS, for example, is not "programming". I mean, if you didn't do the layout or the animation in CSS, you would need to do in Javascript, and then it's programming? Just because you changed the way you do it? The same for Excel - I once solved a huge problem for my wife using some Google Drive spreadsheets and a 15-line script that sent her an e-mail with a reminder. Was this not programming? It's weird when people "gatekeep" the concept...
LLMs don't have the capability of thinking, they are just probabilistic models. Our industry is all about thinking. BTW, current AIs are automating plagiarism, which I think it is not right.
Thoughts from a beginner, rephrased a little by chatgpt and edited by myself: • 1. Clojure's Unique Stand: While Clojure has its roots in Lisp, it's more than just another Lisp variant. It's evident that a lot of thought has gone into its design. Many of us are drawn to Clojure not merely because of its features, but due to its underlying design philosophy. Language features are cheap now, the most modern languages offer multi-paradigm capabilities. It's the core philosophy that sets Clojure apart, making it, in my opinion, even superior to Scheme. • 2. The Role of Tools: While tools play a crucial role in problem-solving, they aren't the sole factor. It's the underlying design and principles of a tool that make it universally applicable across various domains. This recurring theme in history suggests that. • 3. Perspective on AI: To me, AI represents a new layer of abstraction. Many questions you need to rethink of them. Like concurrency, availability, those questions exists always in the real world, which aren't unique in programming. So I believe Clojure will reborn in a new form in the new era, since it is a "Hosted language".
AI is very far from replacing programmers in any language. It is closer to saving some time, e.g. if you need help writing documentation, searching, have better developer tooling, etc. There is argument of writing functions, but it's more in boilerplate languages, that already have toolings/frameworks for that. And even if AI would be good at producing running code, the barrier for that is already not so high for anyone to do. On the same premisis you could speak (and people were speaking) about google, stackoverflow, companies that make software for creating apps from simple blocks etc. The real development that is improtant for business is in making software that is reliable and flexible for changes over long periods of time, which AI in current form is really far, far away. So not only clojure don't have to worry, but other languages also. We are so far technologically from point to worry about it, that we should more worry about our energy use and if we will be able in those X years to have resources to use computers so much and all the infrastructure comming with it.
I personally don't use it for clojure. Clojure is so familiar to me and I write code with extreme intention that anything else spitting out code is just going to be a distraction for me and nothing is going to be the way I want. I've seen that over the years people have such a diverse experience and range of getting work done through programming that someone writing a post explaining and asserting their opinion on a particular subject is probably very useful in their current context but doesn't apply to all domains.
> have better developer tooling That's where it scares me. It can create the "Framework Curse" that Ruby, for example, have - where a single tool is recommended over and over again, and the simple existence of alternatives is buried
I didn't say it in the Flexiana interview but I have said it here in past AI/LLM discussions (and where the topic has cropped up elsewhere): I think AI/LLMs have the potential to eliminate some programming jobs in the "enterprise"... I'm talking about large companies that have huge numbers of (poor to mediocre) programmers and treat them as fungible resources. Those 9-to-5 warm bodies that fill seats and who mostly write boilerplate code into large framework-based apps -- that stuff can often be automated, with just a sprinkling of senior programmers to fill in the gaps. Anecdote: many, many years ago, a friend of mine who wasn't a very good programmer got a job at a company that heavily used Visual C++ and they built their apps mostly via a drag'n'drop UI so their actual "programming" work was usually just filling in the body of a single method. He'd been working there for about six months, happily dragging'n'dropping widgets into the company app and writing a few basic lines of code here and there, and then he got a ticket where the body of the function would be a bit more complex, so he came over to my place to ask for my help. I was shocked that he'd managed to do a "programming job" for six months but couldn't write what was a fairly simple function to manipulate some external API! He showed me some of the code in the application, "written" by this huge team he was on -- it was awful because it was nearly all generated for them by the visual tooling, and the pieces the programmers had actually written themselves were also mostly awful because almost the entire team was pretty low-skilled. But that was how the company had operated for years: programmers were just "cheap, interchangeable resources", moved around as priorities changed... ...I think that sort of company could use AI/LLMs to eliminate a lot of the warm bodies because so much of the codebase was generated boilerplate already. But they'd still need "a sprinkling of senior programmers" to write the harder parts.
I also agree with you, @U04V70XH6 - being on the same boat, I had my share of people that basically made "random changes until the code compiled" and then, git commit
, open a PR, squash everything (so any info on why that variable was being initialized with -2
was lost to time) and that was it...
Then, when the company decided that we had to write tests, they basically wrote tests just to fulfill that requirement - in the end, every single change meant deleting all tests and writing them again, because they were too tied to internal data structures, full of mocks, and everything else. I could see these AI tools replacing some of these people, for better or worse...