Fork me on GitHub
#off-topic
<
2022-12-21
>
eggsyntax00:12:25

I found this analysis insightful, from one of the couple of tech industry analysts I enjoy reading: https://www.thediff.co/archive/how-companies-think-about-layoffs Of note: > Layoffs.fyi shows the kind of growth trend you never want to see: over the course of 2021, they tracked a total of 15,000 or so tech layoffs. This year, the number is 151,648.

Martynas Maciulevičius06:12:27

There is some analysis like numbers, but overrationalization as in "restructuring for growth" is straight-up copium... If you got f-ed because of your mortgage then what growth are we talking about 😄 It's not individual growth. Also what about those employers who laid off 100% of staff? How does it grow from 0% of staff? This piece is probably directed to the upper management who decided about the layoffs. This way they don't need to think about doing anything bad, it's just "correction". https://i.kym-cdn.com/entries/icons/facebook/000/035/699/pepe.jpg

eggsyntax15:12:53

I found it useful as a window into the business's perspective on layoffs; I've definitely tended to see it from the perspective of an individual employee. > Also what about those employers who laid off 100% of staff? How does it grow from 0% of staff? That...doesn't sound like the typical case?

Martynas Maciulevičius15:12:33

102 companies laid off 100%, then (including those) 236 companies laid off above 40% of their workers. In total it has 1833 company information. So... 236/1833=12.88% of companies in that statistic got hit really hard. And for others... alright, maybe 38% correction is still a lot but to some degree this article could apply to them. The link to the statistic is in your article: https://layoffs.fyi/ The screenshot shows the row 102 with the last company that has 100% staff layoff:

pez13:12:54

A young friend of mine studies digital creation, photography, movie making, etcetera. He works mostly in Blender and Davinci, iiuc. He knows nothing (and cares not) about building PCs. His dad wants to buy him a computer, budget about €/$ 2,000, maybe tops 2,500. What are some options/considerations, would you say?

thomas13:12:23

Good graphics card?

thomas13:12:53

I think most people go with nVidia... especially on windows

thomas13:12:15

and lots of RAM I'd say. more disk can be added later... 1TB USB drive is €70 or so the last time I checked.

pez14:12:05

Thanks. I wonder if there's any pre-built configs. If this was for me I would build it, but here that is not really an option.

thomas14:12:37

I went to my local computer shop and specified what I wanted and they made a proposal for me. Took a few days (maybe 2 weeks, can't remember, was 2.5 years ago)

Mno15:12:30

Nvidia is highly recommended for the gpu for 3d design apps, because some of these apps are terribly unoptimized for amd

Mno15:12:12

I don't recall if blender or davinci specifically had any issues.

Mno15:12:36

Previous gen gpu maybe 3070, and 32gb of ram will probably be achievable, CPU i think I'd go ryzen, but if you see anything heavily discounted I'd go for that. Lower end of the new gen (for amd) is maybe worth it for upgrading down the line since they changed socket this generation.

Mno15:12:23

Im not super versed with dv resolve, so I do not know what it uses. That being said I'm sure almost any 2k gaming computer pre-built will probably do a dang good job.

👍 1
pez15:12:06

Awesome. Thanks, both of ya!

timo17:12:59

for linux maybe consider system76

Martynas Maciulevičius20:12:20

Gaming computers may not fit because they render from that same card while you would want to dedicate that card for the workload. This is how I burned when I bought a laptop that they said "it's for machine learning" but it actually owns the ports and I can't not use NVidia's GPU for rendering if I want to connect any screen. I don't know how it would work on a workstation but it's still taking resources of the card that could be dedicated for the workload. In gaming that wins a couple of FPS but IMO that isn't useful if doing non-gaming things. Maybe rendering is different and I'm thinking about machine learning too much here. But even in rendering it will still render into RAM/HDD and only later display the final output (I have no idea if it's this way) :thinking_face:

Mno20:12:42

Full sized cards do not have this issue, as far as I know, it can handle it just fine (a lot of desktop cpus do not contain integrated graphic and they're stable). Laptop gpus have limited power and cooling and tend to die because of this. They're just generally more flaky, unfortunately. Then again I'm only a hardware hobbyist, please take the opinion of an expert over mine.

1
Martynas Maciulevičius20:12:23

Laptop can work in power-save mode if it renders the UI from the CPU graphics. And then you'd activate the GPU when you actually want, as if it would be a plug-in. But my laptop must run the GPU full-time. I bought my laptop from TuxedoComputers and it's basically the System76 but in Europe. So don't buy GPU laptops from System76 or make sure that MUX circuitry allows for this power saving. My laptop lasts a couple of hours but the battery is 97 Watts. This is an exceptionally power-hungry machine (but yes, it packs a punch for sure (at least in laptop league (I think it was pretty good on geekbench: https://browser.geekbench.com/search?utf8=%E2%9C%93&amp;q=5900hx+stellaris))) Edit: Also it's a noisy one if turned to max power (I think System76 and TuxedoComputers actually sell the same laptops from Tongfang and Clevo, and those are known to be noisier (mine is a Tongfang one)). So I had to adjust the fan speeds and throttle it. But it has more power than I use as I don't allow the temperatures to ramp up.

1