this post was submitted on 04 Jun 2024
1823 points (96.5% liked)

linuxmemes

21180 readers
1009 users here now

Hint: :q!


Sister communities:


Community rules (click to expand)

1. Follow the site-wide rules

2. Be civil
  • Understand the difference between a joke and an insult.
  • Do not harrass or attack members of the community for any reason.
  • Leave remarks of "peasantry" to the PCMR community. If you dislike an OS/service/application, attack the thing you dislike, not the individuals who use it. Some people may not have a choice.
  • Bigotry will not be tolerated.
  • These rules are somewhat loosened when the subject is a public figure. Still, do not attack their person or incite harrassment.
  • 3. Post Linux-related content
  • Including Unix and BSD.
  • Non-Linux content is acceptable as long as it makes a reference to Linux. For example, the poorly made mockery of sudo in Windows.
  • No porn. Even if you watch it on a Linux machine.
  • 4. No recent reposts
  • Everybody uses Arch btw, can't quit Vim, and wants to interject for a moment. You can stop now.

  • Please report posts and comments that break these rules!

    founded 1 year ago
    MODERATORS
     
    you are viewing a single comment's thread
    view the rest of the comments
    [–] rbn@sopuli.xyz 109 points 5 months ago (3 children)

    If you consider only the RAM on the developers' PCs maybe. If you count in thousands of customer PCs then optimizing the code outperforms hardware upgrades pretty fast. If because of a new Windows feature millions have to buy new hardware that's pretty desastrous from a sustainability point of view.

    [–] vithigar@lemmy.ca 46 points 5 months ago

    But that's just more business!

    [–] huginn@feddit.it 13 points 5 months ago (2 children)

    Last time I checked - your personal computer wasn't a company cost.

    Until it is nothing changes - and to be totally frank the last thing I want is to be on a corporate machine at home.

    [–] CosmicTurtle0@lemmy.dbzer0.com 10 points 5 months ago (2 children)

    When I was last looking for a fully remote job, a lot of companies gave you a "technology allowance" every few years where they give you money to buy a computer/laptop. You could buy whatever you wanted but you had that fixed allowance. The computer belonged to you and you connected to their virtual desktops for work.

    Honestly, I see more companies going in this direction. My work laptop has an i7 and 16GB of RAM. All I do is use Chrome.

    [–] huginn@feddit.it 10 points 5 months ago (1 children)

    It'd be nice to have that - yeah. My company issued me a laptop that only had 16gb of RAM to try and build Android projects.

    Idk if you know Gradle builds but a multi module project regularly consumes 20+GB of ram during a build. Despite the cost difference being paid for in productivity gains within a month it took 6 months and a lot of fighting to get a 32gb laptop.

    My builds immediately went from 8-15 minutes down to 1-4.

    [–] CosmicTurtle0@lemmy.dbzer0.com -2 points 5 months ago (1 children)

    I always felt that this is where cloud computing should be. If you're not building all the time, then 32GB is overkill.

    I know most editing and rendering of TV shows happen on someone's computer and not in the cloud but wouldn't it be more efficient to push the work to the cloud where you can create instances with a ton of RAM?

    I have to believe this is a thing. If it isn't, someone should take my idea and then give me a slice.

    [–] huginn@feddit.it 4 points 5 months ago (1 children)

    It's how big orgs like Google do it, sure. Working there I had 192gb of ram on my cloudtop.

    That's not exactly reducing the total spend on dev ram though - quite the opposite. It's getting more ram than you can fit in a device available to the devs.

    But you can't have it both ways: you can't bitch and moan about "always on internet connections" and simultaneously push for an always on internet connected IDE to do your builds.

    I want to be able to work offline whenever I need to. That's not possible if my resource starved terminal requires an Internet connection to run.

    Ram is dirt cheap and only getting cheaper.

    [–] bufalo1973@lemmy.ml 2 points 5 months ago

    "Use cloud if available"?

    [–] Ardyssian@sh.itjust.works 3 points 5 months ago (1 children)

    Alternatively they could just use Windows VDI and give you a card + card reader that allows Remote Desktop Connection to avoid this hardware cost, like what my company is doing. Sigh

    [–] cmnybo@discuss.tchncs.de 2 points 5 months ago

    If the job is fully remote, then the workers could be living on the other side of the country. Using remote desktop with 100ms of latency is not fun.

    [–] trollbearpig@lemmy.world -1 points 5 months ago* (last edited 5 months ago)

    Or maybe you could actually read the comment you are replying to instead of being so confrontational? They are literally making the same point you are making, except somehow you sound dismissive, like we just need to take it.

    In case you missed it they were literally saying that the fact that the real cost of running software (like the AI recall bullshit) is externalized to consumers makes companies don't give a shit about fixing this. Like literally the same you are saying. And this means that we all, as a society, are just wasting a fuck ton of resources. But capitalism is so eficient hahaha.

    But come on man, you really think that the only option is for us to run corporate machines in our homes? I don't know if I should feel sorry about your lack of imagination, or if you are trying to strawman us here. I'm going to assume lack of imagination, don't assume malice and all that.

    For example, that's what simple legislation could do. For example, lets say I buy an cellphone/computer, then buy an app/program for that device, and the device has the required specifications to run the software. The company that sold me that software should be obligated by law to give me a version of the software that runs in my machine forever. This is not a lot to ask for, this is literally how software worked before the internet.

    But now, behind the cover of security and convenience, this is all out of the window. Each new windows/macos/ios/android/adobe/fucking anything update asks for more and more hardware and little to no meaningful new functionality. So we need to keep upgrading and upgrading, and spending and spending.

    But this is not a given, we can do better with very little sacrifices.

    [–] kibiz0r@midwest.social 4 points 5 months ago

    As a developer, my default definition of “slow” is whether it’s slow on my machine. Not ideal, but chimp brain do chimp brain things. My eyes see my own screen all day, not yours.