this post was submitted on 13 May 2025
276 points (100.0% liked)

TechTakes

1851 readers
497 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] frezik@midwest.social 9 points 21 minutes ago (1 children)

The general comments that Ben received were that experienced developers can use AI for coding with positive results because they know what they’re doing. But AI coding gives awful results when it’s used by an inexperienced developer. Which is what we knew already.

That should be a big warning sign that the next generation of developers are not going to be very good. If they're waist deep in AI slop, they're only going to learn how to deal with AI slop.

As a non-programmer, I have zero understanding of the code and the analysis and fully rely on AI and even reviewed that AI analysis with a different AI to get the best possible solution (which was not good enough in this case).

What I'm feeling after reading that must be what artists feel like when AI slop proponents tell them "we're making art accessible".

[–] Pixel_Crafter@lemm.ee 3 points 9 minutes ago

As an artist, I can confirm.

[–] swlabr@awful.systems 15 points 1 hour ago (1 children)

The headlines said that 30% of code at Microsoft was AI now! Huge if true!

Something like MS word has like 20-50 million lines of code. MS altogether probably has like a billion lines of code. 30% of that being AI generated is infeasible given the timeframe. People just ate this shit up. AI grifting is so fucking easy.

[–] froztbyte@awful.systems 3 points 1 hour ago (1 children)

yeah, the "some projects" bit is applicable, as is the "machine generated" phrasing

@gsuberland pointed out elsewhere on fedi just how much of the VS-/MS- ecosystem does an absolute fucking ton of code generation

(which is entirely fine, ofc. tons of things do that and it exists for a reason. but there's a canyon in the sand between A and B)

[–] swlabr@awful.systems 3 points 38 minutes ago (1 children)

All compiled code is machine generated! BRB gonna clang and IPO, bye awful.systems! Have fun being poor

[–] frezik@midwest.social 3 points 19 minutes ago (1 children)

No joke, you probably could make tweaks to LLVM, call it "AI", and rake in the VC funds.

[–] froztbyte@awful.systems 2 points 16 minutes ago (1 children)
[–] froztbyte@awful.systems 3 points 16 minutes ago (1 children)

(not in the compute side, but in the lying-obstructionist hustle side)

[–] swlabr@awful.systems 1 points 3 minutes ago

would I happier if I abandoned my scruples? I hope I or nobody I know finds out.

[–] BlueMonday1984@awful.systems 22 points 2 hours ago

Baldur Bjarnason's given his thoughts on Bluesky:

My current theory is that the main difference between open source and closed source when it comes to the adoption of “AI” tools is that open source projects generally have to ship working code, whereas closed source only needs to ship code that runs.

I’ve heard so many examples of closed source projects that get shipped but don’t actually work for the business. And too many examples of broken closed source projects that are replacing legacy code that was both working just fine and genuinely secure. Pure novelty-seeking

[–] TheObviousSolution@lemm.ee 5 points 1 hour ago* (last edited 1 hour ago) (7 children)

Had a presentation where they told us they were going to show us how AI can automate project creation. In the demo, after several attempts at using different prompts, failing and trying to fix it manually, they gave up.

I don't think it's entirely useless as it is, it's just that people have created a hammer they know gives something useful and have stuck it with iterative improvements that have a lot compensation beneath the engine. It's artificial because it is being developed to artificially fulfill prompts, which they do succeed at.

When people do develop true intelligence-on-demand, you'll know because you will lose your job, not simply have another tool at your disposal. The prompts and flow of conversations people pay to submit to the training is really helping advance the research into their replacements.

load more comments (7 replies)
load more comments
view more: next ›