this post was submitted on 17 Aug 2024
277 points (98.3% liked)

Ask Lemmy

26270 readers
1572 users here now

A Fediverse community for open-ended, thought provoking questions


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 1 year ago
MODERATORS
 

I've seen reports and studies that show products advertised as including / involving AI are off-putting to consumers. And this matches what almost every person I hear irl or online says. Regardless of whether they think that in the long-term AI will be useful, problematic or apocalyptic, nobody is impressed Spotify offering a "AI DJ" or "AI coffee machines".

I understand that AI tech companies might want to promote their own AI products if they think there's a market for them. And they might even try to create a market by hyping the possibilities of "AI". But rebranding your existing service or algorithms as being AI seems like super dumb move, obviously stupid for tech literate people and off-putting / scary for others. Have they just completely misjudged the world's enthusiasm for this buzzword? Or is there some other reason?

top 50 comments
sorted by: hot top controversial new old
[–] slazer2au@lemmy.world 110 points 1 month ago* (last edited 1 month ago) (3 children)

Hype brings investment money to the table. When an emerging technology appears, you can say we are looking to develop those technologies into our existing products and you will see a bump up in your share price.

After a few years of failed products and the hype dies for the next thing you can never mention the old hype but keep the bump in share price.

Think about 5-7 years ago, Blockchain was all the hype, 5-7 before then was Machine Learning and XaaS, before that was Big Data.

[–] Ephera@lemmy.ml 21 points 1 month ago (1 children)

Yeah, investors kind of amplify hype. When there is hype, you will have some investors investing money.
If there's investors investing money, it makes sense for other investors to try to invest first, so that their invested money gains value (the share price rises).
And then it becomes somewhat of a self-fulfilling prophecy, because suddenly you do have companies equipped with money to pursue that hype, which can feed back into the hype.

But similarly, you'll eventually reach a point where it does not live up to the inflated hype and then shareholders can just as well be extremely quick to pull out their money and amplify the crash.

[–] slazer2au@lemmy.world 6 points 1 month ago

Investors also know not every product will sell. So pad the bet and spread wide to increase your chances to score big.

load more comments (1 replies)
[–] jjjalljs@ttrpg.network 69 points 1 month ago (1 children)

My understanding is that a lot of venture capitalist funding is driven by gut feel and personal connection. Like, they'll tell you that they're the vanguard of the future with a vision, but most of the time they're just cliquey bros going "dude, sick" and burning money.

There's an anecdote in the book "the cold start problem" about how zoom got funding even though the guys funding it thought it was a solved problem, that a new video company wouldn't go anywhere, but the zoom guy was their bro so they gave him millions of dollars.

I feel like it's possible some future will look back at this the way we look at feudalism. Just like, that's such a bad system , why did people put up with it?

load more comments (1 replies)
[–] betterdeadthanreddit@lemmy.world 51 points 1 month ago (2 children)

Suits heard about this secret sauce called AI that can cut down on the need for those pesky humans that are always looking for handouts and luxuries like a living wage and benefits. The consumer will have to accept it when the only choices they're offered are varying flavors of the same shit.

[–] androogee@midwest.social 17 points 1 month ago (1 children)

(but also it doesn't work and they are, in fact, just dumb)

[–] ivanafterall@lemmy.world 13 points 1 month ago* (last edited 1 month ago)

And then when it doesn't work, you blame other people and fire humans anyway and give yourself a raise for saving the company money. Stock prices rise.

load more comments (1 replies)
[–] jet@hackertalks.com 41 points 1 month ago

Attracts investors.

When people are evaluating companies, and see a company missing out on the current trend, how is that going to factor into their valuation of the stock prices?

[–] SeikoAlpinist@slrpnk.net 38 points 1 month ago

It hypes investors. Investors are the customers.

[–] TheHobbyist@lemmy.zip 31 points 1 month ago (1 children)

Because the boss thinks it sounds cool and doesn't want to be the only kid in the block without an AI product to sell.

load more comments (1 replies)
[–] Just_Pizza_Crust@lemmy.world 31 points 1 month ago (1 children)
  1. OpenAI struck gold, NVIDIA followed suit, and everyone else bought shovels hoping to get investors even though they have no plans on striking gold (developing useful AI).

  2. Would you like to buy a timeshare to the moon? If we all buy, you'll be able to sell your spot for 10x the price! Don't wait! Spots are limited!

[–] JASN_DE@lemmy.world 27 points 1 month ago (2 children)

Nvidia is the biggest shovel seller out there.

[–] Just_Pizza_Crust@lemmy.world 12 points 1 month ago

Nvidia sells the hardware (shovels), but also develops portions of the software to make it run more efficiently, like OpenAI. Nobody else but Microsoft seems to be actually developing software, though AMD is slowly working towards having comparable performance.

[–] Phen@lemmy.eco.br 4 points 1 month ago

We kinda need to adapt the saying now. When someone finds gold, you need to sell wood and iron for all the shovel makers that will show up.

[–] gencha@lemm.ee 27 points 1 month ago

Money was already spent. The hype companies were backed by big capital in their early days. Now the people who provided that capital want to cash out and they want their winnings. So you will have AI shoved down your throat on every piece of media channel those people also own. AI is a hype term that appears periodically since before the 2000s. This is nothing new. https://en.m.wikipedia.org/wiki/AI_winter

LLMs are toys that sparkle for a brief moment. Their value is laughable compared to their cost.

[–] pH3ra@lemmy.ml 26 points 1 month ago

They believe that demand and offer in the market is an egg and chicken situation, so right now they're force feeding us the offer waiting for the demand to adapt

[–] abigscaryhobo@lemmy.world 25 points 1 month ago

A lot of business people also think that AI is a "force multiplier" meaning that if they use it they can get more done in less time. Anything that can do that is basically a money printer at the business level, which is why all these execs and companies are so excited about it.

The problem is it's not or at least not reliably proven to be so. All these companies are jumping on board thinking "shove some AI in there and get 20% growth" when in reality there's no backing behind it working like that. And that's why a lot of customers are turned off, because from the consumer side, AI is just sloppy unoriginal junk. But on the business side they just see "Productivity is up" never mind that the productivity is garbage quality.

[–] JustZ@lemmy.world 19 points 1 month ago (1 children)

It was super cool for like three weeks. Now it's the gambler's fallacy they're hanging on to.

load more comments (1 replies)
[–] Linkerbaan@lemmy.world 19 points 1 month ago

It attracts investors in the company not customers.

[–] Honytawk@lemmy.zip 18 points 1 month ago

Because it attracts shareholders

[–] EnderMB@lemmy.world 18 points 1 month ago (1 children)

It's a symptom of shareholder-driven development.

Many companies pushing AI have had huge layoffs, and haven't launched anything worthwhile in years. Many of these companies have a metric fuck-ton of data, and already do some kind of AI (they probably have had LLM's for years too). This way, they can spend money and make it look like they're doing groundbreaking stuff to ensure shareholders are happy.

They'll continue to do stealth layoffs of people outside of AI, until the hype dies down, and they'll move to the next grift - after laying off all of their AI folks.

load more comments (1 replies)
[–] Moah@lemmy.blahaj.zone 18 points 1 month ago

It's a "don't want to miss the ship" thing where companies have to invest in whatever's trending in case it becomes successful and gives you an advantage. If they wait until it's proven, they might miss a competitive advantage (having to start learning after others). In the case of AI it's even more important since the promise sounds actually useful (the summarize anything quickly bit at least), unlike, say, NFT. At least that's kind of how it got explained to me at one of my jobs.

[–] Zak@lemmy.world 18 points 1 month ago

Think like a venture investor.

A small chance of huge growth via new technology can have a big payoff. They expect most companies to fail and are more worried about missing an opportunity than losing money in a single bad investment.

Nobody is quite sure where AI technology will be in ten years, but if it's big, it's going to make people who got in early very rich. It doesn't matter that it sucks now; the web sucked in 1995, but it made people who got in (and out) at the right time very rich.

[–] PanArab@lemm.ee 17 points 1 month ago (1 children)

For the same reason blockchain was hyped for a while.

[–] craigers@lemmy.world 5 points 1 month ago

Oh dude but so many great and useful applications came out of blockchain that are completely unrelated to crypto... /s

[–] Fedizen@lemmy.world 16 points 1 month ago
  1. Investor FOMO

  2. Threatening Labor

[–] JackbyDev@programming.dev 15 points 1 month ago (1 children)

Shareholders and investors are more profitable than customers.

load more comments (1 replies)
[–] peopleproblems@lemmy.world 14 points 1 month ago

I was discussing this with a friend. We came to the conclusion that "entrepreneur" means "unskilled, uneducated and unable to work" and that the harder a product is marketed, the more worthless it is.

[–] savvywolf@pawb.social 13 points 1 month ago (3 children)

I personally think people were "burned" by the whole NFT situation. During the NFT "hype" a year or two ago, a lot of companies were slow to get on board with releasing NFT products, and so they missed the bubble entirely. NFTs are, of course, silly but if they did take off, companies would have loved to have been part of the boon.

Fast forward to now and you have AI bros shilling AI in the same way cryptobros were shilling NFTs. However, this time it's different! They have results, they have technology. Microsoft is on board! They have fancy tech demos which are not staged at all! If you didn't have experience with the technology and limitations, you would be lead to believe that this is the same as the NFT bubble, but it's actually going to be a real technology rather than snake oil.

I think there's also the issue that it takes a long time to bring a product to market. Imagine you've spent millions developing software and hardware for your AI coffee machine or whatever and it turns out that there's no market demand. You can't really turn to your stakeholders and say "oops, we made a mistake and have to cancel this product. Sorry!", you have to finish the product and try to recoup losses where you can. That's why there's all these weird posts advertising AI products - they can't just not release a product, and AI bros might be tempted to buy it.

I also wonder if the whole AI hate is bias due to us being here on mastodonlemmy... We tend towards fairly cynical people who are critical towards new technology and corporations. Maybe actual consumers who aren't online all day and clued into the tech scene are wowwed by AI. I've certainly seen people here casually remark that they use ChatGPT and Copilot.

[–] Lemminary@lemmy.world 15 points 1 month ago

I also wonder if the whole AI hate is bias due to us being here on mastodonlemmy…

Yeah, we're cynical but we have every right to be.

I use ChatGPT, Copilot, and image generators for different things and I'm generally not on board with the blind hate because it's been nice to have an assistant that can do all these manial things. But honestly, I've gotten mixed results and don't see this tech correcting its obvious problems. The latest ChatGPT-4o release was great with its web browser, images, and speech, but it still struggles with accuracy to a tangible degree. Or worse, other companies use it for the wrong things as a cash grab to change perfectly working products. Even the applications that do seem perfect for it are not.

For example, I can't get Gemini to answer anything but simple questions about Google Docs without it getting confused and repeating the same thing. Copilot will sometimes reach conclusions wildly different from the sources it cites. ChatGPT will give you suboptimal code samples, be subtly wrong about the meaning of words in other languages, or suddenly forget part of my instructions. And now people are adding it to the fucking coffee machine for crying out loud. I'd have a different opinion if it were more accurate most of the time and genuinely useful, but using it more often only cements it in my mind as a secondary productivity tool rather than the main feature.

I hope the hype dies down and AI is seen as an afterthought enhancement rather than a stupid selling point. Anybody selling AI now looks clueless to me.

[–] conciselyverbose@sh.itjust.works 4 points 1 month ago* (last edited 1 month ago) (2 children)

You can't really turn to your stakeholders and say "oops, we made a mistake and have to cancel this product. Sorry!", you have to finish the product and try to recoup losses where you can.

You can and should. You're describing sunk cost fallacy, which is pretty close to universally understood as a terrible money vacuum of a flaw in our reasoning. (I would have made this comment if I hadn't read Quit literally yesterday, but it really is an excellent book about the value of abandoning bad decisions when new information makes it clear that they're bad decisions.) Buying time and raising expectations with a dead end nonsense tech might be better 6 months from now, but 5 years from now, being the guy that saw the writing on the wall that the continued investment was lighting money on fire will leave you better off.

LLMs have limited applications, and will in the future, but nowhere near enough to warrant the obscene amount of resource burn companies are spending to get there.

I also wonder if the whole AI hate is bias due to us being here on mastodonlemmy... We tend towards fairly cynical people who are critical towards new technology and corporations

Corporations, sure, but tech? Anti-tech people aren't early adopters of new tech products. Early adopters are just generally more aware of the actual shape of the field than people jumping on hype trains once they've already started moving.

load more comments (2 replies)
load more comments (1 replies)
[–] originalfrozenbanana@lemm.ee 13 points 1 month ago

The things that make a company successful are not the same as the things that make executives successful

[–] nednobbins@lemm.ee 11 points 1 month ago (1 children)

A lot of people have come to realize that LLMs and generative AI aren't what they thought it was. They're not electric brains that are reasonable replacements for humans. They get really annoyed at the idea of a company trying to do that.

Some companies are just dumb and want to do it anyway because they misread their customers.

Some companies know their customer hate it but their research shows that they'll still make more money doing it.

Many people that are actually working with AI realize that AI is great for a much larger set of problems. Many of those problems are worth a ton of money; (eg. monitoring biometric data to predict health risks earlier, natural disaster prediction and fraud detection).

[–] Tar_alcaran@sh.itjust.works 7 points 1 month ago (1 children)

Many people that are actually working with AI realize that AI is great for a much larger set of problems. Many of those problems are worth a ton of money; (eg. monitoring biometric data to predict health risks earlier, natural disaster prediction and fraud detection

None of those are LLMs though, or particularly new.

load more comments (1 replies)
[–] auzy@lemmy.world 11 points 1 month ago* (last edited 1 month ago)

Because whilst technical people know it has limited applications (like blockchain), business people tend to fall for buzzwords easily because they don't realise a lot of the things it does were solved in other ways

[–] qx128@lemmy.world 10 points 1 month ago* (last edited 1 month ago)

I see two basic reasons.

  1. it gives companies plausible argument to embed telemetry into their products. Should your TV manufacturer or coffee maker manufacturer be able to monitor every single button you press on your device? Probably not, but they would like to “because AI”! Now they have an excuse to be as invasive as they want, “to serve you better”. The dream - for them - would be total surveillance of your habits to sell you more shit. Remember, it always comes back to money.

  2. The old adage never fails: if it’s free, you are the product. Imagine AI being so pervasive, that now everywhere you look, everything you interact with can subtly suggest things. It doesn’t have to be overt. But if AI can nudge the behavior of the masses to do a thing, like buy more soda, or favor one brand over another, then it has succeeded in boosting company bottom line. Sure the AI can do useful shit for you, but the true AI problem companies want to solve is “say or do the right shit to influence this consumer to buy my thing”. You are the target the AI is operating on. And with billions of interactions and tremendous training, it will find the optimal way to influence the masses to buy the thing.

[–] linearchaos@lemmy.world 10 points 1 month ago* (last edited 1 month ago)

AI is the new ad driven model.

Everything that AI touches, ends up machine learning content.

AI DJ? I now have your name your email address and every single taste you have in music. As you use the app I will gain more insight into more music that you are or might be interested in.

That roomba thats running around your house looking for socks and cables not to run over, is also image processing on everything in your house. We know how big your house is they probably know how big my TV is.

They're not just farming your email and text messages to figure out what to sell you they know at a core intimate level what you're interests are.

They're in for a rude awakening in a few years. All of this AI information gathering is a bubble. You have companies like anovo complaining that they can't afford to host a single website. All this AI training is not cheap and the return on investment is not great after the initial plunge right?

[–] andrew_bidlaw@sh.itjust.works 9 points 1 month ago (1 children)

It delights me to lie to myself that they are nervous someone somewhere would pick a golden ticket with their AI application and they'd miss out. But more obvious explanation for big corpos is that they hide problematic data-mining, content appropriation, ad personalization and other stuff behind this curtain, maybe not for these crude tools alone, but to force a precedent into existence that they can do it whenever they like in the future. They make you give up your personal stonks for a shiny penny that is corporate LLM genies and they probably pay a lot to showcase their beauty at loss.

load more comments (1 replies)
[–] Jocker@sh.itjust.works 8 points 1 month ago

AI normalising data collection

[–] TrippyHippyDan@lemmy.world 7 points 1 month ago (1 children)

It's all a sunk-cost fallacy. They've dumped all of this money into it, so therefore they have to double down on it.

Especially if they're trying to get a bunch of money from Wall Street and other investors.

The biggest contributor being all of these companies believe they can now just lay off a bunch of workers and make up the difference with these LLMs even though they are not at all a replacement for humans.

Less workers, less people that have to pay, and more money can be funneled to the top.

load more comments (1 replies)
[–] Anticorp@lemmy.world 7 points 1 month ago* (last edited 1 month ago)

I don't know if it's AI specific or just connected devices that people are starting to avoid. I don't want anything with the word "smart" in it. I don't want anything that requires an app, and I try to avoid things that require an account. I think people are fucking finally figuring out that all this shit is designed to spy on you for profit, and will probably eventually require a subscription if it doesn't already. It's a bullshit business model. Make a fucking decent product, and sell it to people for a flat fee, and they'll be your customers for life. Stop trying to be a data company, or a subscription company.

As for the second part of your question, companies are pushing it so hard because it's like having a money printing press. They can turn a few months worth of work into an endless stream of money.

[–] Don_alForno@feddit.org 6 points 1 month ago

If my workplace is in any way representative, it's because decisions are made by close to retirement out of touch old geezers who want to virtue signal very hard that they are not out of touch old geezers. So they push the "new thing" for lack of any actually innovative ideas of their own. Then, when the younger team members who do have some rough knowledge of the "new thing" try to explain why it might be a bad idea, they call them afraid of progress and double down on the "new thing" even harder.

[–] angelmountain@feddit.nl 6 points 1 month ago

Just dumb. In the case of the company i work for at least.

[–] homesweethomeMrL@lemmy.world 6 points 1 month ago

They think they’ll get money.

Is why.

Unless a hyped-up investor gives it to them, they won’t.

[–] hanabatake@lemmy.ml 5 points 1 month ago (1 children)

Two mains reasons:

Attracting investors

Attracting talented workers by signaling they are doing technical research

Also, people working in the industry might not even use those products. They want a cool job not a cool product

[–] HurlingDurling@lemmy.world 10 points 1 month ago

As a tech worker, it's more towards attracting investors.

[–] Grofit@lemmy.world 4 points 1 month ago

AI has some useful applications, just most of them are a bit niche and/or have ethical issues so while it's worth having the tools and functionality to do things, no one can do much with them.

Like for example we pretty much have AIs that could generate really good audio books using your favourite actors voi e likeness, but it's a legal nightmare, and audio books are a niche already.

In game development being able to use AI for texture generation, rigging, animations are pretty good and can save lots of time, but it comes at the cost of jobs.

Some useful applications for end users are things like noise removal and dynamic audio enhancement AIs which can make your mic not sound like you are talking from a tunnel under a motorway when in meetings, or being able to do basic voice activation of certain tools, even spam filtering.

The whole using AI to sidestep being creative or trying to pretend to collate knowledge in any meaningful way is a bit out of grasp at the moment. Don't get me wrong it has a good go at it, but it's not actually intelligent it's just throwing out lots of nonsense hoping for the best.

load more comments
view more: next ›