this post was submitted on 19 Dec 2023
1300 points (95.0% liked)

Technology

59087 readers
3141 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
(page 2) 50 comments
sorted by: hot top controversial new old
[–] nicetriangle@kbin.social 16 points 10 months ago

It is incredibly cheap and easy to artificially bump a post to the top of a decent sized subreddit. I’ve seen it done before and the cost per impression/click puts most advertising to shame. And this was being done unsophisticatedly by some dude and a cheap bot. Now imagine what major corporations can do with all the resources to burn.

[–] RedditEnjoyer@lemmy.world 15 points 10 months ago (1 children)
[–] m13@lemmy.world 12 points 10 months ago (4 children)

Exactly. And on major subreddits it would be much higher. Worldnews at the moment just feels like IDF posting pro-genocide content, commenting, upvoting and agreeing with each other.

Reddit goes in the bin. 🚮

[–] SuddenDownpour@sh.itjust.works 9 points 10 months ago

The thing with r/worldnews isn't only bots, it's also that the mods are trigger-happy when banning people for making unabashed criticisms of Israel and zionism. Keep that attitude for long enough and you'll end up with an echo chamber anywhere.

load more comments (3 replies)
[–] chitak166@lemmy.world 15 points 10 months ago

Ya think? I noticed when all top comments on /r/worldnews were the exact same thing just said in slightly different ways.

It's a science at this point.

[–] _xDEADBEEF@lemm.ee 14 points 10 months ago

Isn't that just astroturfing and they've been doing it forever there?

[–] Jarmer@slrpnk.net 13 points 10 months ago (1 children)

I was really surprised recently when I was searching for some help with a mod for a videogame and a result popped up on my duckduckgo search page for a thread on reddit about it, so I clicked it and BAM: "error, this subreddit has not been reviewed, so it is not possible to view it. Either use the app or go to home page" ......... wtf? I mean, this basically destroys the entire site right? I was 100% unable to view whatever content had been posted in that subreddit. So I just closed it and went somewhere else. I don't see how reddit can even continue to exist if they don't allow people to view the site. How did this happen?

[–] numberfour002@lemmy.world 13 points 10 months ago (1 children)

There's a theory that certain emails scams are so obvious and easy to spot because that acts as a self-selection mechanism. A person who sees the obvious scam and immediately recognizes it as such was probably never going to fall for it. The ones that respond in spite of all the signs tend to be easier or more lucrative targets.

I could see forcing people to download an app just to see the content as operating on a similar (but not 100% analogous) principle. The type of person who willingly installs the app to see the content (without knowing if it was worthwhile/relevant beforehand) may be exactly the type of person that they prefer to join their site. Perhaps they are easier targets for marketing, less likely to understand /complain about the ramifications of changes to the site that are user adverse, care less about privacy, etc and that makes them more lucrative?

load more comments (1 replies)
[–] GutsBerserk@lemmy.world 12 points 10 months ago (1 children)

Only 15%? More like 99%! The most recent Gaza genocide was truly an eye opener for me.

load more comments (1 replies)
[–] dual_sport_dork@lemmy.world 11 points 10 months ago (1 children)

The percentage is that low?

[–] theodewere@kbin.social 9 points 10 months ago* (last edited 10 months ago)

the impact those accounts have is much higher than a normal 15% slice of the comments.. what they produce is generally non-random, so it's all going toward whatever set of ideas they need to bombard with bullshit.. they intentionally shut down and/or control discussion..

[–] NounsAndWords@lemmy.world 11 points 10 months ago

Crazy thing I've been noticing more and more. When I search "[thing I want to know] reddit" there are always one or two comments in the top results from reddit, usually much more recent than the others, very clearly shilling a product. Sometimes it's an edit purely to include a product the user just thinks is really great that sends you to an affiliate link-ridden site.

[–] fossilesque@mander.xyz 10 points 10 months ago (2 children)

I still have a few subreddits I passively maintain and every three days on the most popular one I'm banning some new app someone is shilling to a vulnerable group. It's absolutely disgusting and makes me so incredibly angry/jaded how much they're targeted.

load more comments (2 replies)
[–] CosmicCleric@lemmy.world 9 points 10 months ago (6 children)

From the article...

The study’s demographic analysis further highlighted the targeted nature of corporate trolling. Younger users, particularly those aged 18–29, were significantly more likely to be contacted by corporate trolls, with 17% of them reporting such experiences, compared to only 7% of users aged 65 and over. This age-based discrepancy underscores the strategic approach of corporate trolls in engaging with a demographic that is often more susceptible to their influence.

Wow. Corporations are tagging younger generations as dumb shits. That is not cool.

load more comments (6 replies)
[–] LemmyIsFantastic@lemmy.world 9 points 10 months ago (1 children)

The study found that 11% of the respondents had been contacted by a bot or troll attempting to promote a product or service. Even more concerning was the discovery that 13% of the respondents had witnessed a company manipulate public opinion on the platform.

Self reported garbage. Asking a user to self identify manipulation is ripe for abuse.

load more comments (1 replies)
[–] CosmicCleric@lemmy.world 9 points 10 months ago

That number has got to be higher than 15%. Everywhere.

[–] Kushia@lemmy.ml 7 points 10 months ago (4 children)

This is paywalled, can you please post the text?

load more comments (4 replies)
load more comments
view more: ‹ prev next ›