this post was submitted on 17 Aug 2023
316 points (94.6% liked)
Technology
59317 readers
4683 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
...then they aught to ban every app with user generated content. YouTube might be the biggest propaganda vector we have right now, and the powers that be would not be able to tell which nation is pumping out anti-us videos due to it being so easy to lie about stuff like that.
That's quite a bad response if that's really the reason. If they're worried about propaganda, the best defense against it is education on what is propaganda, and how to spot it, not banning a single vector of it when so many others exist already.
This only supports my theory that this is red-scare nonsense (though please let me know if I misunderstood, I really do want to understand)
I think the propaganda fear isn’t necessarily due to user content, but rather the algorithm being set to intentionally push certain content/narratives that China wants pushed. Politicians don’t target YouTube because it’s not China setting the algorithm, it’s an American corporation and politicians like that. Personally I don’t have any real opinions on TikTok, although I wish all social media killed their algorithms that seem intentionally designed to push radical content (because clicks).
Algorithms push content to produce engagement and retention. If someone is getting radicalized by algorithms it’s because they are engaged by the radicalism.
My TikTok fyp (algorithmic content) is full of artists - mostly vocals and strings, but some brass too, along with Creepy Dave animal videos and a few British (and British style humor)comedy sketches. It’s quite enjoyable. I think the only political content I can remember is a couple of Jeff Jackson videos (NC US House if Rep), and that’s because I looked him up after my wife forwarded me one of his online bits.
People who get radicalized are ripe for picking, not converted. And if we are concerned about SM, we need to outlaw most talking head TV and talk radio along with it.
The algorithm literally just shows you shit you like.
I argue the government is nefarious for using propaganda instead of the facts of the matter to sway public opinion. (This opinion extends to any government, which likely means every government because, in my general assumption, power holders seem to be unable to not lie to hold on to and amass more power.)
At this point, I can't really argue with your point, you're right, but I sure hate it regardless.
As for your edit, I don't think that really applies. It'd make sense if the issue was voting for someone from China vs someone from the US, but this is more like "who's allowed to spy on you? Our creepy guy who has a higher chance of doing something with the info they spied from you or this other creepy offshores guy who is less likely to do anything with that info (but the offshore guys are creepier! And worse!!! Trust me, bro!!!!)"
In your example, obviously I'd rather the guy who is invested in the neighborhood to some degree so if it turns to shit, he's going down with it. In my example, I'd much rather the guy who's not.
Ah I can't really argue with that.
Republicans "trust me, bro"'d the US into shitty situations time and time again, and I'm slightly worried if we give them the ability to ban apps, they will not stop until the only shit left is Truth Social. They love their fucking echo chambers, as much as they used to rail against them.
I think that's why my kneejerk reaction is to distrust their reasoning, but I did get what I was looking for in another part of this thread:
If the CCP can monitor you indefinitely, and have enough man/ai power to pull it off, they could theoretically social engineer infrastructure attacks without actually putting themselves at almost any legal risk (blackmailing is always illegal, but the methods used to get the blackmail would be hidden in the tandc's)
Unfortunately, all I hear from republicans is tiktok=China, China=bad without any of these sorts of details, which is why I approached this with such skepticism initially
I literally can not imagine what life in China must be like. Especially for someone of about my socioeconomic class. I may have seen American Factory a while ago, but in case I'm thinking of the wrong thing, I'll check it out again. Thanks for the rec!
(Also, in case it's not clear, I wasn't saying the US was anywhere near as bad as or better than China in regards to anything, just that I couldn't tell if this specific rattling was just republican red scare bs or if it actually had substance to it. Turns out, there's some substance to it, they're just not articulating a position well on it, imo)
It's not about the propaganda they can push but the data they can harvest with full access to everything your phone can provide. It's the difference between simple algorithms and full on individuals machine learning models targeting you. Combine that with an actual malicious intent and you get some pretty interesting and terrifying possibilities inside information warfare and kinetic.
What are these interesting and terrifying possibilities inside information warfare and kinetic warfare?
That's specifically what I'm after. It sounds like boogyman fear right now, but it sounds like you have something more concrete. What is it that you have in mind, exactly?
They could use unmitigated microphone access to develop an exact profile on what would get you to spy for them and betray the country and to what extent. Location based models from minute gps data which can predict exactly where you are to maximize attacks on infrastructure. 3d models of the inside of every federal building and us military bases from direct camera access.
Combining models can create increasing impacts with automation and cyber capabilities.
An individual example would be automatic ai generation of images of your family being threatened sent to you while you are at work. All they are asking you to do is go and flush the toilet.
Everyone does that at the same time and overwhelms the sewage system creating a manufactured infrastructure crisis in Arlington VA around the Pentagon. This multiplied by a hundred issues all at once as they invade Taiwan.
This is the type of attack that's only possible with direct access to literally all the data that the little sensor node that is your phone can provide over time. It doesn't include the types of profiling that they could do with your political inclinations to magnify any discontent with the world around you at the most inconvenient time for the government. It also doesn't include the possibility that you might be a decision maker for government contracts someday or a politician influencing policy towards China.
Ever done anything wrong and still had your phone powered on that could possibly be used against you? Are you sure? What if they have enough image data to make a video that looks just like you doing something against your loved ones that they also have profiles on?
Beautiful, this is what I was looking for.
I think, still, that the best defense against this type of thing is education. Education on what nefarious actors can do with data types xy and z, and education on how to not get owned by a foreign actor.
I still think an app ban is principally un-American, but this is the imagination part that I couldn't come up with that at least gives what I'm still calling republican hysteria a veneer of legitimacy.
Thank you for that!
Agree. I want to ban TikTok('s algorithms) too, but on grounds of algorithm addiction and national security issues it causes. That also means Instagram Reels and YouTube Shorts.