I truly hope this is a slippery slope and welcome similar focus on dismantling Facebook, X and other destructive platforms. Social Media is the worst invention in human history (the societial equivalent of a nuclear bomb) and doesn't deserve to exist in its current form on Facebook, X and other major platforms.I hate TikTok but it feels like banning an entire website is a serious slippery slope situation.
No it is Russian. They are also considered an adversary.Telegram is not Chinese.
Namely?There are strong indications that Telegram is actually controlled by Russia
Facebook might benefit, but only if TikTok is not sold. If they sell to anyone, US based purchaser or not, then Facebook doesn't benefit.
Also, the idea that the general public know what is best has been proven time and time again to be false. I suppose you could argue that people should be left to deal with their own consequences but since we live in a society and not in isolation, that does not work. It is why there are rules around alcohol, gambling, etc.... Maybe you would also consider it overreach by the government to require vehicles to contain seatbelts or for no further chips from huawei allowed in the US or many EU countries, but then I guess that is where you would likely be in the minority with most people as we prefer the safety these rules provide.
Perhaps. But he is alive while every other former inner-circle who presents any real opposition to the regime has been poisoned, thrown out of a tall building, sent to a Siberian camp or neutralized in other ways. Makes you wonder why he’s the exception. 🤔Namely?
The fact that Durov is Russian isn’t. Given his previous ownership Russian social media business (VKontakte) and how he was forced out of it, he doesn’t look like an ally or agent of the Russian government.
I’m Canadian and I agree with it.Let’s face it. The US just isn’t happy that there is another big player on the market that isn’t American. Where is the actual PROOF that they are doing anything wrong? So far I have only read assumptions because … BOOOO china
And no, I don’t even use TikTok and don’t like the government of China either but this seems more like a „business decision“ more than anything, just like the whole Huawei ordeal. „Gotta slow em down cuz we can’t have China be ahead in the whole phone bizz“
Short form videos is brain rot and kills your attention span regardless of the content.I kept hearing all this hate and how tiktok is poison until I finally downloaded it...my algorithm has gotten me videos to catchup with the NBA highlights, DIY help for home projects, comedy and the news updates. Also, they added STEM section for people to focus on that. It has been by far more beneficial than my IG or FB account.
TikTok’s problem isn’t the data collection.TikToks problem isnt the app itself or the algorithm.
It's the vast amount on non-related information it sends back via the app.
Why does a video reels app need all the data it collects?
You were responding to this: "China would not approve the divestiture of TikTok and its algorithm." I wonder. What if China told Meta FaceBook could operate in China if Meta sold FaceBook to ByteDance, and Meta agreed to do so? Imagine the storm of hot criticism that would trigger in the U.S., by the government and the people.Doesn’t that kind of say it all right there?
And if they were reluctant to sell it directly, I imagine China could act through other companies with superficially legitimate purposes for wanting that data.But Meta and X collect similar data and will sell it to anyone including China, Russia, North Korea, Iraq, etc..
That's a good point...is there a difference, and is there supposed to be?China already bans US-owned social media like Facebook and YouTube, so I don't see any difference with the US banning a Chinese-owned social media platform.
Yes. There was considerable concern about how much power the Department of Homeland Security got due to fears after the 9-11 attacks. I imagine the surveillance power revealed by Edward Snowden appeared justified in the interests of national security. National Security makes a very convenient boogey man to justify doing sketchy things.That is very slippery slope. What else is going to be considered a threat to national security? Unpopular dissenting opinion? Protests? Once the door is opened it is very difficult to close.
Do we really want the government censoring what they're allowed to say to us?The US doesn't have a first amendment for foreign corporations.
It sets a precedent. Worry.That’s why the law is strictly limited to TikTok. Every similar ban would require Congress to pass a new law, which is no easy task. So don’t worry.
I wish we could pressure them into allowing Google, FaceBook and X into China, but as you know it wouldn't work.You do realize that EVERY social media app/website that is owned by a US-based entity is banned in China, right?
You are deriding the US for banning TikTok or Huawei but China bans and/or seriously hamstrings many US companies from operating within their borders.
I say that the US should allow TikTok once China allows Google, Facebook and Twitter (X) to operate in their country. I won't hold my breath for that though.
To do what? Selectively serve up TikTok videos? Are we afraid TikTok will start sending users a stream of videos and President Xi is the greatest thing since sliced bread and Communism rocks?TikTok offers the Chinese government direct unrestricted influence over millions of Americans.
Ironic this discussion is in the context of our feeling threatened by a totalitarian State that paternalistically controls what people are allowed to access and use, ostensibly for their 'own good' and the food of the nation (according to the ruling party).This is a good start. Now we just need to ban Twitter, Instagram and Facebook. Maybe there's a chance we can still save what's left of our society.
How many people predominantly get their news from TikTok, and would it be so hard to alert the public and issue counter arguments if it became an issue?Also, now China can slightly tweak your algorithm and pump content to benefit their national interests.
You were responding to this: "China would not approve the divestiture of TikTok and its algorithm." I wonder. What if China told Meta FaceBook could operate in China if Meta sold FaceBook to ByteDance, and Meta agreed to do so? Imagine the storm of hot criticism that would trigger in the U.S., by the government and the people.
I agree that the TikTok problem is who controls the platform (the truth of what we’re observing). However, control is only a problem because of the data being collected (the truth behind the truth). The real problem is the data being collected and used in an essentially unregulated manner in the first place. No commercial, unregulated entity (foreign or domestic) should have this much impact on an individual’s or a society’s well being or general social cohesion.TikTok’s problem isn’t the data collection.
It’s not as if Meta or Google are any different or better in that regard.
It’s simply that TikTok aren’t controlled by the U.S.
'Data' is pretty vague, yet often cited. What are the specific concerns people have? I often see mention of 'personal data,' but that seems mainly used to inform Amazon what to tempt me with via e-mails or when I log on, and to 'cross advertise' so when I go to websites, the ad banners show things that suggest the system recognizes some of what I've shopped for before. In other words, it's about targeted ads replacing random ads.The real problem is the data being collected and used in an essentially unregulated manner in the first place. No commercial, unregulated entity (foreign or domestic) should have this much impact on an individual’s or a society’s well being or general social cohesion.
This perspective whitewashes a problem that is obvious. When the issue is reduced to the specific question asked, the answer is 'Everything is fine. This data collecting is actually helpful in many ways.''Data' is pretty vague, yet often cited. What are the specific concerns people have? I often see mention of 'personal data,' but that seems mainly used to inform Amazon what to tempt me with via e-mails or when I log on, and to 'cross advertise' so when I go to websites, the ad banners show things that suggest the system recognizes some of what I've shopped for before. In other words, it's about targeted ads replacing random ads.
It's my understanding that a lot of this data collection gets into aggregate data, such as what % of platform users chose to view this story, or upvoted it, or spend x amount of time watching videos vs. y amount of time sharing which memes, etc... It's not FaceBook selling Walmart my Social Security number or trying to hack into my bank account.
The profits off this data collection, along with ads, help keep some services like FaceBook effectively 'free.'
They can identify people who, by their usage patterns, might be more responsive to appeals for a product or cause and selectively reach out to them, yes. In the U.S. we have a whole profession dedicated to that very thing...it's called Marketing.
What specifically is the moral objection to companies collecting, using and monetizing aggregate data? This isn't new...social media companies are simply good at it.
I'm not asking about dislike; it can be creepy when you shop Site A and later browse Site B where the banners show things you looked at on Site A. Creepy, yes, but not clearly wrong.
Different argument and has nothing to do with the supreme court appearance, brain rot is the least of the government's care or worry....Short form videos is brain rot and kills your attention span regardless of the content.
Also, now China can slightly tweak your algorithm and pump content to benefit their national interests. You and millions more people.
Minors are a legitimate and separate issue, since individual informed decisional consent is not presumed.Yet, here we are: industrial scale data harvesting and weaponized A/B testing being applied to the online choices of everyone including kids
I don't think they're in a position to coerce anyone into doing anything. To the extent there is free speech, and ready online access to myriad resources, people aren't coerced.the results of this data harvesting and personal profiling being used to target individuals and bespoke groups to coerce them into believing and acting
Maybe the 'their' people should get to decide about that?on ideas that are often against their own interests
Maybe it should be up to consumers to decide what to consume?disinformation and misinformation being used to tear families and communities apart.
Aside from minors, what should be regulated? They shouldn't be giving away our social security numbers and credit card info. What else?the real issue which is unregulated personal data collection and use with zero accountability for the harm created or enabled in the process.
We are interrupting your cute cat video to bring you important late breaking news...Taiwan sucks! It's not a country!China tweaking the algorithm is a ridiculous argument that keeps getting thrown around without any basis.
First, to address something you said, unless you live in New Hampshire it is the LAW to wear your seatbelt. A driver can't choose to wear it or not without also choosing to break the law or not. So strange to call it a choice. Technically yes but practically no.Yeah this is a major reach comparing alcohol and wearing a seatbelt to an app on your phone. And even if you wanted to compare them, the Govt doesn't completely ban alcohol or force you to wear a seatbelt. They put rules around them but the person driving can still decide if he wants to wear the seatbelt and the person buying alcohol can drink as much as they want.
A/B testing for marketing messaging and A/B testing for personal profile building are two completely different things - if the distinction is unclear some homework is in order; but here’s the cliffs notes: A/B Testing in personal profiling allows the identification of conditions under which a person is most likely to take a prescribed action. This information can be used to engineer individualized contexts for information presentation to drive desired behavior. Actions taken under these conditions are the result of coercion not free will.Minors are a legitimate and separate issue, since individual informed decisional consent is not presumed.
Hadn't heard of A/B testing. A quick grab off Oracle.com: "A/B testing—also called split testing or bucket testing—compares the performance of two versions of content to see which one appeals more to visitors/viewers. It tests a control (A) version against a variant (B) version to measure which one is most successful based on your key metrics."
That's...common in marketing. Years ago I saw a video where people where shown photos or videos of 2 grocery store multi-product displays for a time, and then I think quizzed on which had them remember a specific product better. Taking marketing to improve selling a product or idea and calling it 'weaponized' sounds like a bit of a stretch.
I don't think they're in a position to coerce anyone into doing anything. To the extent there is free speech, and ready online access to myriad resources, people aren't coerced.
To grab an example from the recent past, many were frustrated with the seemingly intractable stance of some of the more diehard COVID-19 anti-vax people. But they weren't forced to believe as they did.
As for 'bespoke groups,' I think in marketing they call those demographics.
Maybe the 'their' people should get to decide about that?
There's a disturbing trend in U.S. culture to believe it's better for paternalistic media managers to decide which messages ought to get out (if you want evidence, sign up for a free The Economist account online and read the article When The New York Times Lost Its Way - a liberal editor got ousted for letting a conservative senator post an article with an unpopular (to liberals) point of view), rather than let free discussion reign and people decide for themselves.
It's like when I read some complaining about the conservative bent in rural areas and stating those people 'vote against their own interests.' But when an affluent urban leftist votes for those who'll likely raise his taxes, they don't say that...they'd say he's voting his conscience and his values.
Maybe it should be up to consumers to decide what to consume?
Again, if people want to learn the truth, it's usually out there waiting to be found. On some issues, like climate change, there's enough legitimate debate on some specifics that it takes more review and thought, but eventually the mounting evidence should direct the legitimate truth seeker.
As for things tearing families and communities apart, I don't think Social Media is the cause of that, just a reflection of our culture. A 2008 book The Big Sort delves into the progressive polarization and self-segregation of Americans into ideologically concentrated (and exclusion-prone) areas, and factors contributing to that. I'd heard of the ideological echo chambers in social media derived from friends lists and algorithms, but that book showed it goes way beyond that.
Is there a real world example of TikTok videos tearing a neighborhood apart?
Some families seemingly fly apart at the drop of a hat, like finding out somebody voted for the other side's guy.
Aside from minors, what should be regulated? They shouldn't be giving away our social security numbers and credit card info. What else?
Accountability judged how? This is a big issue. In the U.S., vendors have large latitude to offer products to the public, which the public is free to partake of or not. As long as the product is not defective and it's evident what it is, we don't tend to blame the vendor for back outcomes.
It's not Toyota's fault if I run over somebody.
It's not McDonald's (or Sonic's, or Chick-fil-A's, or Little Caesar's) fault I'm fat.
It's not La-Z-Boy's fault my big leather recliner's too comfy, encouraging me to be sedentary with health risks.
It's not FaceBook's fault if I spend too much time scrolling my feed, Sony's fault if some people spend too much time playing video games on PlayStations, Netflix's fault if I waste time binge watching junk, etc...
FaceBook, TikTok, X and others are not obligated to only put out a product if it appears to be a net benefit to society according to your (or my) personal values, anymore than McDonalds or Burger King are.
All these platforms and vendors are voluntary. Nobody ever forced me to eat a Big Mac or log onto FaceBook.
So far, it sounds like TikTok does nothing objectionable that FaceBook doesn't, and the only concern is that China could have them introduce content with pro-China propaganda (which they could do on FaceBook anyway, simply by having people post content to it), and in theory the Chinese government could get broad aggregate data on usage patterns (e.g.: what trends)...which shouldn't be that hard to come by for an entity with the resources of the Chinese government anyway.
P.S.: It amazes me the people on this thread who'd like to prohibit the mainstream social media platforms, basically preventing anyone from having the choice to use them. Big Brother much?