this post was submitted on 09 Mar 2024
228 points (91.9% liked)
Technology
59091 readers
4107 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It not uncommon to see misinformatuon to fabricated information appears on many SNS platforms including Facebook and Twitter. It is not unheard of Russia use social media to influence election too via popular platform that is US based. All SNS are subject to the same problem, but only TikTok have more active users thus more far reaching, ~~but again this is a content moderation problem, not the inherent fault of TikTok itself.~~ Whom should perform content moderation is a business decision. It should not be dictated by law, though they can make moderation standards that companies needs to comply. I think this is a bit unfair to just targeting TikTok only, and should be universal.
EDIT:
Isn't TikTok opened access to its algorithm for reviewing?
Actually it is not solely a content moderation problem. While some dumb and physically harmful content should be subject to moderation, speeches should be protected. Isn't American all about the word "Freedom"? It should be free to speak what they believe, right?
However, the recommendation algorithms might need some regulations that categorize content and have relevant display policies. For example, political content, user generated and advertisement, should be distributed equally for all views (i.e. a user will see content for all candidates for roughly same amount of time). The "addictive" thing shouldn't be regulated as that the point of the algorithm: maximize user engagement. However, there could be a rating system similar to game ratings that affect who at what age can use which platform. Otherwise, it should be free for one to addict to something, as long as it doesn't cause a physical harm to himself and others.