staircase@programming.dev to Technology@lemmy.worldEnglish · edit-21 month agoMeta found liable in child exploitation casewww.theguardian.comexternal-linkmessage-square159linkfedilinkarrow-up11.25Karrow-down112file-text
arrow-up11.24Karrow-down1external-linkMeta found liable in child exploitation casewww.theguardian.comstaircase@programming.dev to Technology@lemmy.worldEnglish · edit-21 month agomessage-square159linkfedilinkfile-text
minus-squareMyMindIsLikeAnOcean@piefed.worldlinkfedilinkEnglisharrow-up1arrow-down1·1 month agoIf social media companies were required to moderate their content…if they were responsible for what’s posted…all problems would go away. As it stands bad actors use bots to stay one step ahead of automated moderation.
If social media companies were required to moderate their content…if they were responsible for what’s posted…all problems would go away.
As it stands bad actors use bots to stay one step ahead of automated moderation.