

the minimum would be transparency for the algorithm. If users can see exactly what a social media algorithm is doing with their content feed, they would always have a way to identify and escape dark patterns of addiction.
But this minimum itself would require powers to compel tech companies to give up what they would describe as intellectual property. Which would probably require a digital bill of rights?
The most practical option would be to just ask your kids directly about the kinds of content they’ve been consuming and why. Dinner table conversations can probably reveal those dark patterns just as well






because if you don’t see it - it disappears! like magic!
the angry chud whos economic stability has been shattered by a neoliberal race to the bottom, the chud who can’t afford to live in the same town he grew up in, who has been directed to view every moral or class grievance in purely racial terms by the corporate media. When that same chud dares to express any anger or resentment or hostility, and he expresses that anger to his cultural out group, at least YOU will be there! to wag your finger in his face and moralise to him about how people’s feelings are important, even as you ignore his.
censoring someone doesn’t change their mind, if anything it hardens their velief and encourages them to spread it elsewhere