Dozens of leading apps accused of putting children in danger | Financial Times
The companies that 5Rights analysed range from the well-known, such as TikTok, Snap, Twitter and Instagram, to lesser-known platforms such as Omegle, Monkey and Kik, and the violations they alleged include design tricks and nudges that encourage children to share their locations or receive personalised advertising, data-driven features that serve harmful material including on eating disorders, self-harm and suicide, and insufficient assurance of a child’s age, before allowing inappropriate actions such as video-chatting strangers.
They affect adults too.
There are numerous guidelines, congressional hearings, increased media scrutiny, “app-store rules”, parental guidance etc. Yet, things don’t change. Data leaks, privacy breaches and user subversion is common, even in countries where privacy laws are “strict”.
The root cause of problems – inconsistent coding, incorporation of software development kits, sketchy platforms and viral networks. There is immense competition to game metrics and seek user attention. As such, there is no accountability.
As hospitals transition towards a market place model for applications/services, they need to be aware of these issues.
Dark patterns are even more omnious:
Subtle ways of nudging children to give up their privacy contravene the Children’s Code. 5Rights’ investigation found that video-chat social-media app Monkey uses pop-up memes to encourage users to give the app access to their location, which is then used to match users with others in their area, including unknown adults. The app did not respond to request for comment.
No one responds for comment.