Social media platforms may soon need to acquiesce to the demands of Congress and upset parents.
The prospect comes after the U.S. Senate Judiciary Committee called for testimony by Snap, X, TikTok, Discord, and Meta.
As their executives responded to angry questioning, parents stood silently behind them, holding up photographs of their children whose deaths are related to using the platforms.
“You have blood on your hands,” ranking member Lindsey Graham (R-S.C.) accused Meta CEO Mark Zuckerberg as the hearing began. “You have a product that’s killing people.”
Stress in the household was a main reason why many children developed problematic media use during the height of the pandemic. Household screen rules had little effect on media usage, according to new research.
Emily Kroshus had three children under age 6 at the time of the pandemic lockdowns. She remembers how she coped with online work meetings. “I would turn to screens.” And not to co-view and discuss content with her children. “It’s more like: ‘please, can I hypnotize you for an hour?’”
“I’m not proud of that,” the child behavioral researcher recalls. “And I don’t think I’m alone.”
Curiosity prompted Kroshus and her lab at Seattle Children’s Research Institute to survey other parents in the fall of 2020. A diverse group of 1,000 American families responded, each with at least one child between the age of 6 and 17.
According to the survey results published in the journal Pediatrics, one in three children displayed “problematic media use,” which Kroshus describes as “the child is unwilling or unable to stop using media.”
About one in three households had rules around media use, such as keeping devices out of the bedroom at night and not bringing screens to meals. But did those rules prevent problematic use of phones, laptops and computers? Continue reading →
Robin thought she was “being Super Mom” as she made nice dinners and tidied her midwestern U.S. home, with her toddler son quietly sitting nearby watching made-for-babies TV. She didn’t know that by letting him watch so often, he was developing the newly described condition termed “Virtual Autism.”
Took a While to Realize
For weeks, Robin rationalized the changes she saw, but finally had to admit something was wrong. Her formerly happy, lively 14-month-old had stopped having eye contact, no longer said words, and began to display hand-flapping, spinning and other autistic-like symptoms.
“The big one was,” she recalls, “he had stopped answering to his name.” Continue reading →
With a new frame of mind, designers can create humane technology. Former Google tech ethicist Tristan Harris wants to teach them how.
“This talk is about the wisdom we need to steer technology, and our future.” The words from his new message shone brightly from the screen at the 2022 mindfulness in technology conference, Wisdom 2.0.
Harris was back at the place where in 2015, he pulled back the curtain on how tech companies used “persuasive design.” They were in “a race to the bottom of our brainstems to seduce our instincts.”
Their products did not support human well-being, he claimed. “It’s like being on a diet, but you are only handed menus with burgers and fries.”
Harris believes tech companies’ intentions were way off when they started Google, Facebook, and other platforms. He should know, having trained in the Stanford University Persuasive Technology Lab.
Since tech products could be accessed for free, users’ personal data were fair game, which companies made unprecedented sums from selling and re-selling. Individuals were hyper-targeted under the guise of “giving users what they want.”
Silicon Valley founders saw tech as a neutral vessel. That users became trapped in polarized filter bubbles was not the platforms’ problem.
The result today: the loudest and meanest social media opinions seem to be the majority. As Harris observes, “we start to believe the extreme voices and stereotypes represent the world.”
Besides political turmoil, he blames early Silicon Valley attitudes for creating problems ranging from information overload and addiction, to synthetic charlatans including bots and DeepFakes.
Over the years, it became standard practice to use psychological sleight of hand to keep users engaged.
Children have been especially affected. Since the dawn of social media, youth mental health has significantly eroded.