© 2024 WGLT
A public service of Illinois State University
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

As Views Of Tech Turn Negative, Remorse Comes To Silicon Valley

Things like smartphone addiction, false stories and election interference leave some tech executives regretful about what they've created.
Manan Vatsyayana
/
AFP/Getty Images
Things like smartphone addiction, false stories and election interference leave some tech executives regretful about what they've created.

There are a lot of regrets coming out of Silicon Valley these days as the dark side of the tech revolution becomes increasingly apparent, from smartphone addiction to the big scandal involving the misuse of personal information from some 87 million Facebook users.

Facebook Chief Operating Officer Sheryl Sandberg expressed her regrets in an interview last week with NPR. "We know that we did not do enough to protect people's data," Sandberg said. "I'm really sorry for that. Mark [Zuckerberg] is really sorry for that, and what we're doing now is taking really firm action."

Facebook CEO Mark Zuckerberg will have a chance to express his regrets when he testifies in front of Congress later this week.

But the remorse coming out of Silicon Valley isn't just from high-profile leaders like Sandberg and Zuckerberg. Investors and people who worked to build some of the problematic technologies are also taking the blame; some are even turning their attention to fixing the problems.

When Sandy Parakilas went to work for Facebook in 2011, he says, he deeply believed in its mission of bringing the world closer together and building community. At the time, the Arab Spring was in full bloom and social media companies were getting credit for helping to launch a revolution.

"I was extremely excited about the power of social media to advance democracy all over the world," Parakilas says.

But his optimism would be tempered by the reality of Facebook's hunger for raw data about its users. He didn't like the direction it was going.

"They have a business model that is going to push them continuously down a road of deceiving people," he says. "It's a surveillance advertising business model."

Parakilas says he tried to warn his managers at Facebook that they were at risk of putting private information into the wrong hands. But the company was growing fast and making money. Its leaders believed connecting people was inherently good.

Many of its earliest investors believed in its mission, too. But now Roger McNamee, who helped mentor Zuckerberg, says he feels bad about what has happened, "because at the end of the day these were my friends. I helped them be successful. I wanted them to be successful."

As part of his penance, McNamee helped found the Center for Humane Technology. The center is trying to "realign technology with humanity's best interests." Parakilas has also joined the effort as an adviser.

While Facebook may be in the headlines now, there is plenty of regret going around Silicon Valley from people who were part of other companies.

Guillaume Chaslot joined Google/YouTube in 2010. He, too, started as a true believer. "We could only make things better if people were more connected," he says. "If everybody could say what he wanted to say, things would naturally get better."

But Chaslot says he noticed the main goal at YouTube wasn't to inform people; it was to keep people watching videos for as long as possible. "This goal has some very bad side effects and I started to notice the side effect as I worked at YouTube," he says.

Among the side effects he noticed: People tended to get only one point of view on a topic — and not always the right one. For example, a search for "moon landing" might bring up videos from conspiracy theorists arguing that NASA faked the whole event.

Chaslot tried to create an algorithm that would show people different points of view. But, he says, his bosses weren't interested.

A spokesperson from the company says it has updated its algorithms since Chaslot left. According to the company, it no longer just tries to keep people on the site for as long as possible; the goal is to measure through surveys how satisfied users are with the time they spend on the site.

Chaslot left in 2013. But he continued to lose sleep over what was happening on YouTube. From the outside, he observed the site fill up with conspiracy theories and divisive content. He privately met with former colleagues and tried to warn them. But nothing began to change until after the presidential election, when news of Russian interference brought more attention to the kinds of videos on YouTube.

Chaslot now says he wishes he'd gone public sooner. "Now that's what I'm doing but with a bit of a delay," he says. He even started a site to track what kinds of videos surface when you search terms like "Is the Earth flat or round?" and "vaccine facts." The results bring up plenty of factually incorrect conspiracies.

Of course, it may be easier for many techies to speak out now — investors have done well and employees were paid well for their work. Still, it's probably good news that the very people who helped create the problem are now using their inside knowledge to fix it.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Corrected: April 8, 2018 at 11:00 PM CDT
A previous version of the Web story identified Facebook Chief Operating Officer Sheryl Sandberg as the company's CEO.
Tags
Laura Sydell fell in love with the intimate storytelling qualities of radio, which combined her passion for theatre and writing with her addiction to news. Over her career she has covered politics, arts, media, religion, and entrepreneurship. Currently Sydell is the Digital Culture Correspondent for NPR's All Things Considered, Morning Edition, Weekend Edition, and NPR.org.