Silicon Valley needs a new approach to studying ethics now more than ever

Lisa Wehden
4 min readApr 20, 2021

--

Next month, Apple and Google will unveil features to enable contact tracing on iOS and Android to identify people who have had contact with someone who tests positive for the novel coronavirus.

Security experts have been quick to point out the possible dangers, including privacy risks like revealing identities of COVID-19-positive users, helping advertisers track them or falling prey to false positives from trolls.

These are fresh concerns in familiar debates about tech’s ethics. How should technologists think about the trade-off between the immediate need for public health surveillance and individual privacy? And misformation and free speech? Facebook and other platforms are playing a much more active role than ever in assessing the quality of information: promoting official information sources prominently and removing some posts from users defying social distancing.

As the pandemic spreads and, along with it, the race to develop new technologies accelerates, it’s more critical than ever that technology finds a way to fully examine these questions. Technologists today are ill-equipped for this challenge: striking healthy balances between competing concerns — like privacy and safety — while explaining their approach to the public.

Over the past few years, academics have worked to give students ways to address the ethical dilemmas technology raises. Last year, Stanford announced a new (and now popular) undergraduate course on “Ethics, Public Policy, and Technological Change,” taught by faculty from philosophy, as well as political and computer science. Harvard, MIT, UT Austin and others teach similar courses.

If the only students are future technologists, though, solutions will lag. If we want a more ethically knowledgeable tech industry today, we need ethical study for tech practitioners, not just university students.

To broaden this teaching to tech practitioners, our venture fund, Bloomberg Beta, agreed to host the same Stanford faculty for an experiment. Based on their undergraduate course, could we design an educational experience for senior people who work across the tech sector? We adapted the content (incorporating real-world dilemmas), structure and location of the class, creating a six-week evening course in San Francisco. A week after announcing the course, we received twice as many applications as we could accommodate.

We selected a diverse group of students in every way we could manage, who all hold responsibility in tech. They told us that when they faced an ethical dilemma at work, they lacked a community to which to turn — some confided in friends or family, others revealed they looked up answers on the internet. Many felt afraid to speak freely within their companies. Despite several company-led ethics initiatives, including worthwhile ones to appoint chief ethics officers and Microsoft and IBM’s principles for ethical AI, the students in our class told us they had no space for open and honest conversations about tech’s behavior.

If we want a more ethically knowledgeable tech industry today, we need ethical study for tech practitioners, not just university students.

Like undergraduates, our students wanted to learn from both academics and industry leaders. Each week featured experts like Marietje Schaake, former Member of the European Parliament from the Netherlands, who debated real issues, from data privacy to political advertising. The professors facilitated discussions, encouraging our students to discuss multiple, often opposing views, with our expert guests.

Over half of the class came from a STEM background and had missed much explicit education in ethical frameworks. Our class discussed principles from other fields, like medical ethics, including the physician’s guiding maxim (“first, do no harm”) in the context of designing new algorithms. Texts from the world of science fiction, like “The Ones Who Walk Away from Omelas” by Ursula K. Le Guin, also offered ways to grapple with issues, leading students to evaluate how to collect and use data responsibly.

The answers to the values-based questions we explored (such as the trade-offs between misinformation and free speech) didn’t converge on clear “right” or “wrong” answers. Instead, participants told us that the discussions were crucial for developing skills to more effectively check their own biases and make informed decisions. One student said:

After walking through a series of questions, thought experiments or discussion topics with the professors, and thinking deeply about each of the subtending issues, I often ended up with the opposite positions to what I initially believed.

When shelter-in-place meant the class could no longer meet, participants reached out within a week to request virtual sessions — craving a forum to discuss real-time events with their peers in a structured environment. After our first virtual session examining how government, tech and individuals have responded to COVID-19, one participant remarked: “There feels like so much more good conversation to come on the questions, what can we do, what should we do, what must we do?”

Tech professionals seem to want ways to engage with ethical learning — the task now is to provide more opportunities. We plan on hosting another course this year and are looking at ways to provide an online version, publishing the materials.

COVID-19 won’t be the last crisis where we rely on technology for solutions, and need them immediately. If we want more informed discussions about tech’s behavior, and we want the people who make choices to enter these crises prepared to think ethically, we need to start training people who work in tech to think ethically.

To allow students to explore opposing, uncomfortable viewpoints and share their personal experiences, class discussions were confidential. I’ve received explicit permission to share any insights from students here.

Originally published at https://techcrunch.com.

--

--

Lisa Wehden

CEO & Founder Plymouth www.plymouthstreet.com, Ex-Special Projects @join_ef, former President @OxfordUnion. Free speech, startups, technology, humanity..