Skip to main content
Loading

Ethical Leadership of Emerging Technologies

Incorporating ethics into data practices requires an ongoing, daily investment in learning how to work with data well and responsibly—an effort beyond completing a finite set of tasks. Embedded data ethics relies on rigorous and repeatable organizational practices and a diverse and empowered team reflecting the range of expertise needed in tackling technological ethical challenges.

Shannon Vallor is the Baillie Gifford Professor in the Ethics of Data and Artificial Intelligence at the Edinburgh Futures Institute, The University of Edinburgh. She is the director of the Centre for Technomoral Futures.

The Critical Questions

As we think critically about emerging technologies, there are three key questions to consider. First, what problem are we trying to solve, and are we creating more problems along the way, such as abridging users’ privacy? Second, what are the expected outcomes of the technologies we are creating, and is there potential for harm to vulnerable groups? Third, whom the technology helps, and who might be left out of the decision-making process about the deployment of new technologies?

Jasmine McNealy is an Associate Professor in the College of Journalism and Communications at the University of Florida and Technology Associate Director of the Marion B. Brechner First Amendment Project. Her research focuses on privacy, online media, and communities.

Ethical leadership of new technologies starts from an understanding that technology design is not value-neutral. We need clear methodologies that translate our values into design in transparent and accountable ways. Responsible innovation depends on a diverse group of stakeholders representing different value perspectives and mechanisms that turn responsibility into an actionable value understood in a larger, systemic context.

Jeroen van den Hoven is a Professor of Ethics and Technology at the Delft University of Technology. He is the Editor in Chief of Ethical and Informational Technology and the Scientific Director of the Delft Design for Value Institute He was the founding scientific director of 4TU. Centre for Ethics and Technology, which brings together the expertise of three philosophy departments in the Netherlands (Delft, Eindhoven, Twente) in the field of ethics of science, technology, and engineering.

Public interest technology is the application of technologies to promote and protect the public benefit. Achieving this requires incorporating values such as privacy, bias awareness, and inclusion and diversity into our design process. This points to a more general truth about technology development: it involves a series of choices, prioritizations, and de-prioritizations. With many possible choices during the technology design process comes great power and, thus, even greater responsibility. Public interest technology provides a framework for handling this power in ways that help us address global technological challenges ethically.

Afua Bruce is the founder of the ANB Advisory Group LLC, a consulting firm that specializes in supporting organizations that develop, implement, or fund responsible data and technology. She has a background in software engineering, data science, and artificial intelligence and has experience developing and deploying technology in and with communities. She is also an adjunct professor at Carnegie Mellon University and a frequent speaker on community-centered inclusive innovation. Her latest book, The Tech That Comes Next: How Changemakers, Technologists, and Philanthropists can Build an Equitable World, describes how technology can advance equity.

An effective review board is built on “the three Ps”: people, program, and perspective. People on the board must represent a diverse range of possible stakeholders and have access to meaningful input mechanisms. Ethical review boards should rely on a clear and actionable program embedded in a larger culture of integrity within the organization. Lastly, boards gain perspective through external collaboration, which is especially crucial in a fast-paced area of new technologies.

Melissa Stapleton Barnes is Board Director and Retired Senior Vice President of Enterprise Risk Management and Chief Ethics and Compliance Officer at Eli Lilly and Company.

In a world where technology shapes our social, political, and ecological conditions, risks associated with emerging technologies can lead to declining social trust. Responsible technology is a multidisciplinary field that responds to these challenges by working to align the development of new digital technologies with individual and societal values. Responsible technology strives to make the tech field more accessible and welcoming to people from diverse educational and career backgrounds, reflecting the wide range of social impacts and uses of the new technologies for broad societal benefit.

Karina Alexanyan is the Founder and CEO of Humanication.io and Director of Strategy for All Tech is Human.

An ethical approach to software development relies on five core elements: reliability, security, data protection, transparency, and accountability. Companies developing software must recognize a responsibility to these ethical standards beyond the initial release of their products. At the same time, user agency and engagement are also critically important. Users are responsible for challenging potential hazards stemming from the use of the software.

Vint Cerf, widely recognized as one of the “Fathers of the Internet,” was a co-developer of the TCP/IP communication protocols used to build the Internet. He is a recipient of the Presidential Medal of Freedom, the National Medal of Technology, and the Turing Award, among others. He now serves as Vice President and Chief Internet Evangelist at Google.