Taking Ownership
Chris Heckman
Assistant Professor
Shilo Brooks
Director, Engineering Leadership Program
Shaz Zamore
Head of STEAM outreach at ATLAS
Ethical technology requires new approaches to education, research and inclusion.
For the first few decades of the computing age, computers were monolithic machines in big places, out of reach for most of the general public.
When personal computers came along, everything changed, according to CU «Ƶ’s Bobby Schnabel. Since then, the growing ubiquity of computing has compounded both the number of devices and the ethical issues inherent in their development and use.
“When people started being able to interact with those computers, they became two-way devices,” said Schnabel, external chair of the Department of Computer Science and former CEO of the Association for Computing Machinery. “All sorts of things have arisen that impact people’s lives.”
Today, the field is grappling with many of those impacts, like bias in machine learning algorithms and social media networks that are easily manipulated.
“As a discipline, we need to take ownership of that and go fix it,” said department Chair Ken Anderson. “Computer science has to mature as a discipline and start to say, ‘How do we bake in discussions of what’s important first before the technology starts to roll out?’”
‘Biases as bad as ours’
At CU «Ƶ, some of those discussions are happening at the research stage.
Assistant Professor Chris Heckman works with advanced autonomous systems as director of the Autonomous Robotics & Perception Group. Though he sees great promise in technology as an augmented of human ability, he is concerned by the use of AI to make moral decisions.
“I can’t say that humans are beyond reproach when it comes to this decision-making, and our autonomous systems that we build will have biases as bad as ours, if not worse,” Heckman said.
For technologists, dual-use concerns are often brought to the forefront. A system designed to connect can isolate. A system built with good intentions can be weaponized. Unfortunately, human ingenuity makes the task of designing meaningful technology that could never be used in a dangerous manner next to impossible.
Technologists can, Heckman argues, choose what systems they do or do not work on and choose to partner or not partner with certain entities, but once the technology floats further downstream, it becomes the responsibility of managers and end-users.
“It is an organizational process that needs to ensure that autonomous systems are actually behaving according to the values and the mission that we have as a society … and that means a much more robust education for organizations and end-users,” he said.
It is an organizational process that needs to ensure that autonomous systems are actually behaving according to the values and the mission that we have as a society.
Chris Heckman
Educational opportunities
But what about technologists like Elon Musk, Jeff Bezos and Mark Zuckerberg, who are engineers-turned-business-leaders? When you create a technology and also implement it, how do you develop that ethical foundation?
Since 1989, CU «Ƶ has been answering that question with a program that educates engineers in both ethics and technology, the Herbst Program for Engineering, Ethics & Society. The program introduces the “great books” of Western civilization, which have been used in the humanities for centuries to spark inquiry into ethics.
From the Herbst program tradition also came the Engineering Leadership Program, led today by Shilo Brooks. Brooks believes that in the modern era, engineers often become leaders in business. To that end, looking at classical ethical dilemmas helps them make better decisions in the future.
“The best way to equip these future leaders is to think through some of these problems. It gives a foundation of curiosity and an intellectual agility that provides a map for how they ought to think through problems confronting them,” Brooks said.
The value of varied perspectives
As valuable as the age-old struggle for moral excellence is, it is also important to consider what viewpoints have been left out that provide valuable context for difficult ethical dilemmas we face today.
For Shaz Zamore (they/them), head of science, technology, engineering, art and math (STEAM) outreach at the ATLAS Institute, the greatest ethical question today is how to increase space for different, equally valued perspectives.
“When you’re all working together in an equitable system with parity, with everyone’s background, experience and knowledge valued equally, that is where you’re going to see truly genius developments and life-changing knowledge come about,” Zamore said.
Zamore thinks about ethics in relation to who has access. Who can make technology? Who can use it? Who learns about it, and how?
“When it comes to outreach and engagement, one of the biggest barriers with underrepresented and severely underserved populations is that they are not told what their options are,” they said. “They don’t know that you can ask questions and do experiments and get paid to do it.”
One of the biggest barriers with underrepresented and severely underserved populations is that they are not told what their options are.
Shaz Zamore
If students with different backgrounds are continually left out of the tech pipeline, their valuable insights are minimized, and the technological considerations built will not be as robust, Zamore said.
Anderson agrees and said that’s why the department has invested so heavily in diversity efforts, like creating the Bachelor of Arts in Computer Science, building logic and ethics courses into its curriculum, and partnering with groups like ATLAS and the National Center for Women and Information Technology.
“It’s all intertwined,” Anderson said. “The diversity programs that we started are going to help us change these things over time so that the systems, as they’re being designed, have more diverse thinking behind them. We’re going through this phase in which the exclusionary practices that made this a white man’s world, people are now working to try to dismantle those as best they can.”