Ethical Design Requires Systemic Support

<br />
<b>Notice</b>:  Array to string conversion in <b>/nas/content/live/samaindev/wp-content/themes/ieee-sa-theme/page-templates/template-single-beyond-standards.php</b> on line <b>55</b><br />

By Lisa Morgan, program director, IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems Outreach Committee

Members of the IEEE’s Global Initiative on Ethics of Autonomous and Intelligent Systems (A/IS) just completed the community review of its Ethically-Aligned Design (EAD) document, version three. Despite the Herculean amount of work that’s been put into the document by its authors and the impressive community of individuals providing feedback on the section drafts, much more work has to be done.  Practically speaking, designers must understand how to embed ethical principles and values into the A/IS systems they’re building. In addition, the users of those systems must subscribe to their ethical use.

“It’s a very complex problem,” said Robert (“Bob”) Donaldson, a retired computer scientist. “I don’t know how we’re going to codify ethics when the interpretation or definition of ethics changes based on where people are in the world or what they believe.”

Donaldson spent his entire career in risk management, first as a systems engineer working an Exxon-owned oil refineries and later as the CIO of the State of Pennsylvania’s Department of Revenue. During the decades in between, he worked his way up the corporate ladder maintaining a focus on risk management.

“A safety and failsafe-focused culture was beaten into me at a very early stage of my career by the engineers, managers, and executives working at the Edmonton refinery,” said Donaldson. “You don’t want to kill or injure people or be responsible for one of those explosions you see happening from time to time around the world. There was very conscious activity and awareness of the importance of designing and implementing computer systems for use in real-time operating environments. There’s a parallel happening with A/IS now.”

One of the parallels is that A/IS systems and the systems used in the oil and gas industry need to be monitored to ensure their safety. Keeping entire oil and gas ecosystems safe is a very complex problem that is decomposed into smaller pieces that can be individually monitored and managed. When combined, the integrity of the individual pieces helps ensure the safety of the aggregate system.

“We didn’t call it ethics. We called it good engineering,” said Donaldson.

Ethical Use Is as Important as Ethical Design

When Donaldson worked for Pennsylvania’s Department of Revenue, his job involved the discovery of elaborate tax evasion schemes. Given its dependence on personally identifiable information (PII) to operate, the Department had to ensure compliance with various state and federal rules. He thinks the same level of rigor needs to be applied to the design and use of A/IS systems.

“If I’m a corporation that builds an A/IS system to do facial recognition for security management at airports, then that needs to follow the rules and regulations of TSA or whoever is deploying it and where it’s being deployed,” said Donaldson. “When a user acquires a system from me as a supplier, how they use it and what levels of discrimination they put into it is as important – if not more important – than the value that’s encoded into the tool.”

It would be easier to embed values in A/IS if all humans agreed to and accepted the same norms.  However, as the EAD points out, one set of norms and values cannot apply to equally in all parts of the world given their respective differences. The question then becomes whether the spirit of a stated set of principles and values can survive when interpreted differently by different groups.

“It’s a very complicated problem, even at the principles level. If we get the principles right, everything else can flow from it,” said Donaldson. “I think you need an immutable purpose for an A/IS system. If you have that and get the purpose right, you can hold a developer responsible for complying with that purpose and only that purpose. That way, if a user utilizes the system for an unintended purpose, then I as a developer can prove I at least tried to protect my shareholders. Whether that argument would hold up in a court is questionable, though.”

Donaldson also believes the penalties for non-compliance must be swift and severe to effectively dissuade bad and negligent actors whether they’re designers or users. However, enforcement doesn’t just happen; it requires tools that can identify and track compliance and non-compliance. A/IS monitoring and tracking A/IS.

“The Department of Revenue used all sorts of tools to identify tax evaders. We also had a law department that ensured we were in compliance with state and federal laws. All of us were aware of the level of scrutiny and very sensitive to it,” said Donaldson. “If an employee accessed their spouse’s tax return because they were going through a divorce, they were fired. We had the tools to verify that they looked at it. A/IS needs that level of rigor.”

A/IS and Humans Are Necessary to Ensure Ethical Implementations

At least two elements would be necessary to ensure the ethical design and use of A/IS systems: supervisory A/IS systems and humans. Supervisory A/IS systems are necessary to monitor systems because they can identify things at scale that surpass human capabilities. However, humans are also necessary because ultimately a person must decide whether A/IS are systems are making decisions that are in the best interest of humanity.

Donaldson believes that some sort of global oversight committee will be necessary, given the positive and negative impacts A/IS systems may have on the world’s citizens.

“I don’t see anybody worrying about A/IS at a macro level,” said Donaldson. “I don’t see us maintaining an accounting of responsibilities or how misuse is being managed and monitored. There are certainly a lot of forces at play that want A/IS to work in their favor, whether it’s corporations, governments or individuals.”

The reality is that A/IS involves many complex issues that are not easily addressed in a manner that is necessarily fair or even acceptable to everyone. Meanwhile, A/IS innovation is moving at light speed with and without ethical considerations. As time passes, the tension between fast product delivery and A/IS safety assurance will become more acute as product delivery cycles continue to shrink and A/IS roles expand.

Slowly, but surely, the high tech industry and other industries are waking up to the fact that the speed of innovation isn’t everything, especially when A/IS systems can expedite positive and negative outcomes at scale at near real-time speed. The question for all organizations is whether their implementation of A/IS ethics will be swift enough and effective enough to avert or at least minimize the unintended consequences of their A/IS systems.

Clearly, the answer to that question is both “yes” and “no” since some individuals and organizations will delay implementing A/IS ethics unless and until it becomes a brand issue or legal requirement. Even then, some will choose unethical paths because the potential rewards are too great to ignore.

Hope is not an effective strategy to propel A/IS ethics, in other words. Principles and values must be supplemented with enforcement mechanisms to effectively manage risks.