Everyone involved in software development has an ethical responsibility to minimize the harms that software can cause to people and maximize the benefits
Software systems on two Boeing MAX 737 planes interpreted the signal from a faulty sensor and decided on a course of action that crashed the planes into the sea, killing hundreds of people. The UK Post Office prosecuted more than 700 sub-postmistresses and sub-postmasters for “shortfalls” and “false accounting” wrongly reported by its Horizon software system, forcing them to “repay” thousands of pounds they hadn’t stolen. People went bankrupt, losing their businesses, their livelihoods, and in many cases their homes and marriages. Many went to prison. One suicide occurred as a direct result; there were likely others. At least one suicide has been attributed to widespread errors in the Phoenix Canadian federal government payroll system, which wrongly calculated payroll amounts causing major life stresses and financial problems for thousands of employees over several years.
Software is everywhere, permeating our society, impacting every aspect of our lives. It can and does bring enormous benefits, but it can also cause great harm to people. As software practitioners we have an ethical responsibility to maximize the benefits of software and minimize the harms. In this keynote, Fiona Charles explores the practical things that testers can do.