From the course: Foundations of Responsible AI

Autonomous systems and society

From the course: Foundations of Responsible AI

Autonomous systems and society

- People often react to autonomous systems in ways developers and researchers don't expect. For some, it brings up fears of automation replacing human jobs. For others, they assume autonomous devices are surveillance devices, and their skepticism isn't too far from the truth. Many autonomous devices deployed in public spaces have cameras on them, and the public doesn't know how that data is stored, when it gets deleted, if it gets deleted, or how to revoke access to their image. This is a pretty big deal, and the chance that you're in a facial recognition database and you don't know about it is high 50% of Americans have their images in facial recognition databases used by law enforcement. With these concerns, we need to ask, how do we develop autonomous systems that take special precautions to avoid creating products that reinforce and strengthen these fears? Before we implement any AI system, we should understand if the system is even necessary, meaning it measurably outperforms people at the same task with regards to fairness, privacy, and then with regards to accuracy. Outlining the potential value of new technologies weighed against the potential harms in the earliest design stages can reduce the unintended negative consequences once a new autonomous system is deployed. So what does this look like? We need a diversity of perspectives, including those from populations that have been overpoliced, those from areas where autonomous weapons are frequently used, and those who have no stake in the success of autonomous systems. We also have to consider scenarios that are not routine and not ripe for deploying autonomous systems. Be aware that the public wants clear, concise information about why we deploy these systems, especially in the public sphere. One reason there are few standards here is because there are few policies and frameworks to guide organizations. There will always be a debate around whether these systems should be used at all, and I encourage you to seek out and brainstorm alternatives to robot cop dogs and surveillance drones. Deploying these systems will almost always cause a decrease of public trust, even in cases where you feel like you're doing the right thing, Many harmful technologies have developed in response to world events, such as an increase in violence. However, these tools don't actually address the cause of the issues. Instead, we unintentionally and sometimes intentionally instill fear in people, when we could allocate time and financial resources to curbing the root causes of these events. Take crime increase as an example. When we drill into the details of recent crime spikes, as reported by various police departments, we notice there are far more crimes related to an individual's financial stability rather than their propensity for violence. Theft, carjacking and vandalism are more frequently tied to socioeconomic status but we assume technical solutions will address these issues, when they can't. Studies done by both the governments of California and New York have shown that providing financial relief to people who are struggling with food and housing insecurity slows the rate at which they become homeless and need to turn to alternative methods of survival. If the millions of dollars spent on surveillance tools were given to those struggling financially, especially considering the economic impact of COVID-19, we'd see a drop in crime related to financial instability. However, this isn't the method many want to take despite proven results. One potential method of avoiding the use of harmful technologies is by developing the lowest tech solution that provides some results, and slowly expanding that without adding new technology or machine learning models. Then performing studies to understand the impact of these interventions and without building AI as a default. Next, decide what other methods can be used to scale those that are successful at meeting your objectives. Whether that can be reducing crime or something else. I challenge you to think outside the chip and opt for low-tech solutions when attempting to tackle societal issues.

Contents