Wednesday, May 2, 2018

Technology, Risk and Facebook's Dating Service

Facebook, thanks for providing the perfect demonstration of my recent paper on risk communication and technology, presented at WeRobot2018. In starting a new dating service, Facebook shows exactly the cluelessness that I worried tech world had about public attitudes.

Full disclosure, I really like Facebook. I enjoy it immensely and there is a decent chance readers will find this post through Facebook. Further, on the whole I am a fan of new technology, we've seen great things happen in recent decades. My paper is a bit of a cri de coeur to tech world urging them to consider public perceptions so that when they do something ill-advised they don't face backlash that stymies innovation.

Let me start from the beginning.

Communicating Risk
Robots, that is, in the words of the inestimable Laurel Riek, are: physically embodied systems capable of enacting physical change in the world. This change, either locomotion or manipulation, means this systems will be able to do harm. The paper includes an in-depth taxonomy of vectors of harm robots may do. Further, many, although not all, robots are directed by non-deterministic algorithms, which means that they may act in unpredictable ways. (The tragic Uber self-driving vehicle accident is an example.) On the whole, these systems have enormous potential to bring benefits, but that must be balanced against the potential for harm.

To make informed decisions about using and interacting with robots, people need essential information. Ensuring people have this information is the role of risk communication.

The paper begins with a summary of risk communication, a well developed field. There has been extensive research on risk communication and public health and environmental issues, as well as the subfield of crisis communications. There has been extensive research about how to best communicate probabilities, what type of language to use, and what mechanisms are most effective for reaching audiences. This is not to say it is a settled science, much of the work is intuitive and every issue and situation will require new approaches.

Further - and this is big - people vastly overestimate how well they understand others and how well they are understood. This, from my cursory reading, is central to communications theory and makes all of this really hard. A communicator might think they did a bang-up job, but the key points recalled by the recipient were not what was intended.

The obvious conclusion is that the robotics industry should start studying this field and figure out how to apply it. But there's more (and I'm coming back to Facebook - even though they aren't building robots.)

Risk Perception
But there's more. A lot more. The risk communication process described above is a rational cost benefit analysis process. But that is not how people make decisions. Certain types of risks and benefits loom large in people's minds out of proportion to their probability. The classic case is terrorism, which your TerrorWonk will readily point out is much less likely to kill someone in the U.S. than a car accident. But this offers little comfort, people understand car accidents and feel they have some control. Terrorism is poorly understood, uncontrolled, and potentially catastrophic. Terrorism, in particular, inspires dread because there is an active adversary behind it.

It would be easy to dismiss these concerns as irrational, but also unwise. Paul Slovic, one of the giants of this field, wrote:
Perhaps the most important message from this research is that there is wisdom as well as error in public attitudes and perceptions. Lay people sometimes lack certain information about hazards. However, their basic conceptualizations of risk is much richer than that of experts and reflects legitimate concerns that are typically omitted from expert risk assessments.
Donald MacGregor, of MacGregor-Bates Applied Decision Concepts, put it more bluntly, telling me:
Familiarity with the technical risk analysis can breed contempt for those who don't share the same views of risk. 
This is where businesses, industries, and governments can get into trouble, when they do not consider these perceptions of risk a failure can lead to a "signal" event that triggers public concern of catastrophic impact. The classic case is the nuclear power industry. They did not consider seriously the potential of an accident, did not engage in serious risk communication, and when Three Mile Island occurred the public was frightened. Even though the accident did no real damage, the public perception shifted quickly and nuclear power development was effectively halted in the United States.

In the paper I discussed how robots, because they are perceived as having agency and because their actions may not be well understood may trigger high levels of perceived risk and could trigger a signal event. I'll go farther and say that tech world more broadly is not immune to this possibility.

A Matter of Trust
Effective risk communication relies on trust. If the communicator has trust, the audiences will hear their message and bear some risks. Trust however is very hard to build, requiring extensive two-way communication. It is also very, very easy to lose.

My concern is that tech world is assuming a high level of public trust. Facebook (remember them) has the famous slogan: move fast and break things.

Former Google CEO Eric Schmidt told an audience at MIT: Instead of spending all day worrying, why don't you wait until there's a near miss... Let's not translate that worry into premature constraints on the innovators....

In tech world the belief appears to be that when inevitable failures occur, the public will understand. But this reservoir may not exist and when signal events occur, there will be broad regulatory and public backlash.

Back to Facebook
That brings us back to Facebook's dating service (I don't doubt, by the way, the company's ability to do some effective analytics on this). After the Cambridge Analytica imbroglio, the company is under increasing scrutiny. That scrutiny is not going to be limited to the issues around the 2016 election, it will extend more broadly into what Facebook does with the data it gathers. They are, to their credit, making some moves to better meet privacy concerns.

Given this situation is now a good time to consider a bold new endeavor that leverages very personal information? Further, there are going to be incidents of violence and harassment linked to this dating service - this is a matter of percentages, given the unfortunate and terrible reality of violence against women. Even if Facebook does a masterful job and is incredibly effective at screening out those with violent tendencies (and this is very hard to do), their algorithm will not be perfect.

When this happens Facebook is very likely to be held responsible. Their arguments that they have done everything possible to protect the participants in their dating site will not sway and angry public. When they appeal to the public on the basis of the good they've done, Facebook may find that it has few friends.

No comments: