Tile with Facebook logo submerged in water for Ronnie Stangler MD good reads page regarding Clicking Accept is not Informed Consent.

Clicking “Accept” Is Not Informed Consent

Social media companies claim that their terms of use permit them to run research trials on users.

A recent Science article published the results of an experiment conducted on 20 million LinkedIn users over five years involving the “People You May Know” algorithm. The experiment randomly manipulated the algorithm to understand the effect on users’ likelihood of getting jobs. None of these people knew they were part of an experiment, nor did they consent to participate.

Informed consent is a bedrock of human subjects research ethics in the United States. Any study done with federal funds or conducted at an institution that receives federal funds is required to review all proposed human subjects research through an institutional review board. Part of the process is ensuring that there is adequate protection to prevent harm to the subjects and to make sure that they consent to participating in the experiment. Such consent means that they know the risks, benefits, and alternatives, as well as having an opportunity to ask questions and to refuse to participate.

Since LinkedIn is a private company, owned by Microsoft, it does not legally fall under the requirements for human subjects review. But these requirements are so widely accepted that most human research studies in the U.S. abide by them. Although the study was approved by the MIT Institutional Review Board, one must question what was included in the protocol application and what elements of this study were debated. After all potential subjects should at the very least know they are subjects. There are some experiments in which knowing the process would influence the outcome and IRBs have mechanisms for such situations—people agree to being subjects and are debriefed afterwards. Even that ethical practice is missing here.

Eight years ago, Facebook was criticized for conducting a social experiment that manipulated the emotional content of users’ news feeds and learned that people who saw more negative content displayed more traits of depression in their posts. In all this time, no standards or regulations have been created to address the gap in human research oversight involving studies conducted on social media.