hls - algorithmic allegories
TRANSCRIPT
-
8/19/2019 HLS - Algorithmic Allegories
1/23
By Marcus Comiter, Ben Sobel, and Jonathan Zittrain Apri l 2015
Algorithmic Allegories (version 1.0)
Case Study
Copyright © 2015 Harvard University. No part of this publication may be reproduced, stored in aretrieval system, used in a spreadsheet, or transmitted in any form or by any means – electronic,mechanical, photocopying, recording, or otherwise – without permission.
http://casestudies.law.harvard.edu
-
8/19/2019 HLS - Algorithmic Allegories
2/23
-
8/19/2019 HLS - Algorithmic Allegories
3/23
Algorithmic Allegories (version 1.0) 1
Background Material
The Experiment
In the June 17, 2014 issue of the Proceedings of the National Academy of Sciences, researchers
from Facebook and Cornell University jointly published a paper on the results of anexperiment they conducted in 2012 entitled “Experimental Evidence of Massive-Scale
Emotional Contagion through Social Networks.”1 The experiment aimed to address whether
“emotional states can be transferred to others via emotional contagion, leading people to
experience the same emotions without their awareness.”2
To study this phenomenon, Facebook data scientists randomly selected 689,003 active
Facebook users and manipulated their News Feeds in a particular way.3 The Facebook News
Feed is the primary activity and content list on Facebook. More specifically, it is a filtered list
of content containing activities of a user’s Facebook Friends, such as posts and “Likes,” as
well as advertisements.
4
A proprietary Facebook algorithm determines the content displayedon the News Feed.5 Importantly, not all of the activities of a user’s Facebook Friends are
displayed in the News Feed, but only the subset selected for display by this algorithm.
Figure 1: The Facebook News Feed ca. February 2014
In the experiment, posts set to appear on selected users’ News Feeds were automatically
scored for emotional content based on the language used in the posts. Users were split into
three groups: one that would be exposed to proportionally more positive emotional content,one that would be exposed to proportionally more negative emotional content, and a
control group. After being divided into these groups, the users’ news feeds were modified.
For users in the positive group, some posts deemed emotionally negative in content would
be excluded from the News Feed. For users in the negative group, some posts deemed
emotionally positive in content would be excluded from the News Feed. For users in the
-
8/19/2019 HLS - Algorithmic Allegories
4/23
Algorithmic Allegories (version 1.0) 2
control group, random posts would be excluded from the News Feed with some probability,
independent of the posts’ emotional content.6
The users selected for participation had not volunteered or explicitly opted in. They were not
notified of their participation in the study. Instead, the researchers relied on the Facebook
Data Use Policy, to which all users agree in order to use the service. The researchersstipulated this fact in the publication, claiming, “As such, it [the experiment] was consistent
with Facebook’s Data Use Policy, to which all users agree prior to creating an account on
Facebook, constituting informed consent for this research.”7 However, the extent to which
research purposes were mentioned in the Data Use Policy at the time has been contested. 8
Ordinarily, the US Department of Health and Human Services Policy for the Protection of
Human Research Subjects, also known as the “Common Rule,” mandates that federally-
funded human subjects research pass an ethical review by an Institutional Review Board
(IRB).9 However, Facebook is a private company and therefore is not obligated to follow the
Common Rule. Because the data in this study were collected by Facebook, Cornell
University’s IRB concluded that its affiliated researchers were “not directly engaged inhuman research and that no review by the Cornell Human Research Protection Program was
required.”10
The research found that “…for people who had positive content reduced in their News Feed,
a larger percentage of words in people’s status updates were negative and a smaller
percentage were positive. When negativity was reduced, the opposite pattern occurred.”
Thus, the experimenters concluded, “These results suggest that the emotions expressed by
friends, via online social networks, influence our own moods, constituting, to our knowledge,
the first experimental evidence for massive-scale emotional contagion via social
networks...”
11
-
8/19/2019 HLS - Algorithmic Allegories
5/23
Algorithmic Allegories (version 1.0) 3
The Reaction
The PNAS article received large amounts of media attention soon after its June 17, 2014
publication. Public reaction to the study was largely negative. The Atlantic referred to the
study as “Facebook’s Secret Mood Manipulation Experiment,”12 and a Slate headline posited
that “Facebook’s Unethical Experiment Manipulated Users’ Emotions.”13
Privacy advocateLauren Weinstein went so far as to tweet, “I wonder if Facebook KILLED anyone with their
emotion manipulation stunt. At their scale and with depressed people out there, it’s
possible.”14 Technology journalist David Auerbach compared it to the infamous Stanford
Prison Experiment,15 in which students designated as guards seriously mistreated those
asked to pretend to be prisoners.16
Amidst the condemnation, several currents emerged. Some critics maintained that Facebook
crossed a line by manipulating its users’ emotions under the aegis of academic science as
opposed to doing so for business research purposes. 17 (To many, the latter purpose was far
less objectionable – perhaps because it has become such a staple of service optimizationacross the Web.) Others, in a similar vein, emphasized the lack of ethical oversight from an
institutional review board (IRB).18 Still others found fault with the discrepancy between the
way Facebook is publicly perceived and the way the company behaves: Tarleton Gillespie, a
professor at Cornell University, wrote that “Facebook is complicit in this confusion, as they
often present themselves as a trusted information conduit, and have been oblique about the
way they curate our content into their commodity.”19
Some objectors sought government sanctions against Facebook for facilitating the
experiment. The Electronic Privacy Information Center (EPIC), a self-described “independent
non-profit research center,”20 filed a complaint with the Federal Trade Commission against
Facebook, alleging that the company’s “failure to adequately disclose that it shared
consumer data with third-party researchers constitutes a deceptive act or practice in
violation of Section 5(a) of the FTC Act, 15 U.S.C. § 45(a).” 21 This section of the FTC Act
dictates that, among other things, “unfair or deceptive acts or practices in or affecting
commerce are...unlawful.”22 Furthermore, EPIC contended that Facebook violated a 2012
Consent Order issued by the FTC in the wake of a previous privacy claim.23
Not all responses were hostile to Facebook’s study. The dating website OkCupid’s official
blog published a post addressing the experiment, in which co-founder Christian Rudder
wrote, “…guess what everybody: if you use the Internet, you’re the subject of hundreds of
experiments at any given time, on every site. That’s how websites work.”24
Some commentators endeavored to contextualize the study amidst common practices
within the current technological environment. Harvard Law School Professor Jonathan
Zittrain posed, “What makes Facebook’s experiment…different from the A/B testing
Internet firms routinely do? That FB shared results?”25 As the quote suggests, a frequent
point of contention between the more and less outraged parties was whether or not the
-
8/19/2019 HLS - Algorithmic Allegories
6/23
Algorithmic Allegories (version 1.0) 4
PNAS study should be regarded as similar to A/B testing,26 a business practice designed to
improve online platforms that is frequently conducted by developers and managers.
Responses encompassed the pragmatic as well, such as Northeastern University researcher
Brian Keegan’s response to Zittrain, “I fear the gathering tar & feather mob will have a
profoundly chilling effect on the kinds of research FB discloses from now on.”27
In response to the public’s reaction to the study, Facebook issued the following comment:
This research was conducted for a single week in 2012 and none of the data used
was associated with a specific person’s Facebook account. We do research to
improve our services and to make the content people see on Facebook as
relevant and engaging as possible. A big part of this is understanding how
people respond to different types of content, whether it’s positive or negative in
tone, news from friends, or information from pages they follow. We carefully
consider what research we do and have a strong internal review process. There
is no unnecessary collection of people’s data in connection with these researchinitiatives and all data is stored securely.28
Further, Facebook employee and study author Adam Kramer stated that Facebook’s internal
review procedures had advanced since the study was conducted, and that the review
procedures would continue to evolve by taking the lessons learned from the emotion
contagion study into account. Beyond addressing the review procedures, Kramer also
rearticulated the study’s motivations, stating, “The reason we did this research is because we
care about the emotional impact of Facebook and the people that use our product. We felt
that it was important to investigate the common worry that seeing friends post positive
content leads to people feeling negative or left out.” 29
On July 22, PNAS’s editor-in-chief, Inder M. Verma, published an “Editorial Expression of
Concern” about the paper, noting that “This paper represents an important and emerging
area of social science research that needs to be approached with sensitivity and with
vigilance regarding personal privacy issues.”30 Because it is PNAS policy to adhere to the
Common Rule’s ethical mandates, Verma commented that it “a matter of concern that the
collection of the data by Facebook may have involved practices that were not fully consistent
with the principles of obtaining informed consent and allowing participants to opt out.”31
-
8/19/2019 HLS - Algorithmic Allegories
7/23
Algorithmic Allegories (version 1.0) 5
Legal Background
Commentators on both sides of the issue have explored the legality of the Facebook
experiment. Below, potentially relevant legal issues are broadly introduced:
Regulatory32
• Under Section 5(a) of the FTC Act, 15 U.S.C. § 45(a), the Federal Trade Commission
(FTC) has the authority to regulate “unfair or deceptive acts or practices in or
affecting commerce.” The Commission defines “unfair” practices as those that
“cause[] or [are] likely to cause substantial injury to consumers which is not
reasonably avoidable by consumers themselves and not outweighed by
countervailing benefits to consumers or to competition.”33
• The FTC could take two primary courses of action to use its regulatory authority in
response to this incident:
o Adjudication targets individual companies. When the Commission has
“reason to believe” a violation of the law has occurred, it can issue a
complaint to the allegedly offending company. The company may settle the
complaint or contest it in an administrative trial. Appeals may be taken as far
as the Supreme Court. FTC sanctions take the form of a “final order,” which,
if violated, can carry civil penalties of up to $16,000 for each violation.
o Rulemaking amends industry-wide violations. Under some circumstances,
Commission can propose to enact “trade regulation rules” that categorize
specific actions as unfair trade practices. If enacted rules are violated,companies may be assessed fines of up to $11,000 per violation, which are
extracted by the Commission filing suit in district court. However, the
Commission may issue a notice of proposed rulemaking “only where it has
reason to believe that the unfair or deceptive acts or practices which are the
subject of the proposed rulemaking are prevalent…” In order to make such a
finding, the Commission must first have “issued cease and desist orders
regarding such acts or practices” or otherwise ascertained “a widespread
pattern of unfair or deceptive acts or practices.”3435
Administrative
• The US Department of Health and Human Services Policy for the Protection of
Human Research Subjects, also known as the “Common Rule”, mandates that
federally-funded researchers seek Institutional Review Board (IRB) approval when
conducting human-subjects research.36 An IRB is a board that evaluates and then
approves or rejects proposed research on human subjects. In this role, IRB guidelines
require a number of important conditions to be met before approving research. For
-
8/19/2019 HLS - Algorithmic Allegories
8/23
Algorithmic Allegories (version 1.0) 6
example, one such requirement is “informed consent” of study participants, which
stipulates that,
...no investigator may involve a human being as a subject in researchcovered by this policy unless the investigator has obtained the legallyeffective informed consent of the subject or the subject’s legallyauthorized representative. An investigator shall seek such consentonly under circumstances that provide the prospective subject or therepresentative sufficient opportunity to consider whether or not toparticipate and that minimize the possibility of coercion or undueinfluence. The information that is given to the subject or therepresentative shall be in language understandable to the subject orthe representative. No informed consent, whether oral or written,may include any exculpatory language through which the subject orthe representative is made to waive or appear to waive any of thesubject’s legal rights, or releases or appears to release theinvestigator, the sponsor, the institution or its agents from liability for
negligence.37
• Informed consent requires that:
o “(1) A statement that the study involves research, an explanation of the
purposes of the research and the expected duration of the subject’s
participation, a description of the procedures to be followed, and
identification of any procedures which are experimental;
o (2) A description of any reasonably foreseeable risks or discomforts to the
subject;
o (3) A description of any benefits to the subject or to others which may
reasonably be expected from the research;
o (4) A disclosure of appropriate alternative procedures or courses of
treatment, if any, that might be advantageous to the subject;
o (5) A statement describing the extent, if any, to which confidentiality of
records identifying the subject will be maintained;
o (6) For research involving more than minimal risk, an explanation as to
whether any compensation and an explanation as to whether any medicaltreatments are available if injury occurs and, if so, what they consist of, or
where further information may be obtained;
o (7) An explanation of whom to contact for answers to pertinent questions
about the research and research subjects’ rights, and whom to contact in the
event of a research-related injury to the subject; and
-
8/19/2019 HLS - Algorithmic Allegories
9/23
Algorithmic Allegories (version 1.0) 7
o (8) A statement that participation is voluntary, refusal to participate will
involve no penalty or loss of benefits to which the subject is otherwise
entitled, and the subject may discontinue participation at any time without
penalty or loss of benefits to which the subject is otherwise entitled.”38
•
Further, “An IRB may approve a consent procedure which does not include, or whichalters, some or all of the elements of informed consent set forth in this section, or
waive the requirements to obtain informed consent provided the IRB finds and
documents that:
o
The research involves no more than minimal risk to the subjects;
o The waiver or alteration will not adversely affect the rights and welfare of the
subjects;
o The research could not practicably be carried out without the waiver or
alteration; and
o Whenever appropriate, the subjects will be provided with additional
pertinent information after participation.”39
Because Facebook’s data collection procedures were not evaluated by an IRB, law professor
James Grimmelmann argues that the company’s actions were illegal under Maryland
legislation that requires all “research” conducted in-state to comply with the Common
Rule.40 Criticizing this argument, Mike Masnick of Techdirt responded that such a line of
reasoning would require bureaucratic vetting to ensure the legality of “very basic forms of
experiments and modifications in all sorts of industries...a result incompatible with basic
common sense.”41 Facebook claimed the study was justified by a stipulation in its privacy
policy that asserts the company may use data for “research,” although several journalists
have reported that Facebook’s privacy policy at the time of the study did not make this
stipulation.42, 43
However, not all academics believe IRB evaluation would offer an all-purpose bulwark
against unethical research. Brian Keegan writes that proposals to subject commercial data
collection to the ethical standards of the Common Rule “are emblematic of ‘ethics creep’
wherein regulatory systems expand far beyond their originally-intended scope and ethical
conduct is uncritically equated with following bureaucratic rules.”44 Keegan concludes that
these and other deficiencies with IRB procedures can “have a corrosive effect on scientists'freedom to pursue legitimate research without politically-motivated interference and
[prioritize] managing bureaucratic routines over substantive evaluations of harm.”45
-
8/19/2019 HLS - Algorithmic Allegories
10/23
Hypothetical Problem: Materials are Fictional Representations
Algorithmic Allegories (version 1.0) 8
Hypothetical Case One: Newspapers
The Anytown Gazette, an online local newspaper, is interested in the way the emotional tenor
of its content influences subscriber activity. Its editor has planned a confidential, month-longstudy to examine this correlation in digital media. You are an independent consultant hired
by the Gazette. Read the brief sent to you by the Gazette attached on the next page. Your
task is to evaluate whether this study constitutes ethical business practice and to anticipate
the response of Gazette readership when the nature and results of the study are publicly
disclosed. Write a response to the Gazette outlining your opinion on the ethics of this study
and the public reaction it may elicit.
-
8/19/2019 HLS - Algorithmic Allegories
11/23
Hypothetical Problem: Materials are Fictional Representations
Algorithmic Allegories (version 1.0) 9
Exhibit 1: Anytown Gazette Research Brief
To: Andrew Lexis, Consultant
From: Ronald Smith, Editor, Anytown Gazette
Subject: Improving Anytown Gazette Reader Engagement
Dear Andrew,
Please find attached a brief of our proposed upcoming research. I look forward to your
response.
Anytown comprises three equally populated and demographically equivalent zip codes: 11111,
22222, and 33333. The Gazette’s online subscriber base is restricted to, and equally spread
among, these three zip codes.
Each evening, before the next issue is printed, the Gazette staff curates the website’s landing
page to feature the day’s most important stories. To test the manner in which emotional
content affects its subscribers, before pushing the articles the Gazette will use a computer
program to label the emotional content of all new articles either positive or negative.
Following this, the front page of the website will be modified in the following manner based
on subscriber zip code:
1.
In the version of the website that subscribers from zip code 11111 access, the Gazette
will move a proportion of homepage articles marked negative by the computer
program to a secondary page. It will replace these negative articles with articlesmarked positive that were scheduled to appear that day, but were not deemed
homepage-worthy by the editorial staff’s normal journalistic practices.
2.
In the version of the website that subscribers from zip code 22222 access, the Gazette
will alter its normally-curated homepage by moving a proportion of all articles
marked positive and replacing them with articles marked negative in a similar fashion.
3.
In the version of the website that subscribers from zip code 33333 access, the paper
will have the homepage it originally composed according to the editorial staff’s
journalistic sensibilities, without regard for the page’s overall emotional content.
During the month that it distributes these differing website versions, the Gazette will monitor
the letters to the editor and online feedback that it receives, score them as positive or
negative, and categorize each by the zip code of the return address.
The newspaper hopes to use this information to ascertain whether positive/negative
homepages engender positive/negative reader reaction (as measured by the average tone of
-
8/19/2019 HLS - Algorithmic Allegories
12/23
Hypothetical Problem: Materials are Fictional Representations
Algorithmic Allegories (version 1.0) 10
letters to the editor and website comments). In turn, the paper will cross-reference these
data points with its sales figures from each zip code in order to ascertain whether
positive/negative front pages affect subscription and purchasing rates. After refining its
business practices accordingly, the Gazette will also release the data for publication in a
research study conducted by Anytown University.
-
8/19/2019 HLS - Algorithmic Allegories
13/23
Hypothetical Problem: Materials are Fictional Representations
Algorithmic Allegories (version 1.0) 11
Hypothetical Case Two: WebWatch
Background Information: WebWatch is an online video entertainment business that allows
users to select movies, television shows, and other videos from WebWatch’s curated videocollection, and then stream them instantly to their computers. An important part of the
service is WebWatch’s proprietary content recommendation algorithm, which recommends
content to users based on past viewing history and ratings given to content by other
viewers. To refine its recommendation algorithm, WebWatch held an open contest in which
researchers from academia and industry were given WebWatch user data with which to build
new algorithms, the best of which would receive a one million dollar prize.46
Scenario: You are the general counsel for WebWatch. WebWatch is considering changes to
its recommendation system in order to increase the number of videos each user watches.
WebWatch has hired a social scientist to suggest possible avenues to achieve this goal. Based
on her domain knowledge and research experience, the social scientist believes that when
users are in a sad emotional state and then watch a depressing movie, they are more likely to
continue to watch more movies on WebWatch relative to those who watch an uplifting
movie. As a consequence, if WebWatch can infer when a user is sad and then recommend a
sad movie, the data scientist believes WebWatch can increase its users’ viewing habits.
To this end, the data scientist has developed and presented the following plan to the
WebWatch CEO. You are asked to provide your counsel on the appropriateness of the plan.
Read the proposal below, and then make a recommendation to the CEO.
-
8/19/2019 HLS - Algorithmic Allegories
14/23
Hypothetical Problem: Materials are Fictional Representations
Algorithmic Allegories (version 1.0) 12
Exhibit 2: Email from Researcher Janet Detali to WebWatch CEO Alex Tamsco
To: Alex Tamsco, WebWatch CEO
From: Janet Datali, WebWatch Data Scientist
Subject: Proposal to Increase WebWatch Viewership
Dear Alex,
Here is my research proposal aimed at increasing WebWatch viewership:
My professional experience suggests that inferring, and then acting upon, the current
emotional state of a user can increase the amount of content viewed by WebWatch
subscribers. Specifically, our initial research shows that if a user is in a sad emotional state
(e.g., following a break-up) when he logs on to WebWatch, viewing a sad movie will make
him more likely to immediately watch another movie. However, viewing a happy movie willmake him more likely to stop using WebWatch after the movie ends. By leveraging these
findings, if we can infer the emotional state of our users when they log on, we can modify
the WebWatch recommendation algorithm to suggest sad movies when the user is in a sad
emotional state. The following is a brief synopsis of a plan to develop this strategy.
As a first step, I recommend forming a strategic partnership with Facebook, through which
WebWatch would receive the most recent Facebook activity for a given user visiting the
WebWatch website. Recall that WebWatch already has a partnership with Facebook by
which users can see what movies their friends are watching, as well as share their own
watching activity, so receiving this additional information will not be difficult given the
current integration.47
I then suggest that WebWatch assign scores to each movie, where a positive score denotes a
happy movie, a negative score denotes a sad movie, and a small or zero score denotes a
neutral movie. I have been assured that WebWatch’s engineering team is able to build an
algorithm to do this quite easily.
Finally, given this information, I suggest the following modifications to the WebWatch
algorithm:
1.
When a user visits the WebWatch home page, WebWatch scans that user’s recentFacebook activity and scores the emotional content of that activity in order to infer
the user’s emotional state.
2. If the user is determined to be in a sad state based on his Facebook activity,
WebWatch modifies the list of suggested content made by the recommendation
algorithm in order to include proportionally more sad movies.
-
8/19/2019 HLS - Algorithmic Allegories
15/23
Hypothetical Problem: Materials are Fictional Representations
Algorithmic Allegories (version 1.0) 13
I am confident that this plan will help us to better monetize our subscriber base. Please
advise if I should instruct my team to begin this project.
Sincerely,
Janet Datali
-
8/19/2019 HLS - Algorithmic Allegories
16/23
Hypothetical Problem: Materials are Fictional Representations
Algorithmic Allegories (version 1.0) 14
Hypothetical Case Three: The Facebook “Altruism Initiative”
On May 1, 2012, Facebook introduced a feature that permits users to designate themselves as
Organ Donors.48
This feature includes links to state organ donor registries, where individualscan find information about organ donation and complete the forms required to legally
become donors. Additionally, users can choose to publish their self-declared organ donor
statuses on their friends’ news feeds, along with a message asking a reader to “Share Your
Donor Status.”49
This relatively small feature had a powerful effect. A study in the American Journal of
Transplantation documents that “On the first day of the Facebook organ donor initiative,
there were 13,054 new online registrations, representing a 21.1-fold increase over the baseline
average of 616 registrations. This first-day effect ranged from 6.9 (Michigan) to 108.9
(Georgia). Registration rates remained elevated in the following 12 days.”50
Now, Facebook is introducing a new site-wide Altruism Initiative, designed to encourage
particular altruistic actions at moments when users are most receptive to the idea. For
example, in addition to publicizing organ donation when a friend publishes his or her status,
Facebook will post messages about organ donation below News Feed entries that announce
deaths or mention life-threatening accidents, as well as next to memorial pages
commemorating the deceased. Beta tests on a subset of users indicate that this
methodology results in more donor registrations per user than the company’s previous
approach, likely because poignant posts may motivate more immediate altruistic responses.
As an additional component of this Altruism Initiative, Facebook will introduce subsidized
advertising opportunities for select non-partisan nonprofits that will reinforce their ads by
pairing them with relevant content (e.g., a gun control advocacy group could attach its
exhortations to users’ posts about gun violence, or a charity for foster children could deploy
its messages alongside users’ pregnancy announcements or posts by new parents).
You are the technology editor for the Metro News, a major news outlet. The editorial board
has asked you to write an article assessing and critiquing the ethics and overall impact of
Facebook’s new altruism initiative.
-
8/19/2019 HLS - Algorithmic Allegories
17/23
Hypothetical Problem: Materials are Fictional Representations
Algorithmic Allegories (version 1.0) 15
Hypothetical Case Four: A Controlled Facebook Study of Emotions
Facebook conducts a series of A/B tests on a sample of users, designed to maximize the time
users spend on the site and therefore increase the company’s advertising revenues. Thesetests involve gradually changing the emotional content of some users’ news feeds and
assessing whether these changes affect the time the users spend on the site. A subset of
users in this sample spent more time on Facebook when more “negative” posts appeared on
their News Feeds. As a result, this subset’s News Feeds were permanently altered to display
more negative posts. Similarly, users who spent more time on Facebook when exposed to
more “positive” posts had their News Feeds altered to display positive posts more
frequently. Further analysis of the results of these A/B tests reveals that users who would
respond to negative or positive News Feeds by spending more time on the site can be easily
predicted using data on their Facebook profiles.
You are on Facebook’s Board of Advisers, and have been presented with the results of this
study. The researchers recommend profiling other users site-wide to determine whether they
may be likely to spend more time on the site if the content of their feeds were skewed more
positive or more negative. You have been asked to give an evaluative briefing to the rest of
the Board of Advisers, specifically commenting on the following two questions:
1.
Is it acceptable for News Feeds to continue to be curated by emotional content for
the subset of users who responded to the A/B tests?
2.
Is it acceptable to profile users not included in the original A/B test and, if they are
likely to be spend more time on Facebook if their News Feed have a particularemotional tenor, display more negative or more positive content on their Feeds?
-
8/19/2019 HLS - Algorithmic Allegories
18/23
Hypothetical Problem: Materials are Fictional Representations
Algorithmic Allegories (version 1.0) 16
Hypothetical Case Five: Facebook and Elections51
Following the success of Facebook’s use of emotions to increase donations to charities, the
Facebook data team finds another paper that demonstrates that during elections, when thepopulace is happy, the incumbent tends to be reelected, and when the populace is unhappy,
the challenger tends to be elected.
Over the past five years, Facebook has spent over to $14 million on lobbying efforts. 52 Not
wanting to lose the money they have spent in establishing relationships with the incumbents
in Congress, Facebook would like to see as many incumbents reelected as possible.
In order to keep the incumbents safe in the election cycle, Facebook wishes to manipulate
their users’ emotions such that they feel happier. To do this, Facebook, using an improved
version of the emotional content scoring algorithm used in the Emotional Contagion study,
scores each News Feed item based on how negative, or unhappy, it is. Facebook thenupdates the algorithm governing which content is displayed on the News Feed to not display
any items with negative scores.
There is one exception to the plan: the incumbent from Washington’s Sixth District has
vowed to bring onerous regulations to the social media industry if reelected. Facebook
specifically will be hurt by this proposed legislation. In this district, Facebook wants to apply
the opposite of the previously modified algorithm such that only negative content appears to
Facebook users in this incumbent’s district.
You are a member of the Board of Directors for Facebook. Facebook’s General Counsel has
assured you that no law precludes this action. You have been asked to evaluate whether
Facebook should deploy this modified algorithm. Make your recommendation to the rest of
the Facebook Board of Advisors.
-
8/19/2019 HLS - Algorithmic Allegories
19/23
Algorithmic Allegories (version 1.0) 17
(Non-Hypothetical) Case Six: OkCupid53
OkCupid is a dating website that assesses users’ compatibility using a “match percentage,” a
quotient designed to “predict relationships” between users. The match percentage
“correlates with message success, conversation length, whether people actually exchangecontact information, and so on.” On July 28th, 2014, OkCupid co-founder Christian Rudder
wrote an article on OkTrends, the company’s official blog. The post was framed as a
response to the outrage over the Facebook News Feed manipulation study. Rudder wrote,
“We noticed recently that people didn’t like it when Facebook ‘experimented’ with their
News Feed...But guess what, everybody: if you use the Internet, you’re the subject of
hundreds of experiments at any given time, on every site. That’s how websites work.” The
blog post then outlined three experiments that OkCupid has conducted on its user base in
the past.
1.
Experiment 1: “Love Is Blind, or Should Be”
After launching an application that arranged blind dates, OkCupid “chose to
celebrate the app’s release by removing all the pictures from OkCupid on launch
day.” “Love Is Blind Day” took place on January 15, 2013. OkCupid observed that on
“Love is Blind Day,” compared with an ordinary Tuesday, “people responded to
messages 44% more often, conversations went deeper, [and] contact details (phone
numbers, emails) were exchanged more quickly.”
2.
Experiment 2: “So What’s a Picture Worth?”
OkCupid initially featured two five-point scales by which users could rate others’profiles: “looks” and “personality.” After noticing from correlations in rating data
that “to our users, ‘looks’ and ‘personality’ were the same thing,” the site phased out
the two scales and implemented a single scale instead. To measure the influence of
the picture versus the text of a user’s profile on that user’s scores, OkCupid “took a
small sample of users and half the time we showed them, we hid their profile text.”
This yielded two different score sets for each of the users: “one score for ‘the picture
and the text together’ and one for ‘the picture alone.’” OkCupid subsequently
examined these data and concluded that the presence or absence of the text exerted
relatively little influence on an individual’s score.
3.
Experiment 3: “The Power of Suggestion”
In order to measure whether “the mere suggestion [of compatibility] cause[s]
people to actually like each other”–in other words, whether the match percentage
simply induces confirmation bias in users instead of reflecting fundamental
compatibility—OkCupid “took pairs of bad matches (actual 30% match) and told
them they were exceptionally good for each other (displaying a 90% match.) OkCupid
-
8/19/2019 HLS - Algorithmic Allegories
20/23
Algorithmic Allegories (version 1.0) 18
also took pairs of well-matched users and displayed a poor match. In the post,
Rudder claims that “the users were notified of the correct match percentage” after
the conclusion of the experiment.
The users who were given falsely high compatibility rankings displayed behavior
similar to the behavior expected from users who were actually assigned highcompatibility rankings by OkCupid’s algorithm. Similarly, users with falsely low
compatibility rankings displayed behavior similar to the behavior expected from
users who were actually assigned low compatibility rankings. Rudder concludes that
the data demonstrate that “if you have to choose only one or the other, the mere
myth of compatibility works just as well as the truth.”
You are an outside ethicist asked to comment on OkCupid’s recent revelations that they, too,
“experiment on human beings,” and contrast these revelations with Facebook’s
controversial “emotional contagion” study. Note that unlike the previous hypothetical cases,
this blog post is an actual, official release from OkCupid.
The blog post is available online: Christian Rudder, “We Experiment on Human Beings!”,
OKTRENDS, July 28, 2014, accessed August 1, 2014, http://blog.okcupid.com/index.php/we-
experiment-on-human-beings/.
-
8/19/2019 HLS - Algorithmic Allegories
21/23
Algorithmic Allegories (version 1.0) 19
1 Adam D.I. Kramer, Jamie E. Guillory, and Jeffrey T. Hancock, “Experimental evidence of massive-scale
emotional contagion through social networks,” Proceedings of the National Academy of Sciences (2014):
201320040.2 Ibid., 8789.
3 Ibid.4 Facebook, “How News Feed Works,” accessed July 2014,
https://www.facebook.com/help/327131014036297/.5 Ibid.
6 Kramer, Guillory, and Hancock, “Experimental Evidence,” 1-2.
7 Ibid., 2.
8 Gregory McNeal, “Controversy Over Facebook Emotional Manipulation Study Grows As Timeline
Becomes More Clear,” Forbes, June 30, 2014, accessed September 29, 2014, http://www.forbes.com/sites
/gregorymcneal/2014/06/30/controversy-over-facebook-emotional-manipulation-study-grows-as-
timeline-becomes-more-clear/.9 United States Department of Health and Human Services, “Federal Policy for the Protection of Human
Subjects ('Common Rule'),” accessed April 16, 2015
http://www.hhs.gov/ohrp/humansubjects/commonrule/index.html.
10 John Carberry, “Media statement on Cornell University’s role in Facebook ‘emotional contagion’
research,” Cornell University Media Relations Office, June 30, 2014, accessed April 15, 2015,
http://mediarelations.cornell.edu/2014/06/30/media-statement-on-cornell-universitys-role-in-facebook-
emotional-contagion-research/ [http://perma.cc/C3N2-H24B] 11
Kramer, Guillory, and Hancock, “Experimental Evidence,” 1.12
Robinson Meyer, “Everything We Know About Facebook’s Secret Mood Manipulation Experiment,” The
Atlantic, June 28, 2014, accessed November 12, 2014, http://www.theatlantic.com/technology/archive
/2014/06/everything-we-know-about-facebooks-secret-mood-manipulation-experiment/373648/.13
Katy Waldman, “Facebook’s Unethical Experiment,” Slate, June 28, 2014, accessed November 12, 2014,
http://www.slate.com/articles/health_and_science/science/2014/06/facebook_unethical_experiment_it
_made_news_feeds_happier_or_sadder_to_manipulate.html.14
Lauren Weinstein, “I wonder if Facebook KILLED anyone with their emotion manipulation stunt. At their
scale and with depressed people out there, it’s possible.,” Twitter feed, June 28, 2014, 5:55 PM,https://twitter.com/laurenweinstein/status/483051171255312384.15
David Auerbach, “Facebook’s next psych experiment: dividing users into ‘prisoners’ and ‘guards.’
http://www.avclub.com/article/facebook-tinkered-users-feeds-massive-psychology-e-206324,” Twitter
feed, June 28, 2014, 7:33 AM, https://twitter.com/AuerbachKeller/status/482894538449903616.16
The Stanford Prison Experiment was a 1971 study conducted by psychology professor Philip Zimbardo in
which participants were asked to assume roles as prisoners and prison guards. Specifically, “For six days,
half the study’s participants endured cruel and dehumanizing abuse at the hands of their peers...Some of
them rebelled violently; others became hysterical or withdrew into despair.” Romesh Ratnesar, “The
Menace Within,” Stanford Alumni , July/August 2011, accessed November 12, 2014, http://alumni
.stanford.edu/get/page/magazine/article/?article_id=40741. The experiment gained notoriety for
illustrating, in Zimbardo’s words, “that these ordinary college students could do such terrible things when
caught in that situation,” and drew criticism because of the researchers’ complicity in the subjects’
suffering. Ibid.17
Cf. Sara Watson’s contention that it is improper to “unquestioningly [accept Facebook’s] claims to the
prestige, expertise, and authority of science”); Sara M. Watson, “Data Science: What the Facebook
Controversy is Really About,” The Atlantic, July 1, 2014, accessed November 12, 2014, http://www
.theatlantic.com/technology/archive/2014/07/data-science-what-the-facebook-controversy-is-really-
about/373770/.18
Gail Sullivan, “Cornell ethics board did not pre-approve Facebook mood manipulation study,”
Washington Post , July 1, 2014, accessed November 12, 2014, http://www.washingtonpost.com/news
http://perma.cc/C3N2-H24Bhttp://perma.cc/C3N2-H24B
-
8/19/2019 HLS - Algorithmic Allegories
22/23
Algorithmic Allegories (version 1.0) 20
/morning-mix/wp/2014/07/01/facebooks-emotional-manipulation-study-was-even-worse-than-you-
thought/.19
Tarleton Gillespie, “Facebook’s algorithm — why our assumptions are wrong, and our concerns are
right,” Culture Digitally , July 4, 2014, accessed November 12, 2014, http://culturedigitally.org/2014/07
/facebooks-algorithm-why-our-assumptions-are-wrong-and-our-concerns-are-right/.20
The Electronic Privacy Information Center, “About EPIC,” accessed August 1, 2014,http://epic.org/epic/about.html.21
The Electronic Privacy Information Center, In the Matter of Facebook, Inc., (2014) (Complaint, Request
for Investigation, Injunction, and Other Relief before the Federal Trade Commission),
http://epic.org/privacy/ftc/facebook/Facebook-Study-Complaint.pdf.22
FTC Act, 15 U.S.C. § 45(a)23
Federal Trade Commission, In the Matter of FACEBOOK, INC., a corporation. Agreement Containing
Consent Order Federal Trade Commission (2011), http://www.ftc.gov/enforcement/cases-
proceedings/092-3184/facebook-inc24
Christian Rudder, “We Experiment on Human Beings!,” OkTrends, July 28, 2014, accessed August 1,
2014, http://blog.okcupid.com/index.php/we-experiment-on-human-beings/. 25
Jonathan Zittrain, “What makes Facebook’s experiment…different from the A/B testing Internet firms
routinely do? That FB shared results?,” Twitter feed, June 28, 2014, 8:01 AM, https://twitter.com/zittrain
/status/482901553024884736.26
Website A/B Testing: a form of user experience testing in which two different versions of a site are
shown to two different groups of users. Attributes such as time spent on the website, clicks, products
purchased, and other characteristics are recorded and then evaluated. Note that A/B Testing is a general
framework and does not apply only to websites.27
Brian Keegan, “I fear the gathering tar & feather mob will have a profoundly chilling effect on the kinds
of research FB discloses from now on.,” Twitter feed, June 28, 2014, 8:23 AM, https://twitter.com
/bkeegan/status/482907084095094785.28
Robinson Meyer, “Everything We Know About Facebook’s Secret Mood Manipulation Experiment,” The
Atlantic, Jun 28, 2914, accessed February 25, 2015,
http://www.theatlantic.com/technology/archive/2014/06/everything-we-know-about-facebooks-secret-
mood-manipulation-experiment/373648/.29
Adam D.I. Kramer, “OK so. A lot of people…”, Facebook post, June 29, 2014,https://www.facebook.com/akramer/posts/1015298715086779630
Inder M. Verma, “Editorial Expression of Concern: Experimental evidence of massive scale emotional
contagion through social networks,” PNAS, accessed April 15, 2015,
http://www.pnas.org/content/111/29/10779.1.full
[http://perma.cc/SQ7Y-MY33] 31
Ibid.32
Federal Trade Commission, “A Brief Overview of the Federal Trade Commission’s Investigative and Law
Enforcement Authority,” July 1 2008, accessed September 29, 2014, http://www.ftc.gov/about-ftc/what-
we-do/enforcement-authority.33
The Electronic Privacy Information Center filed an FTC complaint against Facebook, alleging that the
company’s “failure to adequately disclose that it shared consumer data with third-party researchers
constitutes a deceptive act or practice” that violates the aforementioned stature. See The Electronic
Privacy Information Center, In the Matter of Facebook, Inc., http://epic.org/privacy/ftc/facebook
/Facebook-Study-Complaint.pdf.34
15 U.S.C. § 57a(b)(3)35
See, generally, http://www.ftc.gov/enforcement/rules/rulemaking-regulatory-reform-proceedings36
Research is defined by the Common Rule as “a systematic investigation, including research
development, testing and evaluation, designed to develop or contribute to generalizable knowledge.”37
United States Department of Health and Human Services, “Code of Federal Regulations,” January 15,
2009, accessed September 29, 2014, http://www.hhs.gov/ohrp/humansubjects/guidance/45cfr46.html
http://blog.okcupid.com/index.php/we-experiment-on-human-beings/http://perma.cc/SQ7Y-MY33http://perma.cc/SQ7Y-MY33http://blog.okcupid.com/index.php/we-experiment-on-human-beings/
-
8/19/2019 HLS - Algorithmic Allegories
23/23
#46.116.38
Ibid.39
Ibid.40
James Grimmelmann, “Illegal, Immoral, and Mood-Altering: How Facebook and OkCupid Broke the Law
When They Experimented on Users”, Medium, September 23, 2014, accessed February 8, 2015,
https://medium.com/@JamesGrimmelmann/illegal-unethical-and-mood-altering-8b93af772688.41 “Law Professor Claims Any Internet Company ‘Research’ On Users Without Review Board Approval Is
Illegal,”Techdirt , September 24, 2014, accessed September 29, 2014, https://www.techdirt.com/articles
/20140924/00230628612/law-professor-claims-any-internet-company-research-users-without-review-
board-approval-is-illegal.shtml.
42 Kashmir Hill, “Facebook Added 'Research' To User Agreement 4 Months After Emotion
Manipulation Study,” Forbes, June 30, 2014, accessed April 15, 2015,
http://www.forbes.com/sites/kashmirhill/2014/06/30/facebook-only-got-permission-to-do-research-on-
users-after-emotion-manipulation-study/ 43
McNeal, “Controversy over Facebook Emotion Study.”
44 Brian Keegan, “Research in the Time of Technopanics,” Association for Computing Machinery
Conference on Computer-Supported Cooperative Work and Social Computing, March 14, 2015, accessed
April 15, 2015, https://cscwethics2015.files.wordpress.com/2015/02/keegan.pdf.45 Ibid.46
See, generally, http://www.netflixprize.com/.47
See, generally, Netflix, “How do Netflix Social Settings and features work,” Netflix Help Center , accessed
February 2, 2015, https://help.netflix.com/en/node/464.48
Steve Almasy, “Facebook encouraging organ donations,” CNN, May 1, 2012, accessed February 3, 2015,
http://www.cnn.com/2012/05/01/health/facebook-organ-donors/index.html). 49
Facebook, “How do I share that I’m an organ donor on Facebook?,” accessed February 2015,
https://www.facebook.com/help/338196302902319. 50
A. M. Cameron et al., “Social Media and Organ Donor Registration: The Facebook Effect”, American
Journal of Transplantation (2013): 2059.51
This hypothetical is inspired by Professor Jonathan Zittrain’s June 2014 article, “Facebook Could Decide
an Election Without Anyone Ever Finding Out.” Jonathan Zittrain, “Facebook Could Decide and Election
Without Anyone Ever Finding Out,” New Republic, June 1, 2014, accessed November 12, 2014,http://www.newrepublic.com/article/117878/information- fiduciary-solution-facebook-digital-
gerrymandering.52
OpenSecrets, “Facebook Inc.,” accessed August 1, 2014, https://www.opensecrets.org/lobby
/clientsum.php?id=D000033563.53
This actual scenario and all quotes come from Rudder, “We Experiment on Human Beings!”.