Published on August 31, 2015. Views: 7458. Downloads: 623. Suggestions: 0.
No More Secrets: Gmail and Facebook can determine your political values
Can software programs (“algorithms”) used by email providers such as Gmail, and social media websites such as Facebook, learn an individual's political values? For quick insight into this question, I created Gmail and Facebook accounts to represent users with different political values: one a Democrat and the other a Republican. After monitoring advertisements and sorting results from Gmail’s algorithms and suggested pages from Facebook’s algorithms for just 10 days, I found evidence that the algorithms of both Gmail and Facebook may learn personal preferences from the user’s behavior, may then infer other preferences such as political values, and may steer content to and away from the person accordingly.
Results summary: During the 10-day study period, the Facebook algorithm made a total of 61 suggestions, 30 to the Democratic profile and 31 to the Republican profile. Suggestions related to the political party of the profile were 16 for the Democrat and 21 for the Republican. Suggestions related to politics from a non-partisan or opposing party were 6 for the Democrat and 4 for the Republican. Non-political suggestions, such as TV shows and sports teams, were 8 for the Democrat and 6 for the Republican. These numbers suggest different ad experiences with the Republican profile receiving more pro-party ads than the Democratic profile. Only 68 advertisements were delivered on Gmail: 34 went to the Democrat and 34 to the Republican. Most of the ads (52 or 76 percent) were the same on both profiles, and the few ads delivered uniquely to each profile suggest some non-descriptive distinction. This study encourages future work with a longer study period.
In today’s technological, connected world, can you keep information private? Even if you do not explicitly share some information about yourself, other information you may be willing to share may leak further hints about you. Email and social media make it fast and easy to continuously share personal information. Once personal information is shared, companies such as Gmail and Facebook have algorithms that use your personal information and online behaviors to target advertisements and suggest other pages that may be of interest to you. By their very nature, these algorithms are attempting to learn more about you; therefore, it can be difficult for you to limit what they learn. Additionally, because they deliver ads and make recommendations that steer content toward you and away from you, your online experience may be largely shaped by their interpretation. Exactly what might these algorithms learn from your online behaviors? Can they learn your political values (i.e., your commonly shared political views or preferences)? If so, does the response differ based on your apparent partisan leanings?
Gmail’s terms of service state that automated software analyzes incoming and outgoing emails in order to target advertisements to users . Facebook is not as clear about the means it uses to determine a user’s interests, though its advertising policy disclosure states that a user’s ad preferences derive from the information shared with Facebook, apps and websites used, and pages engaged with or liked . Businesses advertising on Facebook select the audience for their ads using those preferences .
The algorithms of both Gmail and Facebook seek to deliver advertisements relevant to the user, with the hope that doing so will increase the likelihood the user clicks on links  and thereby increase revenue from advertisers.
What kind of information do the algorithms steer to and away from a user? I expect a Democrat’s Gmail inbox and Facebook page to contain liberal advertisements and suggestions, while a Republican’s Gmail inbox and Facebook page should contain conservative ads and suggestions. Do they?
In addition, I expect algorithms associated with Gmail to start sorting emails that disagree with a user’s political beliefs into the “Promotions” tab or “Spam” folder, while sorting the emails that agree with the user’s political beliefs into the “Primary” tab. Do they?
To answer these questions, and discover whether the algorithms used by Google’s Gmail and by Facebook can identify a user’s political values, I setup two parallel identities, one Democrat and one Republican, and then monitored advertisements and suggestions to see how they correspond to the unstated beliefs of the account holders. The algorithms rely on email content and subscriptions to lists and newsletters to learn political values. As a first step, I created two email accounts through Gmail representing users with two different political affiliations. To ensure the political preference was clear, one email address used the name Dominant Democrat with the address “firstname.lastname@example.org,” while the other account used the name Ruling Republican and the address “email@example.com.” To keep external variables constant, I listed both as 48-year-old men and did not specify a geographic location. I then subscribed each user to the same set of 36 email lists and newsletters. This set included daily updates from a mixture of liberal and conservative news sources, such as MSNBC and Fox News, and magazines such as Nation and Red State. It also included subscriptions to mailing lists of 2016 presidential campaigns, such as Hillary Clinton’s and Marco Rubio’s, as well as the Democratic National Committee and the Republican National Committee. For a politically neutral source, I subscribed both accounts to 10 newsletters from various Harvard Athletic teams.
Then I monitored where Gmail automatically sorted incoming emails in its webmail interface: into either of the default tabs, the “Primary” tab or “Promotions” tab, or the spam mailbox. I also monitored the advertisements displayed by Google within the “Promotions” tab. After carefully recording and taking screenshots of the email locations and types of advertisements, I deleted the emails that disagreed with the specific user’s political beliefs without reading them, and read the emails that agreed with the political beliefs. For example, every time an email from the Marco Rubio campaign appeared in Dominant Democrat’s inbox, I deleted it, while I read an email from Hillary Clinton’s campaign and left in the inbox. In order to create a pattern of behavior indicating partisan preference for each user, I continued this process throughout a two-week period, from April 25 to May 9, 2015.
For the second part of this experiment, I used the Gmail addresses to create Facebook profiles for the two users five days after creating the initial email addresses. The Democrat was named Dom Dem and the Republican was named Ruling Rep. To keep external variables constant and to match the new profiles to the Gmail profiles, I listed both as 48-year-old self-employed men. I used Facebook’s like button to indicate political preferences. While signed into Dom Dem’s Facebook page, I “liked” 19 pages supported by a general Democrat, such as the Democratic National Convention, MSNBC, U.S. Senator Elizabeth Warren, and Barack Obama. In addition, I liked 13 non-political pages including Michael Jackson, the Boston Red Sox, and the television show Law & Order. For Ruling Rep, I also liked 19 pages supported by a Republican, including the Republican National Committee, the Conservative Post, Speaker John Boehner, and George W. Bush. I also liked the same 13 non-political pages that I liked on behalf of Dom Dem. I collected data from Facebook over a 10-day period from April 30 to May 9, 2015. Even though the original goal was to record the advertisements displayed on Facebook associated with each page, no ads appeared over this 10-day period. I recorded data from the suggested pages listed on the right side of the user’s Facebook profiles in the place of advertisements for each account holder instead.
For Gmail, I found that while the advertisements started out as identical for both accounts, with ads for Uber and a Fisher Investment retirement account, over time the ads started to diverge. A total of 68 advertisements were delivered on Gmail: 34 were to the Democrat profile and 34 were to the Republican profile. About 76 percent (or 52 of 68) of the advertisements displayed by Google appeared at least one time for both addresses, with several, including Uber and Nova Scotia Cruises, appearing multiple times in each inbox. Seven ads were unique to the Republican address, and 9 ads were unique to the Democratic address. Figure 1 lists the ads that were unique to each profile.
Figure 1. Lists of ads that were unique to the Democratic profile (left) and the Republican profile (right) using Gmail.
One of the most common advertisement categories for both inboxes was higher education, or more specifically online MA, MS, and MBA programs. While the University of Phoenix Online and Gonzaga Online were present in both inboxes, Northeastern University’s MS in Computer Science and MS in Energy Systems appeared only in the Democrat’s inbox, while USC’s online MA in Library Science and UNC Online’s MBA program appeared only in the Republican’s inbox. Another notable difference was in the products offered to each user. Both inboxes received advertisements for “Mack Weldon: Men’s Best Underwear” and Harry’s Razors, while only the Democratic inbox received advertisements for furniture and fitness products and only the Republican inbox received advertisements for Samsung’s portable portfolio and the “Publisher’s Clearing House 2015 Dream Home Giveaway.” These differences are slight, but they encourage further investigation with a longer study period.
Gmail’s algorithm initially sorted all incoming emails into the same tabs for each user. For both addresses, Gmail placed emails received from FoxNews, the RNC, Facebook, and Gmail itself in the “Primary” tab, while Gmail placed emails from MSNBC, the DNC, and Harvard Athletics into the “Promotions” tab. Gmail divided emails from The Nation magazine and Marco Rubio between both tabs for each address. Emails from Hillary Clinton first appeared in the “Primary” tab for both addresses, but over time, they appeared in the “Promotions” tab for the Republican address, while remaining in the “Primary” tab for the Democratic address. This was the only email to diverge between both accounts. Only one email went into the Spam folder: The Nation, a liberal news source, went into the Republican’s Spam folder, though this occurred at the very beginning of the experiment and then did not occur for future emails from The Nation.
For Facebook, my results concentrate on the groups suggested by Facebook’s algorithm that appear on the right side of the page, in place of advertisements, on each user’s profile. I found a total of 61 suggestions, with 30 to the Democratic profile and 31 to the Republican profile. Of these, 16 (or 53 percent) suggestions to the Democratic profile were pro-Democrat, while 6 (or 20 percent) were either Republican or non-partisan, and 8 (or 27 percent) were non-political. For the Republican profile, 21 (or 67 percent) of suggestions were pro-Republican, while 4 (or 13 percent) were either Democratic or non-partisan, and 6 (or 20 percent) were non-political. In total, 37 (or 61 percent) suggestions related to the political party of the user while 10 (or 16 percent) related to the politics of a non-partisan or opposing party, and 14 (or 23 percent) were non-political suggestions, such as TV shows or sports teams. Figures 2 and 3 present summaries.
Figure 2. Groups suggested by Facebook’s algorithm. Bars show the kinds of suggestions received by the Democratic profile (blue) and the Republican profile (red).
Figure 3. Proportions of the three types of suggestions made by Facebook’s algorithm to (a) the Democratic profile and (b) the Republican profile.
Another finding suggests a possible connection between the algorithms of Facebook and Gmail. As described in the methods, I created the Facebook pages five days after creating the Gmail accounts and listed both users as self-employed on Facebook. It was only after creating the Facebook page that I started to collect evidence of small-business and other self-employment targeted advertisements on both accounts on Gmail, such as “Me: How I Start My Business,” “QB Self Employed: The 1099 Tax Calculator,” and “Biz2Credit: Business Loan to help your small businesses grow.”
The goal of this effort was to see if a quick experiment could be done to justify a larger effort. These findings do encourage a longer study. The biggest limitation to this study is the short time frame for data gathering. Observing the effects longer should provide results that more clearly show the workings of the algorithms. These findings are also limited because I only engaged with the specific services mentioned and then logged out of the accounts. Google uses many aspects of its service, such as YouTube videos viewed and Google searches made, to make advertising decisions and help determine which ads to show on an email account . I did not use any of these services while signed into any account, so Google’s algorithms had less information to determine user preferences than it might have had if the users had fully engaged Google services.
Despite these limitations, the experiment indicates there are concerns to be aware of when using Gmail and Facebook. Their algorithms are able to determine political values and use the resulting inferences to steer information to and from users. Escaping the impact of these algorithms is difficult because there are no strong alternatives.
Melissa Hammer is a junior at Harvard College concentrating in Social Studies with a secondary in Economics and a citation in French. She was an intern at the Office of the Attorney General for the District of Columbia and is currently a Campus Campaign Coordinator for Teach for America.
Hammer M. No More Secrets: Gmail and Facebook can determine your political values. Technology Science. 2015090105. August 31, 2015. https://techscience.org/a/2015090105/
Hammer M. Replication Data for: No More Secrets: Gmail and Facebook can determine your political values. Harvard Dataverse. August 31, 2015. http://dx.doi.org/10.7910/DVN/4KFVQV
Enter your recommendation for follow-up or ongoing work in the box at the end of the page. Feel free to provide ideas for next steps, follow-on research, or other research inspired by this paper. Perhaps someone will read your comment, do the described work, and publish a paper about it. What do you recommend as a next research step?