WORKSHOP ON PROBLEMS IN THE PUBLIC INTEREST- JAN 2016
Happy New Year. We are pleased to announce the formation of the Technology Science Research Collaboration Network. This is our next step after successfully launching the Technology Science publication last August.
As editors of Technology Science, we are more than 50 researchers and educators at a variety of universities that undertake research with students aimed at better understanding the way in which new and emerging technologies impact public interest issues. Over the next 18 months, the Research Network will engage with civil society and government agencies to articulate problem statements that seem solvable by researchers. We expect to produce scientific facts and document relevant experimental results that will provide new insights for the media, policymakers, and the broader public about the ways technology impacts civil society. Possible topics range from consumer protection to criminal justice, elections, employment and gender issues.
Thanks to support from the Ford Foundation, the MacArthur Foundation, and other groups, we have resources available to help researchers, educators and students tackle these tough problems.
Over the next 18 months, we will:
Welcome to our first workshop!
Our goal for the next two days is to distill discussion into problem areas that are likely to be fruitful. We are bringing civil society organizations together with researchers to brainstorm across technology and civil society issues. The outcome of the workshop will be high-level descriptions of research topics and problems. These will be available to all the members of the Research Network.
Specific discussion tracks include:
The first day of the workshop explores areas by breadth and the second day looks at specific issues in depth. All activities will be held at 1730 Cambridge St. Cambridge, CGIS South Building, Room S020, lower concourse.
Let's get started.
Day 1 (Breadth)
8:00 AM - 9:00 AM: Registration and Breakfast
Have breakfast, meet people and get the following: (1) your name tag and (2) your code sheet, which is used throughout the day to place you in groups
9:00 AM - 9:30 AM: Opening Remarks
Opening remarks from Prof. Latanya Sweeney, Dean Michael Smith, and Dean Frank Doyle of Harvard University
9:30 AM - 10:15 AM: Meet-and-Greet
Now let's everyone hear the voices of everyone else. We will go around the room quickly.
10:15 AM - 11:15 AM: Public Interest Primer
Introducing the five target discussion tracks to ignite brainstorming.
11:15 AM - 12:30 PM: Breakout #1
Rooms S030, S050, S153, S250 (depending on your code sheet)
Our goal is to identify problems we can solve over the next 12-18 months. If we cannot solve the problem, can we describe experiments that would help characterize the issue, bring public attention the issue, or serve as a proof of concept. You want to describe problems, approaches, and concerns with approaches. The post-em sheets you will bring back will be your work products. Go ahead and brainstorm.
12:30 PM - 1:30 PM: Lunch
Eat and enjoy conversation. We have assigned tables so that you are next to people who were mostly not in your breakout session. (This is our last time assigning seats, so bear with us.)
1:30 PM - 2:30 PM: Get back together
As a group, we will examine the sheets made in the breakout sessions and discuss them. We will give about 15 minutes to the sheets produced in a break out session. Feel free to comment, question or explain any sheet.
2:30 PM - 3:00 PM: Break
3:00 PM - 4:45 PM: Breakout #2
Rooms S030, S050, S153, S250 (You choose)
Go to whichever session you want, even if you were a facilitator in the previous breakout session. When you get to the room associated with the topic of your choice, you will find the post-em sheets from the previous session and a room of like-minded people. Feel free to work in small or large groups, or even alone, as you prefer. You will now produce your own description of work to do or that you want done or you envision students doing as assignments or labs. Be as concrete as possible. You can reiterate previously described work, but use your own words and with your own rationale. Newly inspired problems statements are great. Draft at least 5 problem statements. They can be inspirational. Submit them online. Submitted statements have no attribution.
4:45 PM - 5:00 PM: Recap of the Day
Behold all you have done today…. post-em sheets describing work and problem statements online. Look at all the new personal connections you made. Well done. See you tomorrow for a different approach.
5:00 PM - 6:00 PM: Happy Hour
Day 2 (Depth)
8:00 AM - 9:00 AM: Breakfast
9:00 AM - 9:30 AM: Today's Remarks
9:30 AM - 10:00 AM: Evaluating Problem Statements
Go online and vote problem statements that interest you.
10:00 AM - 11:30 AM: Public Interest Reactions
Hear how public interest groups react to the most popular and less popular problem statements.
11:30 AM - 12:15 PM: Lunch
No assigned seating this time. Start talking to those who share your interests in specific problem statements.
12:15 PM - 1:00 PM: Presentation: Teaching Technology Science
Hear from Prof. Latanya Sweeney and Prof. James Waldo on how they teach and encourage students to research technology science problems.
1:00 PM - 2:00 PM: Breakout #3
There will be 10 stations set up for the top 10 most popular problem statements. Join the stations that most interest you to connect with like-minded individuals on next steps. Interested in multiple problem statements? No problem. At the 30-minute mark, switch to a new station.
2:00 PM - 4:00 PM: Final session
We will review the final problem statements and discuss the next steps with research funding, future workshops, and publishing research in the journal.
How can someone manipulate technology or data to steal election 2016? Technology and personal data availability has changed communication norms and sources of trust. What are manipulations of social media and technology that may significantly influence election outcomes? What are other vulnerabilities introduced by recent changes in absentee voting and mail-in ballots?
For example, in the 11th hour of the Kennedy-Nixon election, the Kennedy campaign leaked some false and damaging information about Nixon, too late for the Nixon campaign to respond. Kennedy won the election. Could that happen again using social media or search optimization? Can false statements live long enough to influence voters before learning of the truth?
Phone calls and letters often bombard poor and black communities near election day with false reports of last minute changes to voting locations, dates, times, or criteria. Can this be extended within online communities of trust?
First Amendment Tools
What is the best way to record the police? How can we stop revenge porn? How can we protect the free speech of female gamers against backlash? Technology makes it possible to express free speech in many new ways that have led to concerns. What are ways to quantify the problems? What are possible solutions?
For example, the ACLU has an app called Mobile Justice that records video from a mobile device and automatically uploads the image to ACLU servers. All kinds of problems exist. Most people are unaware they have the ability to record the police. What makes a good education campaign? The number of uploaded videos is overwhelming and most videos are off-target. How can relevant videos be found? Should videos just be uploaded to YouTube?
When a person who shared intimate moments with you betrays your trust by sharing your trusted secrets or photos widely and publicly online, what can you do?
How can I control what my bed, devices and appliances say about me? I purchase them but they are constantly forwarding information about me to others who share it further still. I don't even have a copy or know all the places where my information goes.
Technology is changing the way consumers purchase and use products. Issues include the Internet of Things, data as currency, children's apps, health apps, credit and fraud. What are ways to quantify the problems? What are possible solutions?
On the other hand, recent papers in Technology Science report on uses of technology to document price discrimination and detect potentially fraudulent ads in realtime. Are there other new uses of technology to improve consumer protection?
Open Government Data Sharing
How can we share data freely with privacy protections? The government holds lots of data that if freed could possibly lead to improved social and societal benefits. Privacy seems a big challenge to all this open government data. What are ways to quantify the problems? What are possible solutions?
For example, the State of Massachusetts recently passed legislation requiring the aggregation of personal Opioid abuse data to answer questions about the nature of abuse and prevalence. Bringing personal health, criminal, and social data together from many different sources is now legally possible, but the tools necessary to perform data matching across disparate datasets and to find statistical patterns requires sharing the data with researchers beyond state representatives. How can data be shared for research purposes while addressing privacy? What are the privacy risks anyway? What about open record requests which may demand different views of the same data?
As another example, many local municipalities have open data portals making lots of government data public. What are actual risks? Rumors claim that GPS location data may appear in crime data, identifying the location of witnesses or victims who phoned the police. Is this true? If so, how should the data be shared?
Impact on People of Color and Disadvantaged Communities
All of the issues described above, election vulnerability, first amendment issues (e.g., the right to record the police), consumer protection, and open government data sharing holds perils for all Americans, but even more so for people of color and disadvantaged communities. What are ways to quantify the problems specific to these communities? What are possible solutions?
Submit Problem Statements
Online submissions of problem statements is closed at this time.