Stanford University - Prototyping a Deliberative Tooklit for Multistakeholder Governance
05 December 2016 - A Pre-event on in Guadalajara,Mexico
>> MAX SENGES: Good afternoon. We're not going to start right now because the room has a location that I think a lot of people will take a moment to figure out where it is, so give us another two, three minutes, and then we will start.
All right. Good afternoon. Let's start. I'm sure every minute we wait now we will regret later when the conversation is up and we want to get more points and more perspectives in.
Good afternoon. My name is Max Senges. I'm chairing our session today on Deliberation on Encryption Governance. This is going to be, on the one hand, an introduction to deliberative polling and to the possibilities, the benefits, and challenges that that offers for multistakeholder governance, and in particular Internet governance, as well as a feedback session and a collaborative really workshop session on prototyping a deliberative toolkit, which is a smaller version -- as we will share during our session, a smaller -- I shouldn't even say version, it's inspired by deliberative polling, and it brings together a number of elements, but it's quite different, as we will discuss during the day.
We have -- the group that's organizing this session works at the Center for Democracy, Development, and the Rule of Law at Stanford and at the Center for Deliberative Democracy at Stanford, and Professor Jim Fishkin is here to -- really as our expert on deliberative polling and discussing with us how the benefits can be reaped and how some of the challenges can be overcome.
We hope to make this is a really interactive session, so please, by all means, come forward, sit close to the microphones so we can hear you. Just signal -- give me a hand signal if you want to come in. We're going to have a number of slides, more to guide the conversation than to have us give you a frontal lecture or knowledge sharing. It's supposed to be as interactive as possible. Short hand signal -- when you do speak, please introduce yourself, if you want, with or without affiliation. One of the questions that we will discuss, in fact, is how to organize such a deliberative democracy experience in a multistakeholder world where your stakeholders' perspective is important, whether there is a representation of the stakeholder group or if -- you know, in an exercise like this, it's more adequate to participate as citizens of the Internet. One last comment before I give you an overview of how we are structuring this workshop, we have done a pilot last year off deliberative polling on access for the next billion, and it went really well. We had about 69, I believe, participants online and during the session, and got some really interesting insights that we've published in a report that is available on the Stanford website.
By all means, if you're interested in access and the policy options and debates around that, do come up and we're happy to share more and think about further applications, as this is not the last workshop to understand where you folks -- where the community, the Internet governance participants, see a need and good application for this form of democratic practice and values.
So here's an overview of how we're going to structure the day. We're going to start with short overview and introduction by Professor Fishkin and myself about what deliberative polling is and how it differs from the deliberative toolkit for a multistakeholder governance.
Then we wanted to give you a kind of taste for what this is actually like, so the bulk of the second or -- for about an hour we're going to have a walk-through off the briefing materials that you have in front of you on encryption governance and the pros and cons to that. Jackie will introduce you to that -- those.
Then you'll break up into small groups and get an experience of how the deliberation is happening, moderated by Jim Fishkin and Kathleen Giles for about half an hour. Then we come back together and discuss what you heard and bring more insights from a plenum discussion and questions that you have come up with in those smaller groups.
Then we just have a look at how a survey will look like that allows you to understand how opinions shifts before and after such a deliberation, and then we would like to gather some feedback from you, both about the process overall as well as on the briefing materials in particular. Then we'll take a short break, and then we have another hour of deliberation and collaboration about how these elements of deliberative democracy can be useful and can be brought to the multistakeholder world. We'll give you a better rundown, a more detailed rundown of what that means at 3:45.
With that, I pass it over -- oh, no. The goals for the day are clearly we want you to have a good understanding of the concept of deliberative democracy and the tools that come with it and how they can be applied to multistakeholder governance and how deliberative polling can be applied here.
We want you to actually have a better understanding of what encryption governance means and what the different options in the field are, how that relates to multistakeholder governance, and then, of course, we want you to have a really good and high-quality open deliberation and see how different trade-offs and different perspectives come together when you're considering the tradeoffs and options in the field.
And last but not -- last but not least, of course, we want to think about how your thinking evolves and you can perceive how your thinking and knowledge evolves in such an exercise of critical thinking and civic engagement.
With that, except if there are any questions or suggestions to alter the agenda, which are always welcome, we're set in an open workshop format -- I want to pass it to Jim Fishkin to give a quick overview of the deliberative polling process as such.
>> JIM FISHKIN: This doesn't move, so I will move. Great.
Well, great to be back here. So some of you are familiar with what we're talking about; some not, so let me just say the notion of deliberative polling is to take a random sample of a population and engage it -- give it a survey, engage it in serious discussion, balanced, the best conditions we can create with vetted materials, access to competing experts, small -- moderated small group discussions for several hours or even several days. Sometimes it's done online with video, small group video, Google Hangouts or some such. Often it's done face-to-face, and then we -- at the end of the process, people take another questionnaire and we see the changes of opinion. This has been mostly conducted with the mass public. It's now been conducted -- we have done with various collaborators 26 countries and maybe 80 projects and -- around the world, in developing countries, in the most -- in developed countries, all kinds of countries on all six inhabited continents.
Last year we got interested in doing this with a -- in the context of multistakeholder governance for the Internet and people said, well, you're dealing with an expert community, so you won't get any change of opinion, people won't feel free to tell what they really think because they'll feel that they're sort of ambassadors from their institutions, and there won't be any knowledge gain because they know too much, so we did this project here at the IGF last year in Joao Pessoa, and even though the turnout replicates relatively limited, nevertheless, compared to the people who took the survey and didn't come, the sample was in its attitudes and demographics highly representative, and there was considerable knowledge gain and considerable opinion change.
Now -- and so we viewed it by our criteria as a success, and there are a lot more things to say about -- looking at the results quantitatively -- why it was a success.
Now, what does this have to do with multistakeholder governance? Well, the whole idea of multistakeholder governance, as the U.N. and other related documents have said, is that it should be in some way a democratic process, it should in some way be deliberative, and it should in some way be inclusive, and how do you -- and it should in some way generate some conclusions, whether those are recommendations or reference points or whatever, there should be some conclusions, and we think -- and there's been calls for methods for more than just bland statements but some organized process.
We think that deliberative poll, as a model, has something to recommend it in this context, but it requires -- because in some way democratic it's conceptually difficult because you've got such different entities. You've got countries, you've got companies, you've got people from Civil Society, people from academia, different sectors, people from all around the world. How are we going to have democracy among such different groups? So the approach that we've been taking with deliberative polling is to say let's see if we can treat the participants at forums such as this one and other Internet Governance venues -- let's see if we can treat them asnetizen, take a stratified random sample of them as individuals. If we get a good sample, it's representative of the population, and we treat them under context of equality, where everybody's voice counts, we have moderated small-group discussion in a civil way where people count equally, and they share information, they clarify the key questions that they want answered, they raise those questions, usually in a deliberative poll it's to panels of competing experts. In this workshop, we've got so many experts, we're going to raise the questions for the group to answer, but they get more information, and we see, then, they register their opinions in confidential questionnaires so that they're not -- so we want their sincere judgments. They're not acting as ambassadors, they're acting as netizens. People said they would never act as netizens, but we did this last year. At least on the topic of increasing access, there was considerable opinion on change, and it was motivated for coherent reasons.
The whole process, even though it wasn't as large as we would like, generated a number of statistically significant changes, and it really was credible by our analysis, and there's a report on the Center for Deliberative Democracy website and we're preparing a version of that. A version for that will be submitted for publication.
Now, so that was -- but that was access. We think encryption is even -- is -- today we're prototyping such a project for which there would be a full-scale deliberative polling, and we're looking for the kinds of partners, contexts, and venues where this topic of encryption could really have an impact, both within the Internet governance community and potentially with the mass public, perhaps in various countries. We think it's a very timely, important topic, but to do that, we want your feedback on the materials, on the way we frame the topic. This is a draft in progress. We think it could be improved. So that's one of the purposes.
Another purpose is to distinguish the deliberative polling, which is meant to be representative of a broader population. That's why we do stratified random sampling, to distinguish that from a somewhat less ambitious but nevertheless still extremely useful use of this kind of process in that in various contexts we've had what we call a deliberative toolkit, usually in schools or for various kinds of civic groups, where they can, for the group in question, consider the issues in the same way as if they were the population in a deliberative poll. They can discuss the issues, excellent balance, vetted briefing materials, small group discussions, questions to competing experts, register their opinions before and after, and what does that do? It helps a group clarify it's considered judgments about an issue because by registering -- it's all very simple and commonsensical if you think about it, but it's a route to getting something more than bland consensus statements on a contested issue. Instead, when the people really think about the tradeoffs, you get the actual distribution of opinion, and people offer their answers without the social pressure of just going along with the group and arriving at a rough consensus. Instead we see what the real distribution of opinion is and we can see why and do various analyses about why the opinions shift the way they do.
So we think there will be contexts in which the sort of toolkit version of this is useful, and so we're going to sort of pilot the materials with both of those things in mind today.
>> MAX SENGES: Perfect. And, in fact, the hour from 4:00 to 5:00 is really to think about what adaptations to that toolkit that has been used in other contexts should be made for a multistakeholder environment like this, so we don't think we have all the answers together, and we really want to hear how you think, especially the balanced briefing materials, just the production of that seems valuable and the deliberation experience that is a bit more organized but very egalitarian deliberation rather than the panel discussions that we are used to at IGF, how can we balance that more, make it a bit more democratic.
But with that, we want to start the actual deliberative experience with an overview of the briefing materials. After Jackie Kerr is done, we'll move to smaller group deliberations, probably divided the left and right side of the table.
>> JIM FISHKIN: Let me say this. Normally, if there were a deliberative process, we would have sent you the materials beforehand, but Jackie, who has helped us develop the materials, among others, has very kindly agreed to give you a quick walk through the materials because you have not had a chance to read them in advance. You should say who Jackie is.
>> MAX SENGES: Absolutely. Jackie is a post-doc scholar in Stanford and various other institutions, but I think you better say a couple of words about yourself. We have vetted the materials with a group of about 15 to 20 experts and shared them broadly through various stakeholder groups, and with that, over to you, Jackie.
>> JACKIE KERR: Okay. Hi, there. So it's an honor to be here, and I will try to do justice to these very brief details that you've been given. I'll walk you through them and invite you to look at more at the text and the charts. They're in the red folders. So if you look at what is -- there's a document that's titled "DP Encryption Briefing Material," and that's the one I'll be walking you through, so as -- I guess by way of a few words about myself, yeah, I'm a post-doctoral research fellow at theCenter for Global Security Research at Lawrence Livermore National Lab and I'm a research affiliate at the Center for International Security and Cooperation at Stanford, and I've been lucky to be involved in a couple of stages of this process in development, and it's just a very interesting project. So as -- how to get this a little closer.
As everyone here is aware the issue of encryption and debates over encryption policy have become extremely significant in the public sphere in discussion in the last several years. Encryption is widely used as a secure -- as a way to secure protected data and communications, secure these data against criminals, spies, governments, but there's significant concern also, especially on the part of governments and law enforcement, over the effect that encryption, especially strong encryption on devices or end-to-end encryption of communications plays in obstructing the efforts of law enforcement, hindering law enforcement's ability to access and read encrypted data, and so there's been a significant debate over what the correct policy position is on this subject.
Technologists and civil liberties advocates say weakening encryption or allowing a back door would lower data security. On the other hand, law enforcement and governments often take the position of being concerned that there would be a way -- or thinking there is a way to create some sort of legal mechanism or back door or a key that the government also has access to that would allow an acceptable level of risk, while allowing very limited access to encrypted data for law enforcement purposes.
This debate came to the fore in the U.S. in if the last year with the debate following the San Bernardino massacre and the Apple's pushback against the be FBI's efforts in getting assistance in decrypting one of the iPhones involved in the attack, so if you look through this packet of information, we walk through several of the major concerns and issues and then lay out several policy options that we invite you to think about and also think about other policy options that we might not have included in the list.
So if you turn to Page 2, the second paragraph from the bottom, we discuss -- one second. -- we discuss the focus of this deliberative exercise, which is on cases where government actors have sought legal authorization to access encrypted data but cannot access it due to encryption, and what this exercise aims to do is to assess the optimal solution to these sorts of cases from a legal and technical perspective.
If you -- if you turn to Page 3, there's a discussion here of the issues concerning lawful access regimes, whereby government and law enforcement would have access in one form or another to the sort of encrypted data, and the discussion is how to balance legal processes that allow protection of individual's civil liberties but at the same time allow for access to decrypted versions, plain text versions of such data.
There are concerns that there's already a significant degree of assistance from corporations to governments on more of an ad hoc basis, answering to requests for data, and whether there's some way to make this into a more formal legal regime that doesn't allow for the kinds of slowed-down legal debates and processes like happened following San Bernardino or whether that actually is not desirable.
One of the major issues that comes up in this is the concern over terrorism. If you look at Page 4, we have a more developed discussion on this issue. There's concern about true crypt and telegram and other technologies by terrorist networks such as ISIS, but -- and there's also been the discussion of -- and as encryption -- as strong encryption technologies become more ubiquitous, this is a fundamental change in the access to data available to law enforcement.
Now, this brings us to on Page 5 the going dark versus the golden age of surveillance debate. The one side of this issue that is usually a possibility by law enforcement is that there's this concern that the access to the types of data that have traditionally been used by law enforcement is becoming less and less possible. This data is becoming more and more available, it's so-called going dark because of encryption technologies.
The counter-argument posed by many civil libertarians, for example, is that, in fact, this is far from going dark, this is the golden age of surveillance, that there are new forms of data available that never were available before. Encrypted data is just a tip of the iceberg and there's metadata, there's data of personal communications, there's forms of data, sometimes data that is encrypted is also stored in an unencrypted version in the Cloud, for example, and there are ways of accessing that data.
Overall, people's lives and contacts and communications are much more traceable than they've ever been, and so this is a golden age of surveillance, far from going dark. This debate continues, and there are arguments on both sides.
So that brings us to the international dimension of this puzzle, which is it's very easy and I think it's been something that's happened a bit too much in the U.S. context to think about this in a bubble, just thinking about what's going on, what's the ideal policy solution in the United States, but this does not happen in a vacuum for multiple reasons.
First, encryption technologies are not just produced in the United States. There are 546 encryption solutions available globally. That number might have changed since we researched this. There's hundreds of technologies and hundreds that are produced outside the U.S., furthermore, that are quality encryption solutions, and so if you have one country that takes a particular legal approach to this issue and -- that differs from others, this could affect global economics of import, export, of encryption technologies. It could also affect what technologies individuals actually choose to utilize, even if those technologies are not legal within their country.
And it's necessary to consider how that affects rights activists in other contexts, including repressive political regimes, how that affects criminal networks, terrorist networks, and so on.
Other dimensions of the international debate include a normative dimension over how these policy choices in one country might reflect on and influence policy decisions in other countries. There's ongoing debate right now in the U.S., China, India, the UK, France, and this is on Page 8, by the way, Russia and Germany, over these very policy questions. New laws requiring companies to assist in decrypting or hand over encryption keys to governments have been passed in the last year and a half in Russia and China, for example.
Meanwhile, the U.N. has issued several reports, including a report by the Human Rights Council declaring encryption as necessary for the exercise of the right to freedom of opinion and expression. This is on Page 9 of the discussion, which brings us to the discussion we're going have today.
There are a number of policy possible options. Now, for your reference, on Pages 10 and 11 there's a discussion of key terms, so if you want to look for definitions of things like lawful access, end-to-end encryption, metadatas, these are all there and described.
If you turn to Page 12 now, this is where we talk about and lay out a handful of different policy options, and also, for your reference, on Pages 14 through the end of the document there's a chart, which will help you look at these options in more detail, giving pros and cons of each policy option. What we've tried to do here is incorporate options which deal with both the domestic policy debates in any given country and the international dimensions and normative dimensions.
So to begin with, I will not start with Option 1. We'll come back to that at the very end, but starting with Option 2, which is under bullet point B, it's called Approaches: Accessing Encrypted Data, Break, Circumvent, and Compel, so we've laid out several options that deal with policies concerning access to encrypted data.
The first one, Option 2, says, to mandate exceptional access, create technical means for law enforcement to decrypt and access data when legally authorized. This is definitely one of the strong positions that has been held and supported by law enforcement. For example -- and there are several possible ways for how to further that objective. For example, you can -- 2.1 here, you can impose costs for noncompliance, can you make companies pay fines or be liable if they block this exceptional access.
Policy Option 3, reject mandatory exceptional access we made as a distinct option. It's clearly the opposite in the way of 2, but it could be made as a legal framework in its own right, and we've laid out in the chart a number of pros and cons that have been argued on both sides of this position.
Option 4 deals directly with the issue of whether or not to compel companies to assist in decrypting data, to either provide keys or back doors or direct assistance to the government in decrypting data, and this could be done in such a way that it affected design decisions so that companies were legally liable to decrypt data, and so if they didn't retain that ability themselves, then they would face problems legally and they wouldn't have the ability to push back legally that this was a violation of the -- of the divine of their product.
Option 5 is to authorize government hacking, which is clearly another different solution to this issue, and it deals with, among other things, what we discussed in bullet 5.1, the ability of markets and governments restoring vulnerabilities to allow them to hack into encryption protocols or encrypted devices.
And moving on to Section C, regulating the availability of encryption technology, deny, restrict, or promote, we've laid out several -- we've laid out a couple of options, again. One would be to restrict ubiquitous, strong encryption for public use; another would be to support ubiquitous, strong encryption for public use. Again, I refer you to the chart, which goes into a great more detail than I have time to right now as you debate and discuss these policies for more on the pro and contra arguments for each of these.
Obviously, there's a relationship between the legal frameworks discussed in the first set of points and the -- this technological decision, and the two will influence each other as policies.
Moving to Part D, Approaches: Consult and deliberate to find solutions to the encryption problem, this deals with the question of what process should be used to come to an answer. I think one frustration for many in different communities has been the siloedness and disjunctures in the setting where this debate has been happening. You have people in law schools deliberating security issues and there's an extent to which the two don't necessarily happen in the same place. There's a lot of secret discussions and so on, and so one option would be to have a national public expert commission that was appointed to deliberate on and identify possible solutions.
Clearly, interesting topic in this particular setting where we're talking about deliberative democracy as well.
Moving to Point E, dealing with the international dimension, we could establish a set of voluntary global norms to protect encryption against restrictions and weakening due to national security and economic interests, so this would be a set of best practices that then states could attempt to conform to.
Now, I'd like to come back to that meta question which I skipped at the beginning of the policy discussion, and so we've been debating this a bit amongst ourselves today even what is the right framework for this. It's written on the sheet that you have, the meta Debate is private and ubiquitous, strong encryption vs. access. Which is more important to security, private and ubiquitous strong encryption or exceptional access following due process. I would suggest in debating this that you maybe think about what you think of that framing of the question and possibly consider it without the for security and instead think of which is more important, period, for whichever values you think are most important.
This is a question we could debate at length. Security can mean many things. It can mean personal security and protection of civil liberties being part of that, but there's some concern that it primes the debate to be focused more on national security, so the meta debate is over which of these options is better, period.
And on this note, I want to turn it over to you all for the discussion. The main thing here is not to try to pick every single one of these policy options and debate and come to a solution on each of them but more to think about which you think are most important and discuss those in depth and perhaps also there might be other policy options that we haven't thought of or put here that you would like to consider, so on that note, open it up to the small group discussion.
>> MAX SENGES: Thank you, Jackie. You did the impossible. We gave her 15 minutes, and I think you actually stuck to time. Thanks a lot. Just a couple of very small comments before we take off for the conversation and deliberations. Importantly, there is a very strong technology component in this topic innately, and we do not anybody to be a technology expert. In fact, if technology questions come up, please signal and we will try to get you a good answer, but this is really more on the policy dimension, and if, you know, somebody thinks that yeah, as a technology solution or wants to champion that, something that is not in here and we haven't considered, let's kind of park that for the moment and try to find an answer following our conversations are.
Also, of course, we want your feedback to the material, so any annotation and comments that you have -- as Jackie noted, we just had some refinements earlier today that we think made sense -- we'll get better and improve over time. Of course, if you're interested to use these materials in any context, then we do think their application goes far beyond a deliberative poll at this location. This is really an attempt to map a debate that is very fast and difficult to understand, and, you know, if we actually get together and we kind of have a Wikipedia and neutral point of view overview of what the debate looks like, that might be very handy to inform decision-makers around the world to have good debates.
So with that, let's split off into small groups.
>> JIM FISHKIN: Yeah. Let's -- why don't we -- why don't we go around the room and count off and then we'll use the number that you get to divide you, so would you like to go first and just say you're number 1, you're number 2? Just remember your number. Just speak loudly. Go around. Hello.
>> MAX SENGES: No, 1, 2, 1, 2.
>> JIM FISHKIN: They know which are odd and which are even. We could just go around, 1, 2, 3, 4, 5, 6 -- the participants, people who are willing to participate. So just go around the room and count off from 1 to whatever, maybe 40. Go ahead. 5, 6, 7, 8. All right. You want to do 1, 2, 1, 2? You can do that. Whatever's quick. We don't want to use --
>> (Off microphone).
>> JIM FISHKIN: That's fine. Okay. 1, 2, 1, 2, okay. Go around.
>> AUDIENCE MEMBER: 1.
>> AUDIENCE MEMBER: 2.
>> AUDIENCE MEMBER 2.
>> JIM FISHKIN: Whatever. Just --
>> AUDIENCE MEMBER: 1.
>> AUDIENCE MEMBER: 2.
>> AUDIENCE MEMBER: 1.
>> AUDIENCE MEMBER: 2.
>> AUDIENCE MEMBER: 2.
>> AUDIENCE MEMBER: 1.
>> AUDIENCE MEMBER: 1.
>> JIM FISHKIN: Are we all finished? You're what? And there's some people here. 1, 2, 1, 2, 1, 2. Just please -- keep going. Who else is --
>> AUDIENCE MEMBER: (Off microphone)
>> JIM FISHKIN: We've done it. Okay. I just didn't want everybody to sit with their friends, so if we do 1, 2. So all the 1s come over here, all the 2s go over there, and we're going to -- we're going to circle the chairs, and then we'll come back to these --
>> MAX SENGES: 1s over here. Jim is going to moderate that conversation. Kathleen in the back, who's raising her arm right now, is going to moderate the Number 2 deliberations.
>> JIM FISHKIN: After that, we'll reconvene the chairs and come back to where you were.
>> MAX SENGES: We come back at 3:20.
>> JIM FISHKIN: So you can hear me. No. Max is saying no microphone. Okay.
(Small group deliberations taking place until 1520)
>> MAX SENGES: Just a quick housekeeping announcement. We have five more minutes. Maybe you have some questions you'd like to ask in the plenumround.
(Small group deliberations continue)
All right. I know now the conversations just really got deep, but I have to ask you to come back to the Plenum and we can continue the conversation there, maybe finish the last -- yeah.
((Small group deliberations continue)
Hello, hello. Okay. Okay. So let's get started with a panel conversation or the plenum conversation in deliberative poll as Professor Fishkin developed it. There will be experts that we've briefed and chosen for various perspectives and their expertise in the field.
Now, in an IGF context, there was expertise spread across the whole community and different viewpoints come from that, so we will have one of the group participants that you've selected ask the question, and then we will see who from the group finds to offer a perspective and answer to the particular question and have a short debate on each of the questions. Rebecca, may I ask you to ask the first question?
>> REBECCA MacKINNON: Okay. Sure. Am I okay? All right. So the -- one of the questions we came up with is the question of what do government commitments to a free and open and human rights compatible Internet, among those governments who have made such commitments, demand in relation to encryption and what -- what institutions for us should deliberations about this take place?
>> MAX SENGES: So just give me a shorthand signal if you feel you can offer a perspective, a partial answer, one answer. I'm sure there's more than one to this very difficult question.
>> REBECCA MacKINNON: Or if it's not clear what I meant, feel free to ask me more about what I meant.
>> MAX SENGES: Yes, please.
>> PENG HWA ANG: Yeah, I think in this area, okay, human rights, as you know, is not absolute, it is a contextual in the sense of culture and all that, but also from observation, depending on time. So in the UK the act was passed at the time of panic and that, unfortunately, got rolled over in the Commonwealth. All Commonwealth have an act. So I feel at this point in time we're having sort of a panic with respect to terrorism, and so I think in this area, if you ask that question, you would depend on time. I guess if you recall, after the September 11, attack on -- there was a rollback on privacy, and now that things have kind of softened a bit, there's some pushback now against that rollback, so when you ask that question, I guess it depends on the context of it. If you feel you are in a threat, you will definitely have, in a sense, more rolling back of privacy concerns, and if you feel safer, you allow more human rights concern. I guess here the human rights concern we know is the respect of privacy, so my big point is on contextual culture as well as time.
>> MAX SENGES: Thank you, and if you may invite you to say your name and, you know, what stakeholder group you want to speak for or what regional perspective you want to bring in, are that might be interesting for the conversation as well. Do we have anybody else who want to tackle this difficult and must-be-difficult question? Oh, yes. Thank you. Please choose to introduce yourself or not.
>> AUDIENCE MEMBER: I think everything -- every perspective should be set for making Internet human rights compatible. For the reason I have the universal answer is to adopt international instruments which can preserve human rights with that, so I think this is a -- the issue of encryption is very crucial for everyone. It's -- Russian experience shows that so that we have to ensure human rights should work on the Internet. Thank you very much.
>> MAX SENGES: Thank you. I'll take another look, okay, if anybody wants to comment or build on what has been said. Otherwise, let's move to the next question, maybe from Group Number 2. Let me see. Do you want to ask the first one?
>> NALINI ELKINS: Sure. Yes. Nalini Elkins from the United States. One of the questions that we had was no matter what we do and what we decide on, how can it be enforced? I mean, who's going to regulate what we do? I mean, is there really a protocol police?
So I guess that was kind of a question we had about whatever -- you know, whatever happens.
>> MAX SENGES: Thank you. Who governs the Internet? I think that's probably one of the questions that motivated the forum in the first place, but observations, how can the space be policed and any -- you know, any of the measures be enforced?
>> FARZANEH BADIEI: Thanks. Farzaneh Badiei. I have to say my group? Yes. Civil Society. I think that question is kind of in line with our first question, which is about what institutions should talk about, but also do we want to go towards like standard setting institution, like similar to what IETF does or then we are going to have some kind of a norm there for the technologists or do we want -- by enforcement you mean more like hard law? So for -- I think a standard-setting institution would be a good idea, but I have no idea about the hard law because then the sovereign states would -- like different countries have different laws, and it would be difficult to enforce that.
>> MAX SENGES: Thank you. I see a direct reaction, and I'm going to move over to you in a moment.
>> NALINI ELKINS: Yeah. Thanks. Yeah, I actually work at the IETF, and I do standards, and that's why I say -- that's why I say, like, that's -- we -- that comes up all the time, and we have standards. They're adopted, adopted in certain OSs, and then everybody says, you need to make sure this and that standard is implemented, and we're like, we're not the protocol police.
>> MAX SENGES: Thank you. Very multifaceted.
>> EDMON CHUNG: Edmon Chung here from DotAsia here, but I guess I'm bringing this up as an individual, so Civil Society. I think the two questions may be related. I apologize again, I'm coming in late, and I'm probably going to ask a stupid question here, but the question I have is, you know, if the governments are trying to say that, you know, we need a back door, the problem is those who won't -- those who are going to use technology that doesn't have a back door are probably going to be the criminals, so what is the gain anyway in trying to do this kind of thing because you can't enforce the criminals not to use the technology that has no back door, so I -- I'm just scratching my head, you know, what -- what gain does it give law enforcement in, you know, trying to enforce that there would be back doors in certain devices or technology?
>> MAX SENGES: That's bringing up a completely new question, how can you limit, even if you wanted to mandate or for bid the use of solutions without a back door? Interesting one. Direct reaction, please.
>> WALID AL-SAQAF: I'm Walid. I come from Yemen in the Middle East. Often I ask the question can technology use -- can we limit the terrorist's use of technology because, you know, the Middle East is often labeled with that notion, and I tell them that what you're trying to do is driving the problem underground. You're causing criminals to become more adaptive, more creative, more revolutionary in the ways of working, and you're making it much more difficult. In fact, you're trying to deal with the symptoms rather than the roots of the problems, and so if governments keep on thinking in this way, that applies, actually, beyond simply the Internet.
And then they're actually causing more of the devastations happening, and so when we think of encryption -- I mean, trying to create back doors to track criminal activity, you're actually making it worse, and, in fact, you're somewhat contributing to the backlash.
>> MAX SENGES: Okay. Thank you. In the interest of time, we've got to move on to keep the train rolling. James, may I ask you to ask the second question from Group Number 1.
>> JAMES EDWARDS: It's the classic policy questions and we've just heard good questions from Ed and Walid, so it's going to be what's the disease or what's the cur. How important is lawful circumvention of local security to one public safety and, two, to the administration of justice? What difference does it make?
>> MAX SENGES: How could you come up with these very hard questions? Thank you. Does anybody feel he or she can contribute to the beginning of an answer to this big question?
>> AUDIENCE MEMBER: I have one comment. I think if we wanted to allow governments to be able to get in through the back door, they're going to do it anyway. I think no matter what, if we say no, it's going to still happen, but by making it lawful, then I believe any evident gains would be able to be admissible in court, otherwise it's gain from unlawful means, so I think it's better to do it in a lawful way; however, I don't believe we should allow back-door access.
>> MAX SENGES: Any more observations maybe from different parts of the planet or from different stakeholder groups, different perspectives? Do we have somebody who would support the other side of the argument?
>> NALINI ELKINS: I hate to say I support back-door access, but I -- there's a point that hasn't been brought up, really, which we discussed in our group, is I work with a lot of financial institutions and large corporations, and really, they decrypt for lawful reasons, for fraud detection, for -- you know, I mean, for any number of things, and literally, you're going to bring corporations to their knees, which I'm sure you don't want, by getting rid of all this kind of access. This is -- it's a huge problem that we've had, and so, you know, it's a very complex question, I'll just say that.
>> MAX SENGES: Thank you. Do we have any other perspectives. Peng Hwa.
>> PENG HWA ANG: I'm Peng Hwa from Singapore. I have a concrete example of what it means if you don't have the kind of lawful submission that you talked about. I have this wonderful Chinese tracker. It's got -- it's got a fitness tracker, of course, heartbeat measurement. Tells me after an hour I'm sitting for too long, and of course, with delivery right now, the sale $25 U.S. dollars. It pairs to my Chinese phone, and I unlock my phone, I pick up my phone, my tracker unlocks my phone. What a brilliant idea, but I have a friend who will not use the Chinese phone because it has a record in the past of sending data back to his home head office, and my friends in China tell me you should be using an Apple phone because they don't allow people to access the data.
>> MAX SENGES: Thank you very much for that perspective. Let's move on to the second question of Group Number 2.
>> ANA: Good afternoon. I'm Ana from Brazil. I came here with Youth@IGF, so the question that I brought to the small group is whether -- is there really a tendency of communications going dark, since, because, a lot of companies still rely on data as their business model, as really a commodity. Is there really an interest of completely blacking out communications?
>> MAX SENGES: So the question, whether the going dark argument is even in the interest of the private sector and, hence, would ever come to bear. Anybody wants to comment and elaborate a little bit on their perspective on the going dark perspective?
>> AUDIENCE MEMBER: (Off microphone)
>> MAX SENGES: So the going dark argument is that if the -- we'd use encryption widespread, strong encryption, then the police and the enforcement agencies couldn't do their job because they couldn't access any data, and that is, of course, you know, fueled by companies like WhatsApp and similar who start to encrypt all communications end-to-end, which makes it much more difficult to access.
I mean, I can say if I take my Stanford hat off and my Google hat on that, you know, we are obviously interested to provide value to the user, and if the user wants the communication to be encrypted, then we will do our best to make that happen, and there are technological means, like homomorphic encryption, where you can actually do certain queries on data and provide services on encrypted data that allow to, you know, not identify the person that is being served. You can, of course, go up in granularity when you address certain value propositions or, you know, your products in that space, so, again, unfortunately, I don't think it's a black-or-white environment, but it's certainly true that data is information and knowledge and that there's a lot of good that can be taken out of it, so I don't think switching it off and going dark would be the right thing to do, nor is it practically possible, really.
Do I provoke any reactions with that statement?
>> WALID AL-SAQAF: It's Walid again. I think encryption, having it peer-to-peer will be different than peer-to-central service, such as Google, et cetera, and so the idea that companies would embrace encryption end-to-end in all traffic communications means that they will have to change their business model altogether because they will not be able to understand what data there is on the servers and what people are communicating with, so ads will change in terms of how they streamline. Of course, if it's by default, but companies -- and I think this is the more likely situation, that companies will actually afford to have an encryption button that you dig in and try to find on your own and then turn on, and by default, it will not be encrypted, so that's just my understanding of how it might be.
>> MAX SENGES: Okay. Thank you. I think we have one last contribution, then we'll move on.
>> JAMES EDWARDS: So James. In the going dark argument, in practice, sophisticated privacy technology seems to be not that widely adopted. There's lots of users who make themselves very public and accessible, and even those who do deliberately use private technologies interface in the physical world in some way that is probably discoverable by lawful authorities, so it just doesn't seem like the concern that might be argued to be from point of view of administering justice or protecting public safety.
>> JACKIE KERR: I'd also here, bringing back the argument of the age of golden surveillance, the counter argument that's often brought up in reaction to this, there's a period of time, the last ten years, say, where a lot of digital information and communication has been available for law enforcement purposes, but that's the easiest it's ever been to access people's private communications, so maybe that's sort of being pushed back a little bit, but think about before that, before the Internet, the ways in which law enforcement had to go through significant legwork to get that same level of access to people's communications, get embedded in networks, get wire taps on specific phones in specific places, et cetera. It was much more difficult to get access to that kind of communication data so to say that it's going dark is looking at a very limited, short-term, historical horizon, and in fact, there's more data out there and available than there ever was before. That's the counter argument that is often brought up as the golden age of surveillance.
>> MAX SENGES: Okay. So let's go to the next item on our agenda, and that is Jim giving you a bit of a background of how this survey that we've not administered to you because of the shortness of the format that we're doing today, it's just a taste of the deliberation experience, but a survey will be handed out before our communications and before you've read the balanced briefing material, and then after, so in a small group like that, the purpose is just to notice yourself how your perspectives have changed and how the perspectives of the group have changed.
In a full deliberative poll, you'd have a random stratified sampling, and you could see how the population would shift its perspective and more importantly, for the second part, for the second survey, really have a reference point of what a community would say if they had thought through and deliberated about the tradeoffs of different options, so Jim, over to you.
>> JIM FISHKIN: Just in the interest of -- well, to explain, if you -- you have in the back part of a survey. We're not asking you to take the survey. But let me say a couple things about it.
When we do these projects, we find that there's substantial significant, in the statistical sense, opinion change. About 70% of the items we ask change significantly.
If you ask an individual person, did you change your mind, we find they never say they changed their mind, but if you actually look at what they said before and what they said after, there's significant changes, so if you look at this and say, well, I wouldn't change my mind, that's fine, but, in fact, many of you would on many of the items.
Now, what use does that have? Well, sometimes the changes are quite big, and we also -- we have a number of different kinds of questions. We have the policy proposals, we have information questions to see if people became more informed, and as I said, last year on the -- in the pilot we did, with this kind of community, they did become quite significantly more informed, so you guys know a lot, but you don't know everything.
And how do we measure informed? Well, we have questions that -- that relate to the issue, and not true/false question because it's 50/50, you can guess those, but multiple choice questions is usually the way it's done, and there are just snippets of information that -- information's usually correlated with other information, so if you answer some of those more correctly at the end, it's just a way of measuring, you probably gained a lot of other knowledge too.
And then questions -- questions that we think we feel might be used to explain the changes, so they're tradeoffs, they're values, they're other things that -- so we understand, and we can do regressions with those, so that's what we would do with a full-scale deliberative poll where you would be asked the first question on initial contact before -- not here at the meeting but when we first contact you.
Now, in this toolkit, if we -- this were done just with a toolkit and a smaller group, we still think it's interesting to speculate whether it's useful. The reason is that if we just got a group together to discuss -- first of all, it was just a group together to discuss in a big meeting, you wouldn't have any product from the result.
Then, if you had a show of hands or tried to get some consensus statement, usually it's quite bland, a bland consensus statement, but if you fill out confidentialquestionnaires like this, we can really -- without the social pressure of seeing how everybody else is going, so there aren't the bandwagon effects, you can see what you really think, and one thing we do know is that after a discussion like this -- and this was a very brief discussion, but let's say you had several hours of this or more. You really do have opinions. You have -- and your opinions have become more considered and you've considered arguments on either side. We think that that would be quite useful for any group trying to get its collective, informed judgment, so I only point out the questionnaire as an exhibit to consider, and we'll refine it -- we're going to be refining everything in light of what we learn in this discussion. Did you want to comment?
>> (Off microphone)
>> JIM FISHKIN: You need a microphone.
>> (Off microphone)
>> JIM FISHKIN: Okay. Let it boom.
>> AUDIENCE MEMBER: One of the things I --
>> JIM FISHKIN: Give him the microphone.
>> AUDIENCE MEMBER: I'm sorry. I was involved in this last year. By the way, I really like this, that's why I sort of came back at it, but one of the -- if I had any sort of overarching, it's what we want and what this platform can deliver --
>> JIM FISHKIN: Right.
>> AUDIENCE MEMBER: -- are really two different things in many, many, many instances. You know, you really do need people that live in this soup all the time to understand that you don't have the right to be forgotten because you can't be forgotten. The architecture doesn't --
>> JIM FISHKIN: Right.
>> AUDIENCE MEMBER: -- permit for that, okay, and that -- you know, and God knows if you just follow the news, governments can't be trusted.
>> JIM FISHKIN: Well --
>> AUDIENCE MEMBER: You've got problems with the private sector too, but quite frankly, all they want to do is sell you something, so it's not like they're going to come knocking at your door or something, but -- so just more framing it in the environment of what the platform will actually permit, right, and then -- and then, you know, finally, I had this discussion earlier very briefly, but I think it bears down the road here. This is really an arms warfare. This is going to be very much like the nuclear weapons age, and the only -- only a handful of companies and countries will have the power -- I mean power, electrical power, hundreds and hundreds of megawatts, to drive the technology that will allow them to have the keys to the kingdom anyway, so everybody else is going to be road kill. At the final end game is what I'm trying to say.
So you're -- we're only going to have a few king makers the way this is all going to shape out anyway, so --
>> JIM FISHKIN: So are you saying this is worth deliberating about or not? If we're all going to be road kill inevitably --
>> AUDIENCE MEMBER: No, no, I do think -- quite frankly, I do think we should continue it because I do think we're, at some point -- I'll say this, just because of what just happened in America and what happened to the Catholic church and what's happened throughout our history, the poor little guy with a pitch fork is going to say, no, I'm tired of the church service in Latin or I'm tired of the Papal dispensation, or I'm tired of being treated like some -- and then you will have a -- but long before that, you really will have just a handful of players that will have all the keys. That's the technology.
>> MAX SENGES: So let me pick up on your point about -- and I want to let you come in in a second -- on your point about, you know, what we want this instrument to deliver and what it can deliver. I think that's exactly what we're going to spend the next part of the session about.
First a quick round of feedback from you, what you liked, what you didn't like about the materials, but before we come to that, I think Rebecca wants a word.
>> REBECCA MacKINNON: Yeah, no, I just had a follow-on based on what you just said that, you know, it's going to be a handful of powerful government and corporate players that are really going to drive this at the end of the day or have the power to shape it, and that kind of relates to the initial question I pose, which is whom should we be engaging with as sort of the priority? You know, which powers, private and government, need to sort of be the target of engagement? Is do you have to engage with everybody in the United Nations or do you need to just make sure that those companies that you engage with -- that you engage with those companies and those governments that have the bandwidth and the power, in all senses of power, that are going to drive this thing, and try and get some common agreements and standards around the kind of world we want to have and how approaches to encryption fit into that?
>> JIM FISHKIN: She wants to say something too.
>> MAX SENGES: Yes. I guess that depends on your world view too, right, whether you have a perspective where you only address those that actually are in the position to cause change or whether you go more with a democratic approach where you say, let's get a baseline understanding with enough people, you know, small people drum on the drum --
>> REBECCA MacKINNON: The small people are the stakeholders. The big powers ultimately will only be powerful if their stakeholders kind of trust them to a certain amount, and for at least the corporate side, those people are not limited to just one nation's state, so in a sense, it gets to the little people -- the little people end up having some power, you know, or being able to -- there are ways to exercise the power.
>> MAX SENGES: So I agree.
>> AUDIENCE MEMBER: (Off microphone) support for exactly that reason, because if we can inform the -- the IGF is important for exactly that reason because if you have a multistakeholder and you continue to chip away at education and communication, involvement, collaboration, then those handful of people, you can -- you can hold them accountable, you can show that -- your displeasure in a lot of different ways. If it's a company, by boycotting their products or whatever; if it's a government, by ridicule, by -- you know, there are ways, and we've had that throughout history, I mean, short of a guillotine.
>> MAX SENGES: Just because it's an energetic conversation, I want you to come in too and take off my Stanford hat for a second and say Google is probably one of those players you're thinking about, right? I mean, we do sit here. There's always a big delegation, and I would say there's three things that get companies -- get everybody to move, carrots, sticks, and reason, and, you know, I don't think the IGF is particularly good with carrots and sticks, but we are good in doing things like what we're doing right here; that is, getting to the bottom and really hearing out what a good solution is because as you have actually just mentioned, you know, Google's intention is not to take over the world and, you know, to rule. It's a business, you know. It's pretty straightforward, so we want people to be happy, so we're absolutely here to listen, some of us more to engage and participate, but obviously, it's a collective action problem, and these are very complex, big organizations that are almost -- I mean, we're about 100,000 people, I think, at this point, you know. If there's seven or ten of us in a conversation like that, that also needs to carry and get informed.
So great conversation. Let's have two more folks come in and then get more on the -- how do we shape this tool to make it as powerful and as reasonable and useful as it can be for this multistakeholder conversation.
>> NALINI ELKINS: Yeah. I'm going to push back a little bit on some of what I've heard here. One of the things about the Internet and one of the things -- one of the reasons I really believe in the Internet is that it levels the playing field in a way that we have not seen for many, many -- maybe ever in our lives or maybe in the history of the world. I mean, not to use too much hyperbole in there, but really, it's a -- you know, you can have a small company that goes -- if -- right up against IBM or Google. If you've got a better algorithm -- I'm not saying always but oftentimes you can make it work, and we see this all the time, all the time there's stuff coming out, and so I think the monopoly of the big powers, you know, I don't know so much about that. Of course, of course there's things and there's some things for which you need capital. You're not going to go up against Chevron because that just takes too much capital, but some of the other people, sure, you can go up against a lot of that, and -- and the beginning -- and the open source community is a huge thing which just shows you that, and so I think the -- this is probably a whole discussion in itself, so I'm going to stop now. Thanks.
>> MAX SENGES: Snapchat that's probably the thing that's most appropriate. Martin, quick comment, please.
>> MARTIN: Yes. Thanks, Max. My issue was the deliberative polling and what comes out of that, and the danger of getting away and how we end up talking now is we talk a little bit in circles where the question is, well, is it useful or not?
>> MAX SENGES: That's why you're the last one.
>> MARTIN: And I think basically what I see is it's a useful instrument where it can help to better understand where we stand, but not as a final conclusion, particularly with the number of people we're working with here. Then big data analysis is much better, but it can help group in a specific setting like IGF, which means global level, multiple cultures, multiple jurisdictions into discussing subjects, and then takes stock once in a while, this is what we think now, this is where progress is, these are the issues remaining, so in that way I see useful role in global debates for that, but please st