In New York City, on July 21st 2016, the Free and Safe in Cyberspace workshop brings e-privacy and public safety one step closer
WORKSHOP REPORT: “Free and Safe in Cyberspace” workshop on July 21st 2016 in New York City continues to strive for meaningful e-privacy and increased public safety
A small workshop was held on 21 July 2016 in New York City, as part of the “Free and Safe in Cyberspace” International event series, was focused on discussing and planning possible solutions to provide meaningful levels of e-privacy and e-security for all users, while also increasing public safety and cyber-investigation capabilities. Following the great success of the 2015 Edition, a larger two-days 2° EU Edition will follow on Sept 22-23rd 2016, again in Brussels, where a major comprehensive proposal will be presented by a number of speakers involved in the event series, as well as selected results of innovation projects of EIT Digital.
In introducing the July 21st event, Rufo Guerreschi (executive director of Open Media Cluster and event co-organizer) summarized a few crucial points for the entire Free and Safe in Cyberspace event series: “Recent episodes showed that, on the one hand, citizens and institutions suffer a great loss of civil rights and sovereignty, while, on the other, EU and US IT companies are struggling to seek ways to offer the levels of trustworthiness required by both National customers and legislations. But this clash about the need of ensuring public safety and security of state-nations and also user privacy actually could be reconciled. In fact, if you had to choose one of the two you will not be able to sustain democracy. Democracy and freedom require both citizen safety and privacy protection. We hope that our discussion events can reconcile such gap and find a common ground to build a more equitable, effective toolkit for all stakeholders involved”.
Expanding on this introduction, Jovan Golic (EIT Digital Privacy, Security and Trust Action Line Leader and renowned cryptographer) provided a general overview of the deeply complex technical issues at stakes: “It is not true that there is a tradeoff between cyber-security and cyber-privacy, they are both on the same side. We need to talk about more of both, and at the same time ensure data protection. If you don’t protect data then you cannot help cyber-security, because the data will be prone to attacks. However, there is a tradeoff between cyber-surveillance and cyber-security. And by talking about these topics, we can try to change the existing trend where governments have their own ways how to control things in the security area, including legislation, and big security companies prefer to just stay quiet and comply with government mandates. This is the reason why we are still lacking good solutions in regards to data protection practices”.
In his keynote speech, Professor Joe Cannataci (UN Special Rapporteur on Privacy, SRP) explained that “the safeguards and remedies available to citizens cannot ever be purely legal or operational”. Therefore, a much better option is to “involve all stakeholders in the development of International law relevant to privacy” and to “engage with the technical community in an effort to promote the development of effective technical safeguards including encryption, overlay software and privacy protection”. Both goals are at the forefront of the SRP overall efforts, added Cannataci, while also pointing out an important and recent advancement: “Both the Netherlands and the USA have moved more openly towards a policy of no back-doors to encryption, a step that should be encouraged by the UN and other International bodies”.
In the second keynote speech, Max Schrems (leading Austrian privacy activist) summarized the story of his lawsuit for the invalidation of the Safe Harbor Agreement that allows US companies to store European citizen personal data. “What was the reason for the lawsuit? Even if the European Union talks a lot about mass surveillance, with EU resolutions, angry letters and so on, we knew that this kind of ‘public outrage’ was not going anywhere. Therefore, we looked at what I call ‘public/private surveillance’: companies like Facebook are subject to both US and EU jurisdictions, so this law conflict that must be resolved. In turn, this gave us the possibility to bring a legal case (mostly opposing mass surveillance) in a European Court and even have jurisdiction there, because obviously, we cannot have jurisdiction in other countries”. This lawsuit (and it on-going outcomes) was just a first step to making public some problems about global mass surveillance procedures. Another important issue, according to Schrems, is that “given the policies now being adopted and/or rewritten around the world, the de-identification and anonymization of data is no longer a sufficient safeguard if governments & corporations continue to repurpose data originally collected for one specific purpose”. His possible solutions to move forward? “First we need some codes of conduct that could possibly be drafted by and implemented throughout the industrial sector. And then we should establish shared certification options and make sure that companies are fully compliant (with some help from an independent body monitoring)”.
The event included four discussion panels or Challenges, focused on a series of inter-related challenges (A – How can we achieve ultra-high assurance ICTs?, B – Can ultra-high assurance ICT services comply with lawful access request while meaningfully protecting civil rights?, C – What is the role of AI in providing ultra-high assurance ICTs? D – What National policies or International treaties can we envision to support ultra-high assurance ICT standards?).
Here are a few highlights:
Jovan Golic delivered an introductory keynote for panel B about the interplay between cyber-security, cyber-privacy, and cyber-investigation, about the need to reconcile cyber-investigation with cyber-security and cyber-privacy by widely accepted transparent solutions, which would foster business opportunities in the area of digital security, and already practical advanced crypto techniques for data protection, including threshold cryptography based on shared key escrow and practical fully homomorphic encryption, as well as innovation & business results of EIT Digital in this area.
Roman Yampolskiy delivered an introductory keynote for panel C on the security threats related to modern AI systems and smart things, on one side, getting more and more powerful and helpful for humans, but possibly threatening their lives and work by improper designs and implementations, on the other.
Daniel Castro, Vice President of the Information Technology and Innovation Foundation: “How do we create a situation where secure software and hardware systems can be developed? Let’s make a comparison with the construction industry, where developed countries established certain types of regulations and guidelines and today we have buildings that can sustain an earthquake or a fire. We got rid of poor standards and introduced a system based on specific building codes, inspectors and so on, thus achieving a level of safety that seemed impossible just a few years ago. We need to promote public-private partnerships and formalize strong standards and accountability in this area and pushing hard to have governments and businesses working together”.
Yvo Desmedt, renowned cryptographer, and pioneer of threshold cryptography: “What can you do when you really, really worry about privacy? The answer is very simple. don’t use a smartphone. I do not carry a smartphone. Secondly, if you are worried about being eavesdropped, use paper and pen or do what the Russians have done for decades, use typewriters. But given that these are radical and extreme security options, will most people want to use them? No. Can we achieve today economically-feasible and effective security? The answer is no”.
Rufo Guerreschi: “Today’s ‘smart technologies’ (deployed via wi-fi in our homes or to help in natural disasters, etc.) are not at all resistant to hacking by criminals or by authorities. And despite recent advancement, technologists seem unable to ensure a decent level of individual privacy and there is little hope that National legislations can protect it either”.
Rufo Guerreschi: “We currently do not have solutions which are meaningfully private, even if you pay a lot of money or are willing to deal with the inconvenience. That’s also proven by the fact that the market for crypto devices is completely inexistent. It’s a matter of a few thousand devices. Not to mention the fact that, if buy a crypto-phone, you’re flagging yourself, suggesting that probably you’re trying to hide something and most likely you have no clue about that.”.
Jovan Golic:“We need to look at the reality of data protection at different stages. At the first stage of data collection, there are privacy policies and user consent, but they do not prevent uncontrollable mass data collection by big Internet service providers. What is protected in practice is data communications, typically between a client and a server, rarely end2end between two clients. However, data encryption is endangered by various so-called backdoors at different levels of the data security chain, including crypto algorithms and protocols, key generation and management, and software and hardware implementations. Backdoors are by definition secret and proprietary before they get revealed to the public and essentially mean that the used cryptosystem is inherently insecure due to them. In practice, they are used for cyber-investigation by privileged parties. But, they are also used by hackers and cyber-criminals, which renders the cyberspace insecure. Instead, for the same purpose, one may use the so-called front doors, which are by definition transparent and may be based on properly implemented threshold cryptography with shared key escrow providing forward and backward secrecy and focused cyber-surveillance. Data storage is protected by encryption and controlled access, but there are too many breaches of database servers storing sensitive data, because of cryptographic key management issues and various software vulnerabilities. Data processing is practically not protected at all, not even for sensitive data such as the e-health data, because service providers work on plain data to provide their services, regardless of the emerging practical techniques for fully homomorphic encryption, which enable data processing in the encrypted domain. Consequently, what is needed in order to improve the current unsatisfactory situation and trends is the application of existing, but rarely applied, trustworthy technologies for data protection”.
Rufo Guerreschi: “A large majority of people think that secure products are already out there and easily available, including Apple iPhones and the Tor system. But there’s an incredible alignment of interest between Apple, Tor makers and security agencies. Why? Apple and Tour makers they have an interest that people believes their thing is secure so they buy their stuff instead of our stuff. Security agencies have a huge interest that this security is oversold so that people use this tool, communicate secret stuff and they can spy them using directly implanted backdoors or vulnerabilities that are by them discovered or bought and not publicized”.
Daniel Castro: “I think we can have highly regulated systems, for example, financial systems, where we are going to want recovery, in general, to discuss what that looks like and how we enable lawful access. It even might make sense in some regulated communication services. There are multinational companies that have a large user base and we need to consider how to regulate them. In many cases, I can write software and have communications with someone else around the world and we are using software that we’ve written that nobody else has access to. That’s going to be secure and outside of the scope of what law enforcement. But, we still need to figure out how to deal on the policy side with what we are going to do with those situations”.
Zachary Goldman: “At least in the US, there are questions about the circumstances under which you can compel individuals to provide decrypted information. There are questions about the circumstances in which you can require the manufacturers of systems to build systems and networks in a way that clear-text data will always be available. There are questions about whether and under what circumstances you can compel device or app manufacturers to provide clear text data. … I don’t feel comfortable living in a world in which the law enforcement community doesn’t have the ability to infiltrate and take down” such communication networks”