Lessons learned at MAPPING, on intellectual property and privacy on the Internet
During the first three days of this November, Safe Creative travelled to Prague to attend the Second General Assembly of the MAPPING project. This is a European initiative created to propose alternative solutions to privacy, intellectual property rights, and Internet governance.
We wish to thank the event’s organisational team for the chance of participating in these three days of talks and debates.
Also, with the motive of this event, there are three important matters which might be relevant to you and which we’d like you to know more about.
Human rights and intellectual property on the Internet
In order to talk about the current situation of human rights and intellectual property online, we briefly interviewed Christof Tschohl, scientific director at the Research Institute of the Digital Human Rights Center in Austria.
Just like he started explaining, intellectual property has been explicitly addressed in the European Union Charter of Fundamental Rights only very recently.
C. T. : “In Article 17 […] this is the first time with the EU Charter of Fundamental Rights that we have it explicitly laid down as a fundamental right.”
We asked for his opinion regarding the current situation of authors’ rights, and whether there are means right now to achieve a balance between the right to intellectual property protection and the right to freely participate in one’s culture (Article 27 of the Universal Declaration of Human Rights).
So far, according to Christof, such a balance does not exist yet; there are no solutions capable of satisfying all parts involved in the discussion. Plus, this is a matter which directly touches another one of the utmost importance on the Internet: privacy.
C. T. : “All these file-sharing platforms do necessarily – by technical means -, require that you can never just download data, you also need to upload data. […] The problem is not the downloads, the private use, the problem is to publish it and maybe to merchandise in direct or indirect ways. […]
The problem is- it’s going down to these end users who are, just by means of technology, the ones who also spread the content, but they’re not really interested in sharing the content. It’s about the model.
We have the privacy issue on one side, particularly regarding the disclosure of subscribers behind an IP address, and on the other side we have the content, the person who has the right to intellectual property, usually represented by bigger groups or law firms.
I don’t think we’ve found a balance yet, from both sides. It’s not convincing for none of the sides. For the law enforcement it is not convincing, for the data subject it is not convincing, and, by the way, I think that most models are not either convincing for the artists, for those who actually produce the content.”
The only thing that’s evident at this point is that all parts, instead of coming to an understanding, keep reinforcing their respective positions to pursue their own interests, which makes the gap bigger. These parts are the content creators, the entertainment industry, and users.
There’s a need to support new business models in order to reconcile these postures. There’s no other way: content creators and public depend on each other. The solution lays in the creation of new models that prove themselves effective and satisfy all stakeholders, and there are many emerging businesses pursuing this idea. But there’s still much work to do.
One of the core ideas of the event was the topic of privacy on the Internet, in a post-Snowden era where users are growing increasingly aware and critics over the situation of their privacy rights. An increasing number of online businesses are reconsidering their current policies in order to safeguard their users’ data in more effective ways.
In this sense, we also wanted to talk to Christof about the concept of “privacy by design”. Platforms applying this idea are those which protect personal data and inform their users of how each and every piece of information facilitated is used, in a transparent and understandable way.
C. T. : “By design means informed consent, to make really clear what data is used for what, what is disclosed, and what’s your internal use of the data.”
At Safe Creative we have always addressed this matter with the care and attention we believe it deserves. We consider user privacy to be fundamental, so we discussed our own system with Christof, who confirmed some of the practices we already use while providing more information about the privacy by design concept.
C. T. : “By design means what you say: giving possibilities. Depending on the use case, the user can choose what can be disclosed and for what. […]
One thing is the use case and other is the risks connected to the use case for the data subject. If you have no sensitive information, and there’s low risk… then you can choose a one-factor identification model. The moment you have some increased risk, then […] a one-factor identification model is not secure. What’s the risk? If there is an important work with high value, and the question is, who would be the author, it would be possible for someone else to gather the full rights to that work. I would recommend that you would need to have further details.
Otherwise you can say “ok, you can keep this private and verify by your email address” but you should inform about the risks.”
In connection with these comments, we would like to remind you to check and update your personal data (as well as any security measures available) on those platforms that you use more frequently or contain sensitive information.
On this page you can check how secure is your password.
New EU law proposals
We also had the chance to talk about some of the main concerns we have detected among our users: how to track where their content is being shared online, and how to make their rights as authors known.
In light of these worries, there are new regulatory laws proposed by the European Union that we think you should know about, although they haven’t been approved yet. Articles 11 and 13 would have a direct impact on content creators, users and intermediates on the Internet, and they will mean great political, technological and logistic challenges:
Article 11. Protection of press publications concerning digital uses.
Article 13. Use of protected content by information society service providers storing and giving access to large amounts of works and other subject-matter uploaded by their users.
To sum up, both articles aim at two kinds of platforms: those that link to news such as Google or Reddit, and services that host and share big quantities of content online.
Article 11 would regulate the right to link to news and reuse them. (Furter information)
According to article 13, services that store and give access to “large amounts of works” would have to implement filtering systems to ensure that the content they distribute is free of copyrights. An example of this type of system would be YouTube’s ContentID.
Similar regulatory measures have already been applied in Spain and Germany.
What are your thoughts on these articles? We look forward to reading your opinions. We’ll keep you informed. Stay safe!