IACL Younger Scholars Forum (Fukuoka, 25 July 2018) – A Brief Overview

Technology and Innovation: Challenges for Traditional Legal Boundaries’ Workshop

The 20th Congress of the International Academy of Comparative Law (IACL) took place this year in Fukuoka, Japan, between 22-28 July. Apart from bringing together established comparative law scholars from different fields and jurisdictions, the Congress also hosted the first edition of the IACL Younger Scholars Forum, convened by Richard Albert(Professor of Law at the University of Texas at Austin), the former president of the Younger Comparativists Committee of the American Society of Comparative Law.

With this occasion, 200 young scholars around the world had the opportunity to engage in an international academic debate and discuss their research through eight different workshops. Sofia Ranchordás (Professor of European and Comparative Public Law, University of Groningen), Andras Koltay (Associate Professor of Constitutional Law, Pázmány Péter Catholic University) and I had the pleasure of organizing one of the eight workshops, titled ‘Technology and Innovation: Challenges for Traditional Legal Boundaries’. The workshop covered discussions on 23 papers, which we grouped around 6 different themes: Privacy and Data Protection; Media Law and Free Speech; Challenges in Intellectual Property; Online Platforms; Business Law, Blockchain & RegTech, and AI Law. Young scholars from around the world were in attendance and their papers were commented upon by the Distinguished Provocateur-Discussant (Sofia Ranchordas), with the purpose of stimulating the consideration of new angles for their submissions.

read more

Facebook’s Data Sharing Practices under Unfair Competition Law

Crosspost from the Stanford Transatlantic Technology Law Forum Newsletter, Issue 2/2018 

This is a brief analysis of Facebook’s data sharing practices under unfair competition rules in the US and EU. A paper on this topic co-authored by myself and MEPLI research fellow Stephan Mulders will be available shortly, and it will be presented at the Amsterdam Privacy Conference in October 2018.

—-

2018 has so far not been easy on the tech world. The first months of the year brought a lot of bad news: two accidents with self-driving cars (Tesla and Uber) and the first human casualty [1],  another Initial Coin Offering (ICO) scam costing investors $660 million [2],  and Donald Trump promising to go after Amazon [3]. But the scandal that made the most waves had to do with Facebook data being used by Cambridge Analytica [4].

Data brokers and social media

In a nutshell, Cambridge Analytica was a UK-based company that claimed to use data to change audience behavior either in political or commercial contexts [5]. Without going too much into detail regarding the identity of the company, its ties, or political affiliations, one of the key points in the Cambridge Analytica whistleblowing conundrum is the fact that it shed light on Facebook data sharing practices which, unsurprisingly, have been around for a while. To create psychometric models which could influence voting behavior, Cambridge Analytica used the data of around 87 million users, obtained through Facebook’s Graph Application Programming Interface (API), a developer interface providing industrial-level access to personal information [6].

The Facebook Graph API

The first version of the API (v1.0), which was launched in 2010 and was up until 2015, could be used to not only gather public information about a given pool of users, but also about their friends, in addition to granting access to private messages sent on the platform (see Table 1 below). The amount of information belonging to user friends that Facebook allowed third parties to tap into is astonishing. The extended profile properties permission facilitated the extraction of information about: activities, birthdays, check-ins, education history, events, games activity, groups, interests, likes, location, notes, online presence, photo and video tags, photos, questions, relationships and relationships details, religion and politics, status, subscriptions, website and work history. Extended permissions changed in 2014, with the second version of the Graph API (v2.0), which suffered many other changes since (see Table 2) [7]. However, one interesting thing that stands out when comparing versions 1.0 and 2.0 is that less information is gathered from targeted users than from their friends, even if v2.0 withdrew the extended profile properties (but not the extended permissions relating to reading private messages).Table 1 – Facebook application permissions and availability to API v1 (x) and v2 (y) (Symeonidis et al, 2015)

read more

The Land Portal Foundation Partners Up with Maastricht University for Student Research Project on Land Governance

Press release

Research education is one of Maastricht University’s CORE values: to take the university social responsibility seriously by linking the university to society, from the local to the global level, and to do so by creating open access knowledge which can further strengthen connections with society. One of the educational projects in the current academic year that aims to meet these goals is the collaboration between the Faculty of Law’s Skills: Introduction to Comparative Law course and the Land Portal Foundation. After establishing the platform in 2009 as a partnership project dedicated to supporting the efforts of the rural poor to gain equitable access to land by addressing the fragmentation of information resources on land, the Land Portal eventually became an independent non-for-profit organization, based in the Netherlands in 2014. Through a variety of initiatives and partnerships, the Land Portal works to create a better information ecosystem for land governance through a platform based on cutting-edge open data technologies. According to professor Leon Verstappen (University of Groningen), the chairman and founder of the Foundation:

‘The Land Portal is world leading in providing access to information and data on land issues. We adhere to linked open data principles. We dig deep into countries to find and open up information on land.’

The Foundation has recently received a new £1.3 million subsidy from the Department for International Development of the British Government.

read more

You Don’t Need to Be a Superhero to Be in the Justice League: Rethinking Justice Hackathon (3-4 March 2018, Brightlands Smart Services Campus)

Making the world a better place is easier said than done. Ours is a shared world: citizens, businesses, states and institutions all face the same risks and challenges, and so there is a constant need for society to innovate – to find better ways of doing things. Ideally, this can be done in order to bring about more justice in the world. What we mean by justice is simply more fairness, in the way in which citizens, civil society, businesses and public institutions interact with one another. While thinking about broad theme has its advantages, we want to create a nurturing environment and mindset where someone with an idea can go ahead and do something about it. This is how the Rethinking Justice Hackathon came to life: students, staff and alumni from Maastricht University, as well as friends from industry, coming together in a 24-hour hackathon to celebrate free thinking and enthusiastic doing.

As one of the youngest Dutch universities, Maastricht’s pedagogy has always stood out because of its Problem-Based Learning (PBL) approach: departing from real-life problems and learning by doing, either through independent inquiry or group collaboration. For this reason, we consider hackathons and PBL to be a match made in heaven: creativity, leadership, perseverance, empathy, communication – all of these 21st century skills that are so central to modern work experiences have friendly roots in the pedagogical concepts of Maastricht University education.

Organized by the independent law & tech community Technolawgeeks with the support of Maastricht University and the Brightlands Smart Services Campus, the hackathon celebrated rethinking justice in four different challenges: The Hague Institute for the Innovation of Law (Social Justice challenge); eBay (E-Commerce Conflicts challenge); Dubai International Financial Centre (DIFC) Courts (Courts of the Future challenge); and Maastricht University’s Institute of Data Science (Data-Driven Justice challenge). Each of the partners hosted a workshop for participants (online/offline), to share with them how to relate to the challenges from the perspective of their own disciplines and expertise, while also allowing the participants to immerse in the way of thinking of the Hackathon partners.

read more

Innov-AI-tion Law for Technology 4.0 – An Interdisciplinary Conference

The European and global society is gradually transitioning into the fourth industrial revolution, marked by an exponential technological advancement of Artificial Intelligence (AI), such as works of art created by AI systems, algorithmic decision-making and autonomous vehicles. The profound transformation of our society creates a pressing need for a clear legal framework that the EU is currently seeking to develop within Digital Single Market, notably through the adoption of the recent EP Resolution on Civil Law Rules on Robotics.

The conference will be composed of three panels, tackling respectively the questions of private law, IP and privacy (please see below the conference programme). The concept of the conference is rather unique, as each topic will be covered by two presenters having different backgrounds (one from law and another from technology).

The conference undertakes to respond to the quest for establishment of a regulatory framework by putting to debate questions still unsolved that touch upon several fields of law:

(i) Private law: What types of regulation should govern AI liability, and which actors should be involved in these regulatory approaches?

(ii) IP law: Who should hold copyright over works of art created by AI agents and can AI-generated inventions be patented? How does the IP and data interface work in the context of AI?

(iii) Privacy law: How to protect privacy and ensure accountability for decisions taken by autonomous AI agents affecting humans (e.g. automated tax decisions)?

read more