“White Paper” of the Digital Freedom Committe

In order to start the activities of the Digital Freedom Committee, the Ministry of Justice requested the public bodies concerned (Cabinet Office of the Prime Minister, Ministry of Finance, Ministry for Innovation and Technology, Commissioner for Fundamental Rights, National Office for the Judiciary, Hungarian National Authority for Data Protection and Freedom of Information, National University of Public Service, National Media and Infocommunications Authority) to inform our Ministry about their experience regarding the application of law in cyberspace related to freedom of expression, taxation, media regulation and data protection and the problems observed on the basis of citizen requests submitted to them.

The public bodies welcomed the initiative of the Ministry of Justice of establishing a committee examining the activities and the implementation of fundamental rights in the online world, and on the basis of their communication, they are pleased to participate in its work. On the basis of the answers returned, we can establish that a lot of questions and issues have arisen regarding the regulations in force, including the following:

1. Freedom of expression and the protection of privacy:

  • the effect of online platforms’ own regulations on freedom of expression: a vast majority of online platforms set up certain rules (General Terms and Conditions) for the users of their services and if these rules are violated, they limit the publication of content or sometimes the general access of users to the platform. The decisions based on a complicated system of standards have a serious impact on the implementation of constitutional rights, as the platforms themselves set the boundaries of freedom of speech and the related order of supervisory proceedings without any constitutional guarantees. This also raises issues affecting national sovereignty (pseudo legal system),

  • the meaning and duration of being a public figure, particularly the right to the protection of privacy,
  • fairness of election campaigns on community platforms: establishing the conditions of participating in political campaigns, and ensuring the transparency of such participation – avoiding deliberate misinformation, and the effect of fake news on the implementation of fundamental rights,

  • protection of personal rights of users in the online world: it is difficult to identify the person publishing content, determination of the extent to which the platforms play a role in the removal of illegal content (responsibility, legal assessment of comments, clear definition of the obligations of the platforms).

2. Data protection:

  • Mutual implementation and enforcement of the right to be forgotten (GDPR Article 17) and other fundamental rights (e.g. publicity of a trial, right to a fair trial, right of access to and distribution of data of public interest, freedom of expression),
  • Joint application of the Civil Code and the provisions of GDPR, the relationship of legal remedies related to the protection of personal rights and available when data protection provisions are violated,

  • difficulties of enforcing law on social media platforms: online platforms rarely respond to official requests, which makes it rather difficult, for example, to conduct proceedings concerning the violation of personal rights. Hungarian authorities have difficulties in contacting companies with seats in the United States; submitting requests usually takes more time than the data storage period determined by law. In order to improve detectability, there is a need to update data request practices and set up special communication channels for official requests,

  • the effects of information monopolies of non-public entities (e.g. virtual communities) on the right related to the protection of personal data and on freedom of information,

  • opportunities of making and enforcing laws available to mitigate the negative effects of activities in the cyberspace on fundamental rights,

  • data integration: online service providers handle unprecedented data volumes, which they connect and create user profiles, as a result of which full personal profiles of the individual users are created.

 

3. Issues of media regulation:

  • transparency of the operation of platforms: without the transparency of the operation of algorithms (or the “codes” regulating the operation of the platform) crucial for a large number of decisions, the principles of and reasons for highlighting, deleting or pushing to the background some content cannot be determined.

  • ensuring social media platforms and internal pluralism,

  • the role of platforms in news consumption: as a result of changing media consumption habits, citizens collect an increasing proportion of news and information from social media and news aggregator websites. In relation to this, it is worth examining the effects of social media related phenomena on the public, on the future of professional media and on how users are informed; and, also, the effects of the tools by which service providers influence the flow of information,

  • consumer protection in the online world: all the issues related to consumer protection are also relevant in the online world, including the boundaries of commercial communication, the contents of agreements concluded between service providers (e.g. an online platform provider) and the appropriateness of electronic commercial services,

  • new forms of online commercial communication: the increased role of influencers in reaching and addressing consumers. It is important to examine

 o whether their activities should be considered commercial              communication or advertising,

 o    on the above basis, what their fundamental obligations are,

 o  how the arrangement concerning the responsibility for content     should be approached within the triangle of influencer – advertiser – platform provider.


4. Taxation: examining the introduction of digital services tax on revenues generated by providing specific digital services:

  • ·    in 2018, the European Commission recommended the introduction of a revenue-based specific tax. The draft directive intended to impose a tax of 3% on the revenues generated by the services of digital companies connected with a specific Member State. The directive intended to tax specifically those taxpayers who have significant revenues. However, the proposal could not gain unanimous political support even after several attempts. At the same time, following the proposal, Members States started to declare their intentions and their specific taxation plans to tax the revenues of digital companies. Almost all these proposals follow the EU directive,
  • in parallel to the above, the OECD compiled a work plan based on two pillars in order to tax the digital economy, which concentrates partly on taxing digital companies in the future, and partly on reducing tax competition and sanctioning too low tax rates.

o      Pillar 1 related to the digital economy reacts to the question of how digital companies can be taxed in the users’ countries.

o    Pillar 2 related to minimum taxation is independent of the digital economy and serves the purpose of reducing competition in tax rates.


Public bodies also raised the following topics:

5. Protection of copyright in the online world:

  • the latest online services have significantly increased access to digital content, including copyrighted works. All this means new challenges when enforcing copyright, and meeting those challenges involves the fundamental interests of society. Spreading news aggregators and search engines had a significant effect on how online media products function. This process is addressed by the new EU copyright directive adopted in spring 2019 (Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019).

 

6. Enforcing criminal law in the online world:

  • fighting online hate speech: the operation of the platforms challenges the practical implementation of the most important boundaries of freedom of speech. The most important prohibitions must be complied with also in the platforms, and the platforms cannot be some neutral players which may witness the incitement of hatred, terrorism and the appearance of otherwise harmful content, but they must actively take actions against this type of user-created content. The relevant regulation can be provided also by the State. The Commissioners for Fundamental Rights, in the course of their inquiries in the past 20 years have met a wide range of hate speech: comments of private persons inciting hatred, works inciting hatred published by extreme groups and bands, as well as portals created against minority groups, the subsequent eradication of which is extremely difficult due to the speed of information flow.

  • difficulties concerning the detection of crimes in the online world: in order to reduce latency, the practice of legal assistance and the enforceability of requests by the authorities should be reviewed,
  • cyberbullying – fight against cyberbullying,
  • the role and use of social media, and its role in committing crimes.

 

7. Child protection:

  • the protection of child rights in the online world: promoting the informed use of the media, and awareness education are areas of crucial significance. Currently, online filtering programmes are only marginally used in Hungary, although several applications are available to parents to better meet their child protection obligations. Developing skills related to using online space is the task of several state and non-profit organisations (Digital Wellbeing Programme for developing children’s online awareness). More emphasis should be placed on the dangers related to committing online infringements and increasing online legal awareness,

  •  more effective enforcement of the state’s obligation to provide judicial protection related to protecting child rights, in order to promote prevention and efficient institutional legal protection,
  • the problem of handling the data of minors registered in social networking sites, as social platforms handle the data of minors in the same manner as those of adult users. Although there are reporting channels in social media services that aim at excluding those under 14, the functioning and efficiency of those channels are not transparent.

 

8. Issues concerning national sovereignty:

  • the issue of digital quasi sovereignty of online tech companies.