Human Rights and Political Advertising Regulations in the European Union
on
Human Rights and Political Advertising Regulations in the European Union
Alia Yofira Karunian1
1 Faculty of Law, University of Edinburgh, E-mail: [email protected]
Article Information Abstract
Received : 21st May 2023
Accepted : 19th December 2023 Published : 30th December 2023
Keywords:
Human Rights, Political
Advertising, European Union
Corresponding Author:
Alia Yofira Karunian, E-mail: [email protected]
DOI :
10.24843/KP.2023.v45.i03.p01
In the Political advertising, and the AdTech industry that powers it, have been raising concerns about data protection, potential manipulation, and selective information exposure to voters in the European Union. Against this backdrop, this article aims to answer the following question: What are the human rights implications of political advertising, and to what extent does the European Union’s regulatory framework address these problems? Research method applied in this article is normative legal research, using a combination of statutory and critical analytical approach. In conclusion, sectoral regulations in the election and political advertising, strengthened by comprehensive data protection regulations, are essential to mitigate the harm political advertising on online platforms causes.
-
1. Introduction
There are several significant differences in how political messages are advertised in traditional media (TV, radio, newspaper) and new media (for instance, social media, online newspaper), especially on the level of precision in targeting people differs significantly. If traditional media generally targets a certain large demographic audience, the new media allows political parties and candidates to target voters at a far more granular, individual levels, through a technique commonly known as political microtargeting (“PMT”). PMT essentially consists of two phases of activities, first, “collecting information about people” and second, “using that information to show them targeted political advertisements”.1 Political microtargeting shows that political actors are partially shifting from sending messages to a broad audience (“broadcasting”), to sending tailored messages personalized to the” needs, wants expectations, beliefs, preferences, and interests” of individuals (“narrowcasting”).2
While PMT, in general, has various benefits such as increased efficiency in campaign resource allocation3, it also poses numerous risks, especially when combined with advertisement technology (“AdTech”) – a set of technological “tools and services that connect advertisers with target audiences and publishers”.4 The ‘marriage’ between the two -PMT and AdTech - is transforming how political parties and candidates carry out their political campaign.5 Political candidates and parties now increasingly collect personal data, without individual’s consent, and utilized it for the benefit of their campaign.6 The scope of personal data used in PMT is also expanding greatly, using seemingly mundane data observed from our online behavior to predict our political opinions. This technique is known as Political Behavioural Targeting (“PBT”), a type of PMT that conducts data profiling of voters based on their online behavior data and other data (often acquired by data brokers), as well as the use of this profile to target the voters’ individually with tailored political messages.7
The majority of privacy concerns around the practices of PMT and PBT mirror the problems found in the commercial AdTech industry in general which include lack of transparency and failure to comply with data minimisation principle. Moreover, political advertising particularly on social media presents risks in amplifying the spread of disinformation during the election period. Political advertising disseminated through social media also tends to receive less scrutiny in comparison with political advertising in traditional media such as TV, radio and newspaper.
A 2018 study has documented that political actors organize social media manipulation campaigns in 48 countries “making increasing use of paid advertisements … on Internet platforms”.8 Political disinformation during the election period amplified by the advertising feature on social media, can be seen as “the pollutants” on the information ecosystem, interfering with voter’s right to make an informed decision to vote without manipulation. Against this backdrop, this article aims to answer the following question:
What are the human rights implications of political advertising, and to what extent does the European Union’s regulatory framework address these problems?
Prior research, including those conducted by Drunen, Helberger, and Fathaigh (2022)9 and Ruohonen (2023)10, endeavored to assess the regulatory obstacles associated with political advertising within the European Union. Nonetheless, none of the aforementioned works offers an in-depth examination of the human rights implications pertaining to political advertising within the European Union. Consequently, this article introduces a novel approach by initially undertaking a critical analysis of the human rights ramifications before delving into the subsequent regulatory challenges.
Research method applied in this article is normative legal research, using a combination of statutory and critical analytical approach. Reviewing the existing literature on political advertising, critically, is an important step to gain a comprehensive understanding on human rights implications of political advertising. For this reason, critical analytical approach is used to identify the human rights implications of political advertising by firstly identifying the most relevant and significant literature on political advertising and then discussing and evaluating the identified literature.11 The writer chooses the EU regulatory framework as a focus on statutory analysis because the EU is the first to enact a comprehensive regulation on online advertising, including on political advertising.
Political advertising, and the AdTech industry that powers it, have been raising concerns about data protection, potential manipulation, and selective information exposure to voters.12 This section elaborates on the nexus between these challenges with our human rights, namely (3.1.1) right to privacy and data protection, as well as (3.1.2) freedom of expression and right to vote.
Our right to privacy is an internationally recognized human right, protected under Article 12 of the Universal Declaration of Human Rights (“UDHR”) and Article 17 of the International Covenant on Civil and Political Rights (“ICCPR”). Similar protection exists in the regional human rights instruments in Europe, such as Article 8 of the European Convention on Human Rights (“ECHR”) and Article 7 of the Charter of Fundamental Rights (“CFR”). Moreover, Article 8 of the CFR provides a specific right to data protection, while international human rights instruments generally place data protection as an integral part of right to privacy.13 The European Commission in 2020 has underlined the nexus between privacy and democracy, stating that “effective privacy online means a strengthened democracy offline”. 14
The rapid development of the AdTech industry, coupled with sophisticated data analysis techniques such as PMT and PBT, have enabled new and innovative ways for political actors to deliver their campaign messages. However, the AdTech industry has long been criticized for its opaque business model, thus creating privacy concerns. The heavy reliance of the AdTech industry on Real Time Bidding (“RTB”) system – “an automated process that enables advertisers to target very specific groups of people on different websites, videos, apps without having to negotiate prices directly” – is highly criticized, due to the risks the RTB system posed to privacy.15 The ICO which is the UK Data Protection Authority, launched an investigation into the AdTech industry and published a report in 2019 which highlighted problematic issues around the AdTech industry, such as, opaque supply chain of data16 where “a single RTB request can result in personal data being processed by hundreds of organizations”, often without the individual’s knowledge,17 thereby raising data protection issues around data minimization, fair and transparent data processing.18
For long, political actors have joined forces with data brokers to create detailed voters’ profile by combining publicly available voter data with commercial data.19 In most countries, political parties and candidates have access to the electoral roll containing personal data of voters. The scope of personal data contained in the electoral roll, however, differs for each country. For instance, the UK’s electoral roll contains voter’s name and address20, while Indonesia’s electoral roll contains personal data collected under the national ID system such as voters’ names, Family Card numbers, ID number, addresses, gender, date of birth, and disability status.21
The scope of personal data contained on the electoral roll is an essential modality for political candidates and parties to profile their voters, but to build a more comprehensive profile, political candidates and parties are now combining it with commercial data, and this is where data brokers play a major role. The data brokers, “companies that collect consumers’ personal information and resell or share that information with others”,22 is one of the major actors that contribute to the opaqueness of data supply chain in the AdTech industry. They generally collect data that are public and non-public, available both online and offline.23 These data are collected, analyzed by data brokers “without interacting directly with them [data subjects]”, and shared to a third party and “consumers are largely unaware that data brokers are engaging in these practices”.24 This is a direct violation of the obligation to ensure lawfulness in data processing, a key data protection principle.
Furthermore, the data profiling conducted by political candidates and parties in their campaigns also poses privacy risks, particularly when combined with PBT. From the seemingly mundane online behavioural data, political parties and candidates are now able to predict personality traits, socioeconomic background, and most importantly, the voters’ political opinions.25 Data profiling poses significant risks to privacy largely due to the lack of information available to data subjects that their data is used to predict something beyond the initial purpose of data processing they have consented.26 It is also worth noting that personal data revealing political opinions, fall within the scope of
special category of data under most data protection laws around the world, which warrants higher data protection measures.
There are growing scholarly works that claim that ‘mundane’ behavioural data like phone metadata (e.g. call logs, contacts) can be used to accurately infer users’ personality traits27 and their socioeconomic status.28 Similarly, these mundane data are now increasingly used to infer our political opinions. For instance, researchers pointed out that our social media data (e.g. list of people we follow)29 and online search history30 can be used to infer our political views. These practices show how powerful actors like political parties and candidates are now able to profile and make important decisions (e.g. what campaign content to show us and not to show), without our knowledge or input.31 Inaccuracy in data profiling in PMT and PBT also poses significant risks due to its political nature.
In addition to privacy concern, profiling in this situation also raises concern about selective information exposure to voters, which interferes with the right to receive information, as part of our freedom of expression.
Freedom of expression is protected under international human rights instruments such as Article 19 UDHR and ICCPR, as well as regional instruments such as Article 11 of the CFR and Article 10 of the ECHR. Similarly, our right to vote is also protected under various human rights instruments such as Article 25 ICCPR, Article 39 of the CFR and Article 3 Protocol 1 of the ECHR. Both freedom of expression and right to vote is closely linked to each other. Freedom of expression is “the foundation stone for every free and
democratic society”32 and “essential conditions for the effective exercise of the right to vote and must be fully protected”.33
At the core of freedom of expression is the right of “free communication of information and ideas about public and political issues” between voters, political candidates, and parties.34 The deployment of PMT and PBT techniques by political candidates and parties has resulted in the gradual shift from undifferentiated mass audiences (e.g. broadcasting), to micro-targeted messages tailored to the “needs, wants, expectations, beliefs, preferences, and interests” of voters’ profile.35 This selective information exposure of voters violates non-targeted voters’ right to receive information and collectively participate in the public discourse, 36 thereby distorting the democratic process.37
Paid advertisement is one of the tools of organized social media manipulation campaign38, and is gaining popularity in 48 countries “where political parties are spreading disinformation during elections.”39 Disinformation is one of the three types of information disorder and is defined as “information that is false and deliberately created to harm a person, social group, organization or country.”40 The use of online political advertising to amplify politically motivated disinformation during the election period interfere with our right to vote, due to its manipulative nature.41
The European Union highlighted the need for regulatory interventions “to ensure greater transparency in the area of sponsored political content (‘political advertising’)” as one of the key actions enshrined under their Democracy Action Plan.42 On November 2021, the European Commission introduced a proposal of regulation on transparency and targeting of political advertising (“RPA”), which governs detailed transparency provisions and also specifically tackles privacy concerns by requiring the disclosure of information about targeting and amplification.43 Political ads transparency is a crucial modality for combatting disinformation campaigns during the election, and the proposed political ads transparency regulation is thus closely linked to the EU 2022 Code of Practice on Disinformation,44 a code of conduct recognized under the Digital Services Act (“DSA”).45
Any regulatory attempt to mitigate the risks posed by political advertising must conform to the principle of necessity and proportionality, particularly when political advertising is part of political expressions, and thus afforded the highest protection under the human rights laws.46 This section looks at numerous existing regulatory framework in the EU that seeks to regulate political advertising from various sectoral regulations and policies on (3.2.1) Political Advertising, (3.2.2) Data Protection, and lastly provide recommendations on what form of regulations are necessary and proportionate.
Several countries imposed varying limitations on what political parties and candidates can do when campaigning online for elections. Online platforms, like Twitter for instance, even banned political advertising altogether.47 However, limitations on political advertising disseminated through online platforms “generally should be contentspecific” and “generic bans on the operation of certain sites and systems” is considered
disproportionate.48 Therefore, a total ban of political advertising on online platforms is arguably disproportionate, and crafting careful limitations on political advertising should therefore be prioritized. In practice, several countries adopt a stricter approach in regulating political advertising, whereas others are more risk averse.49
Defining what constitutes political advertising is an important starting point for policymakers. In the EU, the RPA defines ‘political advertising’ as “the preparation, placement, promotion, publication or dissemination, by any means, of a message: (a) by, for or on behalf of a political actor, unless it is of a purely private or a purely commercial nature; or (b) which is liable to influence the outcome of an election or referendum, a legislative or regulatory process or voting behavior.”50 In other words, political advertising is defined to not only include election-related advertising, but also issues of public interest such as proposed laws and regulations. Such broad scope has been criticized, “since it subjects certain forms of dissemination of political speech to tight constraints within the context of the Regulation, as well as opening the door for further limitations at the national level.”51
Additionally, the definition of political advertising provided by the RPA has been also criticized to be lacking the ‘commercial’ element. As stated by Joan Barata, “the proposal does not clearly identify the need for an agreed remuneration in order for a message to be considered as political advertising.”52 The distinction is paramount in ensuring that political advertising on online platforms regulations do not interfere with voters’ legitimate political speech disseminated both publicly and privately through online platforms, that are non-commercial.
The RPA further provide a list of entities who are considered as political actors in the EU, which includes the following: (a) political party, (b) political alliance, (c) European political party, (d) candidate for any elected office at European, national, regional and local level, or for one of the leadership positions within a political party, (e) elected official within a public institution at European, national, regional or local level, (f) unelected member of government at European, national, regional or local level, (g) political campaign organisation with or without legal personality, established to achieve a specific outcome in an election or referendum, and (h) any natural or legal person
representing or acting on behalf of any of the persons or organisations mentioned previously, promoting the political objectives of any of those.53
Moreover, improving transparency around political advertising practices on online platforms is also an area worthy of regulatory attention. In practice, the level of transparency set by election laws and regulations varies greatly. In the EU, there is a fragmentation of political advertising regulations and the introduction of the RPA intends to harmonize this by regulating several transparency obligations such as: Identification of political advertising services (Article 5), Record-keeping and information transmission (Article 6), Transparency requirements for each political advertisement (Article 7), Periodic reporting on political advertising services (Article 8), Indicating possibly unlawful political advertisements (Article 9).54 Critics have criticized how the transparency obligations enshrined under the RPA are primarily placed on the shoulder of political parties and candidates as the ad buyer, and that “political advertising publishers (including platforms) do not face any independent duty to monitor for political ads”.55
Elections laws and regulations in general tend to overlook the important role of online platforms in the implementation of transparency obligations of political advertising. To ensure effectiveness, any regulatory attempt to oblige political candidates and parties to authorize political ads on social media, should also be accompanied by provisions creating direct obligations for online platforms to facilitate such obligation. Explicitly creating direct obligations for online platforms should also cover other general provisions on basic ads transparency requirements, as well as ensuring political parties’ and candidates’ compliance with campaign budget limitation. The limitation of political campaigns’ spending is a justified limitation necessary “to ensure that the free choice of voters is not undermined, or the democratic process distorted by the disproportionate expenditure on behalf of any candidate or party.”56 Moreover, since the Cambridge Analytica and the 2016 US Election scandal, policymakers are also more aware of the threat of foreign interference during democratic process. To mitigate this, many countries around the globe strictly prohibit political parties and candidates to receive donations from foreign actors (countries, companies, CSOs, foreign citizens). This provision should also be translated into how online platforms are selling their political advertising.
In the EU, as part of their record-keeping obligation as the providers of political advertising services, technology platforms are required to retain information on “the amounts they invoiced for the service or services provided, and the value of other benefits received in part or full exchange for the service or services provided.”57 In other words, current approach in the RPA heavily relies on transparency provisions, and not yet directly limiting the foreign actors funding political advertising in other countries. This initial information provided by the transparency requirements, together with other measures such as limiting the payment method to only local currency and disclosing information on the political ads spending of political parties and candidates on their platform can also aid the Election Commission in monitoring compliance with budget limitation provisions.
In addition to provisions around transparency, the RPA also includes a data protection article, particularly on the prohibition of the use of sensitive data for targeting and amplification of political advertising.58 This brings us to the next sub-section on data protection regulations aspect of political advertising in the EU.
In order to address privacy concerns arising from the PMT and PBT techniques deployed for political advertising on social media, the role of data protection regulation is highly crucial. Hailed as the world’s toughest data protection regulation, the European Union General Data Protection Regulation (“GDPR”), contains some provisions that hinder intrusive data profiling using PMT and PBT techniques. The GDPR provides data subjects with a set of rights that are relevant to data profiling, such as the right to information59 and right to object data processing.60
Particularly on the opaqueness of the data supply chain in the AdTech industry, right to information is an essential right that provides data subjects with greater transparency about the data processing, particularly in situations where personal data have not been
collected directly from the data subject.61 The GDPR obliges the data controller to provide information on the legal basis of processing, original source of personal data acquired, including whether it was collected from publicly available data.62 In relation to political advertising on online platforms, information about the scope of data used to micro-target us with political ads should also be provided as part of our right to information.63 If enforced effectively, the right to information is a powerful tool that can shed light on the behind the scene of PMT and PBT techniques used by political candidates and parties.
Moreover, another privacy concern around PMT and PBT is that data profiling takes place without the consent of data subjects. GDPR listed out the conditions for obtaining consent from data subjects,64 elaborates that consent should be “freely given, specific, informed and unambiguous”.65 Moreover, personal data revealing political opinion is considered sensitive data under GDPR, and thus afforded a higher level of protection that includes explicit consent.66 Therefore, any data processing aiming to infer political opinions of voters should only take place on the basis of explicit consent of voters. Even if social media companies argue that they can rely on legitimate interests for delivering political ads, GDPR allows voters to object to the data processing based on legitimate interests.67
The article discusses the profound impact of rapid technological developments in the commercial AdTech industry on political advertising strategies, where political campaigns increasingly integrate with AdTech, tailoring messages to individuals based on online behavioral data. This integration, exemplified by PMT and data profiling techniques, raises privacy and data protection concerns akin to those in the broader AdTech industry. The article highlights the potential for political advertising on online platforms to amplify harmful disinformation during elections, compromising the right to make informed voting decisions. Due to its human rights implications, particularly concerning privacy, the right to vote, and freedom of expression, the article argues for essential regulation of political advertising on online platforms. The inherent problems within the AdTech industry, its heavy reliance on the RTB system, and data profiling techniques pose significant threats to privacy and democratic processes. To address these challenges, the article advocates for sectoral regulations in elections and political advertising, reinforced by comprehensive data protection measures. It emphasizes the need for carefully crafted regulations to avoid undue interference with freedom of expression, urging policymakers to define political advertising, establish transparency obligations, and require prior authorization. The article concludes by highlighting how data protection regulations complement election-related measures, providing data subjects with relevant rights and higher protection for sensitive political opinions, including the right to object to data processing based on legitimate interests.
References
Book
Montjoye, Yves-Alexandre de, Jordi Quoidbach, Florent Robic, and Alex (Sandy) Pentland. “Predicting Personality Using Novel Mobile Phone-Based Metrics.” In Social Computing, Behavioral-Cultural Modeling and Prediction, edited by Ariel M. Greenberg, William G. Kennedy, and Nathan D. Bos, 48–55. Lecture Notes in Computer Science. Berlin, Heidelberg: Springer, 2013.
https://doi.org/10.1007/978-3-642-37210-0_6.
Journal
Barberá, Pablo. “Birds of the Same Feather Tweet Together: Bayesian Ideal Point Estimation Using Twitter Data.” Political Analysis 23, no. 1 (ed 2015): 76–91. https://doi.org/10.1093/pan/mpu011.
Bayer, Judit. “Double Harm to Voters: Data-Driven Micro-Targeting and Democratic Public Discourse.” Internet Policy Review 9, no. 1 (March 31, 2020).
Bi, Bin, Milad Shokouhi, Michal Kosinski, and Thore Graepel. “Inferring the Demographics of Search Users: Social Data Meets Search Queries.” In Proceedings of the 22nd International Conference on World Wide Web - WWW ’13, 131–40. Rio de Janeiro, Brazil: ACM Press, 2013. https://doi.org/10.1145/2488388.2488401.
Blumenstock, Joshua, Gabriel Cadamuro, and Robert On. “Predicting Poverty and Wealth from Mobile Phone Metadata.” Science 350, no. 6264 (November 27, 2015): 1073–76. https://doi.org/10.1126/science.aac4420.
Bodó, Balázs, Natali Helberger, and Claes H. de Vreese. “Political Micro-Targeting: A Manchurian Candidate or Just a Dark Horse?” Internet Policy Review 6, no. 4 (December 31, 2017). https://policyreview.info/articles/analysis/political-
micro-targeting-manchurian-candidate-or-just-dark-horse.
Bradshaw, Samantha, and Philip N. Howard. “The Global Organization of Social Media Disinformation Campaigns.” Journal of International Affairs 71, no. 1.5 (2018): 23– 32.
Chester, Jeff, and Kathryn C. Montgomery. “The Role of Digital Marketing in Political Campaigns.” Internet Policy Review 6, no. 4 (December 31, 2017).
https://policyreview.info/articles/analysis/role-digital-marketing-political-campaigns.
Dobber, Tom, Damian Trilling, Natali Helberger, and Claes H. de Vreese. “Two Crates of Beer and 40 Pizzas: The Adoption of Innovative Political Behavioural Targeting Techniques.” Internet Policy Review 6, no. 4 (December 31, 2017). https://policyreview.info/articles/analysis/two-crates-beer-and-40-pizzas-adoption-innovative-political-behavioural-targeting.
Drunen, Max Zeno van, Natalie Helberger, and Ronan Ó Fathaigh. “The Beginning of EU Political Advertising Law: Unifying Democratic Visions through the Internal Market.” International Journal of Law and Information Technology 30, no. 2 (June 1, 2022): 181–99. https://doi.org/10.1093/ijlit/eaac017.
Nickerson, David W., and Todd Rogers. “Political Campaigns and Big Data.” Journal of Economic Perspectives 28, no. 2 (May 2014): 51–74.
https://doi.org/10.1257/jep.28.2.51.
Rubinstein, Ira. “Voter Privacy in the Age of Big Data.” SSRN Scholarly Paper. Rochester, NY, April 26, 2014. https://doi.org/10.2139/ssrn.2447956.
Ruohonen, Jukka. “A Note on the Proposed Law for Improving the Transparency of Political Advertising in the European Union,” November 2023. https://arxiv.org/pdf/2303.02863.pdf.
Saunders, Mark N.K., and Céline Rojon. “On the Attributes of a Critical Literature Review.” Coaching: An International Journal of Theory, Research and Practice 4, no. 2 (September 2011): 156–62. https://doi.org/10.1080/17521882.2011.596485.
Zuiderveen Borgesius, Frederik, Judith Moeller, Sanne Kruikemeier, Ronan Ó Fathaigh, Kristina Irion, Tom Dobber, Balázs Bodó, and Claes H. de Vreese. “Online Political Microtargeting: Promises and Threats for Democracy.” SSRN Scholarly Paper. Rochester, NY, February 9, 2018.
https://papers.ssrn.com/abstract=3128787.
Reports
Bradshaw, Samantha, and Philip N Howard. “Challenging Truth and Trust: A Global Inventory of Organized Social Media Manipulation.” Oxford Internet Institute, 2018. https://demtech.oii.ox.ac.uk/research/posts/challenging-truth-and-
trust-a-global-inventory-of-organized-social-media-manipulation/.
“Data Brokers: A Call For Transparency and Accountability: A Report of the Federal Trade Commission,” May 27, 2014. https://www.ftc.gov/reports/data-brokers-call-transparency-accountability-report-federal-trade-commission-may-2014.
Joint Research Centre (European Commission), S. Lewandowsky, L. Smillie, D. Garcia, R. Hertwig, J. Weatherall, S. Egidy, et al. Technology and Democracy: Understanding the Influence of Online Technologies on Political Behaviour and Decision Making. LU: Publications Office of the European Union, 2020.
https://data.europa.eu/doi/10.2760/709177.
“Update Report into Adtech and Real Time Bidding.” ICO, 2019. https://ico.org.uk/media/about-the-ico/documents/2615156/adtech-real-time-bidding-report-201906-dl191220.pdf.
Wardle, Claire, and Hossein Derakhshan. “Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making.” Council of Europe, 2017. https://edoc.coe.int/en/media/7495-information-disorder-toward-an-interdisciplinary-framework-for-research-and-policy-making.html.
Websites
“AdTech | Privacy International.” Accessed December 18, 2022.
https://privacyinternational.org/learn/adtech.
“European Democracy Action Plan.” Accessed December 16, 2022.
GOV.UK. “The Electoral Register and the ‘Open Register.’” Accessed December 17, 2022. https://www.gov.uk/electoral-register.
(HRC), UN Human Rights Committee. “CCPR General Comment No. 16: Article 17 (Right to Privacy), The Right to Respect of Privacy, Family, Home and Correspondence, and Protection of Honour and Reputation.” Accessed December 18, 2022. https://www.refworld.org/docid/453883f922.html.
———. “CCPR General Comment No. 25: The Right to Participate in Public Affairs, Voting Rights and the Right of Equal Access to Public Service,” July 12, 1996. https://www.equalrightstrust.org/ertdocumentbank/general%20comment%2 025.pdf.
———. “CCPR General Comment No. 34, Article 19, Freedoms of Opinion and Expression,” September 12, 2011. https://documents-dds-
ny.un.org/doc/UNDOC/GEN/G11/453/31/PDF/G1145331.pdf?OpenElemen t.
https://policyreview.info/articles/news/transparency-and-no-more-political-advertising-regulation/1616.
https://commission.europa.eu/document/aef260df-a85e-482c-8d5b-323b636a0179_en.
Proposal for a Regulation of the European Parliament and of the Council on the Transparency and Targeting of Political Advertising (2021). https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/12826-Political-advertising-improving-transparency_en.
“The 2022 Code of Practice on Disinformation | Shaping Europe’s Digital Future.” Accessed December 20, 2022. https://digital-
strategy.ec.europa.eu/en/policies/code-practice-disinformation.
“The Digital Services Act: Ensuring a Safe and Accountable Online Environment.” Accessed December 20, 2022. https://commission.europa.eu/strategy-and-
policy/priorities-2019-2024/europe-fit-digital-age/digital-services-act-ensuring-safe-and-accountable-online-environment_en.
“Twitter to Ban All Political Advertising - BBC News.” Accessed December 19, 2022. https://www.bbc.co.uk/news/world-us-canada-50243306.
Laws and Regulations
Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance) (2016). http://data.europa.eu/eli/reg/2016/679/2016-05-04/eng.
Jurnal Kertha Patrika, Vol. 45, No. 3 Desember 2023, h. 252-267
Discussion and feedback