Public values under threat
Robust and free public debate is essential to our democratic society. Digitisation has created new opportunities for such debate: never before have ordinary people had such easy access to information, or such ability to communicate ideas widely, initiate debate and organise themselves. Nevertheless, the current digital landscape also represents a threat to our public values. A handful of large technology corporations have a disproportionate control over the digital landscape, thus placing public debate at risk.
The risk comes from various quarters. For example, the opaque nature of the algorithms used by tech giants contributes to the dissemination of disinformation. The development of 'filter bubbles' leads to people being in contact mainly with others who think like themselves, which in turn facilitates radicalisation and forms a barrier to broad public debate. Moreover, targeted ad campaigns enable politicians to make different promises to different groups, without one group knowing what has been said to another. Meanwhile, online hate campaigns conducted by armies of trolls are creating new forms of intimidation and suppressing freedom of expression. Collectively, such developments are undermining the information position of the individual and interfering with the working of democracy.
Against that background, at the end of 2019 SIDN Fund and the Adessium Foundation called for proposals linked to the theme of Public Values in a Digital Society. Support was made available to projects involving the development of alternative technologies, research and educational methods aimed at protecting and promoting public values in a digital society. The nine funded projects have since all been completed. This article reflects on the initiative, describing how the projects went and the lessons we can learn from them. It also asks how public values can be protected and promoted in a digital society, and what opportunities and challenges lie ahead.
Nine projects with four sub-themes
The projects were grouped on the basis of four sub-themes. Each of the projects is briefly summarised below. For more detailed descriptions, refer to the separate feature articles linked to at various points below.
1. Use of personal data for political purposes
Three projects addressed the question of how personal data obtained through tech companies is used for political purposes. For the project Personal Data for Political Purposes run by Berlin-based Tactical Tech, freelancejournalist Marjolein van Trigt investigated the situation in the Netherlands. She cited various examples of targeted online political campaigning by parties including GroenLinks, D66, VVD and Forum voor Democratie. VVD and Forum voor Democratie worked with an Austrian marketing agency that uses controversial influencing methods similar to those of Cambridge Analytica. Van Trigt also highlighted the fact that few transparency requirements apply to political parties in the Netherlands in relation to targeted online campaigning. With a view to making people aware of the online strategies used by political parties and ways of protecting themselves against them, Tactical Tech developed the Data Detox Kit. The funded project involved translating the Digital Detox Kit into Dutch, adding information specific to the Dutch situation and producing a supporting information video.
Tech companies are not generally transparent about how their platforms are used by political parties. Two projects supported in the context of the call involved the development of browser plugins that enable volunteers to contribute to research into political advertising on tech platforms. One was Berlin-based AlgorithmWatch's Towards a Monitoring of Instagram, a project set up to investigate the Instagram algorithm. During the 2021 Dutch general election, AlgorithmWatch linked up with Dutch media partners NOS, Pointer and De Groene Amsterdammer to study how politicians use Instagram. Among the findings, which generated considerable media attention, was the discovery that politicians do better when they post selfies on Instagram than when they post substantive content.
Who's Using Facebook Ads to Win Your Vote? is a project run by British organisation WhoTargetsMe, based on a browser plugin that's made available for the study of political advertising on Facebook. Since 2017, the plugin has been installed more than 50,000 times in more than a hundred different countries. The funded project involved the analysis of political Facebook ads during the Dutch general election of 2021. WhoTargetsMe did the work in tandem with the University of Amsterdam's Institute for Information Law (IViR). The results indicated that the party that made most use of targeted Facebook advertising was the Christian Democrats, although the tactic didn't enable them to win the election. That observation reinforces WhoTargetsMe's conclusion from other countries, that Facebook ads don't appear to be decisive in election campaigns.
2. Strengthening the position of internet users
Of the projects supported in the context of the call, two were aimed at strengthening the position of internet users. One was the Short Course on Manipulation project by Bits of Freedom, the digital citizens' rights organisation. The project involved a study by Holly Robbins (postdoc at Eindhoven University of Technology) into the ways that tech companies manipulate our information landscape. On the basis of a literature study, interviews with former tech company designers, and design analysis, Robbins devised a taxonomy of online platforms, distinguishing five forms of manipulation: profiling, prioritisation, censorship, 'dark patterns' and self-manipulation.
Bits of Freedom then used Robbins' findings to develop a Short Course on Manipulation: an online resource for use by the general public. People who do the course learn about the five forms of manipulation and how to protect against them. The course was drawn to the attention of a large audience by means of an advert on the Dutch catch-up TV service, Uitzending Gemist. The research also formed the basis for an information report about online manipulation, aimed at specialists such as policy-makers, activists and researchers. Bits of Freedom's European counterpart EDRi will also be circulating the report internationally.
Campaign platform DeGoedeZaak ran a project called More Power to You! How Social Influencers Can Face Down Hate to address a particular problem: online hate campaigns against social change leaders. For the project, researcher Ibtissam Abaâziz interviewed various social change leaders in order to build up a picture of the strategies used by 'trolls'. Her findings demonstrated that online hate campaigns are organised activities. The research was used to guide development of DeGoedeZaak's Online Hate First Aid Kit: a set of tools and strategies designed to help social change leaders who are targeted by trolls. The kit attracted considerable media attention and was showcased in a publicity campaign by DeGoedeZaak.
3. Diversity in news algorithms
As well as informing our world view, the media are a cornerstone of democracy. A comprehensive and diverse news supply is vital for a healthy public domain. However, the use of online recommendation systems does not ensure diversity at all. Algorithms generally make recommendations on the basis of what people have liked before, which can lead to a biased news supply and the creation of 'filter bubbles'. Two of the projects facilitated by our recent call aimed to get developers to think differently about their algorithms in order to make news recommendations more diverse.
The University of Amsterdam's Institute for Information Law (IViR) ran a project entitled Algorithms for Freedom of Expression and a Well-informed Public, designed to investigate ways of measurably increasing news algorithms' diversity. A diversity toolkit was developed to help news organisations make their algorithms more diverse in various ways. Later this year, the toolkit is due to be tested by several national and international news organisations. The research is also to be brought to the attention of programmers at the Conference on Human Information Interaction and Retrieval (CHIIR 2021).
Leuven University's Institute for Media Studies set up the project Who Would You Like to be Guided by? to study news algorithms from the user's perspective. News consumers currently have very little influence over the recommendations they receive. With the ultimate goal of changing that, Leuven University investigated the possibility of using 'recommendation avatars' to get active and conscious input from users. After gathering data through workshops and interviews with news consumers, the team has defined three optimised recommendation avatars: the Expert, the Challenger and the Relaxer. The research is to be presented at the Conference on Human Factors in Computing Systems (CHI 21).
4. Education in a Digital Society
Finally, two of the supported projects focused on education. Today's children – the citizens of the future – are growing up in a world where a significant part of the public domain is on line. How should we prepare them for life in a digital society and educate them about the role of public values in such a society?
SkillsDojo's teaching package The Ethics of Artificial Intelligence and Democracy for children aged ten to fourteen provides creative and programming projects that introduce youngsters to artificial intelligence and prompt them to think about its effects. SkillsDojo programmes are suitable both for independent home use and for group use in a classroom setting. The accompanying videos have now been watched about ten thousand times. On 1 January 2021, SkillsDojo became part of NBD Biblion. As a result, the teaching materials are now part of the service package for public libraries and media centres, and thus available to an even bigger audience.
Civinc's Vox Pop Academy Secondary Vocational Pilot has been developed for secondary-school pupils following vocational courses. The Vox Pop Academy is a digital environment for online debates between young people with contrasting opinions. The ability to engage with people who hold different views is vital for a healthy democracy, but the current internet landscape tends to polarise opinion and thus divide society.
A Vox Pop Academy session starts with the student responding to a number of statements. Next, they are put in contact with someone with a different take on the subject for an anonymous one-to-one chat. During the discussion, a bot chips in with tips to help the debate along and keep everything polite and on-topic. At the end, the teacher discusses the responses to the starter-statements and how the debates went. Last year, three vocational secondary schools took part in a pilot, which was well received by the students.
Civinc used the findings to refine its technology, and is now looking to scale up to a nationwide pilot. The organisation also deployed an adult version of the concept during the 2021 Dutch general election. No fewer than 330,000 people with wide-ranging political preferences visited Waaromkiesjij.nl and exchanged views in online 'speed dates'. The project attracted a lot of media attention, thanks partly to Civinc's partnership with a PR agency. The project's reach was additionally extended by collaboration with Stemwijzer, an online voter support tool used by 7.8 million people. A PhD student was also recruited to investigate the effect of engaging in debate with someone who holds opposing political views.
Public Values in a Digital Society: opportunities and challenges
In the context of the call, SIDN Fund and the Adessium Foundation supported nine diverse projects designed to help protect public values in our digital society in a variety of ways. What lessons were learned? How can the impact of the projects be maximised? More generally, how can public values be protected and promoted in a digital society? A series of meetings were held at the outset, mid-point and conclusion of the programme, at which those questions were put to the project leaders. The outcome was the following list of opportunities and challenges:
Small projects versus big tech corporations
Today's digital landscape is characterised by a power imbalance. A handful of big technology corporations define the rules, and initiatives that seek to offset the influence of big tech – such as the projects funded by SIDN Fund and the Adessium Foundation – are very small by comparison. That presents various technical, legal and strategic challenges.
On the technical front, one of the biggest challenges is the closed nature of the big tech corporations' data systems. The companies in question allow little scrutiny of their algorithms. Two of the supported projects, those run by AlgorithmWatch and WhoTargetsMe, involved the use of specially developed browser plugins to gather data on, and thus provide insight into, the way that big tech companies operate. However, both found themselves facing significant technical constraints. For example, the plugins don't work with mobile apps, whose role is steadily growing. Such initiatives can also be vulnerable to legal challenge, and the tech companies have far more legal muscle than the investigators. AlgorithmWatch received a warning from Facebook, but WhoTargetsMe was not contacted, despite the fact that Facebook had actively opposed a similar earlier project involving the monitoring of political adverts on the platform. Collaboration on technical and legal matters is seen as a useful way of strengthening the position of small players.
Relations with big tech corporations can be strategically challenging too. Some projects call the practices of tech companies into question and seek to get regulations introduced to oblige the companies to change their policies; others aim to engage the tech companies in dialogue with a view to getting them to look at their algorithms in a different way, so that change is realised from within. One thing that all project leaders said was that it's difficult to contact the tech giants, regardless of whether the aim is to initiate dialogue, to question certain practices, or to offer alternative technologies. It is similarly difficult to contact policy-makers who might be able to require tech companies to comply with legal or technical rules. Again, collaboration is regarded as a way to strengthen the campaigners' hand.
International versus national
Digitisation doesn't recognise international borders; the big tech corporations operate globally. By contrast, democracy and the public realm are organised largely at the national and local levels. That presents various challenges. On the one hand, it's important to know what is happening at the local level and to maintain contact with relevant media partners and others in order to communicate with a local audience. On the other hand, international collaboration and knowledge exchange are required to build a picture of what is happening and to develop an effective response.
Tactical Tech therefore operates on both the local and international levels. The organisation is active in various countries, studying how political parties are using citizens' data for political purposes. They work with local journalists and researchers to develop a thorough understanding of what's happening in a particular country and have now initiated a total of sixteen national studies. By comparing the findings from the various countries, the organisation is able to identify common characteristics and patterns. The approach has enabled Tactical Tech to build up a picture of the complex impact of targeted political advertising at the local and international levels. The challenge is finding suitable local partners and identifying the best way of combining the findings in order to exert influence.
Some project organisers seek to amplify the impact of their work through international collaboration. Much depends, however, on having the financial resources to get material translated and on finding partners who can deliver access to the relevant target groups. For example, Bits of Freedom is getting its policy-makers' report translated into English with a view to sharing it with EDRi, an organisation that lobbies at the European level for the protection and reinforcement of digital citizens' rights. The Short Course on Manipulation will also be translated into English and made freely available to any digital citizens' rights organisations that want to use it locally (in English or translated on into the local language). For its part, DeGoedeZaak is looking at the possibility of collaborating with campaign organisations in other countries in order to boost the impact of its Online Hate First Aid Kit.
What's the best way to reach the target audience?
Online manipulation, privacy and democracy are relatively complex, abstract themes. Bringing them to the attention of a wide audience can therefore be difficult. One cheap approach is to advertise on big tech platforms. That isn't a comfortable option, however, for campaigners seeking to question the power of the corporations behind those platforms. Bits of Freedom therefore opted to advertise on Uitzending Gemist, an online platform where viewers can catch up with previously televised public service broadcasters' programmes.
Another challenge is making contact with journalists. Civinc therefore chose to invest in partnering with a PR agency to promote Waaromkiesjij.nl. The collaboration resulted in the initiative being picked up by various important news outlets. A tie-in with Stemwijzer also enabled Civinc to extend its reach considerably.
Several of our supported projects were realised with the help of media partners. Such partnerships are viable mainly for projects that have significant news-generating potential. For its Dutch activities, AlgorithmWatch struck up relationships with NOS, Pointer and the Groene Amsterdammer, for example. The arrangement gave the media outlets access to newsworthy data, while AlgorithmWatch got more data donors, and thus more opportunity to study how the Instagram algorithm works.
WhoTargetsMe came up with an interesting way of increasing its reach: the organisation developed a special service for journalists, called Notify. Notify users were sent daily automated e-mails detailing all the Facebook advertisements that individual political parties were using that day. The service enabled journalists to stay up-to-date with, for example, what the candidates they were covering were promising to Facebook users. The aim was to generate a flywheel effect, where journalists got material for their stories, which in turn raised WhoTargetsMe's profile, leading to more plugin downloads and thus an improved data supply, adding further momentum to the process. The tactic had worked well during elections in the UK and elsewhere, but in the Netherlands WhoTargetsMe secured relatively little media exposure and there weren't as many plugin downloads as in other countries. Securing appropriate media access channels and activating them at the right moment is challenging and likely to remain so, especially for organisations that operate internationally.
Finally, the project organisers helped each other generate publicity where possible. For example, Bits of Freedom featured AlgorithmWatch's browser plugin in a newsletter, while Tactical Tech's the Digital Detox Kit flagged up the WhoTargetsMe plugin. Evidently, collaboration is just as important for maximising reach as it is for securing other goals.
Online versus offline
Pandemic-related restrictions meant that most of us spent a large part of 2020 confined to our homes. The internet therefore came to play an even greater part in our lives than was already the case, extending its influence into new areas. That served to amplify the urgency of the projects supported in the context of this call. However, the situation also impacted project organisers' ability to progress their activities. The importance to some projects of face-to-face interaction was made abundantly clear.
That was the case with the educational projects, for instance. With schools all closed, Civinc's pilot with vocational secondary pupils had to be held online. Although the participating schools were grateful for an activity that was ideal for remote participation, the absence of a teacher introduced previously unexpected difficulties. Some pupils simply didn't 'show up', or dropped out mid-session, for example. In many cases, such problems could probably have been prevented by having a teacher on hand to encourage and explain. Civinc is therefore looking to scale up on the basis of collaboration with community organisations that can supervise and guide the debates.
SkillsDojo's tools are in principle suitable for young people to use independently. Consequently, use of the organisation's lesson programmes rose sharply during lockdown. Again, however, the absence of a teacher to provide guidance and context had a negative effect on the impact. That's one of the reasons why SkillsDojo has merged with NBD Biblion. NDB Biblion's media coaches can deliver SkillsDojo lessons in schools, thus increasing the programme's reach and impact.
DeGoedeZaak's research into online trolling showed that there is significant overlap between online and offline intimidation. Trolls will often harass an influencer's family or employer, or threaten physical violence, for example.Online intimidation is merely one of many ways of putting pressure on social change leaders. It is not distinct from physical intimidation and should not be treated as distinct. In the fight against trolling, it can therefore be advantageous to forge ties with community organisations, with a view to maximising impact.
Conclusion: collaboration is vital for standing up to Big Tech
The protection and promotion of public values in a digital society is a complex challenge, for which there is no universally applicable solution. The projects supported in the context of the recent joint call by SIDN Fund and the Adessium Foundation involved initiatives at various levels designed to curb the power of big tech companies and safeguard democracy, the rule of law and privacy. The projects focused on education, clarifying how big tech platforms' algorithms work, promoting awareness in the general public and amongst policy-makers, advancing alternative technologies, delivering strategies for strengthening the position of internet users, increasing the diversity of automated news story recommendations and countering the creation of filter bubbles.
Structural protection of public values in a digital society requires effective legislation and regulation to limit the power of Big Tech. In recent years, the political pressure on tech platforms has increased considerably. Both the EU and the US Federal Trade Commission have imposed huge fines on Google and Facebook for privacy infringements and abuses of power. And Australia has decided to challenge Facebook's dominant role in news delivery. Meanwhile, in the Netherlands, organisations such as the Council for Public Administration are calling for the appointment of a Minister of Digital Affairs to protect citizens against the power of tech corporations and make them more accountable for automated decision-making.
It also seems that public perceptions of Big Tech are changing. Corporations previously held in great esteem are now attracting criticism. However, it must be acknowledged that there has been no consequent large-scale migration to alternative technologies. The 'lock-in' effects of tech platforms are so strong that widespread dependency has developed. The earnings and profits of big tech corporations continue to grow exponentially, and the pandemic has served only to consolidate their power.
What about the big tech corporations themselves? While there is nothing to suggest that they are inclined to pursue fundamental change of their own volition, a combination of effective legislation and regulations, public pressure and constructive proposals regarding smart aPublic values under threat
Robust and free public debate is essential to our democratic society. Digitisation has created new opportunities for such debate: never before have ordinary people had such easy access to information, or such ability to communicate ideas widely, initiate debate and organise themselves. Nevertheless, the current digital landscape also represents a threat to our public values. A handful of large technology corporations have a disproportionate control over the digital landscape, thus placing public debate at risk.
The risk comes from various quarters. For example, the opaque nature of the algorithms used by tech giants contributes to the dissemination of disinformation. The development of 'filter bubbles' leads to people being in contact mainly with others who think like themselves, which in turn facilitates radicalisation and forms a barrier to broad public debate. Moreover, targeted ad campaigns enable politicians to make different promises to different groups, without one group knowing what has been said to another. Meanwhile, online hate campaigns conducted by armies of trolls are creating new forms of intimidation and suppressing freedom of expression. Collectively, such developments are undermining the information position of the individual and interfering with the working of democracy.
Against that background, at the end of 2019 SIDN Fund and the Adessium Foundation called for proposals linked to the theme of Public Values in a Digital Society. Support was made available to projects involving the development of alternative technologies, research and educational methods aimed at protecting and promoting public values in a digital society. The nine funded projects have since all been completed. This article reflects on the initiative, describing how the projects went and the lessons we can learn from them. It also asks how public values can be protected and promoted in a digital society, and what opportunities and challenges lie ahead.
Nine projects with four sub-themes
The projects were grouped on the basis of four sub-themes. Each of the projects is briefly summarised below. For more detailed descriptions, refer to the separate feature articles linked to at various points below.
1. Use of personal data for political purposes
Three projects addressed the question of how personal data obtained through tech companies is used for political purposes. For the project Personal Data for Political Purposes run by Berlin-based Tactical Tech, freelancejournalist Marjolein van Trigt investigated the situation in the Netherlands. She cited various examples of targeted online political campaigning by parties including GroenLinks, D66, VVD and Forum voor Democratie. VVD and Forum voor Democratie worked with an Austrian marketing agency that uses controversial influencing methods similar to those of Cambridge Analytica. Van Trigt also highlighted the fact that few transparency requirements apply to political parties in the Netherlands in relation to targeted online campaigning. With a view to making people aware of the online strategies used by political parties and ways of protecting themselves against them, Tactical Tech developed the Data Detox Kit. The funded project involved translating the Digital Detox Kit into Dutch, adding information specific to the Dutch situation and producing a supporting information video.
Tech companies are not generally transparent about how their platforms are used by political parties. Two projects supported in the context of the call involved the development of browser plugins that enable volunteers to contribute to research into political advertising on tech platforms. One was Berlin-based AlgorithmWatch's Towards a Monitoring of Instagram, a project set up to investigate the Instagram algorithm. During the 2021 Dutch general election, AlgorithmWatch linked up with Dutch media partners NOS, Pointer and De Groene Amsterdammer to study how politicians use Instagram. Among the findings, which generated considerable media attention, was the discovery that politicians do better when they post selfies on Instagram than when they post substantive content.
Who's Using Facebook Ads to Win Your Vote? is a project run by British organisation WhoTargetsMe, based on a browser plugin that's made available for the study of political advertising on Facebook. Since 2017, the plugin has been installed more than 50,000 times in more than a hundred different countries. The funded project involved the analysis of political Facebook ads during the Dutch general election of 2021. WhoTargetsMe did the work in tandem with the University of Amsterdam's Institute for Information Law (IViR). The results indicated that the party that made most use of targeted Facebook advertising was the Christian Democrats, although the tactic didn't enable them to win the election. That observation reinforces WhoTargetsMe's conclusion from other countries, that Facebook ads don't appear to be decisive in election campaigns.
2. Strengthening the position of internet users
Of the projects supported in the context of the call, two were aimed at strengthening the position of internet users. One was the Short Course on Manipulation project by Bits of Freedom, the digital citizens' rights organisation. The project involved a study by Holly Robbins (postdoc at Eindhoven University of Technology) into the ways that tech companies manipulate our information landscape. On the basis of a literature study, interviews with former tech company designers, and design analysis, Robbins devised a taxonomy of online platforms, distinguishing five forms of manipulation: profiling, prioritisation, censorship, 'dark patterns' and self-manipulation.
Bits of Freedom then used Robbins' findings to develop a Short Course on Manipulation: an online resource for use by the general public. People who do the course learn about the five forms of manipulation and how to protect against them. The course was drawn to the attention of a large audience by means of an advert on the Dutch catch-up TV service, Uitzending Gemist. The research also formed the basis for an information report about online manipulation, aimed at specialists such as policy-makers, activists and researchers. Bits of Freedom's European counterpart EDRi will also be circulating the report internationally.
Campaign platform DeGoedeZaak ran a project called More Power to You! How Social Influencers Can Face Down Hate to address a particular problem: online hate campaigns against social change leaders. For the project, researcher Ibtissam Abaâziz interviewed various social change leaders in order to build up a picture of the strategies used by 'trolls'. Her findings demonstrated that online hate campaigns are organised activities. The research was used to guide development of DeGoedeZaak's Online Hate First Aid Kit: a set of tools and strategies designed to help social change leaders who are targeted by trolls. The kit attracted considerable media attention and was showcased in a publicity campaign by DeGoedeZaak.
3. Diversity in news algorithms
As well as informing our world view, the media are a cornerstone of democracy. A comprehensive and diverse news supply is vital for a healthy public domain. However, the use of online recommendation systems does not ensure diversity at all. Algorithms generally make recommendations on the basis of what people have liked before, which can lead to a biased news supply and the creation of 'filter bubbles'. Two of the projects facilitated by our recent call aimed to get developers to think differently about their algorithms in order to make news recommendations more diverse.
The University of Amsterdam's Institute for Information Law (IViR) ran a project entitled Algorithms for Freedom of Expression and a Well-informed Public, designed to investigate ways of measurably increasing news algorithms' diversity. A diversity toolkit was developed to help news organisations make their algorithms more diverse in various ways. Later this year, the toolkit is due to be tested by several national and international news organisations. The research is also to be brought to the attention of programmers at the Conference on Human Information Interaction and Retrieval (CHIIR 2021).
Leuven University's Institute for Media Studies set up the project Who Would You Like to be Guided by? to study news algorithms from the user's perspective. News consumers currently have very little influence over the recommendations they receive. With the ultimate goal of changing that, Leuven University investigated the possibility of using 'recommendation avatars' to get active and conscious input from users. After gathering data through workshops and interviews with news consumers, the team has defined three optimised recommendation avatars: the Expert, the Challenger and the Relaxer. The research is to be presented at the Conference on Human Factors in Computing Systems (CHI 21).
4. Education in a Digital Society
Finally, two of the supported projects focused on education. Today's children – the citizens of the future – are growing up in a world where a significant part of the public domain is on line. How should we prepare them for life in a digital society and educate them about the role of public values in such a society?
SkillsDojo's teaching package The Ethics of Artificial Intelligence and Democracy for children aged ten to fourteen provides creative and programming projects that introduce youngsters to artificial intelligence and prompt them to think about its effects. SkillsDojo programmes are suitable both for independent home use and for group use in a classroom setting. The accompanying videos have now been watched about ten thousand times. On 1 January 2021, SkillsDojo became part of NBD Biblion. As a result, the teaching materials are now part of the service package for public libraries and media centres, and thus available to an even bigger audience.
Civinc's Vox Pop Academy Secondary Vocational Pilot has been developed for secondary-school pupils following vocational courses. The Vox Pop Academy is a digital environment for online debates between young people with contrasting opinions. The ability to engage with people who hold different views is vital for a healthy democracy, but the current internet landscape tends to polarise opinion and thus divide society.
A Vox Pop Academy session starts with the student responding to a number of statements. Next, they are put in contact with someone with a different take on the subject for an anonymous one-to-one chat. During the discussion, a bot chips in with tips to help the debate along and keep everything polite and on-topic. At the end, the teacher discusses the responses to the starter-statements and how the debates went. Last year, three vocational secondary schools took part in a pilot, which was well received by the students.
Civinc used the findings to refine its technology, and is now looking to scale up to a nationwide pilot. The organisation also deployed an adult version of the concept during the 2021 Dutch general election. No fewer than 330,000 people with wide-ranging political preferences visited Waaromkiesjij.nl and exchanged views in online 'speed dates'. The project attracted a lot of media attention, thanks partly to Civinc's partnership with a PR agency. The project's reach was additionally extended by collaboration with Stemwijzer, an online voter support tool used by 7.8 million people. A PhD student was also recruited to investigate the effect of engaging in debate with someone who holds opposing political views.
Public Values in a Digital Society: opportunities and challenges
In the context of the call, SIDN Fund and the Adessium Foundation supported nine diverse projects designed to help protect public values in our digital society in a variety of ways. What lessons were learned? How can the impact of the projects be maximised? More generally, how can public values be protected and promoted in a digital society? A series of meetings were held at the outset, mid-point and conclusion of the programme, at which those questions were put to the project leaders. The outcome was the following list of opportunities and challenges:
Small projects versus big tech corporations
Today's digital landscape is characterised by a power imbalance. A handful of big technology corporations define the rules, and initiatives that seek to offset the influence of big tech – such as the projects funded by SIDN Fund and the Adessium Foundation – are very small by comparison. That presents various technical, legal and strategic challenges.
On the technical front, one of the biggest challenges is the closed nature of the big tech corporations' data systems. The companies in question allow little scrutiny of their algorithms. Two of the supported projects, those run by AlgorithmWatch and WhoTargetsMe, involved the use of specially developed browser plugins to gather data on, and thus provide insight into, the way that big tech companies operate. However, both found themselves facing significant technical constraints. For example, the plugins don't work with mobile apps, whose role is steadily growing. Such initiatives can also be vulnerable to legal challenge, and the tech companies have far more legal muscle than the investigators. AlgorithmWatch received a warning from Facebook, but WhoTargetsMe was not contacted, despite the fact that Facebook had actively opposed a similar earlier project involving the monitoring of political adverts on the platform. Collaboration on technical and legal matters is seen as a useful way of strengthening the position of small players.
Relations with big tech corporations can be strategically challenging too. Some projects call the practices of tech companies into question and seek to get regulations introduced to oblige the companies to change their policies; others aim to engage the tech companies in dialogue with a view to getting them to look at their algorithms in a different way, so that change is realised from within. One thing that all project leaders said was that it's difficult to contact the tech giants, regardless of whether the aim is to initiate dialogue, to question certain practices, or to offer alternative technologies. It is similarly difficult to contact policy-makers who might be able to require tech companies to comply with legal or technical rules. Again, collaboration is regarded as a way to strengthen the campaigners' hand.
International versus national
Digitisation doesn't recognise international borders; the big tech corporations operate globally. By contrast, democracy and the public realm are organised largely at the national and local levels. That presents various challenges. On the one hand, it's important to know what is happening at the local level and to maintain contact with relevant media partners and others in order to communicate with a local audience. On the other hand, international collaboration and knowledge exchange are required to build a picture of what is happening and to develop an effective response.
Tactical Tech therefore operates on both the local and international levels. The organisation is active in various countries, studying how political parties are using citizens' data for political purposes. They work with local journalists and researchers to develop a thorough understanding of what's happening in a particular country and have now initiated a total of sixteen national studies. By comparing the findings from the various countries, the organisation is able to identify common characteristics and patterns. The approach has enabled Tactical Tech to build up a picture of the complex impact of targeted political advertising at the local and international levels. The challenge is finding suitable local partners and identifying the best way of combining the findings in order to exert influence.
Some project organisers seek to amplify the impact of their work through international collaboration. Much depends, however, on having the financial resources to get material translated and on finding partners who can deliver access to the relevant target groups. For example, Bits of Freedom is getting its policy-makers' report translated into English with a view to sharing it with EDRi, an organisation that lobbies at the European level for the protection and reinforcement of digital citizens' rights. The Short Course on Manipulation will also be translated into English and made freely available to any digital citizens' rights organisations that want to use it locally (in English or translated on into the local language). For its part, DeGoedeZaak is looking at the possibility of collaborating with campaign organisations in other countries in order to boost the impact of its Online Hate First Aid Kit.
What's the best way to reach the target audience?
Online manipulation, privacy and democracy are relatively complex, abstract themes. Bringing them to the attention of a wide audience can therefore be difficult. One cheap approach is to advertise on big tech platforms. That isn't a comfortable option, however, for campaigners seeking to question the power of the corporations behind those platforms. Bits of Freedom therefore opted to advertise on Uitzending Gemist, an online platform where viewers can catch up with previously televised public service broadcasters' programmes.
Another challenge is making contact with journalists. Civinc therefore chose to invest in partnering with a PR agency to promote Waaromkiesjij.nl. The collaboration resulted in the initiative being picked up by various important news outlets. A tie-in with Stemwijzer also enabled Civinc to extend its reach considerably.
Several of our supported projects were realised with the help of media partners. Such partnerships are viable mainly for projects that have significant news-generating potential. For its Dutch activities, AlgorithmWatch struck up relationships with NOS, Pointer and the Groene Amsterdammer, for example. The arrangement gave the media outlets access to newsworthy data, while AlgorithmWatch got more data donors, and thus more opportunity to study how the Instagram algorithm works.
WhoTargetsMe came up with an interesting way of increasing its reach: the organisation developed a special service for journalists, called Notify. Notify users were sent daily automated e-mails detailing all the Facebook advertisements that individual political parties were using that day. The service enabled journalists to stay up-to-date with, for example, what the candidates they were covering were promising to Facebook users. The aim was to generate a flywheel effect, where journalists got material for their stories, which in turn raised WhoTargetsMe's profile, leading to more plugin downloads and thus an improved data supply, adding further momentum to the process. The tactic had worked well during elections in the UK and elsewhere, but in the Netherlands WhoTargetsMe secured relatively little media exposure and there weren't as many plugin downloads as in other countries. Securing appropriate media access channels and activating them at the right moment is challenging and likely to remain so, especially for organisations that operate internationally.
Finally, the project organisers helped each other generate publicity where possible. For example, Bits of Freedom featured AlgorithmWatch's browser plugin in a newsletter, while Tactical Tech's the Digital Detox Kit flagged up the WhoTargetsMe plugin. Evidently, collaboration is just as important for maximising reach as it is for securing other goals.
Online versus offline
Pandemic-related restrictions meant that most of us spent a large part of 2020 confined to our homes. The internet therefore came to play an even greater part in our lives than was already the case, extending its influence into new areas. That served to amplify the urgency of the projects supported in the context of this call. However, the situation also impacted project organisers' ability to progress their activities. The importance to some projects of face-to-face interaction was made abundantly clear.
That was the case with the educational projects, for instance. With schools all closed, Civinc's pilot with vocational secondary pupils had to be held online. Although the participating schools were grateful for an activity that was ideal for remote participation, the absence of a teacher introduced previously unexpected difficulties. Some pupils simply didn't 'show up', or dropped out mid-session, for example. In many cases, such problems could probably have been prevented by having a teacher on hand to encourage and explain. Civinc is therefore looking to scale up on the basis of collaboration with community organisations that can supervise and guide the debates.
SkillsDojo's tools are in principle suitable for young people to use independently. Consequently, use of the organisation's lesson programmes rose sharply during lockdown. Again, however, the absence of a teacher to provide guidance and context had a negative effect on the impact. That's one of the reasons why SkillsDojo has merged with NBD Biblion. NDB Biblion's media coaches can deliver SkillsDojo lessons in schools, thus increasing the programme's reach and impact.
DeGoedeZaak's research into online trolling showed that there is significant overlap between online and offline intimidation. Trolls will often harass an influencer's family or employer, or threaten physical violence, for example.Online intimidation is merely one of many ways of putting pressure on social change leaders. It is not distinct from physical intimidation and should not be treated as distinct. In the fight against trolling, it can therefore be advantageous to forge ties with community organisations, with a view to maximising impact.
Conclusion: collaboration is vital for standing up to Big Tech
The protection and promotion of public values in a digital society is a complex challenge, for which there is no universally applicable solution. The projects supported in the context of the recent joint call by SIDN Fund and the Adessium Foundation involved initiatives at various levels designed to curb the power of big tech companies and safeguard democracy, the rule of law and privacy. The projects focused on education, clarifying how big tech platforms' algorithms work, promoting awareness in the general public and amongst policy-makers, advancing alternative technologies, delivering strategies for strengthening the position of internet users, increasing the diversity of automated news story recommendations and countering the creation of filter bubbles.
Structural protection of public values in a digital society requires effective legislation and regulation to limit the power of Big Tech. In recent years, the political pressure on tech platforms has increased considerably. Both the EU and the US Federal Trade Commission have imposed huge fines on Google and Facebook for privacy infringements and abuses of power. And Australia has decided to challenge Facebook's dominant role in news delivery. Meanwhile, in the Netherlands, organisations such as the Council for Public Administration are calling for the appointment of a Minister of Digital Affairs to protect citizens against the power of tech corporations and make them more accountable for automated decision-making.
It also seems that public perceptions of Big Tech are changing. Corporations previously held in great esteem are now attracting criticism. However, it must be acknowledged that there has been no consequent large-scale migration to alternative technologies. The 'lock-in' effects of tech platforms are so strong that widespread dependency has developed. The earnings and profits of big tech corporations continue to grow exponentially, and the pandemic has served only to consolidate their power.
What about the big tech corporations themselves? While there is nothing to suggest that they are inclined to pursue fundamental change of their own volition, a combination of effective legislation and regulations, public pressure and constructive proposals regarding smart algorithm design could ultimately bring about a rethink. In its own way, each of the projects supported through the recent call contributed to that process. Local and international knowledge exchange, partnership and media mobilisation can amplify the impact of the supported projects with a view to securing public values in a digital society.