<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
    <channel>
        <atom:link href="https://www.beyond-eve.com/technialarticles/rss" rel="self" type="application/rss+xml" />
        <title><![CDATA[Beyond EVE: Events]]></title>
        <link><![CDATA[https://www.beyond-eve.com/technialarticles/rss]]></link>
        <description><![CDATA[]]></description>
        <language>de-DE</language>
        <pubDate>Sun, 22 Feb 2026 13:12:37 +0100</pubDate>

                    <item>
                <title><![CDATA[International Observatory on Information and Democracy (OID): A Major New Report on the State News Media, AI and Data Governance]]></title>
                <link>https://www.beyond-eve.com/en/events/international-observatory-on-information-and-democracy-oid-a-major-new-report-on-the-state-news-media-ai-and-data-governance</link>
                <description><![CDATA[<p>As part of the <a href="https://informationdemocracy.org/mission/" rel="noopener noreferrer" target="_blank">Forum on Information and Democracy</a>’s <strong>Global Dissemination month</strong><em>,</em> the Alexander von Humboldt Institute for Internet and Society (HIIG) is opening its doors for a special multi-stakeholder event. This gathering will bring together academics, activists, and policymakers for an exclusive showcase of the OID’s latest findings. The OID’s results will be presented by Professor Jeanette Hofmann and Professor Robin Mansel, followed by a “<a href="https://www.hiig.de/en/events/digitaler-salon-gespraechsklimawandel/" rel="noopener noreferrer" target="_blank">Digitaler Salon</a>” in the evening to publicly discuss key issues of information flows and the changing discourse climate, and the action necessary in different sectors.</p><p>After more than a year of work and a review of more than 3000 sources, the OID is launching its first meta-analysis. The report is set to provide a global understanding of the current structure of the information and communication space and its impact on public debate and democracy around the world.</p>]]></description>
                <author><![CDATA[Alexander von Humboldt Institute for Internet and Society (HIIG)  <info@hiig.de>]]></author>
                <pubDate>Sun, 05 Jan 2025 18:23:47 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[Globalizing the European Media Order: The Brussels Effect in Times of Crisis]]></title>
                <link>https://www.beyond-eve.com/en/events/globalizing-the-european-media-order-the-brussels-effect-in-times-of-crisis</link>
                <description><![CDATA[<p>What are the challenges of finding consensus in a diverse Europe and the potential of Europe as a global regulator of the digital field? Renate Nikolay is a key voice in the development of European digital regulation, with a deep experience in data protection, hate speech, and disinformation. She will talk with Prof. Dr. Wolfgang Schulz about the challenges of finding consensus in a diverse Europe and, the potential of Europe as a global regulator of the digital field. Together they will enquire about Europe’s regulatory approaches anchored sufficiently in scientific insights into platforms, regulation, and whether the “Brussels effect” of digital rules underlines the importance of European digital law.</p><p>The event will be held in English and moderated by <a href="https://www.hans-bredow-institut.de/en/staff/matthias-c-kettemann" rel="noopener noreferrer" target="_blank">Prof. Dr. Matthias C. Kettemann</a>.</p><p><br></p><p><strong>Renate Nikolay</strong> is Head of Cabinet of Vĕra Jourová, Vice-President for Values and Transparency, working on matters such as rule of law or disinformation. She was Director in charge of Asia and Latin America in DG Trade and, from 2014 to 2019, she was Head of Cabinet of the Commissioner for Justice, Consumers and Gender Equality where she played a key role in the adoption of the data protection reform and the establishment of the European Public Prosecutor and the Code of Conduct with platforms on online hate speech. She holds a law degree (erstes und zweites Staatsexamen) from the Free University of Berlin.</p><p><strong>Wolfgang Schulz </strong>is Director of the Leibniz Institute for Media Research │ Hans Bredow Institute (HBI) and holds the university professorship “Media Law and Public Law including its Theoretical Foundations” at the Faculty of Law of the University of Hamburg. Since February 2012, he has been Research Director of the Alexander von Humboldt Institute for Internet and Society (HIIG). His work emphasizes the freedom of communication, problems of legal regulation with regard to media contents, questions of law in new media, above all in digital television, and the legal bases of journalism, but also the jurisprudential bases of freedom of communication and the implications of the changing public sphere on the law.</p>]]></description>
                <author><![CDATA[Alexander von Humboldt Institute for Internet and Society (HIIG)  <info@hiig.de>]]></author>
                <pubDate>Sun, 24 Apr 2022 12:04:59 +0200</pubDate>
                            </item>
                    <item>
                <title><![CDATA[Krisztina Rozgonyi and Marius Dragomir: Freedom of expression in Central and Eastern Europe]]></title>
                <link>https://www.beyond-eve.com/en/events/krisztina-rozgonyi-and-marius-dragomir-freedom-of-expression-in-central-and-eastern-europe</link>
                <description><![CDATA[<p>Media systems in Central and Eastern Europe have been subject to significant transformation processes over the past two decades. In the face of populist tendencies and with large swathes of the media being captured by governments and oligarchs, the space for independent journalism has dramatically shrunk in most of the region’s nations. For citizens, this results in experiencing their everyday realities through multiple layers of distorted communication channels. This situation is further compounded by global digital technologies such as the algorithm-driven manipulation of content. These rapid developments have additional negative effects on national and regional media diversity.</p><p>How can economically vulnerable media regain editorial independence and stand up against the powerful propaganda channels? To find answers to this question, this edition of the lecture series features two scholars who specialize in the digital transformation of media systems in democratic societies and the particular challenges in Central and Eastern Europe.</p><p><em>Marius Dragomir</em> talks about the changes experienced by media systems in Central and Eastern Europe and presents the findings of his research into the impact of media capture on independent journalism. He highlights risks that independent journalism is likely to face in the near future. <em>Krisztina Rozgonyi</em> analyses how digital societies in Central and Eastern Europe are embedded in a politically manipulated communicative context and sheds light on its historical roots. This unique situation is further complicated by social media platforms, resulting in an increased vulnerability of the public to hate speech, disinformation, and propaganda.</p><p><br></p><p><strong>Marius Dragomir</strong> is the Director of the Center for Media, Data, and Society. In 2015, he founded <a href="https://mpmonitor.org/" rel="noopener noreferrer" target="_blank"><em>MediaPowerMonitor</em></a>, a community of experts in media policy covering trends in regulation, business, and politics that influence journalism. He has spent the past two decades in the media research field, specializing in media and communication regulation, digital media, governing structures of public service media, and media and ownership regulation. Marius is now running a slew of comparative research projects including the <a href="https://cmds.ceu.edu/media-influence-matrix-whats-it-all-about" rel="noopener noreferrer" target="_blank"><em>Media Influence Matrix</em></a>, a global research project looking into power relations and undue influence in news media.<strong> </strong></p><p><strong><span class="ql-cursor">﻿</span>Krisztina Rozgonyi</strong> is a senior scientist at the <a href="https://www.oeaw.ac.at/cmc/the-institute/staff/krisztina-rozgonyi" rel="noopener noreferrer" target="_blank">Institute for Comparative Media and Communication Studies (CMC)</a> of the Austrian Academy of Sciences (ÖAW) and a senior expert on international media, telecommunications, and IP legal and policy. She works with international and European organizations, national governments, and regulators as an advisor on media freedom, spectrum policy, and digital platform governance. Her recent work for the Venice Commission focused on <a href="https://www.venice.coe.int/webforms/documents/default.aspx?pdffile=CDL-LA(2018)002-e" rel="noopener noreferrer" target="_blank">responding to disinformation online</a>. Krisztina Rozgonyi has also engaged recently with the OSCE Representative on Media Freedom as an expert on <a href="https://www.osce.org/representative-on-freedom-of-media/452452" rel="noopener noreferrer" target="_blank">Artificial Intelligence &amp; media pluralism</a>.</p>]]></description>
                <author><![CDATA[Alexander von Humboldt Institute for Internet and Society (HIIG)  <info@hiig.de>]]></author>
                <pubDate>Sun, 24 Apr 2022 11:49:50 +0200</pubDate>
                            </item>
                    <item>
                <title><![CDATA[The 2021 Salant Lecture on Freedom of the Press: Maria Ressa, CEO of Rappler and 2021 Nobel Peace Prize Winner]]></title>
                <link>https://www.beyond-eve.com/en/events/the-2021-salant-lecture-on-freedom-of-the-press-maria-ressa-ceo-of-rappler-and-2021-nobel-peace-prize-winner</link>
                <description><![CDATA[<p>On Tuesday, November 16th at 6:00 pm ET in the John F. Kennedy Jr. Forum, this year’s Salant Lecture on Freedom of the Press will be delivered by&nbsp;<strong>Maria Ressa</strong>, 2021 Nobel Peace Prize winner, co-founder and CEO of Rappler.com, Fall 2021 Shorenstein Center Fellow, and Center for Public Leadership Hauser Leader. Shorenstein Center on Media, Politics and Public Policy Director&nbsp;<strong>Nancy Gibbs</strong>&nbsp;will moderate a conversation with Maria after her remarks.</p><p>A journalist in Asia for 35 years, Maria Ressa co-founded Rappler.com, the top digital-only news site that is leading the fight for press freedom in the Philippines. As Rappler’s CEO and president, Maria has endured constant political harassment and arrests by the Duterte government.</p><p>Maria was awarded the 2021 Nobel Peace Prize for her courageous fights to uphold freedom of expression. She was also Time Magazine’s 2018 Person of the Year, and has received numerous other awards and recognition for her journalism and fearlessness in the face of efforts to silence her.</p><p>The <a href="https://shorensteincenter.org/programs/prizes-lectures/salant-lecture/" rel="noopener noreferrer" target="_blank">Salant Lecture on Freedom of the Press</a> is delivered annually by a prominent journalist, scholar, or practitioner. Named for Mr. Richard Salant, a former president of CBS News, and a defender of the freedom of the press as well as a champion of high ethical and news standards for the press, the annual lecture is made possible through a 2007 fund established by Dr. Frank Stanton’s estate. Dr. Frank Stanton, also a former president of CBS News and staunch defender of First Amendment rights, set up the fund in honor of his longtime friend and colleague, Richard Salant, a graduate of Harvard College and Harvard Law School.</p>]]></description>
                <author><![CDATA[Shorenstein Center on Media, Politics and Public Policy]]></author>
                <pubDate>Thu, 21 Oct 2021 21:04:15 +0200</pubDate>
                            </item>
                    <item>
                <title><![CDATA[AI and Content Moderation]]></title>
                <link>https://www.beyond-eve.com/en/events/ai-and-content-moderation</link>
                <description><![CDATA[<p>Public pressure on platform companies to more soundly monitor the content on their sites is constantly increasing. To address this, platforms are turning to algorithmic content moderation systems. These systems prioritize content that promises to increase engagement and block content that is deemed illegal or is infringing the platform's own policies and guidelines. But content moderation is a ‘wicked problem’ that raises many questions all of which eschew simple answers. Where is the line between hate speech and freedom of expression – and how to automate and deploy this on a global scale? Are platforms overblocking legitimate content, or are they rather failing to limit illegal speech on their sites?&nbsp;</p><p>Within the framework of a ten-week virtual research sprint hosted by the HIIG, thirteen international researchers from various disciplines came together to tackle the challenges posed by automation in content moderation. Their work resulted in policy briefings focused on algorithmic audits and on increasing the transparency and accountability of automated content moderation systems. We warmly invite you to learn more about their findings and attend their output presentations followed by a panel discussion.</p><h4><strong>Agenda</strong></h4><p>Opening remarks on the project and the research sprint by research director Wolfgang Schulz and research lead Alexander Pirang</p><p>Presentations of the research outputs by the sprint fellows:</p><p><br></p><ul><li><strong>David Morar,</strong> guest researcher at <a href="https://datagovhub.elliott.gwu.edu/staff/" rel="noopener noreferrer" target="_blank">George Washington University</a>, Elliott School of International Affairs, USA</li><li><strong>Aline Iramina,</strong> PhD candidate at the <a href="https://www.gla.ac.uk/" rel="noopener noreferrer" target="_blank">University of Glasgow</a>, Great Britain</li><li><strong>Sunimal Mendis, </strong>lecturer at the <a href="https://research.tilburguniversity.edu/en/persons/sunimal-mendis" rel="noopener noreferrer" target="_blank">University of Tilburg</a>, Netherlands</li></ul><p>Followed by a panel discussion moderated by Jennifer Boone with:</p><ul><li><strong>Angelica Fernandez</strong>, fellow of the research sprint and PhD candidate at the University of Luxembourg</li><li><strong>Philipp Otto</strong>, founder and director of the iRights.lab</li><li><strong>Matthias Kettemann</strong>, associated researcher at the HIIG and scientific lead of the research project ”Regulatory Structures and the Emergence of Rules in Online Spaces” at the Leibniz-Institut für Medienforschung I Hans-Bredow Institut&nbsp;</li></ul>]]></description>
                <author><![CDATA[Alexander von Humboldt Institute for Internet and Society (HIIG)  <info@hiig.de>]]></author>
                <pubDate>Sun, 22 Feb 2026 13:12:37 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[AlgorithmWatch forced to shut down Instagram monitoring project after threats from Facebook]]></title>
                <link>https://www.beyond-eve.com/en/technicalarticles/algorithmwatch-forced-to-shut-down-instagram-monitoring-project-after-threats-from-facebook</link>
                <description><![CDATA[<p><strong>Digital platforms play an ever-increasing role in structuring and influencing public debate. Civil society watchdogs, researchers and journalists need to be able to hold them to account. But Facebook is increasingly fighting those who try. It shut down New York University’s Ad Observatory last week, and went after AlgorithmWatch, too. The European Parliament and EU Member States must act now to prevent further bullying.</strong></p><p>On 3 March 2020, AlgorithmWatch launched a project to monitor Instagram’s newsfeed algorithm. Volunteers could install a browser add-on that scraped their Instagram newsfeeds. Data was sent to a database we used to study how Instagram prioritizes pictures and videos in a user’s timeline.</p><p>Over the last 14 months, about 1,500 volunteers installed the add-on. With their data, we were able to show that Instagram likely <a href="https://algorithmwatch.org/en/story/instagram-algorithm-nudity/" rel="noopener noreferrer" target="_blank">encouraged</a> content creators to post pictures that fit specific representations of their body, and that politicians were likely to <a href="https://algorithmwatch.org/en/instagram-algorithm-politicians/" rel="noopener noreferrer" target="_blank">reach a larger audience</a> if they abstained from using text in their publications (Facebook denied both claims). Although we could not conduct a precise audit of Instagram’s algorithm, this research is among the most advanced studies ever conducted on the platform. The project was supported by the European Data Journalism Network and by the Dutch foundation SIDN. It was done in partnership with <a href="https://www.mediapart.fr/journal/international/150620/sur-instagram-la-prime-secrete-la-nudite-se-deshabiller-pour-gagner-de-l-audience" rel="noopener noreferrer" target="_blank">Mediapart</a> in France, <a href="https://web.archive.org/web/20210303082809/https:/nos.nl/artikel/2371016-het-algoritme-van-instagram-verslaan-best-lastig-voor-een-politicus.html" rel="noopener noreferrer" target="_blank">NOS</a>, <a href="https://www.groene.nl/artikel/de-poppetjes-zijn-op-instagram-belangrijker-dan-de-inhoud" rel="noopener noreferrer" target="_blank">Groene Amsterdammer</a> and <a href="https://pointer.kro-ncrv.nl/politieke-campagnes-met-veel-selfies-worden-beloond-door-het-instagram-algoritme" rel="noopener noreferrer" target="_blank">Pointer</a> in the Netherlands, <a href="https://www.sueddeutsche.de/wahlfilter" rel="noopener noreferrer" target="_blank">Süddeutsche Zeitung</a> in Germany and was covered by dozens of news outlets over the world.</p><p><em>by Nicolas Kayser-Bril</em></p>]]></description>
                <author><![CDATA[AlgorithmWatch gGmbH <info@algorithmwatch.org>]]></author>
                <pubDate>Tue, 14 Sep 2021 20:20:41 +0200</pubDate>
                            </item>
                    <item>
                <title><![CDATA[“News you don’t believe”: User perspectives on f*ke news and misinformation]]></title>
                <link>https://www.beyond-eve.com/en/events/news-you-dont-believe-user-perspectives-on-fke-news-and-misinformation</link>
                <description><![CDATA[<p>Users’ perspectives on what f*ke news and misinformation is and isn’t, who drives it, and where people say they see it are important for understanding the scale and scope of public concern, and how this corresponds with research insights and aligns with proposed responses to these problems, as well as for the credibility and even effect of responses. In this presentation, Professor Rasmus Kleis Nielsen, Director of the Reuters Institute for the Study of Journalism, uses survey data and focus group material from Reuters Institute research to present an overview of user perspectives on “fake news” and misinformation more broadly, and identify some commonalities and differences between how, respectively, the public, researchers, and policymakers talk about these problems.</p><p><strong>Professor Rasmus Kleis Nielsen</strong>&nbsp;is Director of the Reuters Institute for the Study of Journalism and Professor of Political Communication at the University of Oxford. He was previously Director of Research at the Reuters Institute and Editor-in-Chief of the International Journal of Press/Politics. His work focuses on changes in the news media, on political communication, and the role of digital technologies in both. He has done extensive research on journalism, American politics, and various forms of activism, and a significant amount of comparative work in Western Europe and beyond.</p>]]></description>
                <author><![CDATA[Shorenstein Center on Media, Politics and Public Policy]]></author>
                <pubDate>Thu, 21 Oct 2021 21:01:17 +0200</pubDate>
                            </item>
                    <item>
                <title><![CDATA[Iyad Rahwan: How to trust machines?]]></title>
                <link>https://www.beyond-eve.com/en/events/iyad-rahwan-how-to-trust-machines</link>
                <description><![CDATA[<p>Machine intelligence plays a growing role in our lives. Today, machines recommend things to us, such as news, music, and household products. They trade in our stock markets and optimise our transportation and logistics. They are also beginning to drive us around, play with our children, diagnose our health. How do we ensure that these machines will be trustworthy? This lecture explores various psychological, social, cultural, and political factors that shape our trust in machines and pleads for the accomplishment of the challenges of the information revolution not only to be understood as a problem of computer science.</p><p>&nbsp;</p><p><strong>Iyad Rahwan</strong> is director of the Max Planck Institute for Human Development in Berlin, where he founded and leads the Center for Humans and Machines. He is also an honorary professor of Electrical Engineering and Computer Science at the Technische Universität Berlin. Until June 2020, he was an Associate Professor of Media Arts &amp; Sciences at the Massachusetts Institute of Technology (MIT). Rahwan holds a PhD in Information Systems (Artificial Intelligence) from the University of Melbourne, Australia. His work lies at the intersection of computer science and human behavior, with a focus on collective intelligence, large-scale cooperation, and the societal impact of artificial intelligence and social media. In addition to various journal articles, Iyad Rahwan is co-author of the study <em>Reply to: Life and death decisions of autonomous vehicles</em> and together with Jean-François Bonnefon he published the paper <em>Machine Thinking, Fast and Slow</em> (both 2020).</p><p>&nbsp;</p><p><strong>The event will be held in English. </strong></p>]]></description>
                <author><![CDATA[Alexander von Humboldt Institute for Internet and Society (HIIG)  <info@hiig.de>]]></author>
                <pubDate>Fri, 18 Dec 2020 11:13:55 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[Alexander von Humboldt Institute for Internet and Society (HIIG) ]]></title>
                <link>https://www.beyond-eve.com/en/organisations/alexander-von-humboldt-institute-for-internet-and-society-hiig</link>
                <description><![CDATA[<p>The Alexander von Humboldt Institute for Internet and Society (HIIG) was founded in 2011 to research the development of the internet from a societal perspective and better understand the digitalisation of all spheres of life. As the first institute in Germany with a focus on internet and society, HIIG has established an understanding that centres on the deep interconnectedness of digital innovations and societal processes.&nbsp;The development of technology reflects norms, values and networks of interests, and conversely, technologies, once established, influence social values.</p><p><br></p><h3>We explore new models of thought and action</h3><p>Modern societies are based on ever-changing sets of norms, procedures and structures that are intended to enable free and democratic coexistence. In times of fundamental social, economic and technical transformation, however, some of these institutions are reaching the limits of their ability to change and "broken concepts" are emerging. This term refers to ways of thinking, patterns of action or explanatory models that are so deeply connected to their previous context that they now seem to have come from a different era and need to be rethought. We want to research such broken concepts – such as the once-meaningful distinction between the offline and online world – and help overcome them by offering new models of thought and action.&nbsp;</p><p>By doing so, we are actively shaping the society of the future. Based on the scientific competences brought together at the institute and its dedication to interdisciplinarity, HIIG can engage with current topics such as the "platformisation" of the economy and society or the use of artificial intelligence and question the underlying concepts, structures and norms.</p>]]></description>
                <author><![CDATA[Alexander von Humboldt Institute for Internet and Society (HIIG)  <info@hiig.de>]]></author>
                <pubDate>Fri, 11 Dec 2020 16:46:20 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[Berkman Klein Center for Internet & Society]]></title>
                <link>https://www.beyond-eve.com/en/organisations/harvard-university-berkman-klein-center-for-internet-society</link>
                <description><![CDATA[<p>The Berkman Klein Center's mission is to explore and understand cyberspace; to study its development, dynamics, norms, and standards; and to assess the need or lack thereof for laws and sanctions. We are a research center, premised on the observation that what we seek to learn is not already recorded. Our method is to build out into cyberspace, record data as we go, self-study, and share. Our mode is entrepreneurial nonprofit. </p><p><br></p><p><strong>The Center in Brief</strong></p><p>We bring together the sharpest, most thoughtful people from around the globe to tackle the biggest challenges presented by the Internet. As an interdisciplinary, University-wide center with a global scope, we have an unparalleled track record of leveraging exceptional academic rigor to produce real- world impact. We pride ourselves on pushing the edges of scholarly research, building tools and platforms that break new ground, and fostering active networks across diverse communities. United by our commitment to the public interest, our vibrant, collaborative community of independent thinkers represents a wide range of philosophies and disciplines, making us a unique home for open-minded inquiry, debate, and experimentation.</p>]]></description>
                <author><![CDATA[Berkman Klein Center for Internet & Society]]></author>
                <pubDate>Sun, 06 Dec 2020 12:02:59 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[Privacy International]]></title>
                <link>https://www.beyond-eve.com/en/organisations/privacy-international</link>
                <description><![CDATA[<p><strong>Privacy International</strong> challenges overreaching state and corporate surveillance, so that people everywhere can have greater security and freedom through greater personal privacy. Privacy International (PI) is a registered charity based in London that works at the intersection of modern technologies and rights. PI envisions a world in which the right to privacy is protected, respected, and fulfilled. Privacy is essential to the protection of autonomy and human dignity, serving as the foundation upon which other human rights are built. In order for individuals to fully participate in the modern world, developments in law and technologies must strengthen and not undermine the ability to freely enjoy this right. <strong>How We Fight</strong> We challenge governments' powers by advocating and litigating for stronger protections. We lead research and investigations to shine a light on powers and capabilities, and to instigate and inform debate. We advocate for good practices and strong laws worldwide to protect people and their rights. We equip civil society organisations across the world to increase public awareness about privacy. We raise awareness about technologies and laws that place privacy at risk, to ensure that the public is informed and engaged.</p>]]></description>
                <author><![CDATA[Privacy International]]></author>
                <pubDate>Fri, 04 Dec 2020 14:21:42 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[When scholars sprint, bad algorithms are on the run]]></title>
                <link>https://www.beyond-eve.com/en/technicalarticles/when-scholars-sprint-bad-algorithms-are-on-the-run</link>
                <description><![CDATA[<p><em>The first research sprint of the </em><a href="https://www.hiig.de/en/project/the-ethics-of-digitalisation/" rel="noopener noreferrer" target="_blank"><em>Ethics of Digitalisation</em></a><em> project financed by the Stiftung Mercator reached the finishing line. Thirteen international fellows tackled pressing issues concerning the use of AI in content moderation. Looking back at ten intense weeks of interdisciplinary research, we share highlights and key outcomes.</em></p><p>In response to increasing public pressure to tackle hate speech and other challenging content, platform companies have turned to algorithmic content moderation systems. These automated tools promise to be more effective and efficient in identifying potentially illegal or unwanted&nbsp;material. But algorithmic content moderation also raises many questions – all of which eschew simple answers. Where is the line between hate speech and freedom of expression – and how to automate this on a global scale? Should platforms scale the use of AI tools for illegal online speech, like terrorism promotion, or also for regular content governance? Are platforms’ algorithms over-enforcing against legitimate speech, or are they rather failing to limit hateful content on their sites? And how can policymakers ensure an adequate level of transparency and accountability in platforms’ algorithmic content moderation processes?</p>]]></description>
                <author><![CDATA[Alexander von Humboldt Institute for Internet and Society (HIIG)  <info@hiig.de>]]></author>
                <pubDate>Sun, 03 Jan 2021 16:47:28 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[The True Costs of Misinformation]]></title>
                <link>https://www.beyond-eve.com/en/events/the-true-costs-of-misinformation</link>
                <description><![CDATA[<p>It all feels like a precursor to a bad joke: What do foreign agents, white supremacists, conspiracists, snake oil salesmen, political operatives, white academics, and a disgruntled bunch of zoomers have in common? The groups have collided in a centrifuge of chaos online, where the tactics they use to hide their identities and manipulate audiences are more prevalent than ever. Social media companies are trying to patch the holes in a failing sociotechnical system, where the problems their products have created are now shouldered by journalists, universities, and health professionals, just to name a few. What can be done to restore moral and technical order in a time of pandemonium?&nbsp;</p><p>Recommended resources:</p><ul><li><a href="http://www.mediamanipulation.org/" rel="noopener noreferrer" target="_blank">MediaManipulation.Org</a></li><li>Donovan, J. 2020. “<a href="https://www.technologyreview.com/2020/10/05/1009231/social-media-facebook-tobacco-secondhand-smoke" rel="noopener noreferrer" target="_blank">Thank You for Posting: Smoking’s Lessons for Regulating Social Media</a>. MIT Technology Review.</li><li>Donovan, J. 2020. “<a href="https://www.technologyreview.com/2020/04/30/1000881/covid-hoaxes-zombie-content-wayback-machine-disinformation/" rel="noopener noreferrer" target="_blank">Covid Hoaxes Are Using a Loophole to Stay Alive—Even after Content Is Deleted</a>.” MIT Technology Review. 2020.</li><li>Donovan, J. 2020. “<a href="https://www.nbcnews.com/think/opinion/why-trump-s-viral-covid-flu-misinformation-hard-facebook-twitter-ncna1242665" rel="noopener noreferrer" target="_blank">Trump’s Viral Flu Tweet Proves America Is Losing the Battle against Covid Misinfo</a>.” NBC News.</li><li>Donovan, J., and boyd, d. 2019. “<a href="https://doi.org/10.1177/0002764219878229" rel="noopener noreferrer" target="_blank">Stop the Presses? Moving From Strategic Silence to Strategic Amplification in a Networked Media Ecosystem</a>” American Behavioral Scientist, September.</li></ul><p><br></p>]]></description>
                <author><![CDATA[Berkman Klein Center for Internet & Society]]></author>
                <pubDate>Sun, 30 Jun 2024 16:52:04 +0200</pubDate>
                            </item>
                    <item>
                <title><![CDATA[Creating New Futures for Local Newspapers]]></title>
                <link>https://www.beyond-eve.com/en/events/creating-new-futures-for-local-newspapers</link>
                <description><![CDATA[<p>Local newspapers are in peril. Although they continue to serve millions of Americans with vital information about their communities, newspapers face an extremely difficult environment. Private capital has stepped in to manage the business risk and take advantage of the remaining asset strength of newspapers. But the ownership, governance, and values of private capital do not foster the business or social transformation that local newspapers need to serve their communities. What can we do to help local papers find a new footing?</p><p><br></p><p><strong>Elizabeth Hansen</strong> and <strong>Marc Hand</strong> have released a <a href="https://shorensteincenter.org/the-national-trust-for-local-news/" rel="noopener noreferrer" target="_blank">paper</a> proposing a new National Trust for Local News that would support the financing and transition of local newspapers to new ownership and governance structures. They will present the outline of their proposal in this webinar, and engage in conversation with fellow panelists about the future of local newspapers. <strong>Steve Waldman</strong>, CEO and cofounder of Report for America will discuss his similar “replanting” proposal, recently published by the Open Markets Institute. <strong>Geoff Davis</strong>, CEO of the Sorenson Impact Center at the University of Utah, will comment on how social impact capital might be mobilized to respond to the business crisis in local journalism, and how these proposals relate to other new institutions being built to solve major social challenges. <strong>Setti Warren</strong>, Executive Director of the Shorenstein Center and former Mayor of Newton, MA, will moderate and comment on the importance of local journalism to public life in cities and towns.</p>]]></description>
                <author><![CDATA[Shorenstein Center on Media, Politics and Public Policy]]></author>
                <pubDate>Sat, 05 Dec 2020 22:41:59 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[Countering the COVID-19 Misinfodemic with Text Similarity and Social Data Science]]></title>
                <link>https://www.beyond-eve.com/en/events/countering-the-covid-19-misinfodemic-with-text-similarity-and-social-data-science</link>
                <description><![CDATA[<p>The Oxford Internet Institute is proud to present faculty member <strong>Dr Scott A. Hale </strong>for this next session in our Wednesday Webinar Series. The session will be moderated by Dr Chico Camargo, Postdoctoral Researcher in Data Science at the OII.</p><p><br></p><p>Misinformation about COVID-19 has led to severe harms in multiple instances: as an example, a rumor that drinking methanol would cure the virus resulted in hundreds of deaths. While end-to-end encryption is an important privacy safeguard, this encryption prevents platforms such as WhatsApp, Signal, and others from employing centralized interventions and warnings about misinformation. Several options, however, from user interface changes to tip lines to having more intelligence on client devices offer hope.</p><p><br></p><p>In this presentation Dr Scott A. Hale will discuss how text similarity algorithms are being used to help fact-checkers locate misinformation, cluster similar misinformation, and identify existing fact-checks in the context of tip lines on platforms with end-to-end encryption. The presentation will detail research at the Oxford Internet Institute and Meedan, a global technology not-for-profit developing open-source tools for fact-checking and translation, that is actively being used by fact-checkers to improve the information available online.</p>]]></description>
                <author><![CDATA[The Oxford Internet Institute <enquiries@oii.ox.ac.uk>]]></author>
                <pubDate>Sun, 06 Dec 2020 13:04:47 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[Terms of Disservice Book Launch]]></title>
                <link>https://www.beyond-eve.com/en/events/terms-of-disservice-book-launch</link>
                <description><![CDATA[<p>The Shorenstein Center hosted an&nbsp;<strong>online book launch for&nbsp;<em>Terms of Disservice</em></strong>, authored by senior fellow and co-director of the Digital Platforms &amp; Democracy Project,&nbsp;<strong>Dipayan Ghosh</strong>. The event featured Shorenstein Center director&nbsp;<strong>Nancy Gibbs</strong>, former Hillary Clinton 2016 campaign manager and HKS Defending Digital Democracy program director&nbsp;<strong>Robby Mook</strong>, and&nbsp;<em>Politico</em>&nbsp;chief technology correspondent&nbsp;<strong>Mark Scott,</strong> discussing the structure of the modern digital economy and its interface with social issues in America today.Ghosh contends that the business model underlying the consumer internet sector implicates our welfare from economic, political, and social perspectives.&nbsp;<a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__shorensteincenter.us1.list-2Dmanage.com_track_click-3Fu-3D30699762a3826bbf132818652-26id-3D46f7bd679d-26e-3Dc6de6c7fd0&amp;d=DwMFaQ&amp;c=WO-RGvefibhHBZq3fL85hQ&amp;r=OItGXn4rLkJFn1pUn1Fh9XSbO_qbiTqsyGb_mvLAvgw&amp;m=Y0r3FXEw9NBtKArdevUPMehms2Tp916tncpQgBpRE5g&amp;s=DjfWyKxDKdcsuEtT4F0zzVoSgASZ2zchepUxA7lXRmc&amp;e=" rel="noopener noreferrer" target="_blank"><strong><em>T</em></strong></a><a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__shorensteincenter.us1.list-2Dmanage.com_track_click-3Fu-3D30699762a3826bbf132818652-26id-3D854b30df9e-26e-3Dc6de6c7fd0&amp;d=DwMFaQ&amp;c=WO-RGvefibhHBZq3fL85hQ&amp;r=OItGXn4rLkJFn1pUn1Fh9XSbO_qbiTqsyGb_mvLAvgw&amp;m=Y0r3FXEw9NBtKArdevUPMehms2Tp916tncpQgBpRE5g&amp;s=QLjssoerZvtwM3XcY1nV2v_V94K3tfG_KEhnEI_ppHw&amp;e=" rel="noopener noreferrer" target="_blank"><strong><em>erms of Disservice&nbsp;</em></strong></a><a href="https://urldefense.proofpoint.com/v2/url?u=https-3A__shorensteincenter.us1.list-2Dmanage.com_track_click-3Fu-3D30699762a3826bbf132818652-26id-3D988c032f3f-26e-3Dc6de6c7fd0&amp;d=DwMFaQ&amp;c=WO-RGvefibhHBZq3fL85hQ&amp;r=OItGXn4rLkJFn1pUn1Fh9XSbO_qbiTqsyGb_mvLAvgw&amp;m=Y0r3FXEw9NBtKArdevUPMehms2Tp916tncpQgBpRE5g&amp;s=V8E7DIHybCqLXLFBSWaPNbqdL94gIj6xBvZR1w_XEd4&amp;e=" rel="noopener noreferrer" target="_blank"><strong>(Brookings Institution Press)</strong></a>&nbsp;attempts to chart out a path forward for a new digital social contract to establish better economic equity.</p><p><br></p><p>Key findings in&nbsp;<em>Terms of Disservice</em>&nbsp;include:</p><ul><li><strong>“The exploitative rake of data and attention on the path to natural monopoly”</strong>: The dominant internet firms deal in a novel currency with consumers based on data and attention — and through it have become natural monopolies.</li><li><strong>“The radical commercialization of decision making”</strong>: Personal data is collected at a mind-blowing rate and level of granularity. Internet firms engage in radical, “commercialized bias” — and have marketized the segmentation and splicing of society.</li><li><strong>“The dilemma of attending to content policy reform”</strong>: Our immediate attention to matters of content policy reform is misplaced; the more important target in the realm of Big Tech reform is fundamental economic regulation of the industry.</li></ul><p><br></p>]]></description>
                <author><![CDATA[Shorenstein Center on Media, Politics and Public Policy]]></author>
                <pubDate>Sun, 06 Dec 2020 11:23:09 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[Corona crisis fuels hate against Chinese on Twitter: Commentary]]></title>
                <link>https://www.beyond-eve.com/en/technicalarticles/corona-crisis-fuels-hate-against-chinese-on-twitter-commentary</link>
                <description><![CDATA[<p><strong>The COVID-19 pandemic has the world firmly in its grip with millions of confirmed cases worldwide and whole countries in full or partial lockdown. Despite calls for solidarity across borders and countless local support initiatives, various incidents prove that the corona outbreak has also given rise to a series of racist attacks against Chinese people and people with Asian looking features both on the streets and in social media networks. A team of researchers from the Potsdam Institute for Climate Impact Research has now investigated the use of discriminating language against Chinese people in the context of the COVID-19 pandemic on Twitter. </strong> </p><p>“When normally we analyse societal effects of weather extremes, now we used the corona outbreak as a study case to better understand social responses to extreme events”, explains Leonie Wenz, author and researcher at the Potsdam Institute for Climate Impact Research (PIK). “Using social media data, we basically counted English language tweets containing a set of key word combinations like Chinese AND Corona AND hate and examined the evolution of this daily tweet count since the corona outbreak”, Annika Stechemesser, co- author and also at PIK, adds. “The picture thus unveiled was quite striking: On March 11th the WHO declared the COVID-19 outbreak a pandemic; stock markets around the world crashed –and within the first half of March, the number of offensive tweets in our dataset increased by more than 1000%”, she lays out.</p>]]></description>
                <author><![CDATA[PIK Potsdam Institut für Klimafolgenforschung]]></author>
                <pubDate>Sat, 12 Dec 2020 18:34:30 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[Commercial Content Moderation during the Pandemic]]></title>
                <link>https://www.beyond-eve.com/en/events/commercial-content-moderation-during-the-pandemic</link>
                <description><![CDATA[<p>As we clumsily shift our lives online, the cracks in the information infrastructure are bursting open. While there’s been an uptick in boosting trusted content by credible sources, like the Center for Disease Control and the World Health Organization, there has simultaneously been sweeping purges of advertisements seeking to capitalize on the crisis and suspicious accounts, leaving us to wonder who’s heard and who’s harmed in the current infodemic. Amidst this sliding scale of uncertainty, we turn to leading voices in the field,&nbsp;UCLA professors<strong>&nbsp;Safiya Umoja Noble, PhD&nbsp;</strong>and<strong>&nbsp;Sarah T. Roberts</strong>,&nbsp;<strong>PhD&nbsp;</strong>and&nbsp;<em>Washington Post</em>&nbsp;Reporter,<strong>&nbsp;Elizabeth Dwoskin,</strong>&nbsp;who have been taking stock of how commercial content is being moderated during the pandemic.&nbsp;</p><p><br></p><p><strong>Safiya Umoja Noble</strong> is an Associate Professor at the University of California, Los Angeles, in the Department of Information Studies and serves as the Co-Director of the UCLA Center for Critical Internet Inquiry. She is the author of a best-selling book on racist and sexist algorithmic bias in search engines titled:&nbsp;<a href="https://nyupress.org/9781479837243/algorithms-of-oppression/" rel="noopener noreferrer" target="_blank"><em>Algorithms of Oppression: How Search Engines Reinforce Racism</em></a>.</p><p><br></p><p><strong>Sarah T. Roberts</strong> serves as an Assistant Professor of Information Studies at UCLA’s School of Education and Information Studies. Roberts is a leading authority on “commercial content moderation”, the term she coined to describe the work of those responsible for making sure the photos, videos and stories posted to commercial websites fit within legal, ethical and the site’s own guidelines and standards.&nbsp;Her book,&nbsp;<a href="https://yalebooks.yale.edu/book/9780300235883/behind-screen" rel="noopener noreferrer" target="_blank">Behind the Screen: Content Moderation in the Shadows of Social Media</a>, was released on Yale University Press in 2019.</p><p><br></p><p><strong>Elizabeth Dwoskin</strong>, a Silicon Valley correspondent at&nbsp;<em>The Washington Post,</em>&nbsp;covers the rise of data mining, machine learning and AI throughout the tech industry and in the economy at large. Dwoskin’s recent articles – from&nbsp;<a href="https://www.washingtonpost.com/technology/2020/04/10/apple-google-tracking-coronavirus/?itid=ap_elizabethdwoskin" rel="noopener noreferrer" target="_blank">smartphone apps</a>&nbsp;that map infection pathways to new trends in consumer habits that give way to greater&nbsp;<a href="https://www.washingtonpost.com/technology/2020/04/27/big-tech-coronavirus-winners/?itid=ap_elizabethdwoskin" rel="noopener noreferrer" target="_blank">market monopolization&nbsp;</a>– offer readers around the world fresh insight on what’s at play amid the coronavirus pandemic.</p>]]></description>
                <author><![CDATA[Shorenstein Center on Media, Politics and Public Policy]]></author>
                <pubDate>Sat, 05 Dec 2020 22:52:21 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[Don’t panic. It’s just the collapse of neoliberalism.]]></title>
                <link>https://www.beyond-eve.com/en/events/dont-panic-its-just-the-collapse-of-neoliberalism</link>
                <description><![CDATA[<p><strong><em>Part of the speaker series on misinformation, co-sponsored by the&nbsp;</em></strong><a href="https://web.northeastern.edu/nulab/" rel="noopener noreferrer" target="_blank"><strong><em>NULab</em></strong></a><strong><em>&nbsp;at Northeastern University.</em></strong></p><p><strong>&nbsp;</strong></p><p><strong>Yochai Benkler</strong> is the Berkman Professor of Entrepreneurial Legal Studies at Harvard Law School, and faculty co-director of the Berkman Klein Center for Internet and Society at Harvard University. Since the 1990s he has played a role in characterizing the role of information commons and decentralized collaboration to innovation, information production, and freedom in the networked economy and society. His books include&nbsp;<strong><em>The Wealth of Networks: How social production transforms markets and freedom</em></strong>&nbsp;(Yale University Press 2006), which won academic awards from the American Political Science Association, the American Sociological Association, and the McGannon award for social and ethical relevance in communications. In 2012 he received a lifetime achievement award from Oxford University in recognition of his contribution to the study and public understanding of the Internet and information goods. His work is socially engaged, winning him the Ford Foundation Visionaries Award in 2011, the Electronic Frontier Foundation’s Pioneer Award for 2007, and the Public Knowledge IP3 Award in 2006. It is also anchored in the realities of markets, cited as “perhaps the best work yet about the fast moving, enthusiast-driven Internet” by the Financial Times and named best business book about the future in 2006 by Strategy and Business. Benkler has advised governments and international organizations on innovation policy and telecommunications, and serves on the boards or advisory boards of several nonprofits engaged in working towards an open society. </p><p><br></p><p><em>His work can be freely accessed at </em><a href="http://www.benkler.org" rel="noopener noreferrer" target="_blank"><em>http://www.benkler.org</em></a><em>.</em></p>]]></description>
                <author><![CDATA[Shorenstein Center on Media, Politics and Public Policy]]></author>
                <pubDate>Sun, 06 Dec 2020 11:46:20 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[Rasmus Kleis Nielsen: The power of platforms and how publishers adapt]]></title>
                <link>https://www.beyond-eve.com/en/events/rasmus-kleis-nielsen-the-power-of-platforms-and-how-publishers-adapt</link>
                <description><![CDATA[<p>Today, more people follow the news via platform companies like Facebook and Google than via any news organisation in human history, and smaller platforms like Twitter serve news to more people than all but the biggest publishers. Most news content is still produced by professional journalists. But the way in which we discover it and the distribution of the content is changing rapidly. But who decides what is going to be displayed and what not? And who profits from our behaviour? All this goes along with the increasing use of search engines, social media, and the like for news.</p><p>In this lecture, Rasmus Kleis Nielsen will revisit the history of the first twenty years of relations between platforms and news publishers to identify the underlying dynamics that have shaped the development of our digital society, and will shape it for years to come. He argues that publishers have – sometimes reluctantly, but often actively <strong>–</strong> fueled the rise of platform companies by embracing the very real opportunities they provide. This is the case even though they also challenge publishers’ historically dominant position by competing for attention and advertising and by controlling key parts of the infrastructure of free expression. In the process publishers, like the rest of us, become increasingly empowered by and dependent upon a small number of centrally placed and powerful platforms.</p><p><a href="https://rasmuskleisnielsen.net/about/" rel="noopener noreferrer" target="_blank"><strong>Rasmus Kleis Nielsen</strong></a> is Director at the Reuters Institute for the Study of Journalism and Professor of Political Communication at the University of Oxford and Editor-in-Chief of the <a href="http://hij.sagepub.com/" rel="noopener noreferrer" target="_blank">International Journal of Press/Politics</a>. Most of his research deals with news media organisations and their ongoing transformations, changing forms of digital media use in political and news-related contexts, political communication and campaign practices. He is involved in a wide range of different comparative research projects around the future of news, the changing business of journalism and the rise of digital media.</p>]]></description>
                <author><![CDATA[Alexander von Humboldt Institute for Internet and Society (HIIG)  <info@hiig.de>]]></author>
                <pubDate>Fri, 18 Dec 2020 13:05:12 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[Who Governs the Internet?]]></title>
                <link>https://www.beyond-eve.com/en/technicalarticles/who-governs-the-internet</link>
                <description><![CDATA[<p>Based on the guiding principle „digital policy means social policy“, this publication follows the idea that internet governance affects everyone. An open, free and global Internet is vital for all. Therefore, infrastructures for surveillance and censorship should not be established.</p><p>This publication gives an overview of actors and areas of action and stresses that collective engagement is needed more than ever to further develop Internet governance, to strengthen multistakerholderism as well as multilateralism and to hinder the fragmentation of the net. The publication was created by iRights.Lab on behalf the FES.</p><p><a href="http://library.fes.de/pdf-files/akademie/15917.pdf" rel="noopener noreferrer" target="_blank">Here</a> you find the online version of "Who Governs the Internet?"</p>]]></description>
                <author><![CDATA[Friedrich Ebert Stiftung]]></author>
                <pubDate>Sat, 12 Dec 2020 19:43:49 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[The Politics of Difference: Race, Technology, and Inclusion]]></title>
                <link>https://www.beyond-eve.com/en/events/the-politics-of-difference-race-technology-and-inclusion</link>
                <description><![CDATA[<p>The <a href="https://shorensteincenter.org/about-us/areas-of-focus/technology-social-change/" rel="noopener noreferrer" target="_blank">Technology and Social Change Research Project</a> and the <a href="https://shorensteincenter.org/about-us/areas-of-focus/news-equity-race-gender/" rel="noopener noreferrer" target="_blank">Initiative for Institutional Anti-Racism and Accountability</a> – both core research projects at the Shorenstein Center – recently co-sponsored an event at the IOP JFK Jr. Forum on “The Politics of Difference: Race, Technology, and Inclusion.”</p><p><br></p><p>Panelists included: </p><ul><li><strong>Prof. Khalil Gibran Muhammad</strong>, faculty director of the Initiative for Institutional Anti-Racism and Accountability and Professor of History, Race, and Public Policy at the Kennedy School</li><li><strong>Prof. Ruha Benjamin, </strong>Associate Professor of African American Studies at Princeton University</li><li><strong>Latoya Peterson</strong>, journalist, digital media consultant, co-founder of Racialicious, and current Director of Culture at Glow Up Games</li></ul><p><br></p><p>Moderator: <strong>Dr. Joan Donovan, </strong>Research Director of the Technology and Social Change Research Project</p>]]></description>
                <author><![CDATA[Shorenstein Center on Media, Politics and Public Policy]]></author>
                <pubDate>Sun, 06 Dec 2020 11:28:09 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[Who determines the rules of public discourse on the Internet?]]></title>
                <link>https://www.beyond-eve.com/en/events/who-determines-the-rules-of-public-discourse-on-the-internet</link>
                <description><![CDATA[What rules structure our online world? How is determinded what works? And how do social actors behave?
 
Today, billions of people exchange not only holiday pictures, but also false reports, hostility and hate comments on the Internet. In this way, they also determine the themes and tone of public discourse. With the Network Enforcement Act (NetzDG), the state has attempted to create rules for the moderation of Internet content. However, since its introduction, the law has met with criticism because it has led platforms to rigid deleted content, thus restricting Internet users' freedom of expression. For their part, social media and platforms try to regulate content with their own community standards and general terms and conditions. So, who and what determines what we actually get to see online?]]></description>
                <author><![CDATA[The Leibniz Institute for Media Research │ Hans-Bredow-Institut (HBI) <info@hans-bredow-institut.de>]]></author>
                <pubDate>Tue, 01 Dec 2020 15:33:21 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[Research Monitor Microtargeting]]></title>
                <link>https://www.beyond-eve.com/en/technicalarticles/ten-years-after-the-global-food-price-crisis-10</link>
                <description><![CDATA[<h3>The iRights.Lab think tank produces a regular Research Monitor on behalf of the State Media Authority of North Rhine-Westphalia.</h3><p><br></p><p>German and European researchers have thus far dealt only tentatively with the topic of microtargeting in election campaigns. Most of the research projects and scientific papers on the subject are from the USA. Since Barack Obama’s election campaign in 2008 at the latest, it has become clear that both Democrats and Republicans in the US are employing massively data-driven processes in their election campaigns. In the paper State of Research: Microtargeting in Germany and Europe, we summarize the current expert debate on microtargeting in political communication, point to gaps in the research and provide suggestions on where new work is needed.</p><p>The paper was commissioned by the <strong>Landesanstalt für Medien NRW</strong>. The publication can be <a href="https://www.medienanstalt-nrw.de/fileadmin/user_upload/lfm-nrw/Foerderung/Forschung/Dateien_Forschung/Forschungsmonitoring_Microtargeting_Deutschland_Europa.pdf" rel="noopener noreferrer" target="_blank">downloaded</a> (German) free of charge from the Media Authority’s website and from our own.</p>]]></description>
                <author><![CDATA[iRights.Lab GmbH <kontakt@irights-lab.de>]]></author>
                <pubDate>Sat, 12 Dec 2020 17:29:46 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[How Apps on Android Share Data with Facebook - Report]]></title>
                <link>https://www.beyond-eve.com/en/events/how-apps-on-android-share-data-with-facebook-report</link>
                <description><![CDATA[<p>Previous research has shown how 42.55 percent of free apps on the Google Play store could share data with Facebook, making Facebook the second most prevalent third-party tracker after Google’s parent company Alphabet. In this report, Privacy International illustrates what this data sharing looks like in practice, particularly for people who do not have a Facebook account.</p><p><br></p><p>This question of whether Facebook gathers information about users who are not signed in or do not have an account was raised in the aftermath of the Cambridge Analytica scandal by lawmakers in hearings in the United States and in Europe. Discussions, as well as previous fines by Data Protection Authorities about the tracking of non-users, however, often focus on the tracking that happens on websites. Much less is known about the data that the company receives from apps. For these reasons, in this report we raise questions about transparency and use of app data that we consider timely and important.</p><p><br></p><p>Facebook routinely tracks users, non-users and logged-out users outside its platform through Facebook Business Tools. App developers share data with Facebook through the Facebook Software Development Kit (SDK), a set of software development tools that help developers build apps for a specific operating system. Using the free and open source software tool called "mitmproxy", an interactive HTTPS proxy, Privacy International has analyzed the data that 34 apps on Android, each with an install base from 10 to 500 million, transmit to Facebook through the Facebook SDK.</p>]]></description>
                <author><![CDATA[Privacy International]]></author>
                <pubDate>Sat, 05 Dec 2020 19:44:19 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[Comment about Mark Zuckerbergs „Independent Governance and Oversight“]]></title>
                <link>https://www.beyond-eve.com/en/technicalarticles/comment-about-mark-zuckerbergs-independent-governance-and-oversight</link>
                <description><![CDATA[<p><strong>Why Zuckerberg’s “Independent Governance and Oversight” board is not gonna fly (but some of his other ideas are at least worth discussing)</strong> Mark Zuckerberg is all for regulation of social media all of sudden. What’s wrong with that picture? In an almost 5,000 word “blog post”, Zuckerberg (plus we assume two dozen or so of the company’s public policy hacks and lawyers) has laid out Facebook’s idea of how to deal with the crisis the company is facing. The article’s titled “A Blueprint for Content Governance and Enforcement” and structured in 9 parts:</p><p><br></p><p>1. Community Standards</p><p>2. Proactively Identifying Harmful Content</p><p>3. Discouraging Borderline Content</p><p>4. Giving People Control and Allowing More Content</p><p>5. Addressing Algorithmic Bias</p><p>6. Building an Appeals Process</p><p>7. Independent Governance and Oversight</p><p>8. Creating Transparency and Enabling Research</p><p>9. Working Together on Regulation</p><p>...</p>]]></description>
                <author><![CDATA[AlgorithmWatch gGmbH <info@algorithmwatch.org>]]></author>
                <pubDate>Sat, 12 Dec 2020 21:21:10 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[The Time is Right for Europe to Take the Lead in Global Internet Governance]]></title>
                <link>https://www.beyond-eve.com/en/technicalarticles/the-time-is-right-for-europe-to-take-the-lead-in-global-internet-governance</link>
                <description><![CDATA[<p>Europe is a key normative power. Its legitimacy as a force for ensuring the reign of rule of law in international relations is unparalleled. It also packs an economic punch. In data protection and the fight against cybercrime, European norms have been successfully globalized. The time is right to take the next step: Europe must now become the international normative leader for developing a new deal on internet governance. To ensure this, European powers should commit to rules that work in security, economic development and human rights on the internet and implement them in a reinvigorated IGF.</p>]]></description>
                <author><![CDATA[Normative Orders <office@normativeorders.net>]]></author>
                <pubDate>Sat, 12 Dec 2020 21:29:11 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[Conceptualizing the Future of Democracy: Combining Representation and Participatory Innovations]]></title>
                <link>https://www.beyond-eve.com/en/events/conceptualizing-the-future-of-democracy-combining-representation-and-participatory-innovations</link>
                <description><![CDATA[In the light of declining voter-turnout, party membership and trust in representative institutions, the democratic institutions developed in the 19th and 20th centuries, seem to be somewhat out of touch with the popular demands in current societies. This leads some authors to diagnose a crisis of democracy, or even the “death of democracy” (Keane 2009). At the same time, citizens strongly support the concept of democracy. Thus, rather than democracy itself being obsolete, we seem to witness a “process of transition from one type to another“ (Schmitter 2015). Yet, how should the future of democracy look like?
The debate on how to conceptualize hybrid systems of representative and participatory institutions is ongoing. Systemic approaches to designing mixed systems are scarce (Warren 2017), but en vogue. In the roundtable, we will follow this approach and discuss the future of democracy as innovative conceptions of purposeful combinations of representative and participatory institutions fulfilling democratic tasks and being in line with citizens preferences for participation.

Participants:
<strong>Rainer Forst</strong> (Goethe-Universität Frankfurt, Exzellenzcluster "Die Herausbildung normativer Ordnungen")
<strong>Jane Mansbridge</strong> (Harvard University, USA)
<strong>Anne Phillips</strong> (London School of Economics and Political Science, UK)
<strong>Mark Warren</strong> (University of British Columbia, Kanada)
Chair: <strong>Brigitte Geiẞel</strong> (Goethe-Universität Frankfurt)]]></description>
                <author><![CDATA[Normative Orders <office@normativeorders.net>]]></author>
                <pubDate>Tue, 29 Dec 2020 09:24:04 +0100</pubDate>
                            </item>
                    <item>
                <title><![CDATA[Shoshana Zuboff: Surveillance Capitalism and Democracy]]></title>
                <link>https://www.beyond-eve.com/en/events/shoshana-zuboff-surveillance-capitalism-and-democracy</link>
                <description><![CDATA[<p>The collection and analysis of data is changing the way economies operate. Are these changes so fundamental that they can be said to have led to the emergence of a new form of capitalism – surveillance capitalism? If people’s behaviour is made increasingly transparent, do we become a society in which trust is no longer necessary? Are individuals a mere appendage to the digital machine, objects of new mechanisms which reward and punish according to the determinations of private capital? How is social cohesion affected when people become dispensable as a labour force, while their data continues to provide function as a source of value in lucrative new markets that trade in predictions of human behaviour? How should we understand the new quality of power that arises from these unprecedented conditions? What kind of society does it aim to create? And what ramifications will these developments have for the principles of liberal democracy? Will privacy law and anti-trust law be enough? How can we tame what we do not yet understand?</p><p><br></p><p><strong>Shoshana Zuboff</strong> is a social scientist and author of three books, each of which has been recognised as the definitive signal of a new epoch in technological society. Her latest book, The Age of Surveillance Capitalism reveals a world in which technology users are no longer customers but the raw material for an entirely new economic system. Zuboff is the Charles Edward Wilson Professor Emerita at Harvard Business School and was a Faculty Associate at the Berkman Klein Center for Internet and Society at Harvard Law School from 2014 until 2016.</p>]]></description>
                <author><![CDATA[Alexander von Humboldt Institute for Internet and Society (HIIG)  <info@hiig.de>]]></author>
                <pubDate>Fri, 18 Dec 2020 13:16:59 +0100</pubDate>
                            </item>
            </channel>
</rss>
