Abstract
The challenges posed by the attention economy have sparked calls for political regulation, as users’ struggles with self-regulation have become well-documented. However, there is a lack of research on the feasibility of political intervention as a solution. This article examines to what extent political regulation can effectively address the issues created by the attention economy, and seeks to identify potential regulatory barriers. Using a mixed-method approach, the study includes a survey of users’ opinions on regulating the attention economy, qualitative interviews with key stakeholder groups to provide context for the survey data, and a document analysis that covers both national and supranational levels. Theoretically, the analysis builds on disconnection studies, media policy research, and studies of the attention economy. The findings highlight that the main regulatory barriers include insufficient public support, low trust, and inadequate regulatory measures. Since political processes do not occur in isolation and require backing from the public and collaboration with key stakeholders, these barriers hold significant weight. The key theoretical implication of this study is that research on regulation of the attention economy should recalibrate its expectations regarding regulatory possibilities within liberal democracies. Future studies should further explore the underlying mechanisms that impede effective regulation.
Introduction
While digital media offers connectedness and access to information, it also leads to a backlash against its use. Individuals and society are becoming increasingly aware of the disadvantages of digital media (Albris et al., 2024; Jansson and Adams, 2021; Kuntsman and Miyake, 2019; Morozov, 2014). As a result, our relationship with digital technology can be described as ambivalent, and we often struggle to manage intrusive media (Fast, 2021; Syvertsen, 2020). Although self-regulation practices are standard, they often prove ineffective, sometimes leading to a negative cycle of self-blame (Ytre-Arne et al., 2020). A growing body of research advocates for political regulation as the attention economy drives advanced mechanisms to keep users logged on (Bhargava and Velasquez, 2021; Zuboff, 2019). However, the research on the regulation of digital platforms and the attention economy lacks empirical studies of the realism of these suggestions. This article aims to contribute to this research gap by asking: To what extent is political regulation of the attention economy feasible in liberal democratic countries, and what are the primary challenges and regulatory dilemmas in legitimating and implementing measures to reduce the harmful effects of the attention economy?
The study draws on a mixed-method approach based on three datasets: a national survey of users’ opinions about the regulation of the attention economy and digital media use, qualitative interviews with representatives for the main stakeholder groups to illustrate the quantitative findings, and document analysis of law drafts, legislations, and white papers, from an aim to demonstrate the state-of-the-art within political regulation to reduce the unwanted consequences of the attention economy.
Literature review
The attention economy and calls for regulation
The ambivalence toward media technology, digital connection, and the power of the attention economy has grown and become a prominent topic of academic research and public debate (Haidt, 2024; Lembke, 2021; Syvertsen, 2020; Williams, 2018). The advantages of digital connection are increasingly weighted against the disadvantages, such as “technological solutionism” (Morozov, 2014), “digital surveillance” (Zuboff, 2019), “social mediafication” (Kuntsman and Miyake, 2019), “smartphone addiction” (Ting and Chen, 2020), and “digital entanglement” (Jansson and Adams, 2021). The concept of an attention economy is founded on economic principles of scarcity of resources and demand versus supply (Davenport and Beck, 2001; Simon, 1971). In this economy, the currency is human attention: “With slot machines, we pay with our money. With technologies in the attention economy, we pay with our attention” (Williams, 2018). A major implication of the attention economy is heavy pressure on individuals to stay connected, caused by advanced techniques to increase screen time, such as surveillance and personalization (Zuboff, 2019). Accordingly, users struggle to balance their offline and online presence; even though users occasionally mark resistance to the attention economy through disconnection strategies such as digital detox, the lasting effects are limited and might even cause self-blame because of failed efforts (Beattie, 2020; Odell, 2019; Syvertsen, 2022a; Ytre-Arne et al., 2020).
From a historical perspective, self-regulation has replaced collective responsibility, as seen in broadcasting regulations, and media resistance is increasingly commodified, with individuals emphasising the need and relevance of self-control regarding digital media use (Figueiras and Brites, 2022; Syvertsen, 2020). The users’ struggle against the pressure from tech giants has enforced discussions of the attention economy, which tend to call for political and regulatory interventions. Political regulation is requested in studies of social media addiction (Bhargava and Velasquez, 2021), digital disconnection (Kuntsman and Miyake, 2019), and surveillance economy (Zuboff, 2019). Contributions in addiction research urge policymakers to treat social media addiction as a moral problem with “unique ethical concerns not raised by other, more familiar addictive products, such as alcohol and cigarettes” (Bhargava and Velasquez, 2021). Disconnection studies argue that individual responsibilization should be replaced with collective policies: “the decision about disengagement is never truly ours” in a world “where ‘smart things’ are everywhere, and always already networked, one’s possibility of escaping the digital, in shared private, as well as public spaces, in education and health care, in public transport and border controls, is diminishing rapidly” (Kuntsman and Miyake, 2019: 910). Kaun and Treré (2020) argue that disconnection acquires an even more vital political relevance, given that “our attention is systematically exploited and competed for by tech conglomerates through increasingly sophisticated techniques” (p. 711). From the surveillance capitalism perspective, the need to regulate is legitimized mainly by the need to protect users’ data and privacy. Zuboff (2019) argues that there is an absence of democratic checks on the power of surveillance capitalists.
Key takeaway points from the above discussion are that scholars in disconnection, social media addiction, and privacy law call for collective policy measures to reduce problems caused by the attention economy. The question is to what degree there is any realism in these calls, given the contemporary tech-friendly political agenda in Western liberal democracies.
Lobbyism and big-tech
Because politics is not developed in a vacuum but in dialog, it is key to investigate the role of stakeholders such as voters, activists, and lobbyists. In an environment with weighty critique of big tech, a politician might find it opportunistic to promise increased political intervention, such as Elisabeth Warren’s campaign to “break up big tech” in the 2020 US presidential election as a response to concerns about the power concentration created by the attention economy. However, such interventions are easier said than done; big tech has remained robust and seemingly unbreakable despite massive debates, lawsuits, and hearings in recent years.
Big tech is powerful partly because of the thriving attention economy, which provides steady streams of revenue and lobbying power. The tech sector, dominated by Google, Meta, Apple, and Amazon, spends more than 113 million euros on lobbying in Brussels annually because the investments pay off (Lobby Control, 2023). Lobbying has significant implications for media laws and policies, increasing the sector’s power to influence industry-friendly legislation. Lobbying involves pressuring public officials to support laws and policies that benefit lobbyists’ clients; large tech companies lobby the US government on antitrust, privacy, and surveillance regulation (Calabrese and Mihal, 2011; Popiel, 2018).
Digital capitalism replicates rather than disrupts analog capitalism, which is evident, for example, in the winner-takes-all logic and the tendencies of monopoly status (Freedman, 2013). The justification of lobbying is founded on the ideological basis that “the centrality of the tech sector provides economic growth, the self-evident imperative of sustaining the internet economy, and regulation as the enemy of innovation, and by extension, social progress” (Popiel, 2018). Technological determinism, “internet-centrism” (Morozov, 2014), and libertarianism fuel the ideological machine that lobbies for the tech sector’s regulatory autonomy (Popiel, 2018). Companies insist on self-regulation, meaning they are responsible for developing rules and codes of ethics rather than be politically regulated. These lobbying efforts have additionally been fueled by big tech’s embeddedness in politics with growing centrality in political campaigning (Kreiss and McGregor, 2018) and by techno-libertarianism and digital solutionism, implying that technology will benefit society and solve social problems (Morozov, 2014; Popiel, 2018). From a techno-libertarian perspective, technology and regulation might seem like adversaries; “Technology symbolizes markets, enterprise, and growth, while regulation represents government, bureaucracy, and limits to growth” (Wiener, 2004: 1).
However, digital platforms are more than technology; they are media and communication companies, and in that regard, regulation is not unthinkable. The European tradition for policy-making and regulating media technologies is based on ideas of the public interest, equal playing field, securing national culture and language, and protecting vulnerable groups (Freedman, 2013). The Nordic region is characterized by a media regulation that supports positive free speech and protects the public against commercial and authoritarian interests, conceptualized as a “media welfare state” (Syvertsen et al., 2014). In contrast, the US media system is a product of deregulation policies and corporate influence (Pickard, 2014), and a typical pattern is that laws and regulations are implemented as a reactive process. For example, Facebook faced penalties and new privacy restrictions after the 2016 Cambridge Analytica scandal (Federal Trade Commission, 2019). The deregulated media markt in the US explains why most big tech companies are based there and not in Europe, but the consequences are significant for the European media regulation, because the global big tech challenges both national and supranational regulation regimes.
Key takeaway points from this discussion include that the attention economy has primarily been developed in a regulatory vacuum, meaning that companies have leeway to innovate, expand, and grow in a political culture where these values are dominant. Given that the national regulatory frameworks are challenged in the context of the attention economy, it is particularly relevant to study the role of Norwegian politicians, but also the role of the European Union as a supranational regulatory body.
Method and data collection
This section outlines the methodological design, detailing the data collection, sampling procedures, and analytical strategies for the selected methods. First, the decision to choose Norway as a case study is based on three key factors: its regulatory tradition as a “media welfare state,” as discussed above, the high level of digitalization in the Nordic region, and the widespread use of digital devices (Syvertsen, 2020). Additionally, there are reports of increasing overuse of these devices among the population. In 2019, half of the respondents indicated spending too much time online, which rose to two-thirds by 2023, according to a survey conducted in both years by the Digitox project. 1
Second, this study utilizes a mixed-methods approach, integrating survey data, qualitative interviews, and document analysis in a “parallel” design to investigate the regulation of the attention economy (Cohen et al., 2017; Gorard and Taylor, 2004). The survey of users was designed to gather quantitative data on public support for the regulation of the attention economy, as well as the public’s level of trust in politicians and industry stakeholders to take responsibility for such regulation. Conducted by the analysis company Kantar, the survey sampled participants from their online panel, Gallup Panel, specifically targeting the Norwegian population aged 18 years and older. Kantar TNS ensured the accuracy of the data through weighting. 2 The survey contained various questions concerning digital overuse, disconnection, and regulation. The corpus analyzed in this study focuses on responses to six specific survey questions regarding regulation and accountability. 3 The analysis employed SPSS software for a multivariate approach. Key variables of interest included age, education level, and political orientation, as the study aimed to examine how these factors influenced attitudes toward the regulation of the attention economy. For assessing political orientation, the GAL-TAN scale was selected for its effectiveness in capturing the cultural and economic dimensions of political positioning, which is especially relevant given the complexity of the Norwegian party system.
Although survey data offer valuable quantitative insights, qualitative interviews were additionally conducted to explore motivations and perceptions in greater depth (Lindlof and Taylor, 2017). A purposive sampling approach was used to recruit informants, providing insight into how politicians, industry leaders, and users perceive digital overuse and the challenges of regulation. The interview corpus analyzed in this study includes six politicians with responsibilities related to digitalization, six industry leaders from companies that provide digital services and media platforms, and six users actively engaging in or organizing digital disconnection activities. Semi-structured interviews were conducted via a video service platform from 2019 to 2021, lasting between 30 and 60 minutes. These interviews were recorded with the participants’ consent, transcribed verbatim, and approved by the informants. The data was analyzed systematically and thematically, focusing on the informants’ views regarding who is responsible for regulating the attention economy.
As a third method, document analysis was employed to examine the feasibility of regulations reflected in policy documents related to digital media. Documents have several advantages over other sources. They are easily accessible, can be retrieved quickly, and provide firsthand insights into policy processes and the positions of various stakeholders. When analysing documents, it is important to recognise that they are not neutral; rather, they are socially and contextually constructed (Karppinnen & Moe, 2019: 251–252). The sampling strategy was designed to analyze documents and policy proposals related to the attention economy within the European Union (EU) and Norway, covering the period from 2022 to 2024. The final corpus includes five white papers, laws, and official recommendations. 4 The analysis examines the policies themselves, the reasoning behind the regulatory measures, and the challenges associated with regulating the attention economy. The sampling strategy aimed to analyze documents and policy proposals concerning the attention economy within the European Union (EU) and Norway from 2022 to 2024. The final corpus includes five white papers, laws, and official recommendations. The analysis focuses on the policies themselves, the rationale behind the regulatory measures, and the dilemmas surrounding the regulation of the attention economy.
In summary, the chosen methods are designed to triangulate findings, offering a comprehensive and nuanced exploration of the potential for political regulation in the attention economy. By integrating insights from users, politicians, and industry stakeholders, we can better understand this complex issue and, in turn, contribute to informed regulatory solutions.
Key findings: Regulatory dilemmas
This section examines the barriers to regulation of big tech and digital overload through the lens of three key stakeholders: first, the users, who engage as both consumers and citizens, balancing personal consumption preferences with civic responsibilities; second, the industry, which navigates the interplay between market-driven imperatives and its obligations to social responsibility; and third, the politicians, who must align the pursuit of public endorsement with garnering support from both the electorate and the industry.
Users: Lack of support for political responsibility
In the context of a pervasive “tech lash” and debates around addictive algorithms, along with reports that two-thirds of the population acknowledges excessive online engagement and difficulties in self-regulation, this study anticipated a public inclination toward policy initiatives to mitigate compulsive digital connectivity and foster environments conducive to disconnection. Contrary to expectations, the survey revealed a pronounced reluctance among users toward placing responsibility on the political level to initiate any governmental intervention in digital media consumption. When queried on their agreement that “politicians ought to assume greater responsibility in curtailing Internet and mobile phone overuse”, a mere 6% of the respondents agreed. Consequently, a substantial majority displayed vital to moderate resistance to the concept of political regulation of digital media usage.
A deeper examination of the survey data disclosed significant correlations between respondents’ support for political responsibilization and their gender, generational affiliation, education level, and political leaning; first, gender distinctions were evident, with male participants exhibiting more excellent resistance to political intervention than their female counterparts; only 5% of male respondents wholly endorsed the notion of increased political responsibility for reducing digital overconsumption, compared to 8% for female respondents. In a parallel pattern, 28% of males disagreed with the statement, whereas 20% of females did so. Potential rationales for these gender discrepancies may include women’s traditionally more extensive social obligations concerning family and children’s activities and a heightened entanglement in digital media that corresponds with the struggle for an online-offline balance (Lai, 2023).
Second, the results demonstrate a significant generation gap regarding how supportive the users are of the idea of increased political responsibility for reducing digital overconsumption; the age group under 30 years is considerably more reluctant to support political responsibilization compared to the average population with only 3% of these age group entirely agreeing with the statement, (compared to 6% of the total population). Respondents above 60 years were less reluctant, given that only 14% partly disagree, against 20% of the total population. Even though it has not been specified what kind of responsibility the politicians should take, the youngest respondents are highly skeptical and regard it as unlikely that this responsibility could imply anything of positive value for them. In contrast, the oldest respondents are more open to the idea. A conceivable explanation is that users who grew up with the mobile phone as a companion or extended “body part” (Mason et al., 2022; Oksman and Rautiainen, 2002) are less comfortable with the idea of it being regulated than older people, who have lived analog lives and are more used to media regulation for the public interest.
A third factor of influence is socio-economic, referring most clearly to the informants’ level of education, which corresponds with the degree of support for the responsibilization of the authorities for disconnection. The higher the level of education, the more supportive the respondents are of the idea that politicians should be responsible. As much as 11% of respondents with more than 4 years of higher education support the statement that the authorities should take more responsibility, while none of the respondents (0%) with elementary school endorsed this statement and just 2% of those with vocational college.
Lastly, the multivariate analysis connects the informants’ views on political regulation and their general political leaning and party preference. On a generalized level, the traditional way of dividing political parties into right versus left-leaning parties is helpful in that right-leaning voters are significantly more reluctant toward government regulation than left-leaning voters, and the more right-wing, the more reluctant toward political regulation. The conservative-liberal Progress Party votes were most negative toward political responsibilization; only 12% agreed wholly or partly, which is less than half of the average population, while 46% said they strongly disagree with the statement, which is close to double the total population.
The GAL-TAN political spectrum might be helpful because disconnection is an emerging political topic involving political ideologies based on social values rather than economic ones. The data shows that the GAL (Green/Alternative/Libertarian) parties (the Green Party, the Red Party, the Labour Party, and the Socialist Left Party) are more supportive of political regulation compared to some of the TAN (Traditional/Authoritarian/Nationalist) parties (The Conservative Party, The Progress Party, and the Centre Party). However, the Christian Democratic Party, also classified as TAN, is more in line with the GAL parties, indicating that concerns for family values, the party’s critical issue, are threatened by digital overload.
Looking at the results and the demographic variables, what seems to be lacking is a clear message to the politicians. While some respondents, typically women over 30 years with higher education, support the Green party, which encourages politicians to take responsibility, the central tendency was reluctance. The survey results demonstrate that users are reluctant to support political regulation of disconnection. Considering the emerging discourse around the overuse of digital media and users’ struggle to self-regulate, we have unpacked a profound regulatory dilemma, which can be formulated as such: Despite a consensus on the invasive and problematic nature of digital media, there is no unified endorsement of legislative solutions and political intervention.
The qualitative interviews with users and activists similarly indicate a reluctance to embrace the idea of political regulation and skepticism of the realism of policy to reduce the attention economy and its effects on users. When asked whether regulators should do more to address the challenges posed by intrusive media and technology, respondents expressed limited confidence in their ability to make a difference. One informant remarked, “I do not think it is realistic; how would they even do it?” (see also Syvertsen, 2022b: 667). Overall, the qualitative and quantitative data suggest a lack of faith in the authorities’ capacity to mitigate the harmful effects of the attention economy, even within a highly regulated context such as the Nordic region.
Tech-industry: Lack of trust in industry responsibilization
Given that there is no unified endorsement of political intervention, we might have to look elsewhere for the answer to who is responsible. A key question in this section is to what degree the tech industry will likely take responsibility for reducing digital media overuse. The tech industry has been scrutinized for several matters, including mishandling privacy, microtargeting for commercial and political purposes, and spreading disinformation. In the context of this paper, the most relevant aspect is how behavior-based services and algorithms are designed to prevent users from disconnecting.
The European Union has several initiatives to regulate the tech industry, partly in response to national regulations in, for example, France and Germany, to prevent fragmentation in the EU’s internal market. The EU has warned against countries making their laws without coordinating them with the EU (Mason et al., 2022; Oksman & Rautiainen, 2002). The most recent initiatives are the Digital Services Act (DSA), which was implemented as a law in January 2024 in all member states, including the EEA, and the Artificial Intelligence (AI) Act, which the European Parliament officially approved on March 13, 2024, and by the Council on May 21, 2024. The most important measures to protect consumers in the DSA are privacy, targeted advertising, behavioral marketing, dark patterns, algorithm control, and illegal content. Some measures involve bans, others involve transparency, and some include actions that platform companies must take. Norway has proactively banned behavioral marketing (Datatilsynet, 2023). The result of the DSA is that behavioral marketing toward children is prohibited, and behavioral marketing based on specific user data, such as ethnicity, sexual orientation, and political beliefs, is also prohibited (European Commission, n.d.).
The AI Act is relevant for understanding how the EU seeks to regulate the attention economy by reducing the mechanisms used to capture and retain user attention, often through personalized content and advertising. First, the AI Act categorizes AI features into four risk categories, each with corresponding regulatory requirements. Most importantly, high-risk AI systems would be strictly regulated, such as features used in the attention economy to manipulate behavior or influence decision-making. The Act prohibits AI systems from deploying subliminal techniques or exploiting vulnerabilities to manipulate behavior in ways that could harm individuals, such as regulating how AI can capture user attention. Second, to meet transparency requirements, AI systems should disclose their nature whenever they interact with humans or processes user data.
In addition, members of the European Parliament warn against the addictive features of social media, digital games, streaming services, and online marketplaces and advocate for better safety for consumers. They ask the commission to address legal loopholes and introduce new laws against addictive design (European Parliament, 2023). The report demands an assessment and a ban on harmful addictive techniques not already covered by EU laws, such as the infinite news stream (infinite scroll), default automatic playback, and constant push notifications (European Parliament, 2023). The European Parliament claims that companies should be obliged to develop ethical and fair digital products and services “by design.” To reduce what they define as the addictive nature of platforms and give more power to consumers, the Parliament asks the commissions to present a digital “right not to be disturbed” as soon as possible. The Parliament also asks the commissions to create a list of good design practices, such as “think before you share,” having notifications turned off by default, chronological news streams, grayscale mode, automatic lock, and summary of total screen time. The Parliament also suggests awareness campaigns for safer and healthier online habits (European Parliament, 2023). However, the industry is complex to regulate, so it often boils down to placing the responsibility on the users. Hence, the circle of self-regulation seems closed.
But what about the users? Do they think the industry is ready to take responsibility? If we look at the survey data, most users wonder if there is much realism in the claim that the industry should take responsibility. When asked if they support the claim, “I trust that the media industry is doing what they can to enable users to regulate their digital media usage,” only 3% totally agreed, and 12% partly agreed. Taken together, only 15% has any degree of trust in the industry to solve the problem of digital overload. Regarding age, the group with the lowest trust in the industry was young people under 30 years, of whom zero respondents said they trust the industry to do what they can.
The level of education correlates negatively with the level of trust in the industry to do what they can to help users self-regulate. The higher the level of education, the lower the trust. Only 10% of users with more than 4 years of higher education say that they totally or partly trust the industry, while 33% of users with elementary school trust it. Moreover, looking at income, high-income groups trust the industry more than lower-income groups, which can partly be linked to age, given that young people trust less and often have lower incomes. Looking at the respondents’ political views, those who vote for the conservative parties have more trust in the industry, while the socialist left has less trust in the industry to take responsibility.
Despite variations across socio-demographic groups, the data indicates a lack of trust in the industry’s commitment to reducing the pressure on users to remain constantly connected. This finding aligns closely with qualitative interviews conducted with users and disconnection activists. They argue that US tech companies aim to make people as addicted as possible, noting that “apps are built on psychological principles, functioning like slot machines to encourage continuous use, and their goal is to ensure we carry our phones with us at all times” (see also Syvertsen, 2022b: 677).
Although the global tech industry is lobbying to avoid regulation, we might expect higher support for regulatory measures in the “media welfare state context” based on collaboration between the industry and the regulators. However, qualitative interviews with representatives of the Norwegian media industry tell a different story, as they, in unison, argue against political regulation, as illustrated by these quotes: “I do not think regulation is the solution,” “I do not believe in regulation of the user-side; I am reluctant to limit the users’ freedom to decide.” “The problem is to identify adequate and proportionate measures. I do not think anyone wants a world where you, for example, are restricted to using your iPad 1 hour per day,” “I think users can self-regulate quite well. Parents should control what children are watching and how they use the screen,” and “Given that we want people to take responsibility in their life, I think regulation would reduce this. Children are more vulnerable, so we should take more responsibility to protect them.” Accordingly, the acceptance of political regulation is limited to children, while the rest of the population is supposed to self-regulate.
Politicians: Lack legitimate means to take responsibility
Following the insight into the users’ reluctance and the industry’s rejection of regulation, we are left with the politicians and the question of feasibility and why they are not expected to regulate, even though there is broad agreement that the attention economy is problematic. A key barrier is that politicians lack legitimate and feasible regulatory means to reduce digital overload.
First, according to the survey, most of the Norwegian population believes that political authorities should avoid regulating digital media use because it is mainly an individual responsibility. In total, 63% agree with the statement that politicians should stay away from interventions. In comparison, only 22% of the informants disagree, which is even higher among male respondents and younger age groups. The only group that seems to trust politicians to intervene is informants with higher academic education; the more education they have, the more likely they disagree with the statement that the government should not intervene.
A large majority (83%) of the informants support the statement, which points to the individual as responsible, while only a marginal group disagrees (6%). Political preferences influence the attitude toward individual responsibility. On this point, the right- and left-leaning divide in Norwegian politics seems significant; the right-leaning conservatives regard the individual as responsible, while the more left-leaning voters seem reluctant to place the responsibility on the individual. Consulting the qualitative interviews with Norwegian politicians, we find that they largely reflect the opinion of the population; belief in self-regulation is strongest among the informants on the right, while there is more significant concern about potentially increased class differences caused by the attention economy among the informants on the left. Moreover, there is overall agreement that politicians should avoid regulating digital media usage: “I do not think politics can or should regulate how long you should watch a screen every day” (The Conservative Party). “I have little faith in bans, but I have faith in empowerment and that people get control” (The Liberal Party), “It is not the politicians’ task to restrict mobile phone use through regulation. I think that is outside of what we should be doing” (The Labour Party). Although some politicians, in theory, support regulation, they seem overwhelmed with the practical challenges of identifying adequate regulatory measures (Enli and Fast, 2023). The only legitimate regulatory strategy aimed at adult users is in the department of nudging, meaning that the politicians do not implement traditional regulations but rather inform the population about the harmful sides of, for example, smartphone overuse (Moseley, 2020; Sunstein, 2014). Particularly the politicians representing parties on the GAL scale supported the idea of official information: “If this becomes a real problem, we might consider health campaigns like the ones we have used to inform about health damages of tobacco” (Labour Party), “I think we should educate people to be more conscious in their mobile phone use, perhaps give courses and guide people to help themselves (Socialist Left),” and “Someone should write a white paper on this matter to build knowledge about the potential damages” (The Green Party). Likewise, information was the only political initiative regarding the harm caused by the attention economy, which users found appealing, according to the survey data. While over half the respondents (56%) support the statement: “The authorities should provide more information about the disadvantages of digital media usage,” only 15% disagree. In particular, information about the disadvantages was mainly supported in the socio-demographic group with higher academic education (61%) and the age group of pensioners, 67 years and above (64%). In terms of political preferences, the divide cuts across the traditional left/right divide: the idea of providing more information to the public is supported most strongly among the voters of the Green party (78%), the Christian Democrats (69%), and the Socialist-Left party (67%). While users and politicians primarily rejected the idea of regulation, they agreed that the government could provide information and guidance to help users make healthy choices. This nudging also promotes self-regulation and individual responsibility rather than a collective and public responsibility in which the politicians take an active role. Nevertheless, given that the users and the industry, which has developed digital solutions to the problem of overuse, such as Apple Screen Time, seem to agree with the politicians that information is the most adequate solution, there seems to be a lack of incentives for the politicians to explore more proactive regulatory strategies.
When comparing the European tradition of media policy, which emphasizes public interest, creating an equal playing field, safeguarding national culture and language, and protecting vulnerable groups (Freedman, 2013), with the emerging regulations related to the attention economy, it is clear that the principle of “protecting vulnerable groups” has become more prominent in the new regulations, while the other principles have received less attention.
A significant finding from document analysis, alongside studies of Norwegian policy on the attention economy, indicates that politicians primarily rely on the European Union (EU) as a supranational body to develop measures for regulating digital media use, given the transnational nature of digital platforms (Enli, 2021; Enli and Fast, 2023). Protecting vulnerable groups remains the top priority at both EU and national levels.
So far, the most concrete initiatives at these regulatory levels include regulations concerning screen time and age restrictions. The EU and Norwegian health authorities have issued recommendations for appropriate screen time for children and youth, based on evidence of the potential physical and psychological health effects. International political discussions about these harmful effects have facilitated the enforcement of the existing age limit for accessing social media, which is currently set at 13 years, aiming to reduce screen time and exposure to harmful content for children and adolescents. In February 2024, the Norwegian government introduced a new national recommendation to ban mobile phones in classrooms and on school grounds for primary and secondary schools. 5 This recommendation is based on research indicating that mobile phones hinder concentration, negatively impact learning outcomes, and disrupt the social environment of schools.
Rules and recommendations directed at children are widely accepted as legitimate because of their vulnerable nature. In contrast, justifying equivalent guidelines for adults is a more complex task, as adults are assumed to be accountable for their own media choices. Consequently, politicians find themselves in a challenging environment where regulating the attention economy often conflicts with competing principles. This struggle makes it significantly more difficult to establish legitimacy for these regulations compared to conventional media policies.
Conclusion
In academic fields such as disconnection studies, media policy, and law studies, there has been an increased request for political regulation, urging policymakers to take more responsibility for reducing the harmful effects of the attention economy, such as digital overload (Bhargava and Velasquez, 2021; Kuntsman and Miyake, 2019; Zuboff, 2019). In the context of a “tech lash” in which the harmful sides of digital media usage have become more highlighted, political regulation seems increasingly appealing to scholars and pundits. While a body of studies pinpoint the challenges of digital disconnection on an individual level (Jansson and Adams, 2021; Syvertsen, 2020, Ytre-Arne et al., 2020), what seems to be missing is studies of the hindrances to political regulation. This article compensates for the research gap by investigating the feasibility of political regulation of digital media in Norway by exploring the issue from the perspectives of three stakeholders and their internal relations: (a) the users-to what degree does the public support political regulation; (b) the industry - to what degree do users trust the tech-industry to reduce digital overload, and to what degree does the industry support political regulation? and (c) the politicians - what regulatory measures are legitimate for politicians, and with what rationales, and potential effect?
Three main findings deserve attention in this concluding part: First, public support for political intervention in digital media use is limited and marked by significant demographic divides. Even if part of the population supports regulation, there is no unified support for political intervention. Given that this result originates from a survey conducted on a highly regulated Nordic media market, it could be interpreted as an important correction to the requests for regulation in contemporary research on digital overload and the attention economy. Based on this finding, it seems somewhat naïve to point at the politicians because the lack of support in the population for regulatory initiatives makes it politically challenging, if not impossible. Expanding political regulation to restrict the attention economy and thus also the freedom of the users to choose among available global platforms and services would be possible only with a clear message from the voters and proactive pressure groups. Paradoxically, neither the users nor the activists believe that digital overload should be regulated on a structural and political level because they conceptualize it as an individual problem.
A second finding is that the industry is neither trusted by the users to take responsibility, nor in support of political regulation. This finding demonstrates that the tech industry has gained power outside the existing regulatory frameworks and, rather than relating to politicians, has developed its codes of ethics and claimed to self-regulate (Cusumano et al., 2021; Kokshagina et al., 2023). Accordingly, digital capitalism has weakened regulatory regimes such as the “media welfare state” in tandem with media policy becoming increasingly blurred with policy fields such as telecommunication and information technology, the notion of what should be regulated by the state and with what legitimacy is increasingly contested. The survey shows that users lack trust in the industry to take responsibility for reducing the harmful overuse of digital media, which implies that they believe commercial interests have higher priority than ethical concerns. This lack of trust in the tech industry might be rational, given that digital platforms “pushed back regulation by claiming publicly that regulation was unworkable or destructive, and by threatening to leave the marked” (Kokshagina et al., 2023: 175). This study confirms that tech companies successfully lobby for more industry-friendly legislation (Popiel, 2018). However, it also indicates that lobbying has convinced users that the attention economy is untouchable for media regulators.
The third finding is that politicians are reluctant to regulate the attention economy and struggle to find legitimate policy measures. Based on previous research (Enli, 2021; Enli and Fast, 2023) and empirical evidence from qualitative interviews with Norwegian politicians representing a broad spectrum of political parties, the politicians come across as both reluctant to regulate and unable to identify sufficient regulatory strategies. In the context of a “digital backlash,” several Norwegian politicians and political parties, at least in theory, support regulation and have initiated dialog with the digital platforms. However, except for regulation legitimated by protecting children, defined as vulnerable, they lack measures to intervene in the attention economy without risking harmful side effects for users, ecosystems, and the global digital economy (Petit & Teece, 2021).
This analysis concludes that there is limited support for political intervention, a lack of trust in the industry, and a restricted range of regulatory options. These factors create three interrelated regulatory dilemmas. The first dilemma is that users prefer to self-regulate but find it difficult to do so. The second dilemma is that the industry holds significant power yet is often unwilling to assume responsibility. The third dilemma is that politicians recognize the challenges but lack effective policy measures and regulatory tools.
A key policy implication is that traditional media regulation and its associated measures may not be applicable to the attention economy. Therefore, regulators need to collaborate with both the industry and users to develop effective solutions. The main theoretical implication is that research on disconnection, media policy, and the attention economy should recalibrate its expectations regarding what is feasible for regulators in liberal democracies. Additionally, it is important to explore the underlying mechanisms that hinder effective regulation rather than merely advocating for it.
Footnotes
Funding
The author disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This research was supported financially by The Norwegian Research Council, Grant no. 287563 (2019-23). Project title: Intrusive Media, Ambivalent Users and Digital Detox (Digitox).
