Abstract
This article aims at showing the similarities between the financial and the tech sectors in their use and reliance on information and algorithms and how such dependency affects their attitude towards regulation. Drawing on Pasquale’s recommendations for reform, it sets out a proposal for a constant and independent scrutiny of internet service providers.
This article is a part of special theme on The Black Box Society. To see a full list of all articles in this special theme, please click here: https://journals.sagepub.com/page/bds/collections/revisitingtheblackboxsociety
Frank Pasquale’s The Black Box Society, published in 2015, addresses fundamental questions raised by the deployment of technology in our society and the ways in which it affects individuals in their daily life.
Crucially, Frank Pasquale reveals the analogy between the finance and the tech sectors in ways that were not perceptible before. Chapters 4 and 5 in particular, respectively entitled “Finance’s Algorithms: The Emperor’s New Codes” and “Watching (And Improving) the Watchers” explore and analyze the use of algorithms in the finance industry. Its promises, methods, and disastrous effects echoed the tech sector in our days.
Additionally, the book identifies the essential traits of the tech sector’s reform. The response must meet high standards of independence, rely on rigorous methodology, and must address challenges in terms of access to information.
This commentary article aims at showing the similarities between the financial and the tech sectors in their use and reliance on information and algorithms and how such dependency affects their attitude towards regulation. Then, drawing on Pasquale’s recommendations for reform, it sets out a proposal for a constant and independent scrutiny of internet service providers.
Mirroring the finance sector
The main characteristics of the financial sector echo the tech sector.
First, despite a difference in nature, “transactions between entities” (Pasquale, 2015: 169) define these sectors. Transactions involve money in the former, information in the latter between a company and a consumer in exchange for access to a service. Relationship creates wealth, and attention becomes a key factor to generate or preserve revenue.
Second, in both areas, “information advantage” is important and “arms race can get expensive” (Pasquale, 2015: 130). The competitive ascendancy of major firms rests on their capacity to collect and provide the most accurate information about their customers.
Originally, the use of algorithms promised to end bias and arbitrary decisions (Pasquale, 2015: 136), rationalize decision-making processes, instill expertise through the financial system (Pasquale, 2015: 102), and lead to unlimited prosperity. Yet methods were “biased toward reinforcing certain hierarchies of wealth and attention” (Pasquale, 2015: 188). In both instances, the black box, although complex and uncontrolled, was synonymous with wealth in a world of scarcity. Its apparent simplicity, through a disguised attractiveness and intuitiveness, justified unquestioned adoption.
Thus, prosperity rapidly became conditioned by secrecy, opacity, and uncertainty (Pasquale, 2015: 138), up to becoming part of the “culture” (Pasquale, 2015: 187) of these industries. Financial institutions as well as tech companies make profit by “keeping [people] in the dark” (Pasquale, 2015: 187) regarding decision-making processes. Corporations have a propensity to conceal information from the public about their practices (Pasquale, 2015: 176). The crisis that sparked in 2008 rapidly revealed that firms were hiding their structure and methods from their customers and the public (Pasquale, 2015: 120). Similarly, the Cambridge Analytica scandal showed that Facebook was dissimulating its business model, its partnership with third parties, and thus its recklessness to preserve the confidentiality of its users’ information.
The “two black box dynamics” (2015: 111) that Pasquale describes regarding the financial sector accurately applies to the tech sector. When abuses are revealed, companies rapidly apologize and promise to reexamine their procedures to prevent future wrongdoing. Standard & Poor’s, when faced with a scandal, publicly stated about its willingness to “update[e] its models” (Pasquale, 2015: 111). The recent scandals over the practices at Facebook offered the same scenario. At first, the company’s CEO, Mark Zuckerberg, is prompt to apologize and to recognize his responsibility in the process. He then promises to review and update procedures to preserve user’s interests.
Yet, in practice, improvements are limited in scope, disappointingly inefficient or simply inexistent.
Why? Because both industries are marked by volatility and potential unlimited gains, which incentivize them to take risks. Thus, companies assume that the benefits would outweigh risks so as to justify little reflexive effort to limit negative effects.
Another troubling similarity between both industries is their capacity to connect with governments and lawmakers in order to refrain lawmakers from adopting strong regulation. Silicon Valley is often seen as ‘too big to fail’ and too powerful to be regulated—or even “‘too big for trial’” (Pasquale, 2015: 178)—especially given the intense international competition in the field. Any attempt to regulate would undermine innovation by creating undue burdens on leading companies, as argued. The “laissez-faire movement” (Pasquale, 2015: 104) found another appropriate application with the tech sector.
Indeed, like financial institutions, major tech companies have gained such a power that they influence the entire economy. As Pasquale puts it, “Internet and finance firms “set the standards” for our information economy” (2015: 187). This leverage plays at both the macro and micro levels because tech companies “set the standards by which businesses and people are judged” (Pasquale, 2015: 141). Indeed, reputation becomes a central metric of adjustment on the market, as it increases or restrains one’s capacity to conduct business or engage in interactions or transactions.
Yet Big Data remains “unexplained and unchallengeable” (Pasquale, 2015: 149). Control requires traceability of data. With regard to the finance sector, Pasquale asks two fundamental and provocative questions that are perfectly applicable to the information economy: “How many really know the ultimate destinations of their dollars?” (2015: 127–128) and “who would trust a pilot who ignored his own instruments?” (2015: 126). Similarly, users do not know “the ultimate destinations” of their data. As AI slowly gets out of control, the absence of factors to determine who did what and when reduces one’s chances to challenge the process and protect her rights and interests.
The now long list of scandals involving major tech corporations created a necessity to “rebuild public confidence” (Pasquale, 2015: 133). To address this challenge, one must remember that data is relational. This perspective allows us to understand the fiduciary duties of tech companies (Pasquale, 2015: 168), especially social media platforms, when it comes to collecting, analyzing, using, and sharing data.
The Cyborg Finance that Pasquale identified could well have a Cyborg Social Media sibling today. Its reform will take vast coordination and effort.
The reform
As many governments, experts and leading scholars call on tech companies to assume their responsibility as regards to users and the public, Pasquale outlines and recommends key steps to make this happen.
First, just like financial institutions, major tech companies have reached a level of power that make them hard to be competed with. As a consequence, Pasquale invites regulators to “challeng[e] their rules, not tr[y] to keep ahead of them” (2015: 187).
What does this response look like? The title of Chapter 5 says it clearly: it consists in “watching the watchers”. In other words, constant scrutiny. “Someone needs to watch exactly how they are watching other people” (2015: 157), Pasquale writes. This relies on an assumption that auditing leads to better regulation (Pasquale, 2015: 151) and responsibility (Pasquale, 2015: 159). The recent calls by Silicon Valley leaders to be regulated prove this approach is relevant. They know that this “game [is] worth the candle” (Pasquale, 2015: 143).
Regulators in the United States and Europe should be auditing tech companies’ systems, but lack resources or political willingness to do so. As for Wall Street in the past, governments now recognize “the need for fast, flexible “quick looks” at suspect business practices” (Pasquale, 2015: 162). Pasquale recommends investing in “precrisis surveillance and enforcement” if governments want to avoid the critical consequences of “postcrisis litigation” (2015: 179).
Who then should operate this scrutiny? Just like in finance, a strong voice is needed to regulate the tech sector (Pasquale, 2015: 135). Pasquale considers that an independent group of experts should be entrusted with this mission of “qualified transparency” (2015: 142). Only a rigorous methodology and independence can restore the conditions of trust on the market and among the public.
An independent institution is needed to set up methods for identifying early signs of problematic practices and “monitor complex Internet firms” (Pasquale, 2015: 168). This type of surveillance is not only useful but also welcome by the tech sector. Therefore, Pasquale considers “disclosure and auditing (…) as salutary extensions of current business practices” (2015: 162).
This important work raises many challenges. Access to information or quality thereof requires a fine-grain strategy to circumvent difficulties such as a lack of details and standards, or confidentiality agreements (Pasquale, 2015: 169–170). Pasquale identifies these issues in regard to the work of the OFR (Office of Financial Research) whose mission is to “promote financial stability by looking across the financial system to measure and analyze risks, perform essential research, and collect and standardize financial data” (Office of Financial Research, 2010). To successfully apply this scheme to the tech sector, an organization will need “computer scientists, programmers, and other experts capable of understanding exactly how algorithms have changed over time, and how directives from top management might influence what is always portrayed as a scientific, technical, and neutral process” (Pasquale, 2015: 165).
Digitalization of the economy allowed every transaction to be recorded. The principle of log records could be applied to the tech sector in order to allow regulators and investigators to examine processes and practices. On companies’ side, this solution implies disclosures to selected experts.
This work could help lawmakers and regulators to adjust and adopt regulations based on constant and independent scrutiny. This new organization would make its reports public and help create new standards for the internet.
Frank Pasquale’s book inspired me an essential project. After reading The Black Box Society, I founded 21 Mirrors, a nonprofit organization aimed at analyzing, rating and reporting to the public about the policies and practices of social media, web browsers and email services regarding their actual and potential consequences on freedom of expression, privacy, and due process. To reach this goal, we rely on qualitative criteria (such as the use of end-to-end encryption and face recognition techniques) gathered in six categories: identity management, freedom of expression, privacy protection, cybersecurity, due process, and policies. Our work utilizes the reputational risk of digital companies as leverage to prompt them to improve data protection policies. 21 Mirrors will provide lawmakers, entrepreneurs, and the public with a reliable source of information to develop new policies or applications with an enhanced level of human rights protection.
Users mistrust both governments and private companies, and as a result, self-regulation by tech companies is not a viable option. It is also unclear that regulation at the national level would be better. Regulations of limited domestic reach could result in dozens of overlapping and contradictory regulatory regimes: 193 different internets for 193 different countries. This is why intervention by an independent third party like 21 Mirrors is all the more important to protect the benefits of the internet while preserving civil liberties. Our organization gathers interdisciplinary experts—including Frank Pasquale—from across continents and remains independent from the tech industry. Moreover, our procedures are made public and provide conflict of interest and recusal rules to ensure our impartiality.
Implementing this ratings system will not require legislation at the national level. It can be done immediately and include a diversity of perspectives. Its purpose is not to disclose trade secrets or dictate which company should or should not exist but to translate the duties of care and loyalty into standards of good practice. 21 Mirrors can help inform this new regulatory push, ensure continued oversight and accountability, and provide a way to measure results.
One book can have groundbreaking consequences, and it takes a leading thinker like Frank Pasquale to inspire reform.
Footnotes
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.
