Abstract
Workplaces are increasingly full of complex technologies embedded in dynamic infrastructures demanding workers to assess and understand unanticipated problems. In order to comprehensively appraise the role of technological complexity and the uncertainties it affords in a complex, high-stakes setting, we interviewed and observed members of three interdisciplinary STEM laboratories. Findings revealed that organizational members navigated uncertainty by cultivating ignorant expertise (i.e., not knowing but figuring it out). This form of expertise emerged as a combination of two practices: the practice of emergent troubleshooting and the practice of negotiating new practices. In discussing these findings, we offer three key takeaways. We demonstrate that ignorant expertise: (a) operates as a dialectic of hesitancy and boldness and is mobilized through ignorant yet knowledgeable actions; (b) is communicatively performed through think-out-loud and storytelling techniques, and developing interpersonal rapport with organizational members; and (c) establishes technological complexity as a catalyst for organizing processes.
Work practices in most organizations have evolved to become technically complex as tools such as distributed computing, cloud infrastructure services, and deep learning become increasingly common (Bailey & Barley, 2020; Barbour et al., 2023). As workers try to integrate these technologies with day-to-day practices, they encounter situations that present technical issues they are not equipped to navigate (Heath & Luff, 2000; Suchman, 2006). Such lack of relevant skills and expertise is particularly consequential when advanced technologies are embedded in work settings where dealing with technical complexity is only a means to an end (i.e., not necessarily considered the primary work goal). For example, scientific infrastructure used in STEM labs encompasses a variety of hardware and software equipment. Use of this infrastructure to answer scientific inquiries can be unanticipated and varied. That is, organizational members who deploy such infrastructure in their day-to-day scientific work are not necessarily adept at navigating its technical intricacies. While such workers may possess enough knowledge and expertise to consider engaging with them, it is only by running into issues with such technologies and being exposed to their technological complexity that they develop comprehensive skills to work with them. The goal of this research study is to understand how experts negotiate ignorance and navigate uncertainty while working with complex technologies in such work settings.
Complexity in technological infrastructure can be understood as rooted in multiple technological artifacts (Fleming & Sorenson, 2001; Maguire et al., 2006) assembled in modular, hierarchical, and interdependent ways (Nicolini et al., 2012). However, for the complexity of such technologically advanced settings to visibly and perceptibly manifest, these technologies must integrate with workers’ goals and work practices. We use the term practice to refer to the “doings and sayings” of organizational members working with complex technologies (Nicolini, 2009, p. 1400). Workers’ use of technologies, the financial and intellectual investment in such technologies, and the specialized goals those technologies help achieve make them tremendously critical, particularly for those conducting STEM- (science, technology, engineering, math) related work. Thus, the complexity of technology is not merely defined by the nature of its features and functionality, but should also be understood as rooted in work practices around development, deployment, and maintenance of such infrastructure.
In discussing the use of complex technologies, Hughes (1983) invoked the idea of incompatibilities and “reverse salients” that impact the progress of work and demand “remedial action” (p.80). Hughes (1983) commented: “A reverse salient appears in an expanding system when a component of the system does not march along harmoniously with other components” (p.79). In this context, the term remedial action refers to ensuring harmony in the way technologies embed in a socio-technical system. We argue that this remedial action is often executed and/or assisted by experts and demonstrates (and feeds into) the uncertainty of expert work practices. For example, when internet servers in a specific section of an office building are down, experts have to spend time isolating the point of failure which could be associated with power lines, ethernet cables, or the internet service provider among other possibilities. In other words, when experts grapple with the uncertainties of complex technologies, they have to invent new solutions to navigate incompatibilities among parts of a larger complex technological system. By using the term expert, we invoke the Latin origins of this term i.e., experiri which means “to try” (Online Etymology Dictionary, n.d). Specifically, we elaborate upon Ullman’s (2012) use of the term “ignorant experts” (p. 110) and demonstrate that ignorant expertise is intertwined with the use, development, and maintenance of complex technologies.
We conducted interviews and observations over a period of 16 months across three interdisciplinary applied physics laboratories associated with different North American universities—these labs were in different stages of developing, maintaining, and using complex laser systems and software simulation platforms instrumental in advancing scientific research. We found that experts engaged in two distinct types of practices while working with complex technologies: the practice of emergent troubleshooting, and the practice of negotiating new practices under unexpected situations and unfamiliar conditions. We discuss these findings to extend the meaning of ignorant expertise (i.e., not knowing but figuring it out) and offer three key takeaways with regards to ignorant expertise as a work practice. First, we demonstrate that ignorant expertise operates as a dialectic of hesitancy and boldness wherein expertise is mobilized through ignorant yet knowledgeable actions. Second, we discuss that ignorant expertise is communicatively performed through think-out-loud and storytelling techniques, and through developing interpersonal rapport with organizational members. Finally, we argue that ignorant expertise establishes technological complexity as a catalyst for organizing processes.
Ignorant Expertise
Relational perspectives on expert work practice often center around performance, communication, and trust in expertise (Barley et al., 2022; Treem, 2012; Treem & Leonardi, 2016). Accomplishing specialized tasks, coordinating with other organizational members, and providing information and services inside and outside local organizational settings are few of the many roles and characteristics of expert work practice (Stehr & Grundmann, 2011). On the other hand, cognitivist and object-oriented perspectives on expertise have argued that experts are identified in terms of the objective knowledge they possess in comparison to non-experts; i.e., expertise is something people have (Mieg, 2001). Mieg bridged cognitivist and relational views on expertise and positions expertise as a “social form of interaction” (Mieg, 2001, p. 43) where both objective and relational knowledge is negotiated. Collectively, knowledge and interaction of experts with others is central to theorization of expert practice; i.e., what do experts know, how do they communicate knowledge, and how do others establish trust in expert knowledge.
Similar to relational and cognitivist perspective on expert work, emergent expertise literature has also conceptualized expert work in terms of knowledge application and generation aimed at minimizing knowledge gaps (Roberts, 2013). Explaining how work happens in complex organizational situations where there are multiple sources of ambiguity and “goals and technology are hazy,” Cohen et al. (1972) demonstrated that members invent new solutions addressing immediate and urgent necessities. Snowden and Boone (2007) referred to such complex work contexts as spaces where solutions emerge as experts utilize resources at hand. Faraj and Xiao (2006) identified such expertise among healthcare professionals in a medical trauma response center and argued that expertise emerges from dialogic coordination of distributed expert knowledge. Majchrzak et al. (2007) analyzed impromptu organizing of emergent disaster response groups and theorized their emergent expertise as driven by knowledge flexibility, moderate levels of trust, and simple knowledge coordination mechanisms.
However, Weick (1998) offered an alternative view. He decentered the importance of knowledge in expert work practices and interpreted organizational life as characterized by a lack of order requiring members to cope with “discontinuity” (p. 551). By doing so, Weick moved away from understanding expert work in terms of order or recognizable patterns. Weick (1993) analyzed the work of firefighters during the Mann Gulch wildfire and argued that wisdom rather than knowledge of particular facts was a source of resilience for firefighting crew members. The author invoked Meacham’s (1983) definition of wisdom that builds on the interrelatedness of “ignorance and knowledge [that] grow together” (p. 641). If expert work is situated on a spectrum of knowledge and ignorance, why must the scholarly theorization of these practices center knowledge? We argue that when expert work is theorized from an ignorance-based perspective, trust and legitimization in expert practice gets problematized. In other words, when experts do not know what they are doing, trust in their work cannot be easily established.
This perspective shifts scholarly focus to a window of time in expert practice during which experts do not perform as they might be expected to. Zooming in on this time frame illuminates the kinds of failures experts run into, the ways experts navigate them, and the processes through which those failures inform their successful work. While Faraj and Xiao (2006), Majchrzak et al. (2007), and Weick (1993), closely analyzed situations of emergent expertise, their research settings were crisis laden, time-sensitive and life-threating. Thus, such research does not provide an opportunity to analyze repetitive failures in expert work—an analysis which is necessary for developing an ignorance-based perspective on expert practice.
Ullman (2012) began to explain ignorance in expert work by referring to computer programmers as “ignorant experts” who have to navigate the fussy nature of complex and interdependent software and are not always able to predict the outcomes of their actions (p. 110). The term ignorant experts here indicates that programmers are experts in that they can perform their knowledge in ways that seem to work (to them and to others), and act in ways that would be difficult or impossible for non-experts. However, given the complexity of software engineering practices, programmers are often skeptical about the outcomes of their work. Ullman (2012) wrote, “If people really knew how software got written, I’m not sure if they’d give their money to a bank or get on an airplane ever again” (p. 2). However, software in flight control systems does work, and banks are largely secure. This comment is, therefore, a rhetorical admission of an ignorant way-of-working that ultimately leads to a successful outcome. It reflects programmers’ realization of the messiness and the complexity of their work; a realization that hits them when their modular, bound work takes a tangible visible face such as that of a flight control system. This example demonstrates that ignorance is an inevitable part of expert work, and, when analyzed, helps us understand the messiness of expert work more closely.
Science, engineering, and management literature have articulated that ignorance emerges both from known unknowns and unknown unknowns inherent to technical problem-solving practices (Attenberg et al., 2011). Known knowns are “things we know we don’t know” and unknown unknowns are “things we do not know we don’t know” (Logan, 2009, p. 712). Roberts (2013) categorized this as “ignorance from lack of knowledge” and as different from “ignorance about existing knowledge.” He argued that ignorance about existing knowledge can be defined as: (a) knowable known unknowns; things we know we don’t know but can know, (b) unknown knowns; things we don’t know we know or tacit knowledge, and (c) errors; things we think we know but don’t. Ullman’s (2012) description of programmers as ignorant was motivated by such unknowns that are often triggered by continually evolving situations. In context of the work of organizational members, Roberts (2013) spoke to the relevance of ignorance and argued that it may encourage creativity, innovation, and efficient use of cognitive resources by providing space for trial and error. Such views on ignorance reinforce that in unfamiliar situations, emergent expert work does not simply seek to minimize lack of knowledge (c.f. Majchrzak et al., 2007) and eliminate ignorance but rather to embrace it as a means of improving organizational performance.
Thus, ignorance, markedly different from being ignorant, operates as a “marker of greater expertise” and demonstrates “the ability to know that one does not know (or cannot know)” (Kominsky et al., 2016, p. 32). Parviainen et al. (2021) examined the Finnish government’s response to the COVID-19 pandemic and referred to scientists’ acknowledgement of their lack of knowledge as “epistemic humility” (p. 240) which they claimed was a primary force in effective decision making. Borges et al. (1999) demonstrated that ignorance helped simplify the unpredictability of the stock market and proved effective for certain investment decisions.
The current research study investigates ignorant expertise in technically complex and scientific work environment where expert scientists design and construct the technologies and machines that facilitate scientific research (e.g., Traweek, 1992). As experimental tools, these technologies are not only complicated—in the sense that they are materially intricate—but also ambiguous in the sense that scientists often design them while using them to accomplish their work. Specifically, we focus on the use of technologies that (a) are constituted by multiple interdependent hardware and software modules critical for the technological ecosystem to be useful, (b) are highly customizable which, for practical purposes, blurs the boundary between design and use of those technologies (Jackson et al., 2023; Leonardi, 2009), and (c) have a hierarchically-nested structure that is largely invisible and not comprehensively understood to be fully embedded in day-to-day expert work. By this phrase—hierarchically-nested structure—we refer to the Russian dolls-like architecture of technologies in which components are nested inside one another and are not necessarily visible at the same time (Star & Ruhleder, 1996).
We focus on these characteristics because interdependence between multiple and different kinds of constituting modules is likely to trigger unanticipated incompatibilities across scientists’ goals, because various forms of customization can generate uncertainties around outcomes of technology use across multiple organizational members, and because the invisibility and “embedded” (Nicolini et al., 2012, p. 622) structure of its nested design can create unpredictable failures obscured by a focus on work outcomes (i.e., scientific inquiry). Empirical work has supported the claim that complexity and ignorance are core aspects of such scientific work. Nicolini et al. (2012) studied collaboration around novel biomedical technologies and showed how intricate tools emerged from their intertwined social and technical roles in the production of scientific knowledge (Wray, 2002). Such technologies can serve as a focal point upon which researchers coordinate complex actions and operate as epistemic objects (c.f. Rheinberger, 1997), thus making it difficult to determine what knowledge is applicable and therefore what forms of expertise emerge.
Interdisciplinary scholarship broadly defines “complex systems” as any macro/micro level physical, biological, or social system that has an evolving and dynamic hierarchical structure and multiple interdependent components interacting in unpredictable ways (Simon, 1996; Snowden & Boone, 2007). In organizational research, this meaning of complex has been used to explain a wide range of organizational phenomena such as innovation, knowledge, networks, and communication (Maguire et al., 2006; Monge & Contractor, 2003). For the purposes of this study, we situate the complexity of a technological ecosystem as embedded within organizational members’ work practices in scientific work environments.
In order to explore how ignorant expertise might operate in contexts of more established standards of expertise, and which involve interaction with complex technologies, we ask the following research question: RQ: What role does ignorant expertise, and associated practices, play in contexts characterized by complex technologies?
Methods
Our study took place over 16 months across three interdisciplinary applied physics laboratories in the Eastern and Midwestern United States. Labs were composed of 12, nine, and nine members; they were vertically integrated, including undergraduate and graduate students, postdoctoral researchers, staff scientists, and group leaders. We contacted six different labs that were active in exploring similar scientific domains, and three agreed to participate in this study. The setting of physics work is particularly meaningful for our research question not only because of the centrality of complex technologies, but also because the work necessitated interaction and interdependence with others to accomplish tasks (i.e., individuals can use a microscope independently, but a laser requires multiple people to operate). Lab A and lab C’s work consisted of “hands-on science” involving the creation and maintenance of scientific equipment on which lab members ran scientific experiments. Lab B’s work shared the same scientific focus, but lab B did not have scientific equipment in-house (though the lab was in the process of building equipment during the period of study). Instead, the members of this lab spent their time preparing for and conducting collaborations which involved traveling to other laboratories and using their scientific equipment to run their experiments. Lab C had the scientific equipment available in-house and hosted scientists and students from other labs so they could conduct experiments collaboratively. These labs provided a wide analytical frame with varying stages of creating, maintaining, and using complex technologies across different scientific research contexts; thus, any repetitions and patterns in expert work practices across these settings constitute key observations.
Data Collection
Originally, the research design called for in-person observations of work at the respective institutions of lab A and lab B. However, data collection took place during a period of time when travel and in-person work was restricted due to the COVID-19 pandemic. For health and safety concerns, members of lab A and lab B primarily conducted work from home; access to facilities, including offices and scientific equipment, was severely restricted. Given these limitations, data collection shifted to observations of weekly laboratory team meetings for both lab A and B. While we were able to conduct remote interviews with members of lab C, we could not gain site access for lab C. During the pandemic, all of these labs were able to continue work by arranging scheduled use of lab facilities and conducting collaborative brainstorming activities remotely. During interviews, when members reflected on their experiences working in these labs, they shared that although modes of interaction shifted during the pandemic, this shift did not significantly impact the nature of their work.
Over a four-month period, the second and third authors attended 26 team meetings (16 for lab A and 10 for lab B). These meetings were conducted online using the videoconferencing platform Zoom, and were hosted by each lab’s group leader. Attendees were aware of the researchers’ presence (and provided informed consent), but the researchers did not participate in the meetings beyond a brief introduction at the first meeting attended. During these meetings, the researchers each took notes; one researcher made an effort to capture the specific talk occurring, and the other researcher took notes on relevant aspects of non-verbal communication, tone, or activity. Often meetings involved participants sharing PowerPoint slides; in those instances, efforts were made to capture images from presentations when possible.
Following the period of remote observation, the authors conducted 27 interviews with group members (12 with members of lab A, eight with members of lab B, and seven with members of lab C). These members included senior scientists, senior and junior graduate students, undergraduate students, lab engineers, and lab leaders. Interviews lasted between 33 and 93 minutes and averaged around an hour. During the second year of the research study, we conducted two day-long in-person observations of three lab members from lab A. Out of a total of 27 interviews, eight were conducted in-person during this phase of the study. The in-person observations conducted during this time allowed us to focus on the use of complex technologies by lab members. We shadowed three lab members, observing them both deploy lab equipment to prepare targets for laser experiments and collectively analyze computer code and interpret meanings of its outcomes. We center these technologies in our observation sessions with a goal of “re-presenting practice through foregrounding the active role of tools and materials” (Nicolini, 2009, p. 1402).
A semi-structured interview protocol was used to establish comparable responses from respondents on core issues and allow for follow-ups and probes based on emergent conversational topics. The interview protocol included sections with questions regarding what knowledge and resources individuals felt were most useful in their work, how they perceived their fellow lab members and group leader, what they perceived as challenges in their work, and how they approached difficult decisions they encountered. Following discussions, interviews were transcribed, any identifying information was removed, and pseudonyms were assigned to ensure confidentiality.
Data Analysis
Analysis of the data proceeded iteratively and in parallel with data collection (i.e., we engaged in periods of analysis while still collecting additional data). Researchers met regularly during the period of remote observation to discuss emergent patterns in the lab group discussion, and to identify anything unusual, noteworthy, or puzzling that was observed in the meetings. It was two months into observations that the central role of complex scientific equipment in both labs’ work emerged as a topic of interest. However, the only adjustment made to the research process at this point was to ensure that the interview protocol included questions related to the role and use of this equipment. Upon conclusion of the interviews with lab members, the first and second author engaged in a period of data immersion in which they reviewed all of the transcripts and meeting observations. Both researchers independently developed memos based on concepts of interest and patterns that appeared in the data. A comparison of these memos surfaced ignorant expertise of group members as a distinctive dynamic associated with the work in these labs.
Then, the first author engaged in a process of focused coding of the interview transcripts to identify passages that referred to both the use of complex scientific equipment or the processes with which lab members developed expertise. In parallel, the second author engaged in a similar process using the notes from the meeting observations. The two researchers then reviewed the collective comments from respondents in interviews and meetings to identify themes associated with the research question. We generated 3923 codes across 52 documents (interview transcripts and meeting observation notes). Part of these open codes informed findings presented in this manuscript (see Table 3 in Supplemental Material for the list of open codes and focused codes that inform our findings). Following Owen (1984), themes were determined by assessing the recurrence, repetition, and forcefulness of the communication of lab members. As such, the themes presented are neither exhaustive of all communication related to scientific equipment, nor reflective of only the most frequent communication topics. Rather, the themes reflect the aspects that, per the respondents, were most central and essential to their work. We took Owen’s approach to generation of themes because the research questions are focused on highlighting the practices around complex technologies. The recurrence, repetition, and forcefulness of themes was reflected through communication of lab members during lab meetings, interviews, and in-person observations. Collectively, these indicated the embeddedness of themes in their actions and mindsets in day-to-day work. Also, repetition of themes across various lab members’ communication demonstrated the shared practices derived from intersubjective experiences. The goal in this analysis was to surface and reflect an emic view of the meaning of ignorant expertise and role of complex technologies within these labs and among its organizational members.
Findings
In addressing our research question, we found two primary practices central to the work of ignorant experts: (a) the practice of emergent troubleshooting i.e., running into technical failures, figuring them out, and fixing them when lab equipment worked in unexpected ways; and (b) the practice of negotiating new practices i.e., being able to work with different kinds of lasers or limited access to time and material resources. Lab members mentioned that succeeding in this work demanded explicit acknowledgement of technical ignorance followed by strategic troubleshooting. We discuss these practices in detail in the following sections and provide additional support of the themes in Table 1, Table 2, and Table 3 (Online Supplement).
The Practice of Emergent Troubleshooting: Figuring out How to Fail and Fix
While working with lasers and computer simulation programs, the process of failing, isolating the cause of failures, and fixing those failures was a prominent and repetitive work practice. In the interviews, lab meetings, and in-person troubleshooting sessions, when respondents talked about failure (or something “not working”), they mostly were not referring to overall experiments failing, but rather to routine daily failures. Such failures included instances when a resistor would burn out, when computer code/program returned errors or uninterpretable results, or when they could not get focused images on a camera. The respondents said that on a day-to-day basis, these types of small failures were commonplace. For example, during a meeting of lab A, a graduate student shared an experience he had while conducting an experiment in the physical lab space, a technically complex and a highly coveted setting: “We had lots of issues early in the week. Problems with [equipment], and getting the right [configuration]. We also had problems setting the [element] – like a wire disconnecting itself and us not finding it for a day.” When he expressed this story—recounting how a single disconnected wire on a multi-ton, multi-million-dollar machine took a full day to diagnose—none of the eight other individuals in the meeting expressed any surprise, concern, or criticism. There was an understanding that when complex machinery is designed, built, and operated in these labs, its technological appendages are bound to break and fail—trial and error are a part of operating such complex machinery.
Furthermore, expert work across all levels of experience exhibited similar characteristics. A senior graduate student from lab B said: [N]o one comes in an expert on anything. Even the colleague I have who worked at [a renowned company] … were like, “I have all this technical ability and I know how to build these machines …, but I still don’t know what I’m doing.”… No matter what skills you have, because we’re doing this thing from scratch, you have to build new skills.
Speaking to the ignorant nature of expert technical work across all experience levels, interviewees rarely discussed meaningful differences in the technical expertise among lab members; in fact, on multiple occasions, interviewees were explicitly dismissive of the distinctiveness of technical knowledge. Instead, individuals mentioned how they valued the ability of individuals to adapt, see possibilities, and reformulate problems to facilitate work progress. Operationally, the meaning of expert work was constituted as the ability to keep progressing on work in the face of continued obstacles introduced by failed, unfamiliar, and/or inaccessible components of lab equipment.
It seemed, then, that the nature of problems was not of particular importance; what mattered was whether an individual could address an issue and continue to make progress in their work. In meetings of lab A and lab B, respondents would frequently mention, “playing around,” “fiddling with,” or “messing around” in an effort to find a solution to emergent problems. In an interview, an undergraduate student from lab A recalled observing a faculty member who developed a creative but temporary workaround for an issue: This one was a short-term solution that he just wanted to [run the experiment] and wanted to divert the problem for the time being … you do what works, and I think that thinking is definitely pretty pervasive … when we are first getting things to work.
In one of the weekly meetings of lab B, a graduate student said: “[We] are working with high-speed camera and struggling pretty hard. We just couldn’t seem to figure out … The settings on the camera are really weird. Hopefully we can mess around a bit more.” This comment, among others, is not indicative of a definitive solution or even a specific plan of action going forward. It is, rather, an explicit acknowledgement of confusion accompanied by a sense of direction.
During an observational session of lab A, an undergraduate student was trying to prepare targets for experiments. One step of this task was cutting target wires at precise lengths. As the student made adjustments for the cutting, he used a sticky note to see where the laser beam would point and then made sure that the beam track was aligned with that of the wire. Using a sticky note was an ingenious idea that allowed him to line up the beam track precisely. After the procedure was done, it left an iridescent color on the wire. When the lab engineer told him to use a device called ultrasonic cleaner he asked, “what are you talking about?,” not knowing that such a device existed. This story demonstrates that the student was expert enough to operate complex machinery autonomously and it was only by running into issues that he gained new knowledge. Failing and fixing was not only a way for organizational members to move forward but also unraveled the complexities of the technology infrastructure.
Thus, failure is intertwined with experimentation and technology in such a way that when everything runs smoothly, it limits opportunities for learning. This creates a paradox in which lab members are often not as intrigued by problems that have distinct and clear answers because they do not afford as much opportunity to develop and demonstrate problem-solving skills. At one lab A meeting, a former lab member who had moved to a corporate research setting elaborated on this idea. The member mentioned that his hands-on experience working on experiments in the lab led to an advantage over his peers who did not work directly with complex machines needed to run those experiments. He said: Running code is kind of a technical skill, but the underlying thing you need to succeed in that field is not making the computer run, but understanding all the underlying things, how the experiments work, and understanding the real life constraints. So, I could more easily work along with an experimental team and understand what they are doing.
Echoing the value of this emergent learning with technology, a senior graduate student from lab B indicated that they gained knowledge by repeatedly fixing and experimenting with complex machinery. He said: I’ve taken apart and put back together … the little pulser in the lab, hundreds of times. So, it’s like I know every single little curve on that machine inside and out … And so, when something goes wrong, I usually kind of have an idea of where it went wrong, which is nice.
Because these failures occurred in the context of specific aspects of complex machines, fixing problems exposed individuals to the machines’ hierarchically nested structure and served as a way to materially deconstruct the abstract complexity of scientific work.
Moreover, the perception was that anyone could develop technical skills, but the ability to adapt to—and thrive amidst—persistent and complex everyday problems was critical. Many lab members reflected on the respect and appreciation fellow lab members garnered when they were able to quickly solve complex problems and remove bottlenecks in running an experiment or a simulation—be it through an actual scientific device or a software package. When asked about what makes a good collaborator, a senior graduate student from lab C highlighted the importance of quick thinking: “[Y]ou should plan for base goals … stretch goals … and contingencies and be ready to solve problems as they come up. So, someone who’s kind of quick on their feet is pretty important for experimental work.” In discussing what it meant to be “good,” a senior scientist from lab C mentioned that one develops an intuition over time as they gain expertise in working with experimental set-ups. Recalling the distinctive abilities of senior members who had moved to other positions outside of the lab due to lack of funding, this senior member said: [T]he intuition of taking a quick look at a design and knowing, “Oh, you didn’t take into account thermal lensing. Your thing’s going to break. It’s going to drill through a mirror and you need to redesign it” … All of those little things that become intuition once you’ve been doing it long enough, we’ve lost that. I wish we had more of that now, because that means a whole lot to a group like ours.
The skills valued as expertise in the labs were described in a generic vocabulary rather than a domain-specific vocabulary tied to the outputs of scientific work (i.e., an academic paper that would be produced based on experimental results). For example, phrases like “quick thinking,” “being good with hands,” “smart hacks,” and “out of the box thinking” were often referenced. One member from lab A described his efforts to seal off an opening in a pipe this way: “The vacuum holds but electrically I am not sure if it works, so do we want to just go with that?” Lab members needed to fix things quickly even when they did not have a comprehensive understanding of how those fixes might hold up given that they were rudimentary and incomplete. However, ensuring that the fixes were usable enough demonstrated a deep understanding of the science being pursued.
This practice of failing and fixing was so pervasive that it also blurred disciplinary boundaries around designated tasks. Lab members did whatever it took to conduct the experiments and they were expected to acquire, develop, and demonstrate whatever situated knowledge was necessary to solve the immediate problem. As a result, lab members often drew on knowledge from different aspects of engineering (mechanical, electrical), chemistry, carpentry, welding, logistics, and plumbing. Multiple senior and junior lab members commented on the blurred boundaries of domain expert work in these labs. For example, a senior member from lab C said: So the most important skill is … troubleshooting …you have to use your brain and try to figure out what’s actually happening … in our group, we love to rebuild everything ourselves …we had a laser that has … [a] water leak and we have to assemble all the water cooling … repair it from beginning … So, you work as a plumber for some time.
Conducting experiments, writing code, or even analyzing data constituted a small fraction of the work of lab members, both in terms of time and effort. They framed their problem-solving work more generally as troubleshooting; not physics-related, programming-related, or plumbing-related troubleshooting, but simply troubleshooting.
In referring to the domain-agnostic nature of troubleshooting, a graduate student from lab C revealed that this emergent work influenced their learning and problem-solving approaches. She said: “I might have to set up something like this all by myself … I want to understand how each piece works … I’m going to be making a decision about these very basic things.” This comment demonstrated how lab members recognized the breadth of knowledge they would need to rely upon in conducting lab work, as well as the diversity of problems they would need to address. This domain-agnostic nature of problems and expert work indicates that while the niche areas of expertise mattered in the context of individual research projects, they took a backseat when it came to figuring out immediate day-to-day problems related to, for example, plumbing. Commenting on the situated, idiosyncratic, and unpredictable nature of troubleshooting, a graduate student from the lab B said that “once the machine is running, I’m not totally sure what my day is going to look like.” An undergraduate student from same lab explained the meaning of troubleshooting: “It’s just up to you to figure it out, it’s up to you to sit down and bang your head against the wall until it suddenly makes sense.” Talking about their troubleshooting practices around emergent failures, a laser technician from lab C explained: “that’s kind of where we have to backtrack everything that we’ve learned up to a point … there’s a lot of things that do go wrong.”
These comments indicate several things about the inherent and inevitable challenges that emerge in this work. First, the possibilities of failures are many and unique, making it difficult to predict outcomes. Second, individuals often find themselves using unfamiliar equipment or procedures. Third, the unpredictability of work renders much of the potential negotiations of technical knowledge moot and introduces an element of ignorance in expert work practices.
The Practice of Negotiating New Practices: Figuring out How to Work under Material and Time Constraints
The size and cost of lasers along with the varied research agendas of scientists and graduate students made it inevitable that lab facilities would host different kinds of lasers in various phases of their life cycle—designing, building, operating, and maintaining. Consequently, members of all three labs visited other facilities to use specialized machines and equipment needed to address their niche research questions and gather corresponding data.
Because lab members moved a lot between facilities to collect data and to conduct experiments, the skills and expertise they had were almost always contextual—they could be rendered important or redundant the very moment they started operating in a different facility. When asked about the kinds of knowledge required to do their job well, a member from lab A said, “I think interpersonal skills are very important because almost all of our work is at some external facility.” One lab member shared with his peers regarding the obstacles he faced working with unfamiliar equipment at another facility: It was pretty cool of them to let us do that [extra test] because of issues on actual [experiment] day. That got cool results. I found out that there is an offset … that isn’t on any documentation and people neglected to tell me when I asked.
The important point to note here is that not being aware of the idiosyncrasies of equipment almost compromised this individual’s work. It demonstrates that the expertise of lab members did not move with them as they themselves physically moved across different facilities that were geographically distributed. In other words, expertise was constructed when knowledge-bearing individuals were physically present around the material that made that knowledge relevant.
Thus, not only were lab members unable to move their expertise with them, they also needed to figure out how to transfer their work practices from their existing lab to other labs. Work practices were different between big and small facilities—some facilities allowed additional support in setting up machinery such as a vacuum set-up and other specific scientific equipment, but in smaller facilities this support was not available. Lab members needed to develop interpersonal skills and a strong network of members from unfamiliar host facilities who would help them adopt associated new work practices. When lab members were already familiar with people in a lab they were visiting, it helped them avoid mistakes such as entering restricted spaces or inadvertently altering settings on equipment. During meetings, every time an individual mentioned an upcoming visit to another facility, the lab director would provide at least one name of someone they should contact prior to the visit. In this way, travel to other facilities became an important way for individuals to both broaden their work networks and get their work done.
As a consequence, the ability to operate out of different lab facilities was noted as a sought-after skill—it facilitated and signaled a wider array of skills in operating machinery and a higher likelihood of knowing other scientists in those labs. In meetings, when it was mentioned that a member of a lab was visiting another facility, there would usually be small cheers or expressions of excitement from other lab members. When one lab member was asked in an interview how other group members might perceive her expertise, she specifically mentioned her ability to conduct work at other facilities: “I have done experiments at three or four different facilities … [and] know some of the ins and outs of a lot of different facilities.” Being asked to work in other locations and with specialized machines signaled confidence in one’s ability to work beyond the confines of the home lab and without the direct supervision and support of the lab director. Moreover, a lab member’s range of options in engaging with research projects was dependent on the kind of facility/equipment those projects could access over an extended period of time. Such stakes in access to a facility and its equipment were especially high when the life cycle of experiments extended over years—designing and running experiments, collecting and analyzing data, and potentially repeating experiments if needed.
However, it was not just the need to operate out of a different lab that made lab members reinvent their practices. We observed a lab engineer from lab A working with an undergraduate student. While they both worked to properly assign the right length of targets to their corresponding categories, the lab engineer said, “all right, I don’t know what these people are doing … Let’s halt. Do the 1.5 and then start over.” After spending some more time trying to categorize the targets, the lab engineer said: “I am slightly overwhelmed.” This lab engineer was an experienced senior lab member and yet they were having a difficult time categorizing wires. The complexity associated with the work—in this case the differing settings and requirements of different experiments—necessitated ongoing, emergent learning even for experienced workers.
Another frequent occurrence was the need for individuals to adjust their work to account for material limitations within their own labs, especially when they encountered constraints acquiring specific materials for experiments or accessing computer servers that could run necessary simulations and process data. Such constraints were evident in conversations during lab meetings when individuals mentioned not being able to secure equipment with enough strength, cameras with enough resolution and frames per second, and materials with the right characteristics. In such cases, lab members had to adapt and redesign their approach in order to get needed results. Oftentimes, lab members had to abandon or adjust a plan because it was taking too long; material constraints led to time constraints. This was most apparent when lab members conducted simulation work in which the relevant programming language (Python) was incredibly slow. However, use of Python for simulation work was common across multiple labs and facilities. On multiple occasions individuals mentioned reducing the fidelity of their simulations because the results related to the original planned test were taking too long.
Given time and material constraints, lab members had the option to either persevere through time-consuming approaches to problem solving or abandon them in favor of quicker hacks and/or asking for help. Understanding when to practice which approach was a valued form of judgment they identified in peers. This philosophy manifested directly and indirectly in lab interactions where debates around reusing versus “reinventing the wheel” were common, and abandonment of projects was expected and allowed for in favor of time and research interests. A lab engineer from lab A recounted working with more junior group members who wanted to try to solve all of the problems that arose: I’ve had to intervene many times … There are five grad students that know exactly how to do this thing … [b]ut they won’t ask for help. And part of it is because I think that that’s just how they learn … They all want to reinvent the wheel. And they all can. But sometimes we don’t have time for that … [I]t’s not a matter of them not being able to … it’s a matter of we are on a time constraint.
This observation highlights the type of knowledge valued in this context. Individuals could have dedicated time to understanding each technical issue that arose in detail and worked on developing a reliable solution for that specific breakdown, but this would require halting pursuit of the main task. Instead, experienced lab members advocated an approach that focused on fixing situated problems to allow work to move forward. Senior members also expected lab equipment to break, because such possibilities could not be controlled for. Through encouraging lab members to try things out and ask questions, senior members inculcated the culture of not knowing but figuring it out.
Discussion
Our findings indicate that not-knowing and being uncertain defines the epistemic disposition of organizational members working with complex technologies, no matter their level of experience. However, experts’ uncertainty in such settings does not equate to a lack of technical knowledge. We found that lab members were incredibly knowledgeable, by virtue of both hands-on experience and formalized instruction. However, the applicability of their technical knowledge was not always certain or guaranteed. Lab members’ uncertainty was often rooted in their work with complex technologies, which (a) reduced the generalizability of their technical knowledge in different contexts, (b) demanded homogenization of interdisciplinary knowledge as physicists engaged with plumbing, electrical wiring, and software programming within the scope of a single technical problem or failure, and (c) imposed intense interpretive and isolating work when causes of technical failures needed to be pinpointed. In this technical work, the practices associated with ignorant expertise represent a paradox that made experts hesitant about applicability of their situated technical knowledge but also encouraged them to be bold and take calculated risks to address emergent issues.
The value of this mode of work was particularly evident in members’ acknowledgement of their ignorance with regards to future uncertainties. These included uncertainty about what the day would look like when a machine runs or how to stop the resistors from blowing up in subsequent experiments, or how a lab visit would turn out for data collection. These uncertainties meaningfully took shape during the time when members acted on their “theoretical hunches” and developed “a sense of the kind of theories” that could explain “puzzling outcomes” (Schön, 1983, p. 176). This finding is reflective of both ignorance from lack of knowledge and ignorance about existing knowledge (Roberts, 2013). In the following sections, we compare ignorant expertise—as a practice—with existing literature on reflection-in-action (Schön, 1983) to explain why this ignorant way-of-working is effective for experts working with complex technologies.
Ignorant Expertise Operates as a Dialectic of Hesitancy and Boldness
Schön (1983) theorized this process of experts coping with uncertain situations as reflection-in-action, i.e., the practice of engaging in a reflective conversation with the situation. For example, he demonstrated how expert social workers deviated from learned approaches and adopted novel ideas while attempting to address clients’ situated needs. While this idea of reflection-in-action broadly defines expert work in terms of practitioners’ relationship with a situation, it does not explain practitioners’ relationship with their physical and digital instruments which can be marked by much trial and error and time and material constraints. Lévi-Strauss (1966) used the term bricolage to explain a similar process of making do. He explained that a bricoleur manages the work with whatever tools are available at hand and also differentiated this work from the work of an expert engineer who, per Lévi-Strauss (1966), is more methodical than a bricoleur. Our findings suggest that ignorant experts navigate their use of complex technologies through a combination of both methodical and make-shift approaches that make them both experts and ignorant in their work. In the following paragraphs, we build on Schön’s (1983) concept of reflective practitioner and discuss how ignorant expertise better explains the relationship between experts’ work and complex technologies.
Yanow and Tsoukas (2009) developed the meaning of a “reflective practitioner” and demonstrated that when professionals are embedded in an unexpected situation, they practice in-the-moment reflectivity to navigate those situations. This concept assumes that practitioners engage with a continually unfolding social situation or a “flow of interaction” (p. 1340) as if practioners are trying to catch a moving train. However, interactions with complex technologies are markedly different in that technologies do not react or interact unless the practitioner does. For example, a resistor needs to be soldered on a board before it can blow up, and a plug needs to be switched on before an experiment fails. Before ignorant experts act, they have a relatively larger window of time to consider a range of actions and anticipate how objects might behave in response to those actions in a certain complex technical setting. We use the term anticipate, rather than reflect, to emphasize the forward-looking nature of practitioners’ engagement with objects and to highlight the uncertainty with regards to the future outcomes of those interactions.
Furthermore, the interactions of ignorant experts with technologies allow repetitive actions under similar conditions; this is the not the case with social interactions. In Yanow and Tsoukas’s (2009) reflection-in-action and Weick’s (1998) organizational improvisation, the social settings continually change and do not allow experts to go back and forth between their choices and work through trial-and-error under same settings. During this trial-and-error phase, experts encounter persistent failures that shape what they think they know and makes them aware of their ignorance; it is a window of time that allows them to pause, wander, and get creative (Corballis, 2015). Thus, it is a sought-after experience. Moreover, the chance to engage in controlled tweaking of scientific equipment elicits a sense of curiosity and excitement; many participants referred to their work as being “fun” or “cool.” Ignorant expertise is, therefore, set apart from satisficing (Simon, 1997) which is less often associated with joy in the process.
Collectively, ignorant experts have the time to plan and execute their actions, the ability to repeat them in controlled technical environments, and the option to seek failures that inform their ignorance. These characteristics explain how ignorant experts are able to navigate the paradox of practicing hesitancy and boldness in their work with complex technologies. This notion of paradox is similar to Pickering’s (1995) dance of agency, i.e., a dialectic between resistance and accommodation where resistance refers to repetitive technical failures and accommodations refer to novel approaches to navigate those failures. Resistance, per Pickering (1995), emerges from material agency, and accommodation emerges from human agency; these processes are mangled in a dialectic of scientific practice. In comparison to this macro-perspective on scientific practice, the dialectic of hesitancy and boldness provides a micro-perspective on how experts mobilize their agency (or accommodate against resistance) in technical work.
Ignorant Expertise Manifests as a Communicative Performance
Acknowledging that uncertainty in the work of ignorant experts germinates from a tight coupling of their needs with material artifacts also highlights that the outcomes of interactions are contingent upon intense interpretive and often isolating work. This is an intimate interaction that other organizational members are not a direct party to and only become part of when they are intentionally brought into the loop. Thus, the work of ignorant experts is not open to easy and direct interpretation by peers even when visible failures occur. When broken artifacts linger in a physical space, they don’t reveal much about the expertise of those who broke them; it could both be a mistake and a discovery. In other words, the relational aspects of ignorant expertise have to be communicative in nature: ignorant expertise is communicatively adjudicated, performed and valued in professional work settings.
Communicative performance of ignorant expertise means doing the work itself by visibly engaging in emergent problem solving, asking questions, and acknowledging not having immediate solutions. In order to encourage members to ask questions, visibly fail and fix, and acknowledge not knowing, participants in our research who had more experience working with machinery in the lab facility actively worked towards making the lab environment kind, supportive, and inclusive. This influenced lab members’ willingness to engage in practices associated with ignorant expertise. Literature on the work of programmers in software organizations supports these findings and suggests that programmers—as ignorant experts—so intensely value interpersonal rapport that they tend to move with their teams when they switch jobs (Weinberg, 1988).
This finding contradicts the typical way that distinctions between technical and relational skills in engineering and scientific studies are often described, with technical skills and knowledge usually being privileged above and beyond what may be considered the “social” side (Bucciarelli, 2008). While negotiating “the meaning of ambiguous concepts” (p. 88) and communicating their expertise, planners and engineers collaborating on urban planning projects were motivated to use discursive techniques that minimized conflict (Woo et al., 2022, p. 90). Roberts (2013) asserted the value of developing organizational cultures that can recognize the resourcefulness of ignorance. Interpersonal rapport and relationship building thus become dominant components of communicating and cultivating ignorant expertise.
Other extrinsic components of performing ignorant expertise point to thinking out loud and storytelling as ways to visibly wrestle with tedious problems. Telling stories about how machine components were built, how computer code worked, and how materials were secured was a substantial part of emergent problem-solving and negotiating new practices. Research has shown that knowledge is often shared and demonstrated in communities of practice through stories, references to situated problem solving, and thinking out loud during real time problem solving (Olson et al., 1984; Orr, 1996). This understanding is challenged when time taken to solve a problem, and/or representations of work (i.e., reports and presentations) indicate expert performance. In the work of ignorant experts, indicators of expert performance can vary across situation and individuals; further, generalizable indicators such as experience, work outputs, and even educational background could become less relevant in communicating ignorant knowledgeability (Sharma et al., 2022). For example, sometimes quick hacks communicate the intuition deemed necessary to be an expert; at other times, long and stretched out battles (and stories about those battles) demonstrate the perseverance needed to work as an expert.
Ignorant Expertise Establishes Technological Complexity as a Catalyst for Organizing
As workplaces become technologically advanced and are dominated by tools (e.g., artificial intelligence, algorithmic management) that demand ignorant expertise, workers are more likely to experience uncertainties rather than frictionless use of technologies in work. This is not to say that such technologies become useless. However, at the moment workers decide to engage with them, it might not be entirely clear how the technologies will serve or challenge their immediate needs.
In appraising and theorizing the role of technologies in work settings, literature has often focused distinctly on existing knowledge about what activities are permitted or available to users in a particular environment and how they can be accomplished (DeSanctis & Poole, 1994). For example, the implications of tools such as Slack and Zoom for communication and organizing in professional work settings is understood in terms of their features and existing knowledge that users have with regards to using those features. However, when technologies are complex enough to not present their features to users, their implications for work exist despite not being directly evident. In such cases, it becomes critical to acknowledge that technological complexity, as opposed to technological functionality, is consequential for how work happens.
In understanding the relationship between technologies and work, the problematic focus on technological functionality can be partly explained by the interchangeable use of the terms “function” and “material agency,” along with this assumption that “what technology is does not change across space and time” (Leonardi, 2012, p. 37). As demonstrated in this research, when technology infrastructures used in organizational settings include multiple moving and interdependent parts, organizational members are unable to experience spatial and temporal stability. The term unstable technology does not mean that material develops intentionality of its own (c.f. Pickering, 1995). However, materials can interact with other materials around them often without human awareness or intentional intervention. For example, smartphones can automatically latch on to an internet connection and sync phone data on cloud servers and online purchases made from a mobile device can inform advertisement algorithms running on multiple devices on that IP address. In other words, boundaries around stable and unstable technological infrastructure can begin to blur for organizational members’ work practices.
As actions and properties associated with technologies become difficult to pin down, definitions of material agency “operationalized as the actions that a technology takes” and materiality “as properties of an object,” (Leonardi, 2012, p. 36) become limiting. Instead, material agency then gets exercised through material’s capacity to evolve spatially and temporally by reacting and integrating with its environment, and social agency is exercised through ignorant, yet knowledgeable, actions. That is, the complexity of technology creates inherent uncertainty about operational capabilities and outcomes, shifts organizational members’ goals, and dissolves an active realization of the technological functionality that motivated those goals. Thus, ignorant expertise, as a work practice, implores us to investigate the varied ways in which technological complexity implicates communication and knowledge sharing in organizations.
This insight is particularly relevant as work becomes increasingly dependent on complex technologies that have relatively simpler and more usable interfaces. For example, complex AI technologies such as ChatGPT keep their complexity and instability hidden and invisible to their users. Organizational members could use ChatGPT in their day-to-day work and not be aware of how large language models technically work. Users can operate the technology to complete tasks while remaining ignorant of how the technical design of such models informs aspects of the privacy, security, and legality of work. With this gap in the perceived and real complexity of advanced technologies, uncertainties associated with working with a complex technological ecosystem become central to how the relationship between technologies and organizing ought to be theorized (Dougherty & Dunne, 2012). Our goal with this research study has been to begin this theorizing by centering technological complexity as a catalyst for work practices that have long held the attention of organizational and communication scholars.
Limitations
At the beginning of our research, we only had access to two labs: lab A and lab B. Because we were unable to conduct in-person observations during the pandemic, we recruited members for interviews from an additional lab (lab C) but were unable to obtain site access for this lab. Collectively, we remotely observed lab A and lab B and did in-person shadowing of lab A in Fall 2022. The interviews and online observations conducted early on in the study significantly outlined the interpretive lens with which we entered the field for in-person observations. While we would have preferred to engage with the work of these communities to develop a comprehensive and rich understanding of their day-to-day practices, our remote observations and in-person shadowing sessions confirmed the work practices we identified from interview data.
Conclusion
Within organizational contexts relying on complex technologies, uncertainty is inherent and inevitable to the work of ignorant experts. Technologies such as ChatGPT undermine the assumption that those who use—and even create—these technologies can reason through how they function or can predict how they might respond to certain prompts. This manuscript outlines the practices associated with ignorant expertise, highlights that organizational members could be ignorant experts no matter their level of experience, and centers the importance of interpersonal rapport and relationship building in the successful performance of ignorant expertise. Overall, the purpose of this research has been is to demonstrate that technological advancements call for a refinement of theoretical arguments that equate functionality with material agency, and to invite organizational scholars to investigate organizational practices around recruitment, development, and maintenance of ignorant expertise.
Supplemental Material
Supplemental Material - Complex Technologies and Ignorant Expertise: The Communicative Value of Not Knowing but Figuring it Out
Supplemental Material for Complex Technologies and Ignorant Expertise: The Communicative Value of Not Knowing but Figuring it Out by Nandini Sharma, Jeffrey W. Treem, Megan Kenny Feister, and William C. Barley in Management Communication Quarterly.
Footnotes
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was supported by the National Science Foundation.
Supplemental Material
Supplemental material for this article is available online.
Author Biographies
References
Supplementary Material
Please find the following supplemental material available below.
For Open Access articles published under a Creative Commons License, all supplemental material carries the same license as the article it is associated with.
For non-Open Access articles published, all supplemental material carries a non-exclusive license, and permission requests for re-use of supplemental material or any part of supplemental material shall be sent directly to the copyright owner as specified in the copyright notice associated with the article.
