Abstract
One of the most widely used and controversial qualitative research methods among researchers is Grounded Theory (GT). Although the basic concepts of GT are widely known by qualitative researchers, the practical application of the method is not always clear, and sometimes we encounter contradictions and uncertainties. The objective of this article is to provide a data analysis strategy based on GT. It is hoped that this knowledge will provide an aid or resource for both novice and experienced educational technology researchers. To this end, a conceptual overview of the field and the methodological route are presented. A broad and updated theoretical base is used to describe the conceptual and structural aspects of GT, to describe steps to follow when applying it, and to highlight the use of analytical tools that help the researcher make decisions and implement GT.
Keywords
Introduction
One of the most widely used and controversial qualitative research methods among researchers is Grounded Theory (GT). The defining element that differentiates GT from other types of qualitative methods of analysis is its focus on the generation of theories or theoretical models that explain, confirm and/or develop the social phenomena under study (Rodríguez et al., 1999). These theories are the result of systematically collecting and analysing data taken from real life, and subsequently interpreting the content of the data (Strauss & Corbin, 2002). In this way, GT allows us to truly understand “the meanings and actions of research participants, offer abstract interpretations of empirical relationships and generate conditional statements about the implications of their analysis” (Charmaz, 2005, p. 508), aspects that other methods of analysis do not allow us to do.
However, this interpretation of reality is not the result of a process of intuitive discovery, but rather GT providing a systematic process of theory building and decision making throughout the study (Charmaz, 2014). Thus, GT makes qualitative analysis procedures explicit and helps researchers to develop useful conceptualizations of the data (Cuesta, 2006). In this way, GT itself is both a theory-building and a theory-elaborating technique (Murphy et al., 2017).
Therefore, from a simple induction exercise, GT is based on four basic principles that make it possible to generate a theoretical model, understand the phenomenon in-depth, and obtain a meaningful guide for action (Charmaz, 2014; Murphy et al., 2017; Watling & Lingard, 2012): emergence, theoretical sampling, constant comparison, and theoretical saturation. The principle of emergence or data identification states that the researcher always remains open to new realities and phenomena that emerge in the course of data collection and analysis. This type of method is particularly suitable for studying unexplored, contingent, or dynamic phenomena (Charmaz, 2008). Theoretical sampling involves making decisions or defining criteria for selecting new sources of information (individuals, cases, organizations, situations, etc.), while simultaneously collecting and analysing data, either in terms of constructing, validating, extending, or correcting an emerging theoretical model (Prigol & Behrens, 2019). Constant comparison involves continuously analysing and examining the data collected, along with existing literature, in an effort to construct a theory of social reality (Strauss & Corbin, 1990). In the process of interpretation carried out in the constant comparative method, different procedures for dealing with a text can be distinguished. The data are categorized and grouped according to their properties (open coding), and then the researcher organizes the resulting categories into a pattern of interrelationships (axial coding), which represents the emergent theory and explains the process or phenomenon under study (selective coding) until theoretical saturation of data is reached. Finally, theoretical saturation is a procedure of the constant comparative method that addresses the question “When do I stop data collection and analysis?”. Rather than using an arbitrary number of sources and data points, the logic of theoretical saturation is that the researcher continues to collect and analyse data until it is deemed no longer possible to find new information relevant to the study (Glaser & Strauss, 1967).
Other key and fundamental elements of the model are: (a) Question formulation: it is necessary to formulate a question that is broad enough to accommodate the phenomenon in its various perspectives. The question should also take into account the type of qualitative approach that will be used and that will provide a more in-depth understanding of the phenomenon (Strauss & Corbin, 2002); (b) The iterative process: the GT strategy offers a flexible and recurrent approach, allowing the researcher to move backwards and forwards through the data at various points in time in order to constantly compare codes, categories, and concepts (Contreras et al., 2020). A spiral approach to all actions is articulated, enabling the identification of new realities and the refinement of the theory (Prigol & Behrens, 2019); (c) The technical literature review: a guide to the research process, which provides insights into key aspects of the subject matter that will help to make appropriate decisions in theoretical sampling. It also informs the researcher about the different properties and categories that characterize the phenomenon and that otherwise might not be noticed or identified at later stages of the process (Strauss & Corbin, 2002); (d) Emergent coding: allows information to be organized and meaningful units of analysis to be discovered as the collected data are studied (Belgrave & Seide, 2019). The researcher categorizes these units, interprets them, regroups them, assigns them sub-categories, and forms a logical structure endowed with meaning, which is the first theoretical approach (Matteucci & Gnoth, 2017); (e) The matrices: these are tables that allow the properties of data and categories to be compared. They enable similarities, differences, patterns, and regularities to be observed. Initial categories are regrouped, or new categories are defined in order to elaborate a richer and more explanatory theoretical construction (Conde, 2009). (f) The construction of concept maps: these are the main outcome of the analysis process, and they establish the relationships between the ideas that have emerged from the categories and dimensions. Conceptual maps are the basis of theoretical development and make it possible to represent the relationships identified between the data and, in this way, to theorize about the phenomenon based on the conceptual network shown by the mapping (Conde, 2009).
However, despite the fact that the basic concepts of GT are widely known by qualitative researchers, its practical use in studies presents some difficulties (Murphy et al., 2017). There are multiple ways of understanding and applying Grounded Theory, and within each approach, different types of interpretation are used. Since the publication of Glaser and Strauss' book in 1967, three main versions of Grounded Theory have emerged (Charmaz, 2011; 2014; Glaser, 1993; Kelle, 2019; Matteucci & Gnoth, 2017; Strauss & Corbin, 1990): On the one hand, Glaser, more orthodox, insists on the generation of theories from the primary data obtained by the researcher. On the other hand, Strauss adds new instruments of analysis, such as the interpretive description of data, axial coding, diagrams, the matrix, or the use of computer programs (Abela et al., 2007). Finally, Charmaz, with a constructivist theoretical basis, focuses on allowing data topics to emerge organically. Her proposal is a redesign of the model, through a systematic approach that encourages the integration of the subjective experience of the researcher, as a priority and the social conditions specific to the subject of study (Bonilla-García & López-Suárez, 2016; Charmaz, 2011). Therefore, although there has been an important development of this qualitative methodology, its complexity does not favour researchers' understanding of it in sufficient depth (Contreras et al., 2020) and, in addition, it exhibits some ambivalences or vagueness (Prigol & Behrens, 2019).
The aim of this article is to describe and model the process of applying GT in educational technology research by describing the different components of the procedure simultaneously with the methodological practices applied in an educational technology study. The aim is to assess the applicability of GT in educational research and to provide a roadmap to help other researchers use GT.
This contribution is not developed from the orthodoxy of the method, as we consider GT to be an instrument of data analysis that leads to theory elaboration. Consequently, the strategy described is directed towards Strauss' interpretive approach to GT, which is more open and flexible. As such, those tools and strategies that have been deemed suitable have been selected and used, and it is hoped that this knowledge will provide an aid or resource for researchers using GT. However, the intention is not to provide a “recipe” for how to automatically “build theories”. In fact, one of the strengths of GT is its versatility, i.e. its adaptability to the most appropriate decisions for each study. For this reason, it is advised that the proposed strategies should be seen as useful tools rather than a single path for all GT studies.
Method
Context of The Research
Since this study is supported by the methodological process applied in a qualitative study on educational technology, entitled “An Analysis of the Education Policies for the Integration and Use of ICT in the Educational System of Extremadura (Spain) and their Effects on Educational Innovation” (hereafter, PolEx-ICT) (Sosa-Díaz & Valverde-Berrocoso, 2015; 2017; 2020; 2022; Valverde-Berrocoso & Sosa-Díaz, 2014), we present an explanatory outline of the phases of analysis development, with a description of the different procedures, tools, and strategies that were carried out. The study analysed the phenomena produced by the introduction of Information and Communication Technologies (ICT) in educational practice and was carried out in schools selected for the quality of digital technologies integration processes in their curricula. The analysis includes the organization of technological resources, coordination between the different educational agents, teacher training and assessment, and ICT integration in the different institutional documents and in the school’s management processes.
Objectives and research Questions
The general purpose of GT research should be to analyse and understand processes, thus generating a formal model or theory of the phenomenon (Flick, 2019). The general objective of the PolEx-ICT study, in which GT was used, was to delve into the processes of ICT integration in primary and early childhood education schools, in such a way that the current state of the process of ICT integration in general, and the phenomena that are occurring around it, can be understood. A series of specific objectives were also defined: (a) to explore and analyse the process of ICT integration in primary schools, identifying the factors and agents that influence it; (b) to examine the educational practice through ICT developed in the schools; (c) to consider and analyse the perspectives, opinions, attitudes, and digital competences among the educational community; (d) to understand how the organizational dynamics of a school influence the process of ICT introduction and use; and (e) to study the policy model of ICT introduction in the Autonomous Community of Extremadura (Spain).
For GT, it is not enough to merely describe reality and facts; one must understand the phenomena in order to extract meaning from the observed processes. Researchers commonly preserve the open and emergent approach of GT by setting two or three specific research questions that initially guide the attention of the research process. Their purpose is to “explore” in order to understand the phenomenon rather than to “confirm” any preconceived ideas (Murphy et al., 2017). These questions are not only the point of reference during data collection and report writing, but they also provide a guide for making sense of previous results and clarify the usefulness of potential findings. Therefore, the questions elaborated in GT are usually along the lines of: “What is happening in this context?” and “What phenomena are taking place?”. As answers are obtained, decisions about theoretical sampling are made.
Research Questions by Grounded Theory typologies in PolEx-Information and Communication Technologies.
ICT = information and communication technologies
At the same time as analysing the data from the PolEx-ICT case studies and building a theory, the research questions were reformulated based on these results and new questions were posed that required different types of data. For this reason, the duration of the study required a time period of 3 years (Abela et al., 2007).
Literature Review
It is recommended that research questions are categorized by dimensions or themes that are obtained from a literature review (Stake, 1998). In this way, the researcher has a conceptual structure to focus on the relevant aspects. This review can be approached from two approaches (Murphy et al., 2017): the Tabula Rasa approach, in which existing literature should be “left out” or partially ignored in the early stages (Gioia et al., 2012), and the Tabula Geminus approach, which allows for the early integration of existing literature and informant-generated information (Kreiner, 2015). It is naïve to think that it is possible to approach a phenomenon without any prior information or preconceptions (Kreiner, 2015); pre-existing theoretical and empirical knowledge plays a fundamental role. However, the researcher must keep constantly updating the scientific literature so that prior knowledge is integrated or challenged during continuous data analysis (Flick, 2019). Additionally, three phases are identified in the literature review (Thornberg & Dunne, 2019): (1) the initial phase: the main interest is becoming familiar with the topic and focusing on understanding the existing empirical findings. It is necessary to adopt a sceptical stance towards any theory so that we do not become prejudiced by the literature, and keep the limitations in mind to maintain cognitive openness; (2) the ongoing phase: the researcher can look for empirical studies that relate to the first findings that emerge from data collection and analysis, which helps to identify existing theories and concepts that support or contradict the findings; and (3) the final phase: the researcher seeks to contextualize the constructed GT in relation to established theoretical ideas and to identify theoretical reference points against which to compare and contrast the data (Gioia et al., 2012).
According to these three phases, in the research design of the PolEx-ICT study, based on Grounded Theory, a literature review was conducted before the start of each theoretical sampling/research study (Figure 1). First, in order to become familiar with the topic, the researcher looked for the latest scientific manuals on educational technology and ICT integration in education. Second, the researcher searched databases (Eric, Scopus, and WOS) for scientific articles on empirical studies whose main objective was the study of organizational aspects within the integration of ICT in primary schools. And third, the researcher searched databases (Eric, Scopus, and WOS) for Systematic Literature Reviews (SLR) on the topic under study, which would allow the data and the emerging theory to be compared and verified. This provided the state of the art that contributed to the decision-making process in the selection of cases, the development of research instruments, and the formulation of new research questions during the process of analysis of the three research studies (Strauss & Corbin, 2002). PolEx-Information and communication technologies research design based on the theoretical samplings.
Theoretical Sampling
Theoretical sampling is the starting point for GT research. It establishes the criteria for selecting people, cases, or situations from which to collect and analyse data (Prigol & Behrens, 2019), as well as for making appropriate decisions about expanding the sample when more information is needed (Glaser & Strauss, 1967). Theoretical sampling is integrated and interconnected with the research process and occurs simultaneously with data analysis, making it the most complex and important component of the study (Morse & Clark, 2019). The data collection process generates the emergent theory from the moment that information collection, coding, and analysis determine what data will be needed next to answer the research questions and on which samples should be drawn. As key concepts emerge and categories and properties take shape, the researcher requires more data from both new subjects and different contexts in order to understand the phenomenon in-depth and continue theory development. Therefore, as the analysis matures, sample selection strategies and criteria will change accordingly (Morse & Clark, 2019).
There are several types of sampling depending on the timing of the research process and the objectives set (Urquhart, 2019). The basic theoretical sampling types are (Strauss & Corbin, 2002): (a) open sampling: the researcher does not set any limitations on data collection in relation to informants (individuals, groups), contexts (spaces, places), or time (moments, duration). They are initially guided by the questions that the study is based on and that allow them to make an initial sample and instrument selection for data collection. However, sampling is characterized by flexibility and the specific characteristics of the moment, the context, or the needs detected; (b) sampling of relational and variational: the researcher identifies incidents that represent the various manifestations of a concept and the relationships with other concepts. In this way, the researcher can define the most relevant dimensions or categories for the study; and (c) discriminate sampling: the researcher selects contexts, groups, individuals, and documents that allow for an in-depth comparative analysis. The researcher often has to revisit sites, re-contact people, and re-read documents in order to obtain the data necessary to saturate the categories and complete the analysis. In this way, the developing theory can be validated, compared, and contrasted and any necessary modifications or additions can be made.
Theoretical sampling requires researchers to access the field repeatedly at different times. Prolonging the research process in this way is a limitation that must be taken into account as researchers, in order to exploit the full potential of theoretical sampling, must analyse the data before returning to the context of the study (Matteucci & Gnoth, 2017). Consequently, in the study design, each theoretical sampling should be associated with a time or phase of the process in order to establish a purpose for each sampling. However, we must not forget that GT implies a flexible design that is adapted to the evolution of concepts in the process of theory building and confirmation (Morse & Clark, 2019).
Therefore, taking into account the different phases of theoretical sampling, the PolEx-ICT GT model conceives study design as a spiral (Figure 1), through which the three types of sampling (open, relational and variational, and discriminate) are developed, and in which the type of groups, informants, or new settings to be explored in order to shape the theory are determined (Trinidad et al., 2006).
Decisions Concerning the Sample
Types of Case Studies Developed in PolEx-Information and Communication Technologies.
Data Collection Instruments
In order to meet the objectives set out in each of the studies, according to each type of theoretical sampling, different data collection instruments were used. The first study, following open sampling, had the widest variety of data collection instruments (institutional documents, teaching materials, classroom observations, individual semi-structured interviews, and group discussions). In the second and third studies, considering relational and variational, and discriminate sampling, the study focused on conducting individual semi-structured interviews with different educational agents, which are described below.
Data Collection Instruments and Informants According to The Type of Sampling and Study in PolEx-Information and Communication Technologies.
aFour primary schools participated in study 1.
bOne primary school participated in study 2.
cFour primary schools participated in study 3.
Informants
Informants are made up of a selection of people linked to the phenomenon to be studied who can offer relevant information from their own particular roles and experiences. Informants are also considered to be any type of textual, graphic, or audio-visual document with valuable information for the purposes of the study, as well as certain “settings” where interactions and exchanges between the subjects take place (e.g. a classroom).
Therefore, according to the theoretical sampling developed at each point in time, those people who meet certain criteria or requirements will be selected. In PolEx-ICT, teachers who frequently use ICT, teachers belonging to the school’s organizational structure (management team or ICT coordinators), students, families belonging to the Parents' Association (PA), and other external agents such as advisors from the Centre for Teachers and Resources (CTR), which is a body for the continuous training of teachers, were selected.
PolEx-ICT used a variety of data collection processes, including techniques and instruments used in the different phases of the study and its informants (people, documentation, and settings).
Results
Constant Comparative Method
GT is, in essence, a comparative method. Glaser and Strauss integrate coding and theory generation in a systematic way through an analytical procedure of constant comparison, developing categories and properties (Abela et al., 2007). Thus, the constant comparative method consists of the search for similarities and differences by analysing the incidents contained in the data, through the simultaneous processes of coding and analysis, with the purpose of systematically generating a theory (Trinidad et al., 2006).
It is difficult to separate coding from other elements of analysis in GT, let alone establish an order in the elements, especially as, in some cases, they are carried out simultaneously and iteratively (Belgrave & Seide, 2019). However, there must be a starting point, which is why it is useful to establish phases that define the method as steps in a process that is by no means linear. The constant comparative method is developed in four stages: (a) comparison of the data; (b) integration of each category with its properties; (c) delimitation of the developing theory; and (d) saturation of the incidents specific to each category and construction of the theory (Abela et al., 2007; Rodríguez et al., 1999; Strauss & Corbin, 2002; Trinidad et al., 2006).
Phase 1: Open Coding
As data are collected, researchers normally transcribe and code them using the principle of constant comparison. Coding is the phase in which each piece of text “quotes” an indicator or code specific to the category in which it is considered to fall, and is classified by concepts and meanings. Charmaz defines the process of coding as “attaching labels to data segments that represent what each segment is about” (2006, p.3).
Thus, the analyst’s initial task is to code each event to form as many categories of analysis as emerge from the data (Abela et al., 2007; Charmaz, 2006). This procedure consists of fragmenting, examining, comparing, and conceptualizing information and concrete data from institutional documents, observations, interviews, and group discussions, which is carried out by reading each transcript and attaching “codes” (essentially just labels) to fragments of text to reveal meaning. Codes can be applied to text segments of any length, as long as the segment conveys a coherent idea. However, to begin with open coding, it is recommended to conduct a line-by-line analysis, or microanalysis, which involves a detailed and thorough study of the data, sentence-by-sentence and sometimes word-by-word (Prigol & Behrens, 2019; Strauss & Corbin, 2002).
This whole process of microanalysis and coding the different segments of meaning should be guided by analytical questions such as: “What does the data suggest?”, “What is the central idea of this paragraph or sentence?”, “What is the text talking about?”, “What is happening in the text?”, and “What concept does this text extract suggest?” (Charmaz, 2017). Following this, a name or code is assigned to it; as the codes begin to accumulate within the dimensions determined at the beginning of the study, the study proceeds to classification or categorization under more abstract explanatory terms, i.e. into categories (Strauss & Corbin, 2002). Subcategories then usually emerge from the categories, giving greater specificity to the category and forming what we can call a family of codes (Matteucci & Gnoth, 2017).
It should be noted that the researcher can not only code text fragments, but can study and understand emotional states, attitudes, and settings, as well as the participants’ silence (Prigol & Behrens, 2019). Furthermore, several codes or labels can be assigned to the same text segment if the researcher considers that it can contribute information to different categories (Kelle, 2019). When using the Tabula Geminus approach (Kreiner, 2015), code names can be designated with vocabulary from existing literature. In contrast, if the researcher follows the Tabula Rasa approach (Gioia et al., 2012), the coding is prevented from using literature-based terms (Murphy et al., 2017). Especially useful during the coding process are in vivo codes, or terms that are used by informants during the interview and become part of the coding structure (Belgrave & Seide, 2019; Murphy et al., 2017; Vegas, 2016).
Example of Content Analysis Memorandums in Memorandums.
ICT = information and communication technologies
Tools to Facilitate Microanalysis and Coding
A) Memos: Memorandums (memos) are informal analytical notes written throughout the study regarding the data collected. There are no standards for the production of memos; they are usually constructed using informal language for personal use, as the aim is to record insights, explorations, and discoveries of ideas about what was seen, heard, perceived, and coded that can guide and assist in theory building (Prigol & Behrens, 2019). The production of memos is an ongoing process that involves documenting and collecting thoughts and interpretations about the data, explanations of concepts and categories, and also the directions in which the analysis should proceed (Abela et al., 2007; Murphy et al., 2017; Trinidad et al., 2006). It can be argued that memos are the first stepping-stone for theory building, as they help to guide and redirect the researcher in this process (Charmaz, 2006; Strauss & Corbin, 2002; Trinidad et al., 2006).
Each researcher must find their own style of memo writing, as long as these notes are made in an orderly, progressive, systematic, and easily retrievable manner, thus providing the researcher with a bank of analytical ideas classified and grouped according to the evolving theoretical framework.
Example of Classification of Memos by Dimension and Research Study in PolEx-Information and Communication Technologies.
In each of the memos, the whole evolutionary process of the constant comparative method of one dimension and, therefore, of the different coding procedures (open, axial, and selective) is comprehended. Therefore, all the memos contain the following elements: (a) content analysis of the sources, or interpreting each of the categorizations that have been made with the support of WebQDA, a qualitative, web-based data analysis software classified by codes, and carrying out the microanalysis code by code, interview by interview, and categorization by categorization; (b) diagrams, matrices, and explanatory tables, which include the ideas that appear in the microanalysis organized into different representations, and that indicate the frequency of appearance of the same idea and the role played by the informant; (c) central ideas, or collecting a selection of the most relevant concepts and the relationships established between them; (d) concept maps, which are the graphic representation of the relationships between the concepts identified in the research process; and (e) questions, where those that have arisen from the data analysis and new questions that will guide the new phase of the study are formulated. This identifies those categories that are considered to be saturated and for which no further information needs to be collected.
B) Use of software for qualitative data analysis: When coding each text segment of each of the transcribed documents, if the amount of information is abundant, a software is called for, which allows these codes to be assigned and grouped easily. Experienced classical grounded theorists continue to hope for a software that can replicate the analytical skills to analyse and code texts. However, this is not possible, and no software is going to do the researcher’s GT analysis. However, it can be argued that a software can be useful for managing and organizing large amounts of information. This also facilitates the task of coding and allows for a consistent and rigorous analysis (Friese, 2019), making the process more transparent and data management easier (Flick, 2004). Thus, software is an aid in the process, but the analysis should not focus on the functions they provide, as they do not perform the interpretative process of the data, and their contribution in the construction of concept maps and the delimitation of theory is limited.
Phase 2: Axial Coding
Axial coding requires an in-depth analysis of a category in order to uncover interactions and relationships with other categories, subcategories, and properties (Strauss & Corbin, 2002). Axial coding enables data to be recomposed and gives coherence to the analysis and to the emerging theory, pointing out sub-dimensions and properties within a context, allowing for more precise explanations that respond to the phenomenon with questions such as: when, where, why, who, how, and with what consequences (Prigol & Behrens, 2019).
Procedurally, Strauss and Corbin (2002) point out four basic tasks in axial coding: (a) matching the properties of a category and its dimensions, a task that begins during open coding; (b) identifying the variety of conditions, actions, interactions, and consequences associated with a phenomenon; (c) relating a category to its subcategories by means of sentences that denote their relationships to one another; and (d) searching for clues in the data that denote how the central categories can be related to one another.
Example of an Organizational Table of Ideas in PolEx-Information and Communication Technologies.
ICT = information and communication technologies
The construction of matrices is a very useful method for creating and validating hypotheses (Conde, 2009). In this procedure, the most common data and from whom they originate is made explicit. It makes it possible to answer the questions of axial coding: “Why does it happen?”, “Where/when does it happen?” and, “With what/whom?” (Strauss & Corbin, 2002).
Once conclusions have been drawn from these data, it is necessary to arrange and present them in an orderly fashion. Thus, axial coding concludes with the outline of a diagram or model called a coded paradigm, which shows the relationships between all the elements, concepts, and categories (Hernández et al., 2006). Defined by Strauss and Corbin as “graphic representations or visual images of the relationships between concepts” (1990, p. 198), this procedure has helped to articulate information by creating new relationships between concepts.
Concept maps can be used as the instrument to represent the diagrams, as they are the best method to visualize the central concepts, illustrate the relationships established between them (Conde, 2009), and even the mobility process observed throughout the different stages of the research. These concept maps can display the concepts (inside “boxes”) that express the themes and categories found, the relationships (represented by lines) between themes and categories, the type of relationship (represented by the word-link or connector) that is established between themes and categories, as well as the time of analysis (research study) in which a theme or category appears, represented by the colours of the concepts (Figure 2). Example of definitive concept map in PolEx-Information and communication technologies.
Thus, throughout the axial coding process, concept maps can be progressively constructed for each dimension. As the analyses and the different studies progress, the concept maps are modified, shaping the analysis, improving the understanding of the data, and facilitating the subsequent drawing of conclusions.
Phase 3: Delimitation of the Emerging Theory That is Beginning to Develop
The analyst ends up with coded data, categories, memos, and a possible theoretical postulate shown in a concept map. As a result of the constant comparison of the categories, the emerging theory is modified and becomes more consistent. The ideas that have been reflected in the memos concretize the meanings of the categories and their relationships. The annotations in the memos are the main support for theoretical coding or selective coding (Abela et al., 2007; Glazer et al., 2005; Strauss & Corbin, 2002), whose main objective is to “reweave the fractured story” (Glaser & Strauss, 1967).
Thus, once the theoretical scheme has been generated from the conceptual map and the interpretations made in their memos, the researcher returns to the units or segments and compares them with their emerging scheme in order to substantiate it and to be able to delimit the categories and outline the emerging theory (Hernández et al., 2006). Selective coding begins in order to integrate and refine the theory Strauss and Corbin (2002); in order to carry out the selective coding process, it is recommended to carry out a theoretical writing, which describes the relationships between the categories from a central concept, as well as the process or phenomenon, in the light of our theory (Strauss & Corbin, 2002). It can be affirmed that the theory constructed thus far is of medium scope but has a high explanatory capacity for the whole of the data collected (Hernández et al., 2006).
The theoretical writing of PolEx-ICT was developed in different parts, each of which constitutes a research study and each of which developed different dimensions and categories that emerged throughout the research process. In each of these documents, the way in which the concepts emerge and how the data are developed, as well as the way in which the central categories are generated, can be seen. In the end, a final report that articulates all the results of the research is written.
Example of Central Ideas Extracted from an Organizational Table in PolEx-Information and Communication Technologies.
Phase 4: Saturation of the Incidents Specific to Each Category and Construction of the Theory
The researcher has to apply constant comparison at the end of the analysis process for each of the research units. Therefore, it is necessary to determine whether or not the set of categories has become saturated and thus guide the researcher on how to proceed in the next steps of the study. As a result of the constant comparison, the researcher can see the need to initiate a new study, address what happens in certain settings, or identify which individuals can provide new information.
Theoretical saturation occurs when no new properties of the category emerge from the data, i.e. more information does not add anything new or relevant (Glaser & Strauss, 1967), and when no additional data are found to identify new dimensions, codings, actions/interactions, or consequences (Strauss & Corbin, 2002). As noted above, at the end of the memorandum for each of the categories, a process of formulating and identifying new research questions to guide further phases of the study is undertaken. The researchers involved brainstorm research questions on the concepts emerging from the conceptual map elaborated during the axial coding phase, pointing out those categories that are considered to be “saturated” and on which no further information needs to be collected (Figure 3). With this strategy, the researcher asks questions about the latest data analyses in order to determine whether or not they bring new meanings to the phenomenon. If it is assessed that there is theoretical saturation, then the study is completed. Example of a concept map with saturated subcategory and questions for study 3 in PolEx-Information and communication technologies.
Conclusion
As has been observed throughout this study, GT has a long history of theoretical and research development, and it is one of the most widely used qualitative research methods. However, it is one of the most controversial among researchers. In the practical application of the method, researchers encounter an approach that is not always clear, and that has some contradictions and can be vague (Contreras et al., 2020; Murphy et al., 2017; Prigol & Behrens, 2019). The objective of this article has been to develop a detailed study design and data analysis strategy based on the GT of a study on educational technology (PolEx-ICT), in such a way that this knowledge provides an aid or resource to both novice and experienced researchers. To this end, a conceptual overview of the field and the methodological route taken in the study have been presented, highlighting the use of analytical tools that help the researcher make decisions and implement GT. A broad and updated theoretical base has been used to describe the conceptual and structural aspects of GT and to describe the steps to follow when applying it, using its application in the PolEx-ICT study as an example.
If the interpretation of the method for its use is not easy, and is not always clear, the difficulty of teaching its application is as well very complex. Moreover, there are currently no reference documents with a tangible, practical application and with concrete examples that could serve as a guide or complement for teachers to train novice researchers. This is why the present study may represent a good didactic guide to help teachers in their work with novice researchers.
In addition, the aim is to contribute to the field of educational research, especially educational technology, which requires rigorous qualitative research methods, and to highlight GT as a very appropriate approach for research in educational technology. In this field, it is usually an unknown method, and some who are aware of it consider it to be an unscientific method (García-Yepes & Rodríguez-Roja, 2018), so that studies in educational technology that use GT as the main method are scarce. Therefore, the process has been exemplified with the intention of showing the systematicity of the GT research process and giving it the scientific value that the research methodology deserves (Charmaz, 2014), as well as highlighting that it allows for a greater understanding of the phenomena and offers theoretical explanations of great value in the field of educational research that other methods, both qualitative and quantitative, do not allow for (Charmaz, 2005).
GT leads to the generation of new theories that, from the realities of the context and its participants, provide organizations with new paradigms and ways of perceiving phenomena that could not have been evidenced before (Contreras et al., 2020). It is undeniable that GT research has a creative component, which is why the tools presented here are flexible and open to modification to suit the needs and objectives of the researchers. In short, the work presented here is an application (interpretation) of the methodology in the educational technology field, and therefore it is a particular development adapted to the needs of the PolEx-ICT study, which can always be improved. Therefore, the researcher must be a creative person who interprets and analyses the data in their own way, provided that this is done with sufficient data collection capacity, order, and rigour until saturation can be produced, which, in general terms, is what actually drives new concepts.
Footnotes
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: The Consejería de Economía, Ciencia y Agenda Digital from Junta de Extremadura and by the European Regional Development Fund of the European Union (GR21137, IB18O88), Ministry of Science and Innovation. Directorate General for Programmes and Knowledge Transfer. National R&D&I Plan 2006–2009 (SEJ2006-12435-C05-05/EDUC).
