Abstract
This study examined the interplay between extensive reading, metacognitive experiences, and academic literacy development among international doctoral candidates, with a particular focus on the mediating role of critical thinking. Employing a sequential explanatory mixed-methods design, the research first analyzed quantitative data through Partial Least Squares Structural Equation Modeling (PLS-SEM) and subsequently enriched interpretation through qualitative semi-structured interviews. Quantitative findings demonstrated that critical thinking statistically mediates the association between reading, metacognition, and academic writing performance, highlighting the pivotal role of higher-order reasoning in doctoral literacy development. Qualitative insights further revealed that critical engagement with academic texts and reflective self-regulation strategies were crucial in transforming reading input into coherent, original scholarly writing. Together, these results advance the understanding of academic writing not as an isolated linguistic output, but as a dynamic, self-regulated cognitive and metacognitive process. The study underscores the need for doctoral programs to integrate critical reading practices, metacognitive strategy instruction, and writing support frameworks rooted in cognitive development theories. Future studies should aim to validate and refine this model across varied disciplinary and cultural contexts, and to trace the longitudinal evolution of critical thinking and writing competencies over the course of doctoral education.
Plain Language Summary
Writing well in English is a major challenge for international doctoral students, yet it is essential for completing dissertations and publishing research. Success in academic writing requires more than language skills; it also involves critical thinking, reflection on one’s own learning, and wide reading. This study investigated how critical thinking, metacognitive experiences, and extensive reading shape the writing performance of 219 doctoral students in Malaysia. Follow-up interviews with 12 participants provided deeper insights. The results showed that critical thinking was the key factor: reading widely and reflecting on writing supported stronger critical thinking, which then improved writing quality. Interviewees explained that reading gave them models of how arguments are built, while reflection helped them plan and revise. Ultimately, critical thinking allowed them to transform reading and reflection into original contributions. The study highlights the need for universities to support students in developing critical thinking alongside language skills.
Keywords
Introduction
Academic writing plays a central role in shaping scholarly participation and identity in higher education, particularly for doctoral candidates who must communicate complex ideas with clarity and rigor. It has emerged as a critical competency for scholars and educators, as it facilitates the structured communication of ideas, theoretical perspectives, empirical findings, and scholarly interpretations. According to Teng and Yue (2022), academic English writing is a specialized skill set employed to document research procedures, describe findings, and articulate scholarly implications in English. Despite its importance, many doctoral candidates, particularly those from non-English-speaking backgrounds, struggle with mastering the rhetorical and linguistic demands of academic writing (Ma, 2020). These patterns suggest that academic writing difficulties are not isolated challenges but reflect deeper cognitive and pedagogical gaps that merit more systematic investigation.
Doctoral students’ writing challenges are shaped not only by language proficiency but also by limited instructional support and misalignment between writing demands and training. A growing body of research has documented the challenges these students face, including limited exposure to academic English input, underdeveloped writing strategies, and insufficient emphasis on writing instruction within doctoral curricula (e.g., Taye & Mengesha, 2024; Woodrow, 2011). Writing instruction often focuses on grammatical accuracy rather than process-oriented or critical writing development. This disconnect signals the need for a paradigm shift in how academic writing is supported in higher education, particularly for multilingual doctoral students (Teng & Yue, 2022). Doctoral-level writing differs significantly from undergraduate-level writing in its expectation for analytical depth, conceptual engagement, and disciplinary voice. Scholars argue that students must be equipped not only with linguistic precision but also with critical thinking skills that enable them to engage with complex arguments and generate original contributions (e.g., Graham, 2019; Teng & Yue, 2022). However, formal training in critical academic writing is often absent, especially for international students, leaving them to navigate genre conventions without sufficient scaffolding (Ebadi & Rahimi, 2018). Within the Asian EFL context, scholars have noted that learners are frequently characterized by a lack of critical voice and reflective engagement (Song, 2015; S. Wang & Seepho, 2017), potentially due to educational traditions that prioritize memorization over inquiry-based learning (Liang & Fung, 2021). Based on the abovementioned, these issues highlight a persistent mismatch between doctoral writing expectations and the cognitive preparation students receive.
In recent years, researchers have turned their attention to metacognitive processes, including metacognitive knowledge, strategies and experience as key cognitive resources in academic writing. Writing is a recursive and cognitively demanding activity that challenges both first-language and second-language writers (Grabe & Zhang, 2013). Metacognitive experiences, which refer to a writer’s reflective and affective awareness during task execution, are especially relevant to how writers plan, monitor, and evaluate their output (Flavell, 1979; Teng & Ma, 2024). Although previous studies have explored metacognitive knowledge and strategy use in writing contexts, the role of metacognitive experiences remains underexplored (Teng & Qin, 2024), particularly in connection to academic writing performance (Q. Sun & Zhang, 2022a). This gap signals the need to examine how doctoral candidates regulate their thinking during writing, rather than focusing solely on what strategies they know.
Extensive reading is another promising yet insufficiently integrated contributor to writing development. It has been widely recognized as a valuable pedagogical tool for enhancing writing fluency and genre awareness and demonstrated positive effects on vocabulary acquisition, grammar awareness, and overall writing competence (Cottrell, 2017; Jennifer & Ponniah, 2019). Reading, however, is not merely a cognitive-linguistic process; it involves reflective thinking and knowledge construction. Z. S. Sun et al. (2014) noted that reading supports deeper writing development by activating analytical and interpretive faculties. Nevertheless, the integrative influence of reading and metacognitive awareness on writing performance via critical thinking requires further empirical exploration (Phyo et al., 2024). This suggests that reading alone may not shape writing outcomes unless paired with cognitive processes that help learners analyze and transform textual information.
While existing research underscores the value of critical thinking, metacognition, and reading in academic success, most studies have examined these constructs in isolation (Teng et al., 2021). Few have explored their interconnected influence on doctoral students’ academic writing, and the mediating role of critical thinking remains largely speculative (Xiaolei et al., 2023). Moreover, empirical investigations rarely account for the lived experiences and contextual realities of multilingual doctoral writers, limiting our understanding of the mechanisms that support their writing development (Mochizuki, 2022). These gaps highlight the need for an integrated framework that explains how cognitive, reflective and literacy practices jointly shape doctoral writing performance.
To address this gap, the present study adopts a sequential explanatory mixed-methods design, combining Partial Least Squares Structural Equation Modeling (PLS-SEM) with qualitative inquiry. In the first phase, quantitative data are used to examine the hypothesized relationships among metacognitive experiences, extensive reading, critical thinking, and academic writing performance. In the second phase, semi-structured interviews with selected participants offer interpretive depth, helping to explain the dynamics revealed in the quantitative model. This mixed-methods approach allows for a holistic understanding of how doctoral candidates navigate the writing process cognitively and strategically.
The study aims to examine the interrelationships among metacognitive experiences, extensive reading, critical thinking, and academic writing performance, and to explore how doctoral students’ lived experiences help explain these relationships.
Literature Review
Critical Thinking
Critical thinking plays a central role in shaping doctoral students’ academic writing because it enables them to construct arguments, evaluate evidence, and synthesize ideas with intellectual rigor. Critical thinking is widely recognized as a foundational component of successful academic writing, particularly in postgraduate education, where argumentation, synthesis, and analysis are essential (Cruz et al., 2020). Doctoral students are expected to demonstrate critical thinking through their ability to evaluate evidence, construct coherent arguments, and engage with diverse viewpoints. These skills are critical to the production of well-reasoned, evidence-based texts (Yusuf et al., 2024). However, prior research suggests that many graduate students struggle to acquire the necessary critical thinking skills or fail to apply them consistently in academic writing (Borglin, 2012; Persky et al., 2018). To be more specific, Kellogg (2018) conceptualizes writing as a cognitively demanding task that simultaneously requires memory, language control, and analytical reasoning. From this perspective, writing becomes a mode of thinking in itself, and the quality of writing reflects the quality of thought. Yeh et al. (2022) reinforce this view by arguing that well-developed academic writing is an extension of well-developed critical thinking. These patterns signal that developing academic writing proficiency requires a deeper understanding of the cognitive demands associated with critical thinking.
Critical thinking disposition is an essential component of doctoral writing because it determines whether learners consistently apply higher-order reasoning when engaging with academic texts. Beyond cognitive skills, scholars have emphasized the importance of a critical thinking disposition, which refers to the consistent tendency to engage in reflective and reasoned judgment (Ennis, 1996). Facione’s (1990) Delphi report identified seven key dispositions: truth-seeking, open-mindedness, analyticity, systematicity, critical thinking, self-confidence, inquisitiveness, and maturity of judgment that collectively shape how individuals approach reasoning tasks. Similarly, Paul and Elder (2001) conceptualize CT disposition as intellectual traits such as humility, courage, empathy, and fair-mindedness, which are indispensable for sustained critical engagement. These frameworks demonstrate that critical engagement depends not only on cognitive ability but also on learners’ habitual willingness to question, evaluate, and interpret information.
The systematic measurement of these dispositions further shows how they can be linked to academic writing outcomes in empirical research. These theoretical frameworks highlight that effective academic writing requires not only the cognitive skills of analysis and evaluation but also the dispositional inclination to apply these skills consistently. To operationalize these constructs, instruments such as the California Critical Thinking Disposition Inventory (CCTDI) have been developed and widely validated in higher education contexts. The CCTDI directly reflects the theoretical dimensions identified in the Delphi report and has been employed across diverse cultural settings, including studies of Asian graduate students (X. Wang et al., 2019). Its multidimensional structure provides a comprehensive assessment of learners’ willingness to engage in reflective judgment, making it particularly relevant for exploring the link between critical thinking and doctoral-level academic writing. This alignment between theory and measurement offers a robust foundation for examining CT disposition within the present study.
In this study, the examination of CT disposition is grounded in these established theories and supported by the empirical use of validated measurement tools. Previous studies have also identified instructional strategies that enhance critical thinking in writing. For instance, Wale and Bishaw (2020) demonstrated that inquiry-based learning can significantly enhance students’ ability to interpret, analyze, and evaluate information in argumentative writing. Similarly, Butcher (2021) emphasizes the importance of teaching self-regulation, inference, and reasoning as part of the writing curriculum. These findings underscore the necessity of embedding critical thinking explicitly into academic writing instruction for doctoral students, while also justifying the measurement of CT disposition as a central construct in the present study. These findings reinforce the pedagogical importance of embedding critical thinking explicitly into doctoral writing support and further justify treating CT disposition as a central construct in this study.
Metacognitive Experiences
Metacognition is a central cognitive mechanism in academic writing because it enables learners to regulate their thinking and make intentional decisions throughout the writing process. Metacognition, broadly defined as “thinking about thinking,” is critical in regulating one’s learning processes (Flavell, 1979, p. 906). In academic writing, metacognition manifests in how students plan, monitor, and evaluate their writing strategies, especially in recursive tasks like drafting and revision (Hyland & Hyland, 2019). Among EFL learners, metacognition is shown to enhance not only writing performance but also learner autonomy and language acquisition (Q. Sun & Zhang, 2022a; Zhang & Zhang, 2019). These findings indicate that metacognition supports both the strategic and self-regulated dimensions of effective writing.
A deeper understanding of metacognition requires distinguishing metacognitive experiences from other forms of metacognitive activity. Metacognitive experiences, distinct from metacognitive knowledge and strategies, refer to the real-time thoughts and emotions experienced during cognitive activity (Dindar et al., 2020). These experiences influence the writer’s awareness of task complexity, confidence in managing the writing process, and engagement in strategic thinking. Q. Sun and Zhang (2022b) provided empirical evidence linking metacognitive experiences to enhanced writing performance in EFL university contexts. Additionally, Q. Sun et al. (2021) found that such experiences prompt writers to re-evaluate their goals, activate strategy use, and engage in deeper reflection, which are crucial traits for second-language academic writing. In summary, these studies show that metacognitive experiences shape how writers respond moment-by-moment to the cognitive and emotional demands of writing.
The operationalization of metacognitive experiences in writing research further underscores their relevance for empirical analysis. To assess metacognitive experiences in EFL writing contexts, Q. Sun et al. (2021) developed the EFL Learners’ Writing Metacognitive Experiences Questionnaire (EFLLWMEQ). This instrument was designed on the basis of Efklides’ (2006) Metacognitive and Affective Model of Self-Regulated Learning (MASRL), which highlights real-time feelings of difficulty, effort, and confidence as core indicators of metacognitive experiences. Its validation confirmed both internal consistency (Cronbach’s α values above .80) and construct validity through factor analysis, making it a reliable tool for assessing metacognitive experiences in academic writing tasks. This strong theoretical and psychometric foundation supports its use in examining doctoral students’ metacognitive regulation in the present study.
Despite growing evidence, metacognitive experiences remain insufficiently examined in doctoral education, particularly regarding their relationship to higher-order reasoning. To be more specific, metacognitive experiences remain underexplored in doctoral education, especially in the Asian context (Q. Sun & Zhang, 2022b). Their connection to critical thinking is theoretically supported by literature suggesting that executive-level cognitive functions, such as goal setting, planning, and monitoring, are prerequisites for higher-order thinking (Abedini, 2021; Li et al., 2020). This study builds on this premise by investigating how metacognitive experiences support the development of critical thinking in the context of academic writing. Building on this premise, the present study investigates how metacognitive experiences contribute to the development of critical thinking within doctoral-level academic writing.
Extensive Reading
Extensive reading plays a foundational role in supporting academic writing because it exposes learners to language patterns, discourse structures, and disciplinary conventions in meaningful and sustained ways. It is defined as the practice of reading large amounts of material for general understanding and language development, rather than detailed analysis (Suk, 2016). Harman (2013) highlighted that extensive reading not only enhances writing fluency but also serves as a vital source of input for acquiring genre conventions and discourse structures. For second-language learners, reading is a multi-faceted process involving prior knowledge, cognitive engagement, instructional context, and text complexity (Coiro, 2020). Empirical evidence shows that extensive reading contributes positively to vocabulary development, syntactic awareness, and overall academic performance (Chen, 2018; Duong & Nguyen, 2022). These findings suggest that extensive reading strengthens the linguistic and cognitive foundations essential for effective academic writing.
Despite these benefits, many learners are unable to leverage extensive reading effectively because they are not guided in linking reading to writing tasks. In many academic contexts, students receive limited guidance in integrating reading into their writing processes. Taye and Mengesha (2024) argue that without a structured approach to reading instruction across pre-reading, while-reading, and post-reading phasesstudents fail to develop the fluency and analytical capacity required for academic writing. What’s more, Lee and Lee (2023) demonstrated that structured extensive reading significantly improved students’ writing proficiency in a study of Vietnamese EFL students. These studies indicate that instructional design must make the reading–writing connection explicit rather than assuming that exposure to texts automatically improves writing.
However, the cognitive mechanisms linking extensive reading to critical thinking and doctoral-level writing remain insufficiently explored. Although existing research highlights the linguistic and structural benefits of reading, how extensive reading contributes to critical thinking and academic writing within doctoral contexts has yet to be fully unpacked (Phyo et al., 2024). This gap suggests that reading should be examined not only as an input-rich activity but also as a potential trigger for higher-order reasoning relevant to doctoral writing.
The measurement of reading attitudes and engagement provides another avenue for analyzing how extensive reading contributes to academic writing. To capture learners’ engagement in extensive reading, validated scales have been developed that measure attitudes, motivation, and self-regulation in second-language reading contexts. Among these, the Second Language Reading Attitude Scale (Derin et al., 2023) has been psychometrically validated through exploratory and confirmatory factor analyses, demonstrating five robust dimensions: cognitive, conative, self-assessment, anxiety, and negative attitude. By operationalizing both affective and cognitive components of reading attitudes, the L2RA provides a comprehensive tool to assess how learners approach extensive reading. Its multidimensional framework aligns with Day and Bamford’s (1998) extensive reading principles, making it especially suitable for studies exploring the link between extensive reading, critical thinking, and academic writing. Thus, the use of the L2RA in this study offers a theoretically informed way to examine how doctoral students’ reading engagement may relate to their cognitive and writing outcomes.
Integrating the Constructs: Conceptual Framework and Hypotheses
While each of these factors, critical thinking, metacognitive experiences and extensive reading, has independently been linked to writing success, few studies have examined how they interact in shaping doctoral students’ academic writing performance (Taye & Teshome, 2025). Moreover, the potential mediating role of critical thinking in this relationship remains unclear.
To address these gaps, the current study proposes a conceptual framework (see Figure 1) that integrates these constructs into a predictive model. Specifically, the study posits that metacognitive experiences and extensive reading are positively associated with academic writing performance, both directly and indirectly through critical thinking. This model is tested using Partial Least Squares Structural Equation Modeling (PLS-SEM) and supported by qualitative data collected in the second phase of a sequential explanatory mixed-methods design.

Conceptual framework.
Based on the conceptual framework and the reviewed literature, the following hypotheses are proposed accordingly:
Methodology
Research Design
This study employed a sequential explanatory mixed-methods design, which integrates a quantitative phase followed by a qualitative phase to explain and contextualize statistical findings (Creswell & Creswell, 2022). This design was selected to explore both the predictive relationships among extensive reading, metacognitive experiences, critical thinking, and academic English writing, and the underlying processes shaping these interactions.
The quantitative phase involved survey data collection from 219 international doctoral students and was analyzed using Partial Least Squares Structural Equation Modeling (PLS-SEM) to test direct and mediated effects. Guided by the quantitative findings, the qualitative phase employed semi-structured interviews with a purposive sub-sample to gain deeper insight into students’ experiences with reading, thinking, and writing in English academic contexts.
Integration occurred at the interpretation stage, where qualitative findings were used to explain and elaborate on the quantitative results. This design ensured triangulation, enhanced validity, and provided a comprehensive understanding of the research problem, aligning with contemporary recommendations for methodological rigor in mixed-methods inquiry (Tang, 2025).
Participants and Research Setting
This study was conducted among Asian international doctoral candidates enrolled in English-medium programs at public and private universities in Malaysia. Malaysia’s role as a regional higher education hub makes it an ideal setting to explore academic English writing among multilingual doctoral students (Marginson, 2011). Written informed consent was obtained from all participants before participation in both the survey and interview phases. Participants were informed that their participation was voluntary and that they could withdraw at any time without penalty.
The demographic characteristics of the quantitative sample (N = 219) are presented in Table 1. Participants were selected through simple random sampling, ensuring representation across disciplines and institutions. The target population consisted of approximately 560 international doctoral candidates enrolled in social science disciplines across four Malaysian universities. Of these, 300 were invited to participate and 219 completed the survey, yielding a response rate of 73%. Eligibility criteria included: (a) enrollment in a PhD program in Malaysia, (b) at least second-year standing, and (c) active engagement in English academic writing. Participants represented various social science disciplines (e.g., education, psychology, business) and came from a range of Asian countries, with the largest subgroup from China (39.2%). The sample included 68% male and 32% female respondents, with a majority (51.6%) aged between 22 and 27. Over 74% were in their third year or later, ensuring meaningful experience with academic writing tasks. To address the adequacy of the sample for PLS-SEM, the study followed Hair et al.’s (2018) “10-times rule,” which requires at least 10 times the largest number of structural paths pointing to a latent construct. In this model, the maximum was three, indicating a minimum of 30 participants. In addition, an a priori power analysis using G*Power (Faul et al., 2009), with α = .05, power = 0.80, and a medium effect size (f2 = 0.15), suggested a minimum of 92 participants. Thus, the achieved sample of 219 exceeded both benchmarks and provided sufficient power for reliable estimation despite the complexity of the model.
Demographic Profile (n = 219).
For the qualitative follow-up, 12 participants were purposively selected from the survey sample using maximum variation sampling (Patton, 2014). To operationalize this strategy, nationality was represented by students from at least four Asian countries (primarily China, India, Pakistan, and Indonesia); discipline was distributed across major social science fields (education, psychology and business); year of study included both second-year and advanced (third year and above) doctoral candidates and writing proficiency was based on self-reported confidence levels in English academic writing (low, medium, high). This ensured diversity across meaningful dimensions and facilitated analytical generalization (Yin, 2017). Of the 12 interviews, 7 were conducted via password-protected Zoom sessions and 5 face-to-face in private university offices, with the mode determined by participants’ preference and availability. Offering both modes enhanced inclusivity and respected participant autonomy. Ethical safeguards were applied consistently: online recordings were encrypted and in-person sessions were conducted in secure spaces to protect confidentiality. Each interview lasted 45 to 60 min and was audio-recorded with informed consent. This qualitative component allowed participants to reflect deeply on their experiences with metacognition, reading, and writing in a secure and supportive environment, enhancing the breadth and depth of the study and ensuring rigor through methodological triangulation.
Instruments
Academic Writing Performance
Academic writing performance was assessed through a 600 to 800 words essay-writing task on a discipline-related topic, evaluated with Jacobs et al.’s (1981) ESL Composition Profile. The rubric measures five dimensions: Content (30), Organization (20), Vocabulary (20), Language Use (25), and Mechanics (5), for a total of 100 points, generating quantitative scores rather than purely qualitative judgments. Two trained raters with expertise in L2 writing independently scored the essays after a calibration session; inter-rater reliability was strong (Cohen’s κ = .82; r = .87, p < .001). Discrepancies greater than 5 points were resolved through discussion or by a third rater. This procedure ensured reliable and rigorous measurement of participants’ academic writing performance.
Metacognitive Experiences
Metacognitive experiences were measured using the EFL Learners’ Writing Metacognitive Experiences Questionnaire (EFLLWMEQ) developed by Q. Sun et al. (2021). This 16-item instrument is grounded in Flavell’s (1979) metacognitive theory and was validated specifically for L2 writing contexts. It includes four subscales: metacognitive feelings, metacognitive estimates, online metacognitive knowledge, and metacognitive strategies, rated on a 6-point Likert scale. Sample items include: “I reflect on my thinking process while revising my draft.” In the present study, the instrument demonstrated satisfactory reliability, with Cronbach’s α values of .72 for metacognitive feelings, 0.75 for metacognitive estimates, 0.82 for online metacognitive knowledge, and 0.84 for metacognitive strategies. These results are consistent with the original validation study, which reported α values ranging from .70 to .85 across the four subscales (Q. Sun et al., 2021).
Critical Thinking
Critical thinking disposition was measured using the California Critical Thinking Disposition Inventory (CCTDI), developed by Facione and colleagues as part of the Delphi Report on critical thinking (Facione, 1995). This 75-item scale, rated on a 6-point Likert scale, captures seven key dimensions of critical thinking: truth-seeking, open-mindedness, analyticity, systematicity, critical thinking self-confidence, inquisitiveness, and cognitive maturity. It has been widely adopted across educational levels and cultures and demonstrates strong internal consistency (α ≈ .90 for the full scale; Facione, 1995). In the present study, reliability was similarly strong, with Cronbach’s α = .91 for the overall scale and subscale alphas ranging from .72 to .86, consistent with previous findings.
Extensive Reading Attitudes
Participants’ attitudes and engagement with extensive reading were assessed using the Second Language Reading Attitude Scale (L2RA; Derin et al., 2023). The L2RA is a validated 26-item instrument designed to capture five attitudinal dimensions toward L2 reading: Cognitive beliefs (6 items), Conative behavioral intentions (8 items), Self-assessment of reading ability (3 items), Anxiety about L2 reading, and Negative Attitudes (remaining items). Items are rated on a 5-point Likert scale ranging from 1 (strongly disagree) to 5 (strongly agree). The scale was validated through exploratory and confirmatory factor analyses (EFA and CFA) with a large Malaysian student sample (N = 465), demonstrating satisfactory construct validity, convergent and discriminant validity, and internal consistency for all subscales (Cronbach’s α ranging from .71 to .87). For the present study, scores across all five subscales were aggregated to form a composite latent variable representing participants’ extensive reading attitudes. This approach aligns with previous research operationalizing extensive reading attitudes as a multidimensional but unified construct in L2 academic literacy development.
Semi-Structured Interview
A semi-structured interview protocol was designed to explore doctoral candidates’ perspectives on metacognitive experiences, extensive reading, critical thinking, and academic English writing. The protocol was informed by the study’s conceptual framework and initial quantitative findings, consistent with explanatory sequential design principles.
The guide included eight open-ended questions with probes targeting how participants regulate their writing, engage in critical thinking, and draw from reading practices. For example, participants were asked, “How do your reading habits influence your academic writing?” and “What strategies do you use when facing writing challenges?”
To ensure clarity and relevance, the protocol was pilot-tested and reviewed by two experts in academic writing and qualitative research. Interviews were conducted in English, lasted 45 to 60 min, and were audio-recorded with consent. This instrument allowed for both structure and flexibility, supporting rich, contextual data collection aligned with the study’s mixed-methods approach (Zhang & Fathi, 2025).
Data Collection Procedure
Data were collected using a sequential explanatory mixed-methods design (Hirose & Creswell, 2022), starting with a quantitative survey followed by qualitative interviews. Ethical approval and informed consent were obtained before the research. Firstly, 219 valid responses were collected via a Google Forms survey shared through institutional channels and social media. The instrument measured academic writing performance, metacognitive experiences, critical thinking, extensive reading, and demographics. Subsequently, 12 participants were purposively selected for semi-structured interviews based on nationality, academic year, writing scores, and availability. Conducted in English via Zoom or face-to-face, interviews lasted 45 to 60 min and were thematically guided by quantitative findings. All sessions were transcribed, anonymized, and securely stored.
Data Analysis
Data analysis was conducted according to the sequential explanatory mixed-methods design, beginning with quantitative analysis followed by qualitative interpretation. For the quantitative phase, data were analyzed using Partial Least Squares Structural Equation Modeling (PLS-SEM) via SmartPLS 4 software. PLS-SEM was chosen due to its ability to handle complex models with multiple mediators and latent constructs, especially in exploratory studies with relatively small sample sizes (Hair et al., 2018).
The analysis proceeded in two stages. First, the measurement model was evaluated to ensure construct reliability and validity. Internal consistency was assessed through Cronbach’s alpha and composite reliability (CR), while convergent validity was examined via average variance extracted (AVE). Discriminant validity was confirmed using the Fornell–Larcker criterion and the heterotrait-monotrait (HTMT) ratio (Henseler et al., 2014). Next, the structural model was tested to examine hypothesized relationships among the variables. Key indicators such as path coefficients, R2 values, effect sizes (f2), and predictive relevance (Q2) were reported. The significance of the direct and indirect effects, including mediation through critical thinking, was assessed using bootstrapping with 5,000 resamples, which provides bias-corrected confidence intervals for hypothesis testing (Hair et al., 2016).
Qualitative data were analyzed using Braun and Clarke’s (2006) thematic analysis. The approach was primarily deductive, with the three constructs from the quantitative model (extensive reading, metacognitive experiences and critical thinking) serving as a priori categories. Within these, inductive open coding was applied to identify subthemes and nuances beyond the predefined constructs. This hybrid strategy ensured alignment with the conceptual model while remaining open to participants’ perspectives. Coding was conducted manually, with two researchers cross-checking a subset of transcripts to ensure consistency. Finally, the integration of findings occurred at the interpretation stage, wherein qualitative themes were used to explain and contextualize quantitative patterns.
Results
Quantitative Results
Measurement Model Evaluation
Before examining the structural relationships, we first evaluated the measurement model to ensure the reliability and validity of all constructs, following recommended PLS-SEM criteria. As shown in Table 2, all constructs exhibit strong internal consistency: Cronbach’s alpha and composite reliability (CR) values for each latent variable exceed the recommended 0.70 thresholds (Hair et al., 2018), indicating high reliability. The average variance extracted (AVE) values range from 0.61 to 0.72, all well above the 0.50 benchmark, which confirms convergent validity by demonstrating that each construct captures over 50% of the variance in its indicators. In addition, all indicator loadings were statistically significant (p < .001) and generally above 0.70, further supporting item reliability and convergent validity.
Construct Reliability and Convergent Validity of the Measurement Model.
To assess discriminant validity, we applied the Fornell–Larcker criterion and examined the Heterotrait–Monotrait (HTMT) ratios. Table 3 presents the inter-construct correlation matrix with the square root of each construct’s AVE on the diagonal. In every case, the diagonal value is higher than the off-diagonal correlations, satisfying the Fornell and Larcker (1981) criterion for discriminant validity. For example, the square root of the AVE for Academic Writing Performance (0.82) is greater than its correlations with Critical Thinking (0.64), Extensive Reading (0.53), and Metacognitive Experiences (0.47), indicating that AWP shares more variance with its indicators than with any other construct. Furthermore, all pairwise HTMT ratios were below the conservative cutoff of 0.85 (Henseler et al., 2014), providing additional evidence that each construct is empirically distinct. Taken together, the measurement model demonstrates adequate reliability, convergent validity, and discriminant validity, allowing us to proceed confidently to the structural model analysis.
Fornell–Larcker Discriminant Validity Criterion.
Note. Boldfaced diagonal values indicate the square roots of the Average Variance Extracted (√AVE). Discriminant validity is supported when each √AVE exceeds the corresponding inter-construct correlations.
Structural Model Evaluation
With a validated measurement model in place, the researchers assessed the structural model to test the hypothesized relationships among extensive reading (ER), metacognitive experiences (ME), critical thinking (CT), and academic writing performance (AWP). The PLS-SEM structural model showed good explanatory power for the outcome variable. The model explained R2 = .624 for AWP, meaning that approximately 62.4% of the variance in academic writing performance is accounted for by the three predictors (ER, ME, and CT) in the model. This substantial R2 indicates a strong level of explanatory power for a behavioral model in an educational context. The variance explained in the mediating construct, CT, was also considerable (nearly half of its variance, as implied by the significant predictors ER and ME), suggesting that the model captures a large portion of the determinants of students’ critical thinking. In terms of overall model fit, the standardized root mean square residual (SRMR) was 0.067, which is below the 0.08 threshold, indicating an acceptable model fit (Hair et al., 2018).
Structural path coefficients were estimated via bootstrapping with 5,000 resamples to assess their significance (Table 4). Critical Thinking emerged as a strong positive predictor of Academic Writing Performance (β = .58, t = 7.12, p < .001). This result supports the hypothesized mediating role of critical thinking, as higher levels of CT are associated with better writing performance. Meanwhile, both Extensive Reading and Metacognitive Experiences showed significant positive effects on Critical Thinking (ER → CT: β = .42, t = 5.83, p < .001; ME → CT: β = .39, t = 4.92, p < .001). These coefficients indicate that students who engage more in extensive reading and those who employ metacognitive strategies more frequently tend to have higher critical thinking skills. The direct paths from ER and ME to AWP were also tested; however, once critical thinking was included in the model, the direct effects of ER and ME on AWP were greatly reduced and became statistically non-significant. The results indicate that critical thinking statistically mediates the association between reading, metacognition, and academic writing performance. While this suggests an important explanatory role, the cross-sectional design prevents strong causal inference, and longitudinal studies are needed to confirm the directionality of these effects. The findings suggest that reading and metacognitive strategies are associated with better academic writing performance, primarily through their statistical relationship with critical thinking.
Structural Model Path Coefficients and Significance.
The significance of the mediating pathways was confirmed by examining indirect effects. Both indirect effects were positive and significant: extensive reading had an indirect effect on writing performance through critical thinking (β = .23, t = 3.45, p < .01), as did metacognitive experiences (β = .21, t = 3.18, p < .01). The significant indirect coefficients, coupled with the non-significant direct effects, provide evidence of full mediation in the model. In practical terms, these results highlight critical thinking as the key mechanism by which reading and metacognitive engagement translate into improved writing outcomes. We also evaluated the effect sizes (f2) for the structural paths to gauge their substantive impact. The path from CT to AWP had an f2 of 0.35, which constitutes a large effect (indicating that removing CT from the model would greatly reduce the R2 for AWP). In comparison, the effects of ER on CT (f2 = 0.18) and ME on CT (f2 = 0.16) were of moderate magnitude. These effect size values reinforce the conclusion that critical thinking is the most influential predictor in the model, while extensive reading and metacognitive experiences still have meaningful, albeit more moderate, contributions to explaining students’ critical thinking development.
Overall, the quantitative findings from the PLS-SEM analysis provide clear support for the hypothesized model. Figure 2 presents the structural model with standardized path coefficients. Metacognitive experiences (ME) had significant direct effects on both critical thinking (CT; β = .39) and academic writing performance (AWP; β = .18). Extensive reading (ER) also had a significant effect on CT (β = .42), but its direct effect on AWP was weaker (β = .21). In turn, CT was the strongest predictor of AWP (β = .58). These results suggest that CT plays a central mediating role, particularly in channeling the effects of ER and ME toward stronger academic writing outcomes. All hypothesized relationships were supported in the expected direction, with critical thinking functioning as an essential intermediary. This strong empirical backing sets the stage for the qualitative findings, which were collected to further explain and contextualize these statistical relationships in students’ real-world experiences.

Structural model with standardized path coefficients (PLS-SEM).
Qualitative Results
To complement the quantitative findings, 12 international doctoral candidates were interviewed to explore their experiences with extensive reading, metacognitive strategies, and critical thinking in academic writing. Participants were selected purposively to ensure variation in nationality, discipline, year of study, and writing proficiency. Data were analyzed using Braun and Clarke’s (2006) thematic analysis, following a deductive approach with the three quantitative constructs as a priori categories, supplemented by inductive open coding to capture subthemes. This ensured both alignment with the conceptual model and openness to participants’ perspectives. Three key themes: (a) Reading as academic scaffolding, (b) Metacognitive strategies as self-regulatory tools and (c) Critical thinking as a writing enabler are identified. Together, these themes demonstrate how reading, reflection, and analysis interact within the process approach to writing.
Reading as Academic Scaffolding
Students highlighted extensive reading as the foundation of their writing. By engaging with scholarly texts, they absorbed vocabulary, genre conventions, and argumentation strategies, which informed their own drafts. As P3 explained, “I read a lot before I write. It helps me see how arguments are built and how authors use evidence.” Similarly, P7 indicated that “Reading journal articles gives me examples of how to structure my paper. I follow the flow of introduction, discussion, and conclusion that I see in published work.” Reading was described not as an alternative to process writing, but as a preparatory stage that scaffolded planning and drafting. Critical engagement with texts, questioning claims and reasoning through evidence, was key to transforming reading into effective writing, supporting the mediating role of critical thinking identified quantitatively.
Metacognitive Strategies as Self-Regulatory Tools
Participants described using planning, monitoring, and revision to regulate their writing. These strategies were often self-taught, developed through feedback and reflection rather than formal instruction. P5 shared, “Nobody taught me how to revise effectively. I learned by reading feedback and thinking about what went wrong and why.” At the same time, P2 described the emotional side of metacognition: “I often stop in the middle of writing to ask myself if I am still answering the research question. This helps me stay focused.” Emotional regulation, such as managing anxiety and sustaining motivation, was also central. These findings illustrate that metacognitive strategies underpin the recursive nature of the process approach, enabling students to refine their writing while simultaneously exercising critical thinking about their arguments and logic.
Critical Thinking as a Writing Enabler
Critical thinking was perceived as the catalyst that turned reading and reflection into scholarly writing. Students noted that doctoral-level writing required them to go beyond memorization and summary toward critique, synthesis and argumentation. As P1 reflected, “In my country, we were taught to memorize. Here, I have to critique the author and write my own views. That’s a big change.” P6 similarly emphasized the challenge: “At first, I just summarized papers, but later I realized I needed to compare authors and develop my own position. That made my writing stronger.” While reading provided knowledge and metacognitive strategies supported regulation, critical thinking was described as the enabling skill that allowed these elements to converge into original academic contributions. In this way, the qualitative findings reinforce the quantitative model: process-oriented writing is strengthened when reading serves as scaffolding, metacognition as regulation, and critical thinking as the mediator that transforms input into output.
Integration of Findings
Bringing together the quantitative and qualitative findings provides a richer, more in-depth understanding of how international doctoral candidates develop their academic writing skills. The statistical results confirmed the central role of critical thinking as a mediator between extensive reading, metacognitive experiences and writing performance. The interview insights added depth by illustrating what those statistical relationships mean in practice. Taken together, both strands of data paint a coherent picture in which extensive reading, metacognitive self-regulation and critical thinking operate in concert to enhance academic writing. Doctoral candidates described extensive reading not simply as an academic task, but as a way to immerse themselves in scholarly discourse, helping them internalize discipline-specific terminology, argument structures, and writing conventions. This immersion through reading provides content and exemplars that doctoral candidates can critically reflect on. Simultaneously, they used metacognitive strategies (planning their writing, monitoring their progress, revising thoughtfully, and even managing the emotional challenges of writing) as essential self-regulatory tools to keep themselves on track and continually improve their drafts. These behaviors reflect a high level of self-awareness and deliberate practice, which the doctoral candidates saw as indispensable for writing at the doctoral level.
Across both phases of the study, critical thinking consistently stood out as the skill that ties everything together. Quantitatively, it was the key mediator, and qualitatively, doctoral candidates explicitly credited critical thinking with enabling them to make sense of what they read and to transform their reflections into persuasive written arguments. For these doctoral candidates, critical thinking was the engine that drove their ability to synthesize information and generate original ideas, effectively bridging the gap between reading input and writing output. Notably, the integrated findings suggest that becoming proficient in academic writing is not merely about mastering language or grammar in isolation. It is fundamentally about developing a way of thinking. Clear reasoning, purposeful reading, and mindful self-reflection were recurrently mentioned as pillars of good writing. In other words, academic writing development is as much a cognitive and metacognitive journey as it is a linguistic one.
The convergence of evidence from the quantitative and qualitative components strengthens the study’s conclusions and carries important implications. The fact that both the survey data and the interviews highlight the same triad of influences, reading, metacognition, and critical thinking, underscores the validity of the findings through methodological triangulation. For educators and doctoral supervisors, this integration of results emphasizes that supporting doctoral candidates’ growth involves more than correcting grammar or structure. Instead, it requires fostering doctoral development as critical, reflective thinkers. Helping doctoral candidates engage deeply with literature (to read proactively and critically), coaching them in metacognitive writing strategies (to plan, monitor, and revise their work), and explicitly encouraging critical analysis and original thought can create the conditions for significant improvements in academic writing performance. By aligning instructional practices with these interconnected facets, educational programs can better equip international doctoral candidates to succeed as scholarly writers. In summary, the mixed-methods integration in this chapter provides a holistic view: extensive reading and metacognitive experiences lay the groundwork, critical thinking serves as the linchpin, and together these elements drive the advancement of academic writing skills. This comprehensive understanding will be further examined in the next chapter’s discussion of the implications and recommendations arising from the study’s findings.
Discussion
This study examined how extensive reading and metacognitive experiences influence academic writing among international doctoral candidates, emphasizing the mediating role of critical thinking. Using a sequential explanatory mixed-methods design, both quantitative findings from PLS-SEM and qualitative interview data illuminated the central role of cognitive and reflective processes in shaping writing quality. The PLS-SEM analysis confirmed that critical thinking statistically mediates the association between reading, metacognition, and academic writing performance. This pattern suggests association rather than causation and indicates that cognitive engagement and evaluative reasoning are closely linked with stronger academic writing beyond linguistic proficiency.
Although extensive reading did not show a direct statistical relationship with writing performance, its indirect association through critical thinking was significant. This finding is consistent with research highlighting the close relationship between reading and writing development (Grabe & Zhang, 2013). Our study extends this work by showing that reading widely in one’s field is statistically associated with richer content knowledge and evaluative reasoning, which in turn support stronger writing performance. At the same time, the challenges doctoral candidates reported in transferring rhetorical patterns from reading to their own writing echo earlier findings that students often struggle to apply knowledge from texts without explicit instructional guidance (Rosenberg, 1987). This supports Grabe and Yamashita’s (2022) concept of the “reading–writing nexus,” which emphasizes the need for pedagogical support at the reading–writing interface. Practical strategies such as critical response papers, summary–analysis tasks, and literature review workshops may help learners connect reading input with writing output in more systematic ways. This implication resonates with Phyo et al. (2024), who demonstrated a strong correlation between doctoral students’ academic reading and writing abilities, highlighting the inseparable role of reading in meeting postgraduate writing demands. Our findings go further by suggesting why this link exists: it is the critical processing of reading, rather than reading per se, that elevates writing quality. This interpretation is supported by experimental research showing that structured pre-writing reading tasks are associated with improvements in essay content and structure. In short, our results reinforce the adage that good readers make good writers, while clarifying that it is critical engagement with reading that underpins high-quality doctoral writing.
Metacognitive experiences likewise emerged as a crucial ingredient in doctoral writing development. The SEM analysis showed that metacognitive self-regulation (planning, monitoring, evaluating one’s work, etc.) significantly influenced critical thinking (β = .39, p < .001) and, through it, improved writing performance. Qualitative data vividly illustrated what this statistical relationship means in practice. Doctoral candidates described an array of metacognitive strategies they use throughout the writing process: setting goals and timelines before writing, self-monitoring their progress, reflecting on adviser feedback, and revising drafts after critical self-evaluation. Notably, many of these strategies were developed informally; doctoral candidates mentioned learning to plan or revise ad hoc, often through trial and error or after experiencing setbacks. Few doctoral candidates reported receiving systematic instruction in managing the academic writing process, including planning, drafting, revising, and self-monitoring, a gap that has been repeatedly highlighted in doctoral education research (Aitchison & Lee, 2006; Paré, 2017). This points to an institutional gap: while doctoral programs demand high-level writing outputs (papers, proposals, dissertations), they seldom provide systematic training in the self-regulatory skills needed to produce those outputs.
Our findings suggest that higher metacognitive awareness is statistically associated with stronger academic writing. Doctoral candidates who actively reflected on their writing processes also reported greater use of critical thinking to address writing problems, a result consistent with research showing that metacognitive strategies such as planning, monitoring, and managing information improve writing performance (Huang & Zhang, 2022; Teng, 2021a). Advanced academic writing, therefore, is not merely about producing text but about regulating one’s own thinking while writing. Institutions can support this by offering seminars or coaching that explicitly teach metacognitive strategies, such as project planning, reflective journaling, and self-assessment. Evidence indicates that such interventions are effective; for instance, incorporating reflective prompts into assignments has been shown to improve structure and argumentation (Teng, 2021a, 2021b). Our participants echoed these findings: those who engaged in deliberate self-reflection felt more in control of their writing, while those without such strategies often reported anxiety and reliance on external feedback. Fostering metacognitive habits may support doctoral candidates in writing more independently and resiliently.
The mixed-methods nature of our study proved valuable in developing a holistic understanding of academic writing development. The quantitative model quantified the strong, mediating role of critical thinking between reading, metacognition, and writing. The qualitative interviews then unpacked how doctoral candidates experience and perceive these connections in real life. This integration revealed that academic writing skills at the doctoral level emerge from a confluence of literacy practices (extensive reading), cognitive processes (critical analysis), and self-regulatory behaviors (metacognitive strategies). Importantly, doctoral candidates themselves recognize this confluence: they spoke of writing as much more than a linguistic exercise, describing it instead as a thinking process that requires immersing oneself in literature, questioning and synthesizing ideas, and constantly evaluating one’s own understanding. This student perspective reinforces modern theoretical views of writing. Classic cognitive models of writing (e.g., Flower & Hayes, 1981) have long posited that writing is a form of problem-solving that involves planning, translating, and reviewing. Our findings add nuance to these models by emphasizing the role of extensive reading (as input for knowledge and genre awareness) and highlighting critical thinking as the catalyst that converts reading and reflection into original writing. In essence, doctoral writing development can be viewed as a recursive, dynamic cycle: reading is statistically associated with greater opportunities for critical engagement; thinking (with metacognitive oversight) is linked to effective writing and writing in turn often exposes gaps that send the student back to read more or think more deeply. This cycle aligns with the concept of academic literacy as not merely the ability to write in a vacuum, but the ability to engage in a continuous dialogue with texts, ideas, and one’s own understanding (Lea & Street, 1998; Negretti, 2012). For educators and supervisors, this means that supporting doctoral writers involves more than correcting grammar or improving sentence structure. It requires nurturing doctoral candidates’development as critical, reflective thinkers and readers. As P8 eloquently put it, “Helping me with writing means helping me learn how to think,” a statement that encapsulates the intertwined nature of these skills.
Despite its contributions, this study has several limitations that should be acknowledged. First, the sample was restricted to international doctoral candidates enrolled in Malaysian universities. While this setting provided valuable insights into a multilingual academic environment, the findings may not be fully generalizable to local doctoral candidates or to other educational contexts. Second, the qualitative phase relied on self-reported interview data, which may be influenced by recall bias or participants’ perceptions of what constitutes good academic writing. Triangulating with additional sources, such as writing portfolios or supervisor feedback, could have strengthened the analysis. Third, although PLS-SEM allowed testing of complex relationships, the cross-sectional nature of the data limits causal interpretation. Mediation was assessed at a single time point, and thus should be understood as a statistical mediation rather than evidence of temporal or causal ordering. Future longitudinal or experimental designs are needed to establish whether reading and metacognitive strategies genuinely enhance critical thinking, which in turn strengthens academic writing performance.
Conclusion and Implications
In summary, this mixed-methods study explored the complex interplay between extensive reading, metacognitive experiences, critical thinking, and academic writing performance among international doctoral candidates. The findings converged on a clear message: neither reading abundance nor metacognitive skill directly translates into better writing without the intermediary of critical thinking. Instead, it is the doctoral student’s capacity for higher-order reasoning and critique that bridges language input and reflective practice with the production of coherent, high-quality academic writing. Quantitative analysis showed that when critical thinking was accounted for, the direct paths from extensive reading and metacognitive experience to writing performance became non-significant, indicating full mediation. Qualitative insights reinforced this mechanism: doctoral candidates described how reading extensively and regulating their writing process cultivated the cognitive space necessary for analytical writing. They emphasized that by critically engaging with what they read and by continually reflecting on their own thought processes, they were able to generate original arguments and clearer academic prose.
These integrated findings contribute to a more nuanced theoretical understanding of academic literacy development in multilingual, graduate-level contexts. Rather than viewing writing as a discrete skill or a simple matter of language proficiency, our study positions advanced academic writing as an evolving, self-regulated cognitive process. This process is grounded in critical engagement with texts and ideas, and it unfolds over time as doctoral candidates internalize scholarly norms through reading and practice. In essence, to write well at the doctoral level is to think well, to exercise judgment about sources, to see connections and gaps in the literature, and to organize one’s thoughts into a logical argument. The doctoral candidates in our study who made the greatest writing gains were those who actively read to inform their thinking and who used metacognitive strategies to plan and refine their work. Therefore, a significant conclusion is that academic writing expertise is a synthesis of literacy, cognition, and metacognition. This perspective aligns with contemporary views of doctoral education that stress the development of researchers who are not only competent in language but are independent thinkers and self-directed learners (Paré, 2017). By confirming empirically that reading and metacognitive reflection exert their influence on writing through critical thinking, our study underscores the idea that fostering better writers involves cultivating better critical thinkers and learners.
From a theoretical standpoint, this research supports an integrated model of academic literacy that bridges traditionally separate domains: reading, writing, and thinking. It provides empirical evidence that advanced academic writing should not be conceptualized purely as a language output skill, but rather as the culmination of multiple cognitive and metacognitive processes working in concert. In line with socio-cognitive theories of writing, we demonstrate that writing quality is linked to the writer’s engagement with existing knowledge (through reading) and the writer’s ability to regulate thought (through metacognition). Critical thinking emerged as the linchpin of this integration, a finding that enriches theoretical models of writing. Classic models focused on the individual writer’s cognitive stages (planning, drafting, revising); our results suggest these models can be extended by explicitly incorporating the role of reading and critical analysis as inputs to the writing process. In other words, the mental representation of the writing task for a doctoral student is heavily informed by that student’s prior reading of relevant literature and by an internal dialogue about what that literature means. Theoretical frameworks of L2 writing development (e.g., the concept of intertextual competence or writing-from-sources) also gain support from our study. We show that the ability to critically synthesize sources is paramount for successful writing, essentially confirming that doctoral writing is a form of knowledge transformation rather than mere knowledge telling.
Moreover, our findings reinforce the importance of metacognition in academic skill development, lending support to theories of self-regulated learning. The positive impact of metacognitive strategies on writing performance aligns with self-regulation models (Pintrich, 2002; Zimmerman, 2002), which argue that learners who plan, monitor, and reflect on their learning tend to achieve superior outcomes. The researchers extend those models into the realm of doctoral writing by showing that metacognitive experiences specifically related to writing (such as reflecting on feedback or assessing one’s argument logic) play a significant role in enabling critical thinking and, by extension, writing quality. The implication is that any comprehensive theory of doctoral-level writing proficiency must account for how doctoral candidates manage their own writing process. In many academic literacy theories, the social context of writing (enculturation into disciplinary communities, feedback from advisers, etc.) is emphasized. Our work suggests that equally, if not more, important is the individual cognitive context: how the writer processes readings, how the writer questions assumptions, and how the writer makes strategic decisions during writing. In summary, this study contributes to theory by empirically validating a holistic model of doctoral academic literacy: one that situates writing at the nexus of reading input, critical thinking throughput, and metacognitive self-regulation. Future theoretical work is encouraged to build on this integrative perspective, perhaps by developing conceptual frameworks or models that explicitly map these interdependencies. Such frameworks can guide researchers to further investigate under what conditions and for which learners these interdependencies are strongest, refining our understanding of advanced academic writing as both an intellectual and a learning process.
Practically, our findings suggest a need for rethinking how we support doctoral candidates in developing academic writing skills. Many doctoral programs traditionally focus on language mechanics or assume that doctoral candidates will “learn to write” by osmosis through writing their theses. Our study indicates that a more intentional, process-oriented approach is needed, one that cultivates critical thinking and metacognitive skills alongside writing practice. In concrete terms, this means that doctoral programs and writing centers should incorporate pedagogical strategies that engage doctoral candidates in critical reading, reflection, and discussion, as well as explicit training in self-regulated writing practices.
First, the strong link between extensive reading and writing (via critical thinking) implies that doctoral curricula should create more structured opportunities for reading-driven learning. For instance, departments might organize reading circles or seminars where doctoral candidates critically discuss recent journal articles in their field, analyzing authors’ arguments, methods, and writing styles. Such activities encourage doctoral candidates to read not just for content, but for form and reasoning, essentially training them to “read like a writer.” By articulating critiques and comparisons of what they read, doctoral candidates practice the kind of analysis they will later need to apply in writing their literature reviews or argumentative sections. This approach is supported by recent research emphasizing that reading and writing development go hand-in-hand in graduate education. Paltridge and Starfield (2019) specifically advocate targeted language support to bolster reading ability, which in turn can facilitate timely progress in dissertation writing. Our findings concur and further suggest that reading activities should be paired with follow-up writing tasks. For example, after a reading circle, doctoral candidates could be asked to write a one-page reflection connecting the discussed articles to their own research question, thereby practicing synthesis in writing. Instructors can then guide doctoral candidates on how to carry insights from source texts into the framing of their own arguments.
Second, our results highlight the importance of explicit critical thinking instruction within doctoral writing support. Critical thinking, often taken for granted as a generic graduate attribute, can be deliberately nurtured. Workshops on argumentation, logic, and evidence evaluation could be integrated into doctoral training. For instance, universities might offer short courses on “Thinking Critically in Your Literature Review” or “Developing and Supporting Your Argument,” where doctoral candidates learn strategies such as argument mapping, identifying assumptions, and evaluating conflicting findings in the literature. Empirical evidence suggests that direct instruction in critical thinking can indeed improve doctoral candidates’ writing organization and clarity. In our study, many doctoral candidates coming from educational backgrounds with less emphasis on critique (e.g., where rote learning was common) found it challenging to adopt a critical stance in writing. To address this, mentors and writing advisers can use techniques like Socratic questioning on doctoral candidates’ drafts (asking “why” and “how” to push deeper analysis) or encourage peer review exercises where doctoral candidates critique each other’s work constructively. Over time, these practices socialize doctoral candidates into a habit of mind where they automatically interrogate their own ideas and those of others. This is especially beneficial for learners who have not been previously trained to assert an academic voice or challenge sources. By building a culture of critical dialogue in doctoral programs, we help doctoral candidates internalize the value of critical thinking as part of the writing process.
Third, there are clear implications for enhancing metacognitive and self-regulatory support in doctoral writing development. Given that many doctoral candidates in our study developed strategies like planning and self-review only through personal struggle, institutions could save doctoral candidates and frustration by teaching these skills proactively. One practical model is to embed metacognitive training into existing writing courses or doctoral research seminars. For example, instructors can introduce tools such as writing logs or reflective journals in which doctoral candidates document their writing progress and hurdles each week. These logs encourage doctoral candidates to pause and reflect on what is working or not in their process. Another strategy is to incorporate metacognitive prompts into writing assignments. Research has shown that such prompts can increase doctoral candidates’ awareness of their writing strategies and lead to improvements in text quality. Additionally, workshops on time management, goal-setting, and coping with writing anxiety can equip doctoral candidates with techniques to plan large projects and monitor their progress. Our participants noted that learning to deal with emotional aspects (like writing anxiety or lack of motivation) was part of their metacognitive growth. Therefore, addressing the affective side of writing through discussions of writing habits and writing block remedies can be very useful. The overarching idea is to treat the development of a doctoral writer as akin to the development of a self-regulated scholar. University support programs could incorporate elements from self-regulated learning training. For example, helping doctoral candidates set specific writing goals (e.g., number of words per day, or sections per month), self-monitor through progress check-ins, and self-reflect by analyzing feedback received. By making these processes explicit, we demystify academic writing. Doctoral candidates become more aware that successful writing involves drafting and redrafting, seeking feedback, and continuously learning, and that this is normal, not a sign of personal inadequacy. Ultimately, such metacognitive skill-building can lead to more autonomy. As one outcome, we would expect to see doctoral candidates needing less last-minute intervention from supervisors to fix writing issues, because the doctoral candidates themselves catch and correct many issues early on.
Finally, our study suggests that doctoral programs should foster an environment that values process over product in writing. This does not mean lowering standards for the final dissertation or papers, but rather emphasizing the learning journey that leads to those final products. For example, departments could organize writing retreats or boot camps where doctoral candidates spend a week writing together, with scheduled periods for group discussions about challenges and strategies. Peer support mechanisms, such as writing groups or peer review pairings, can also be extremely beneficial. When doctoral candidates share their drafts and thought processes with peers, they gain insight into alternative approaches and can reflect on their own methods. Such community-based learning leverages the fact that often doctoral candidates can learn as much from each other’s experiences as from formal instruction. It also helps to break the isolation that many doctoral writers feel, providing moral support and external structure. In summary, the practical implication is a call to innovate doctoral writing pedagogy by integrating reading, critical thinking, and metacognitive training into the fabric of doctoral education. This holistic approach can better prepare doctoral candidates not only to produce a quality dissertation but to continue writing effectively as scholars, reviewers, and educators in their future careers.
Footnotes
Acknowledgements
I thank all the co-authors for their expertise and assistance throughout all aspects of our study and for their help in writing the manuscript.
Ethical Considerations
This study was conducted in accordance with ethical standards for research involving human participants. Ethical approval was obtained from the Institutional Review Board (IRB) of Universiti Putra Malaysia prior to data collection.
Consent to Participate
All participants were informed about the purpose of the study, the voluntary nature of their participation, and their right to withdraw at any time without penalty. Written informed consent was obtained from all participants before the commencement of data collection. The study ensured participants’ confidentiality, anonymity, and data protection in accordance with institutional guidelines and the Declaration of Helsinki. No identifiable personal information was collected or reported. All data were securely stored and used solely for research purposes. There were no foreseeable risks to participants, and no vulnerable populations were targeted. Participation was entirely voluntary, and no incentives or coercion were involved.
Funding
The authors received no financial support for the research, authorship, and/or publication of this article.
Declaration of Conflicting Interests
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Data Availability Statement
For ethical reasons, data will not be publicly shared. Data is available upon reasonable request from the corresponding author.
