Abstract
The purpose of the present investigation was to examine the relationship between instructors’ defining characteristics (including conduct, personality traits, and past experiences of ingratitude) and student participation in online asynchronous course discussions. Data on traits and experiences were collected from 165 instructors in a variety of disciplines, and analyses of conduct exhibited by instructors and students in mandatory discussion forums were performed. In this study, the amount of time an instructor spent in discussion forums was positively correlated with the amount of time students spent in the course. Manner of relating to students was positively correlated with the number of student replies to the instructor. An instructor’s openness to experience was related to students’ overall response rates. This distinctive pattern fosters further inquiry into the unique aspects of the pedagogy of asynchronous online learning.
Introduction
According to Kuh (2001), students’ conduct in the classroom is a leading indicator of their satisfaction and achievement in school. Not surprisingly, students’ behavioral engagement in the classroom, defined as the extent to which they participate in learning activities (Fredricks, Blumenfeld, & Paris, 2004; Kuh, Kinzie, Buckley, Bridges, & Hayek, 2007), has been linked to positive academic outcomes, including higher performance, satisfaction, and perseverance (Krause & Coates, 2008; Trowler, 2010). It also has been linked to enhanced retention and persistence rates (Connell, Spencer, & Aber, 1994; Finn, 1989; Finn & Rock, 1997; Tinto, 2006). Although behavioral engagement appears to be a vital component of academic success in both online and traditional courses, an issue still being debated is the extent to which instructor conduct and characteristics may differentially contribute to students’ behavioral engagement in the multitude of online environments offered by synchronous (i.e., requiring students and instructors to be online at the same time), asynchronous (i.e., not requiring real-time interactions), and hybrid (i.e., involving both online and face-to-face interactions) course components.
As e-learning programs work toward discipline-specific accreditation and accountability, institutions have struggled to develop guidelines and standards to ensure both quality and consistency of curriculum across multiple sections of the same course, regardless of the instructor assigned to any specific section. Consequently, many institutions have often relied on course developers and subject-matter experts to create structured course elements which not only ensure quality and standardization of curriculum but also can promote engagement (Boston, Ice, & Gibson, 2011; Legon, 2006; Shelton, 2011). In such courses, the instructor is seen as an interaction facilitator and mediator of content whose primary responsibility is to ensure that the curriculum is absorbed so that key information can be acquired and selected skills developed and practiced (Heerema & Rogers, 2002; Morris, Xu, & Finnegan, 2005). An unintended consequence of this approach to online instruction is the tacit assumption that student engagement in online programs is mostly the result of the overall organization and prearranged presentation of course materials (Margaryan, Bianco, & Littlejohn, 2015), and that unique instructor behaviors and characteristics can be discounted as negligible factors. Not surprisingly, in conventional face-to-face classrooms, evidence exists that student–instructor interactions and student satisfaction and productivity are related (Astin, 1984). Specifically, student engagement in the face-to-face classroom has been found to be related to instructors’ in-classroom conduct, such as collaborative learning techniques and attitudes toward a higher level of academic rigor (Umbach & Wawrzynski, 2005). In the online classroom, evidence exists that real-time (i.e., synchronous) interactions with instructors in discussion forums are related to student satisfaction and participation (McBrien, Jones, & Cheng, 2009). However, in the asynchronous online environment, where students can access class materials at any time, questions exist on the nature of student–instructor interactions and their potential effect on engagement (Hew, Cheung, & Ng, 2010). The reason behind these lingering questions is that the asynchronous environment relies heavily on delayed rather than instantaneous social exchanges. As such, online education departs from the conventional face-to-face classroom, forcing a redefinition of one of the main tenets of the instructor’s responsiveness which traditionally has involved not only its content but also the timing of interactions (Comer & Lenaghan, 2013; Dennen, Darabi, & Smith, 2007).
Important to note is that different views exist of the roles of online and face-to-face instructors because online learning tends to be more self-directed (Rabe-Hemp, Woollen, & Humiston, 2009). Whereas the instructor’s primary responsibility in the face-to-face classroom is to deliver his or her knowledge to a receptive student learner (Kahu, Stephens, Leach, & Zepke, 2013), online instructors are largely seen as interaction facilitators and mediators of content who help students interpret and apply information that is acquired from available course materials (Hardin, 2004; Heerema & Rogers, 2002; Morris et al., 2005; Shi, 2010). Thus, online instruction resembles that of independent studies traditionally offered on campus with the exception that the number of students is generally greater than one and that individuated instruction is unlikely. Baran, Correia, and Thompson (2011) have noted that there are unique roles and competencies which must be fulfilled by online instructors, but that the definition of these roles and competencies has yet to achieve consensus. Important to note is that online courses can be entirely asynchronous or use a few synchronous elements, such as real-time discussions (i.e., live chats) or live lectures. Advantages of synchronous components are that they foster spontaneity, collective brainstorming, and community building (Johnson, 2006). One of the presumed barriers in an asynchronous online course is the lack of live interaction with an instructor or classmates. In fact, contrary to online courses with synchronous components, including live lectures, examinations, or discussions, in asynchronous courses, students complete work at their own convenience and post it to a classroom forum or assignment drop box by a required due date. As a result, the style of engagement demonstrated by both instructors and students in asynchronous course environments is rather unique. It follows that participation is more difficult to measure in asynchronous courses, where students are not discussing a topic in real time and may not return to the classroom to provide immediate feedback (Johnson, 2006).
Duncan, Kenworthy, and McNamara (2012) compared synchronous chat boards with asynchronous discussion forums in an online MBA course by measuring the quantity (i.e., number of student posts) and quality (i.e., ratings of posts based on Bloom’s taxonomy; Bloom, Engelhart, Furst, Hill, & Krathwohl, 1956) of student engagement. They found that the quality of student engagement in synchronous elements of an online course was more predictive of students’ exam and course grades than engagement in asynchronous course elements. The pattern uncovered by Duncan et al. may be reflective of the ability of instructors to encourage critical thinking in the synchronous delivery mode more so than in the asynchronous mode. While the results of this study demonstrate the importance of engagement in online forums as it relates to overall course grades, they do not specifically address the role of engagement between instructors and students in the asynchronous mode. Understanding the role of the individual instructor and assessing the quantity and quality of interactions between instructor and student (in addition to engagement with the rest of the class within discussion forums) in asynchronous environments may provide unique insights into the pedagogy of such environments.
A different sample of courses within the undergraduate curriculum can also ascertain the generality of the findings uncovered by Duncan et al. (2012). In fact, evidence exists that asynchronous course elements, such as discussion forums that lack temporal concurrence, can produce higher student engagement than synchronous learning. Petty and Farinde (2013), for instance, analyzed different aspects of engagement within both synchronous and asynchronous elements of math course discussions based on their definition of engagement as students’ efforts to study a subject matter, practice, receive feedback, analyze, and solve problems (Kuh, 2003). Evidence from their analyses suggests that online asynchronous discussions enhance student engagement more effectively than face-to-face discussions. Albeit the limitations to this study (e.g., the small sample size and lack of diverse curricula) advise that caution be exerted in generalizing these findings to the asynchronous learning mode as a whole, it is undeniable that the advantage of asynchronous discussions may be the opportunity for active reflection and analysis that the asynchronous mode invites prior to the action of producing a response (Petty & Farinde, 2013). In fact, in a live format, there is little time to formulate a critical response, and learning is based on immediate reactions to the material and ongoing discussion, rather than on a more reflective consideration of the available information. Furthermore, because secondary school math students were the participants in Petty and Farinde (2013), the results may not adequately reflect the attitudes toward participation held by students taking other courses and adult learners who may face unique time constraints and have needs that are different from those of younger student learners (Karge, Phillips, Jessee, & McCabe, 2011). For instance, engagement of adult learners is known to rely more heavily on participation in discussions and self-motivating activities, such as the sharing of personal and professional experiences that are both practical and relevant (Karge et al., 2011).
The task of assessing engagement in the asynchronous classroom can be made even more challenging by the use of instructor-personalized interactive technology. The study of Mandernach (2009) provides a useful illustration. In the study, students self-reported different aspects of their engagement, including behavioral engagement, and learning was assessed using final exam scores and cumulative course grades. While students reported that the interactive multimedia developed by the individual instructor promoted their engagement (e.g., they perceived in-class interactions more favorably), quantitative analyses demonstrated that student engagement remained focused on performance measures (i.e., involvement appraised by course grades). Yet, it is unclear whether these results depend on the types of courses assessed (e.g., many were general education courses) or reflect self-report biases.
The complexity of the assessment of engagement in the asynchronous mode is further illustrated by the inconsistencies of the available findings even within individual studies. For instance, in an investigation on the role of instructor participation (as measured by the number of postings made) and its relation to student participation (as measured by the number and length of postings made) as well as evaluation (as assessed by end-of-course surveys) in online discussion forums, Mazzolini and Madison (2007) found that the number of instructor posts was negatively correlated with length and number of student posts in discussion forums, and that the number of instructor posts was only moderately correlated with enthusiasm and satisfaction with the course as reported in student evaluations. Furthermore, the timing of instructor posts, either during or at the end of a forum, showed no significant correlation with number of student posts in discussion forums. According to the authors, this pattern of results may illustrate the instructors’ ability to lead class discussions efficiently and prevent students from going astray. Yet, alternative explanations cannot be easily discounted. For instance, it is difficult to rule out self-justification biases not only in student self-report measures of quality and satisfaction (Keengwe, Diteeyont, & Lawson-Body, 2012) but also in instructors’ self-perceptions of conduct that does not align with the overall quality of the selected discussion forums (Mazzolini & Madison, 2007). Indeed, while many instructors self-reported that they asked follow-up questions encouraging critical thinking in discussion forums, the analyses demonstrated that most posts by instructors were answers to questions rather than follow-up questions. Of course, the results of this study may be merely the byproduct of its sample, which comprised a homogeneous group of instructors and students in a specialized educational program with high overall reports of satisfaction. Thus, it remains clear that more information is needed regarding the relationship between instructors’ didactic actions and student conduct in courses from a variety of disciplines perhaps with a more generalizable student sample to better understand the role of the instructor as moderator.
Although the role of engagement in the online classroom, both synchronous and asynchronous, remains a hotly debated topic in the education literature (see Duncan et al., 2012; Mandernach, 2009; Petty & Farinde, 2013), a rather unknown element of engagement in asynchronous discussion forums of quality-assured curricula remains the contribution of the individual instructor. In a qualitative review of instructor conduct in online discussion forums, irrespective of mode of instruction (i.e., asynchronous vs. synchronous), Shepherd and Alpert (2011) noted that instructors who adopted a less formal tone of communication with students (e.g., using students’ first names and incorporating personal experiences and information in introductions and feedback) and were more active in discussion forums were rated as effective instructors in student evaluations. In addition to conduct, another important factor might be the instructor’s past teaching experiences and his or her personality characteristics. For instance, Conway and Peetz (2012) reported that past experiences of ingratitude can influence one’s willingness to further engage in helpful behavior, whereas Roth and Weinstock (2013) noted that a key aspect of teaching is the instructor’s willingness to assist students, thereby making past experiences of ingratitude a potentially critical component of an instructor’s interactions with students. Furthermore, Patrick (2011) reported a relationship between an instructor’s personality traits as measured by the five-factor model (i.e., openness, conscientiousness, extraversion, agreeableness, and neuroticism; McCrae & Costa, 2008) and student evaluations of the instructor’s teaching ability in traditional face-to-face college courses. However, these findings do not address whether personality characteristics matter as much within asynchronous learning environments where key aspects of the curriculum, presentation, and instruction not only are uniform across sections of the same course (Boston et al., 2011; Legon, 2006; Shelton, 2011) but also have been reviewed and judged by independent subject matter and education experts as promoting learning (i.e., quality-assured courses; Coffman & Klinger, 2013). Additionally, the literature on instructor characteristics has often heavily relied on measures of satisfaction and student outcomes, rather than on the objective assessment of interactions and student engagement within the confines of online courses.
Overall, the findings discussed earlier exemplify the conundrum faced by assessment analysts of online asynchronous courses. Namely, the role of the individual instructor in promoting engagement within asynchronous learning environments remains unclear. Research on instructor conduct has provided limited samples of data that may not generalize to adult learners who represent a large constituency of the online environment. Research has also heavily relied on self-report measures, as well as quantity and quality of all posts (i.e., exchanges between or among students and exchanges between students and the instructor), rather than posts which are directed to and stem from the instructor. Thus, the main purpose of the current study was to assess the extent to which an instructor’s conduct and characteristics in online asynchronous classrooms that have met quality assurance standards (Legon, 2006) may be related to student behavioral engagement. For this study, instructor characteristics included not only personality traits, such as extraversion, agreeableness, conscientiousness, neuroticism, and openness to experience (see the five-factor model; McCrae & Costa, 2008), but also self-reports of past experiences of ingratitude when helping students. Thus, besides dispositions such as personality traits traditionally used to describe individual differences, a particular type of antecedent experience was isolated from the multitude of instructors’ life experiences to determine the extent to which it could describe individual differences in behavior specifically observed within the online asynchronous classroom.
The selection of both personality traits and past experiences of ingratitude when helping students was justified by the absence of clear-cut evidence concerning their link to individual differences in behavior within the online, asynchronous, quality-assured classroom as well as evidence from other classroom environments predicting the presence of such a link. In fact, although instructors’ personality traits (e.g., openness to experience and consciousness) had been reported as related to participation and satisfaction in young students utilizing face-to-face or synchronous course elements (see Patrick, 2011), it was unclear whether they matter in an asynchronous, quality-assured environment of adult learners. Furthermore, although the effect of instructors’ past experiences of ingratitude on their conduct and on the conduct of their students had been neglected as a research topic, findings suggested that past experiences of ingratitude can influence a person’s willingness to engage in helpful behavior (Conway & Peetz, 2012; Emmons & McCullough, 2003) and that a key aspect of teaching is the instructor’s willingness to assist students (Bischoff, 2000; Roth & Weinstock, 2013). Thus, both personality characteristics and past experiences of ingratitude were expected to account for instructors’ interactions with students. Furthermore, varied evidence of a link between instructors’ presence (i.e., visibility) in both online and traditional classrooms and student engagement (see Gunawardena & Zittle, 1997; Picciano, 2002; Richardson & Swan, 2003; Schutt, Allen, & Laumakis, 2009) suggested that both instructor conduct and characteristics could predict behavioral engagement (i.e., participation) of students in the asynchronous, quality-assured classroom. As a result, instructors’ time spent in discussion forums and frequency of their posts (i.e., indices of visibility) were expected to be related to behavioral measures of student engagement (e.g., time spent in class and frequency of discussion forum posts). Instructors’ conduct and characteristics were also expected to be correlated with each other if dispositions translate into measurable aspects of observable behaviors in the asynchronous, quality-assured classroom. Finally, time spent in class and frequency of discussion forum posts by students (both overall and directed to the instructor) were hypothesized to be correlated to one another as they both reflected behavioral engagement. The methodology described below tests these predictions in asynchronous online classes that encompass a variety of disciplines and involve mostly adult learners.
Method
Participants
One-hundred and sixty-five instructors (39 men, 126 women) participated in the study (M age = 41.5 years, SD = 10.3). Their mean overall teaching experience was 7.1 years (1–35 years), whereas their mean online teaching experience was 4.3 years (1–18 years). Fifty-six participants did not report this information. All participants taught e-courses in programs covering a variety of academic subjects, including biological, health, behavioral, and social sciences. Students were by and large adult learners (i.e., 25 years of age or older) from across the United States (i.e., as per demographic information provided by Institutional Research).
Procedure
Four-hundred and twenty-seven instructors, who completed at least one asynchronous course over a 6-month period at Ashford University, received an invitation to participate in the study via e-mail. Participants were directed to a survey website where they were asked to review and sign an informed consent form. In it, each instructor agreed to make available to the researchers one archived course, to complete a personality questionnaire based on the five-factor model (McCrae & Costas, 2008) and to estimate the frequency of past experiences of ingratitude (i.e., being the recipient of a negative outcome following a helpful deed) related to teaching.
Materials and Measures
For each instructor, an archived, entirely asynchronous, e-College course was randomly selected for the study from all the asynchronous courses the instructor had taught during the designated time frame. The average size of selected classes was 19.1 students (SD = 6.7; 3–32). The standards specified by Quality Matters defined the curriculum, including materials, presentation, and activities (see Legon, 2006; Willis, 1994; Zygouris-Coe, Swan, & Ireland, 2009). Thus, the structural frame of each section of a course was designed to include the following instructors’ weekly responsibilities: offer a lecture in the form of a document or video, serve as an interactive facilitator in discussion forums, and provide feedback to student discussion posts and written assignments. An additional responsibility during the first week of each class was to respond to students’ personal introductions in a forum specifically devoted to this purpose.
Instructors’ conduct in the classroom was divided into two categories: visibility (a behavioral measure of engagement) and mode of engagement. Visibility was operationally defined as the frequency of the instructor’s responses in discussion forums (i.e., response rate) and as the amount of time spent in the classroom, including discussion forums and lecture sections. Mode of engagement was indexed by (a) the quality of the instructor’s feedback to student posts in discussion forums and (b) the manner of relating to students. Three criteria were utilized to appraise the quality of the feedback offered by an instructor in discussion forums: evaluation of work done, suggestions or instructions, and helpful tone. Each criterion was measured in the scholarly discussion forums regularly offered each week and evaluated on a scale from 0 to 100. The assessment of the instructor’s manner of relating to students was based on four criteria: instructor’s tone of greetings and closing statements, acknowledgment of specific comments made by students, addition of information or questions to students’ posts, and timely replies to student posts (i.e., generally acknowledged to be within 24 hours). Again, each criterion was evaluated on a scale from 0 to 100. For this measure, however, only the opening forum devoted to personal introductions, which was offered during the first week of every class, was utilized, whereas for the other measures discussed earlier, scholarly discussion forums were used. Because introductions are one of the few instances of students’ work not based entirely on scholarly information but rather on more personal details, introductory posts were thought to reveal more clearly the instructor’s unique approach to students than posts in all scholarly discussion forums. Furthermore, to ensure validity and reliability, the assessment of manner of relating to students and quality of feedback was performed by independent raters. Prior to assessment, content validity was established by asking raters to develop an agreed-upon operational definition of each measurement based on descriptions of engagement available in the peer-reviewed literature. Raters also completed a brief calibration practice session which included discussions of the selected operational definitions and their application to concrete examples to achieve interrater agreement scores (i.e., Spearman rank correlation coefficients) ≥ .90. For each instructor, a composite score describing either quality of feedback or manner of relating to students was created as the average of the values obtained by the instructor on the constituent criteria.
Instructor’s characteristics included personality traits (i.e., extraversion, agreeableness, conscientiousness, neuroticism, and openness to experience) all measured on a 5-point Likert scale through the Big-Five Inventory (see John & Srivastava, 1999; McCrae & Costa, 2008; mean coefficient alpha reliability = .83), and an estimate of the instructor’s self-reports of past experiences of ingratitude related to helping students (i.e., being the recipient of a negative action following a helpful deed) which was measured on a 7-point Likert scale reflecting frequency of occurrence.
In addition to information about instructors’ characteristics and conduct, quantitative measures of students' participation, serving as indices of behavioral engagement, were collected in the selected archived classes. These measures consisted of students’ frequency of response (i.e., response rate), time spent in the classroom (including discussion forums and weekly guidance documents), and frequency of replies to instructors’ comments in discussion forums (serving as a measure of engagement with instructor). No information identifying individual students was obtained during the data collection. All measures referred to student conduct per week.
Results and Discussion
Descriptive Statistics.
Note. Measures marked with an asterisk (*) are operationally defined as follows: Weekly response rates refer to the average number of posts per student in forums. Weekly engagement with instructor refers to the average percentage of times students responded to the instructor’s comments in discussion forums. Weekly time spent in class refers to the average number of minutes spent in class per student. However, time spent in class for faculty includes discussion forums, whereas for students it includes discussion forums and lecture sections. Other sections of the online classroom were not included because the online time was either negligible or highly unreliable as evidence of presence in classroom. For instance, the section devoted to submission and review of written assignments was excluded because reviewing papers were done online by some students and instructors and offline by others, making time spent reviewing papers an unreliable measure of both instructors’ visibility and student participation.
Significant Pearson’s Correlation Coefficients.
Note. All correlations are significant at the .05 level (two-tailed).
The results of the present study can be summarized in three main points. First, a pattern of relationships was uncovered between specific measures of conduct and characteristics of an instructor and student participation. As expected, the more time the instructor spent in the discussion forums, the more time was spent by students in class. Similarly, the instructor’s manner of relating to students was positively associated with how frequently the students responded directly to that instructor in discussion forums. Yet, relationships between the characteristics of an instructor and student participation were limited to the instructor’s ratings of openness to experience which were accompanied by increased rates of posting in discussion forums by students. Second, aspects of the conduct or characteristics of instructors exhibited a revealing pattern of intrapersonal relationships. For instance, the higher an instructor’s manner of relating to students, the more time the instructor spent in discussion forums, and the greater was the instructor’s openness to experience. As expected, the more time the instructor spent in the classroom, the higher was the instructor’s response rate. Important to note is that the instructor’s weekly response rate and time spent in the classroom (i.e., conduct) were negatively correlated with class size, indicating that the more students in a class, the less the instructor was visible. Thus, visibility can be considered a diminishing resource as the number of students increases rather than a resource that is replenished by instructors to counteract growing class size. Yet, some instructor characteristics were negatively correlated to each other. For example, as estimates of past experiences of ingratitude connected with helping students increased, instructors’ openness to experience and extraversion decreased. Third, a pattern of relationships between measures of student participation was found. Specifically, both frequency of response (i.e., response rate) and time spent in the classroom were found to be positively correlated with frequency of replies to instructor comments in discussion forums.
What is the relevance of these three patterns of correlations for instruction in the online asynchronous classroom? Consider, for instance, that a specific facet of the many characteristics that an instructor may exhibit (i.e., openness to experience) was found to be positively associated with students’ response rates. This relationship may be clarified by another key finding, concerning the positive association between the instructor’s openness to experience and his or her manner of relating to students. Compared to other personality characteristics, openness to experience may be more visible to students because it reflects the instructor’s intellectual curiosity, imagination, and attentiveness to others’ feelings (Costa & McCrae, 1992). It is reasonable to assume that this disposition is likely to lead to inquisitive, encouraging, and creative comments in discussion forums, as demonstrated by the positive correlation between manner of relating to students and openness to experience. The style of the instructor’s comments may foster a learning environment whereby students seek to interact with the instructor, as demonstrated by the positive correlation between manner of relating to students and frequency of direct replies to the instructor. The instructor’s manner of relating to students may be salient in introductory discussion forums because of the unique and more personal nature of the content of these forums. As a result, his or her individual characteristics and tone can not only be noticed but also receive measurable reactions from the students. A pattern of mutual influences may involve the instructor being encouraged by student responses to his or her posts to maintain interactions of high quality. The disposition of the instructor to be open to experience may also foster a more global change in a learning environment whereby students are active in discussion forums, responding frequently to other class members, as demonstrated by the positive correlation between openness to experience and student response rates.
Although the link between manner of relating to students and frequency of direct replies to the instructor (i.e., weekly engagement with the instructor) is consistent with social agency theory (McLaren, DeLeeuw, & Mayer, 2011; Mayer, 2009; Yu, Tian, Vogel, & Kwok, 2010), according to which cues given by the instructor in the online classroom may prime students to view learning as an engaged partnership between them and the instructor, it is important to note that the theory implies not only emotional or relational engagement, a type of student engagement that we did not directly examine, but also a cause-effect relationship that we could not directly test. Of course, conjectures involving cause-effect relationships are tempting, and consistency with the existing literature (including findings and theories) makes them viable hypotheses for further research. For instance, the fact that an instructor’s past experiences of ingratitude when helping students were found to be negatively correlated with two personality characteristics (i.e., openness to experience and extraversion) may merely reflect a cluster of traits that defines a disposition toward cautiousness. Indeed, if such experiences are conceptualized by the person who has suffered them as warnings against potential threats, with the passage of time, they may have discouraged expressions of extraversion as well as curiosity, experimentation, and willingness to deviate from the status quo (i.e., openness to experience; see Schaller & Murray, 2008). Yet, the fact that past experiences of ingratitude did not predict conduct in the classroom may be due to the norms of the profession, including specific mentoring roles attributed to the online instructor, that dictate unselfishness and empathy in pedagogy (Rowley, 1999). If so, it is reasonable to expect an instructor’s past experiences to translate into observable actions in other settings where social constraints related to one’s professional identity are less influential.
Another set of key findings involves relationships between aspects of student participation. Not surprisingly, students’ weekly engagement with the instructor (i.e., responses directed to the instructor in discussion forums) was significantly correlated with their time spent in class discussions. If one considers that students’ direct replies to the instructor were also positively correlated with his or her manner of relating to students, and that time spent in the classroom by students increased with the time spent in the classroom by the instructor, the existence of a pattern of mutual positive influences seems to be further corroborated. Namely, students may be increasingly motivated to participate in course elements when the instructor’s engagement with students in the course is substantial, and the instructor may be increasingly engaged with his or her students when students are active in the classroom. Alternatively, the data may reflect a pattern whereby students merely model their behavior after the conduct of the instructor; but in a correlational study such as ours, no cause-effect relationships can be asserted with certainty.
The visibility of an instructor is not a novel construct. Short, Williams, and Christie (1976) defined visibility as the salience of a person (e.g., the instructor) and the resulting salience of the interpersonal relationship with others (e.g., students). Relying on this definition, Schutt et al. (2009) found that instructors’ immediacy behaviors were related to students’ perceptions of an instructor’s social presence in the online synchronous mode. Immediacy behaviors are those that reduce the distance between instructors and students, which in the present study encompassed the dimension manner of relating to students (i.e., the instructor’s tone of greetings and closing statements, acknowledgment of specific comments made by students, addition of information or questions to student posts, and timely replies to student posts). Yet, it is important to note that in this study, visibility was indexed by weekly response rate and weekly time in discussion forums. Consistent with the notion that the mere number of an instructor’s responses may not matter much when student engagement is measured (Mazzolini & Madison, 2007), we found that not only the instructor’s availability (as measured by the amount of time spent in class) but also the style of his or her responses (as defined by manner of relating to students), instead of the sheer quantity of his or her responses, were relevant when specific aspects of student participation were measured (i.e., time in class and frequency of responses to the instructor).
While qualitative assessments of instructor conduct have been conducted in other studies (see Nandi, Hamilton, & Harland, 2012), such studies have often relied heavily on students’ self-reports and have focused on measures of learning, as opposed to observed engagement through participation (Wu & Hiltz, 2004). However, when both visibility and mode of engagement are considered, as in the current study, it becomes clear that what the instructor does in the classroom matters, even in the asynchronous mode. The current study, which involved a diverse population of adult learners across a variety of disciplines, supported the generality of this conclusion. Indeed, not only were the students of the sample diverse in their pursuits of learning but also represented the geographically varied groups of adult learners that typically attend predominantly online learning institutions (as per institutional research data). The need to use courses from multiple disciplines for an accurate assessment of the relationship between a specific instructor’s features and student participation has been noted in research relying on self-report measures (see Young & Bruce, 2011). Yet, the measures of participation utilized in the current study, along with the sample of courses and students selected, addressed unique features of this relationship not yet scrutinized. Of course, such measures are coarse indices whose operational definitions may require retooling to acknowledge the complexities of engagement (including behavioral, cognitive, and affective-relational dimensions) in asynchronous, quality-assured classrooms, and the corresponding demand for reliable and valid assessment tools of all dimensions. For instance, there is a need to better understand other facets of student engagement in the online classroom, such as affective-relational engagement. Although student performance (e.g., grades) may be the likely offspring of effective instructional practices, performance is supported by social exchanges. Thus, attention to such exchanges in the context of quality-assured, asynchronous classrooms is essential to understanding of the optimal conditions for learning in the ever-expanding world of distance education.
Although a pattern of unique relationships was detected between specific characteristics and actions of instructors and measures of student participation, the magnitude of the relationships was small. It is reasonable to assume that the constraints related to quality-assured curricula and the norms of the profession may dictate standards of conduct that conceal unique features of an instructor’s character and conduct from students. Namely, because the curriculum and certain aspects of instructors’ roles are set by quality assurance standards, instructors tend to follow these instructions literally, regardless of their preferences and dispositions. Despite these constraints, significant relationships were found, indicating that a personal touch can be visible to students even in quality-assured, asynchronous online courses where responses are unlikely to be immediate, and conduct by both instructors and students is highly structured.
The weak correlation between time spent in class by students and time spent in class by instructors, both measures of behavioral engagement, may not only be due to the accuracy with which computed time reflects actual time on task (i.e., engaged time) in the online classroom but also be the byproduct of the complexities of the learning process. In fact, it is important to recognize that although brief moments of inactivity stop the time counter in the online asynchronous classroom, measurements of presence in the classroom for either students or instructors are mere approximations of time on task (i.e., engaged time). According to Carroll (1963), for each student, time on task is related to the extent to which the student possesses the requisite knowledge and skills to learn the curriculum of a given class, the quality of the instruction in the class, and available time. It is reasonable to assume that an instructor’s time on task is also related to available time and to his or her preparation, including knowledge and pedagogical skills, and the requirements of the instruction to be delivered. Thus, the weak correlation between time spent in class by students and time spent in class by instructors may be the byproduct of factors such as individual differences in available time or in the time needed to learn (for individual students) or teach (for individual instructors), which can define each person’s time on task in the online classroom and thus modulate the relationships between measures of visibility (such as time spent in the online classroom) collected from students and instructors.
The relationships uncovered in this study seen in the context of previous research findings highlight the limitations of the current study and thus the need for further targeted research on the topic of engagement in asynchronous, quality-controlled courses. For instance, an instructor’s conduct and characteristics are just some of the many factors related to students’ participation in the classroom. Global and specific characteristics of the asynchronous classroom environment may also need to be acknowledged as capable of modulating the relationship between students and instructor, and thus respective level and quality of engagement. Of course, the relationships uncovered in the present study were produced by a convenience sample of instructors who volunteered to participate. Convenience sampling begs the question of whether respondents might have been a biased sample of mostly good instructors or instructors with some yet-to-be identified unique characteristics. Although we were unable to conduct a systematic comparison of respondents and nonrespondents on a variety of potentially relevant factors, three performance criteria (i.e., fostering critical thinking, instructive feedback to students, and instructor expertise), each measured on a 5-point scale, were applied to a randomly selected subsample of 25 instructors. Mean ratings on each criterion ranged from 2.5 to 4.5 (interrater reliability: ≥.90), thereby illustrating that the selected sample, albeit of convenience, was diverse on performance measures.
Furthermore, there is a need to further understand the extent to which the goodness of fit between characteristics of students and instructors is relevant to both the online asynchronous and synchronous educational modes. In fact, several researchers (see Connor, Morrison, & Petrella, 2004) have noted that only an individualized approach that fits quality and quantity of instruction to the unique characteristics and necessities of the learner can be expected to be effective. Thus, if the success of each single learner is the goal of instruction, an examination of the interaction between instructional practices and specific student characteristics is one other limitation of the current study that can be rectified by future research. Similarly, the present study highlights the need for agreed-upon operational definitions of constructs relevant to activities performed by students and instructors in the online classroom so that models of the different factors that define effective instruction in the asynchronous and synchronous classrooms can be developed to inform targeted interventions before students experience lack of commitment and performance failure.
Notwithstanding limitations, the findings of the present study add to the body of research on asynchronous learning environments as they set the direction of research that may lead to administrative changes. For instance, notable finding was the significant negative correlation between class size and measures of instructor’s visibility. While most instructors are likely to find this outcome unsurprising, it is undeniable that it lends support to a reversal of administrative policies that have increased class sizes in order to save costs and reach more students.
Footnotes
Acknowledgment
All authors have contributed equally to the research.
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This research was supported by a grant to the first author by Ashford University’s University Fellows Program.
