About the Author(s)


Matumelo D. Kola Email symbol
Department of Human Resource Management, College of Economic and Management Sciences, University of South Africa, Pretoria, South Africa

Patrick Ngulube symbol
School of Interdisciplinary Research and Graduate Studies, College of Graduate Studies, University of South Africa, Pretoria, South Africa

Tebogo K. Molotsi symbol
Department of Human Resource Management, College of Economic and Management Sciences, University of South Africa, Pretoria, South Africa

Citation


Kola, M.D., Ngulube, P., & Molotsi, T.K. (2025). Validating the job structure of online facilitators in higher education: Task identity and feedback challenges. SA Journal of Human Resource Management/SA Tydskrif vir Menslikehulpbronbestuur, 23(0), a3112. https://doi.org/10.4102/sajhrm.v23i0.3112

Original Research

Validating the job structure of online facilitators in higher education: Task identity and feedback challenges

Matumelo D. Kola, Patrick Ngulube, Tebogo K. Molotsi

Received: 07 May 2025; Accepted: 08 July 2025; Published: 30 Sept. 2025

Copyright: © 2025. The Author(s). Licensee: AOSIS.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Orientation: Technology has transformed the design of job structures and work practices in the higher education sector. However, transforming a job structure is complicated, and little is understood about the related job dimensions.

Research purpose: This study examined the job structure of online facilitators working in higher education institutions using the learning management systems as their virtual workspace.

Motivation for the study: The study was conducted at a South African higher learning institution employing a comprehensive open distance and e-learning approach to validate the job structure.

Research approach/design and method: A convergent mixed-method approach was used to validate job dimensions within the job structure. Data were collected from 236 participants for quantitative analysis, and a selection of 30 open-ended responses was used for qualitative purposes. Confirmatory factor analysis was conducted to analyse and confirm the validity of the job structure. A thematic analysis was used to corroborate and integrate the results for convergence.

Main findings: The integrated results validated the core dimensions within the job structure; however, task identity and feedback challenges were identified, which led to online facilitators having limited knowledge about their complete work output and limited opportunity to improve job performance.

Practical/managerial implications: The challenges hindered effective knowledge transfer and suggested that the online facilitators’ motivation might be at stake. Their job structure needs to be redesigned to achieve job enrichment to enable them to facilitate effectively.

Contribution/value-add: This study closes the knowledge gap by validating online facilitators’ job structure. Practical insights about effective job performance feedback are laid out with clear interventions to preserve task identity in virtual work settings within higher education.

Keywords: online facilitation; learning management systems; job design; higher education; convergent mixed-method research.

Introduction

Technology has transformed the educational landscape and the structure of jobs within the sector (Pandit & Agrawal, 2022). Learning management systems (LMSs) were introduced as a web-based technology used to facilitate learning virtually to support students in distance education and improve successful course completion (Camilleri & Camilleri, 2022; Simmons et al. 2023). The LMS, therefore, enables online facilitators in higher education institutions (HEIs) to create and deliver educational content, monitor student participation and assess student performance virtually (Martin et al., 2019). This educational transformation allows students to access a borderless education and overcome geographical boundaries (Camilleri & Camilleri, 2022; Kulikowski et al., 2022). Despite this significant milestone, less is known about the concomitant job dimensions of online facilitators, whose job structure is embedded within the LMS. Traditionally, the job structure of the facilitators transpired in a physical learning environment filled with bodies to engage and convey knowledge (Pandit & Agrawal, 2022; Peschke, 2014). The physical settings positively influenced how facilitators’ jobs were structured, and their significant contribution was recognised. Moreover, their task identity was visible and had clear work outputs and performance measures (Kulikowski et al., 2022). Furthermore, gaining job performance feedback was less challenging because of physical engagement with the students, who could instantly share their critical thinking as they interrogate knowledge consumption within their field of study. Facilitators could easily gauge student understanding, and they had greater autonomy to improve their facilitation skills to meet the diverse needs of the students (Maré & Mutezo, 2020). These aforementioned dimensions are core in creating job enrichment for the facilitators; however, the use of the LMS as both a facilitation site and a platform to convey knowledge transfer raises complex challenges when it comes to having a clear task identity, applying autonomy and gaining job performance feedback (Kadhim et al., 2023; Simmons et al., 2023). The challenges could potentially weaken the job structure of online facilitators, and the magnitude thereof is yet to be known. The challenges compel organisational development (OD) specialists in HEIs to take a step forward by reviewing dimensions that form a meaningful job structure within the online facilitation role and tackle the complexities that come with facilitating using the LMS (Christensen et al., 2022; Knight & Parker, 2021).

This study advanced knowledge about virtual work settings in the higher education field and significantly contributed to the job design literature in four ways: Firstly, the study validates the core job dimensions of the online facilitators and sheds light on the challenges that come with their virtual job structure within higher education institution; secondly, the methodological integration mitigates the risk of over-reliance on statistical data to validate job structures, and both objective and subjective measures were maintained in the methodological application (Skamagki et al., 2024); thirdly, the study shares practical insights on how to provide effective job performance feedback to maximise job enrichment; and lastly, it provides interventions to preserve task identity in virtual work settings within the higher education context.

Context and research purpose

The purpose of this study was to examine the job structure of online facilitators and validate the core dimensions within the role. Contextually, the job structure of the online facilitators is influenced by the LMS as a major component of online education in HEIs (Pandit & Agrawal, 2022). Playing their role, online facilitators must structure and lead content using the LMS and provide educational support services to virtual students who are geographically dispersed (Berge, 2008; Govender, 2018; Martin et al., 2019). In offering their support services, they are the first point of contact to socialise and conscientise students within their online learning community (Dennen & Arslan, 2022; Grant, 2021). The role requires them to implement the interplay between pedagogy and technology (Derakhshan et al., 2023); hence, their job role is heavily embedded within the institutional LMS. However, several studies that have evaluated the use of the LMS have reported challenges in accurately measuring work outputs and contributions to student success with online education (Kipp, 2018; Mehta, 2022; Rhode et al., 2017). Simmons et al. (2023) also pointed out that web-based structures present difficulties in gaining real-time feedback and thus place users’ motivation at risk. The challenges extend to limited knowledge about the core job dimensions within the online facilitation role and could render the job structure fluid. The predicament creates a job design dilemma that lends itself to scrutiny; hence, the main objective of this study is to validate the job dimensions within the job structure of online facilitators.

Literature review

In recent years, there has been a decline in research studies examining the nature of job design and the changing job dimensions within a role (Morgeson & Humphrey 2006, 2008; Oldham & Fried, 2016). Hackman and Oldham (1975) and Oldham and Hackman (2010) first noticed that the paucity of knowledge in the area of job design may be attributed to the limited capacity to measure the magnitude of the work changes. Decades later, job design has evolved exponentially; yet research in this area is still limited (Knight & Parker, 2021; Parker & Grote, 2022). To date, most of these studies emphasise that the way tasks are performed is largely influenced by the theoretical models. However, knowledge about the empirical aspects of job design in virtual work settings is limited (Parker et al., 2017; Parker & Grote, 2022). There is also little emphasis on the models, methods and fundamental principles guiding the structuring of job roles and the assigned virtual tasks (Knight & Parker, 2021; Pandit & Agrawal, 2022; Rodés et al., 2021), particularly within HEIs. Generally, much of the research highlights how the teaching role has been adapted to facilitate inside virtual learning environment (De Metz & Bezuidenhout, 2018; Govender, 2018; Rodés et al., 2021), while a significant number of research studies have reported on the role and competencies of teaching and facilitating online (Martin et al., 2019; Rodés et al., 2021). However, there is less emphasis on how the job structure is designed to transpire within the institutional LMS. The limitations create a research gap on how job design models have been theorised and adopted in HEIs (Pandit & Agrawal, 2022). Consequently, the core job dimensions of online facilitators remain poorly understood. The knowledge deficiency requires great attention as it raises challenges regarding accurate measures of job performance and work outputs (Kozusznik, 2021; Oldham & Fried, 2016), particularly where the job structure is embedded within the LMS (Rhode et al., 2017; Wegman et al., 2016). The knowledge gap implies that higher education institutions, as the employer, might have experienced difficulties in designing and implementing intervention strategies to improve throughput and, therefore, require job design interventions (Knight & Parker, 2021).

Given the number of resources, the individual jobholder’s interaction with tasks performed at work during various phases within the work cycles represents job design (Oldham & Fried, 2016). The construct embodies the content and the organisation of one’s work tasks into a complete, meaningful job structure (Knight & Parker, 2021).

Job design is therefore defined as the study, creation, and modification of the composition, content, structure, and environment within which jobs and roles are enacted. (Morgeson & Humphrey, 2008, p. 47)

The process encapsulates outcomes of how work is structured, organised and experienced within a particular work context. Thus, the adoption in this study encompasses the job content in terms of the structure of tasks and activities, relationships, role and responsibilities of facilitating using the LMS (Camilleri & Camilleri, 2022; Kulikowski et al., 2022; Parker et al., 2017). The focus is on the task characteristics depicting job enrichment and internal work motivation based on the core dimensions of autonomy, task variety, task significance, task identity and feedback from the job. The emphasis is on how these core job dimensions have been impacted by the Internet and web-based technologies, which force online facilitators to move between digital domains and offline reality to connect and manage their work virtually (Dennen & Arslan, 2022; Peschke, 2014). In the next sections, we present the theoretical model of the research based on the five core job dimensions and then describe the research design, which is followed by the data collection procedure. Thereafter, we present the data analysis, results and findings from both strands of data. A discussion of the results and implications for theory and practice is also presented, and we conclude with recommendations for future research.

Theoretical model: Core job dimensions

The theoretical model of job design represents comprehensive measures of work based on a four-component structure, namely, task characteristics, knowledge characteristics, social characteristics and work context. Specifically, the task characteristics embody the core dimensions within a particular job structure (García-Izquierdo & Castaño, 2022). Over the years, the five core job dimensions have provided a stable model for studying the essential nature of work (Oldham & Fried 2016), and it is widely used to assess the scope of jobs in diverse work contexts (Khandan et al., 2018). The dimension of autonomy showcases online facilitators’ freedom to decide how to perform tasks and activities. The dimension reflects the extent to which using the LMS allows discretion to schedule work, make decisions and choose the methods used to perform a variety of tasks irrespective of the work context (Oldham & Fried, 2016). Task variety, therefore, captures wide-ranging activities that online facilitators need to perform to facilitate effectively (Hidayah et al., 2017). This includes performing instructional design using diverse learning materials, teaching resources, interactive technologies and rich media to deliver online lessons and tutorials (Govender, 2018). Task significance, on the contrary, showcases the response to the diverse needs of service recipients (Parker et al. 2017). Therefore, the dimension represents how well the jobholder’s efforts have a positive impact on the recipients of the respective services offered within the job (Wegman et al., 2016). Hence, the facilitation role is significantly geared towards making a positive impact by providing student support services within online education. The impact should also be reflected in the dimension of task identity, which requires online facilitators to produce an identifiable piece of work showing interrelated activities performed from start to finish (Christensen et al., 2022). Accordingly, online facilitators have the opportunity to complete the whole piece of work, starting from the planning of the online delivery of module content, facilitating using various web-based tools, providing adequate technical support to virtual students and ending with assessment for successful module completion (Martin et al., 2019). Naturally, constructive criticism of facilitators’ performance is necessary, and part of such criticism involves gaining feedback about the outcomes of the actual facilitation of learning (Maré & Mutezo, 2020). The presence of these core job dimensions creates job enrichment, leading to meaningful work experiences, feeling responsible for the work outcomes and acquiring knowledge for work improvement (Simmons et al., 2023). Therefore, the core job dimensions provided a solid theoretical model and motivation used in this study to validate the job structure of online facilitators. The contextual application of the model, therefore, captured the essence of online facilitators’ work based on autonomy, task variety, task identity, task significance and job feedback within the virtual work setting in higher education. Mainly, the goal was to gain insight into the scope and nature of their work.

Research design

Research approach

A convergent mixed-method approach was used to gain strong credibility and rigorous application using both quantitative and qualitative methods. The research questions raised in this study prompted the convergence design (Creswell & Creswell, 2022). The aim was to gain objectivity and subjectivity in the methodological application as suggested by Creamer (2018). To achieve this objective, converging lines of inquiry were implemented to test the hypothesis and to answer the parallel research questions under the qualitative strand. The confirmatory factor analysis (CFA) under statistical tests was conducted for validity measures and model fitness. The following hypotheses were posed to test the validity and fitness of the model:

Ha1: The hypothesised measurement model will fit the data.

Ha2: The five-factor model (autonomy, task identity, task variety, task significance and feedback) validates the job structure of online facilitators.

The statistical results were corroborated by the thematic analysis procedure as per the guidelines shared by Skamagki et al. (2024). The objective was to find out to what extent qualitative findings help to explain quantitative results. The parallel open-ended questions were posed and translated into sub-objectives to corroborate and support the statistical results while maintaining rigour (Maxwell, 2020). The sub-objectives were:

  • To describe the job structure of online facilitators in terms of the wide-ranging activities (task variety), work outputs (task identity) and freedom (autonomy) when performing tasks.
  • To explain how the online facilitation role influences others inside or outside the HEI (task significance).
  • To explain how online facilitators gain information about the effectiveness of their job performance (feedback).
Research methods
Research participants and sampling method

The estimation of the number of participants was guided by the objectives of achieving converging results of quantitative and qualitative data under rigorous application; therefore, we implemented a stratified, purposeful and parallel sampling procedure. The mixed-method sampling strategy was guided by Onwuegbuzie and Collins (2017). In the quantitative sample, 236 responses were drawn from a population of online facilitators offering student support services on a part-time basis within the comprehensive open distance and e-learning (CODeL) institution; 58.55% participants were female and 41.5% were male. The largest proportion, 44.1% (standard deviation [SD] = 0.77), holds a master’s degree, and 39.0% of them are between the ages of 33–42 years, while 37.3% (SD = 1.34) have a job tenure of 9–10 years.

A parallel qualitative sample was selected from the open-ended responses. However, only 30 participants who provided rich qualitative data from the 236 open-ended responses were purposefully selected to gain rigour under thematic analysis (Creswell & Creswell, 2022). Therefore, representative samples of both strands of data were drawn from six different colleges, namely: Colleges of Agriculture and Environmental Sciences (CAES), Accounting Sciences (CAS), Economic and Management Sciences (CEMS), Education (CEDU), Human Sciences (CHS) and lastly College of Science, Engineering, and Technology (CSET). To counteract the compromised merging process, each college was represented by 5 participants. The majority of the participants were between the ages of 33 and 42 years; 10 were male and 20 were female; 11 of the female participants completed a master’s degree, while 4 male participants completed the same qualification; 4 male participants completed an honours degree, whilst 6 female participants completed the same qualification. For the doctoral degree, 4 participants hold that qualification with equal numbers between males and females. The remaining female participant has an undergraduate degree. The majority of them were selected from those having either 9 years or 10 years of work experience within the institution.

Data collection and measuring instruments

A parallel semi-structured questionnaire was developed to collect mixed data from both closed-ended and open-ended responses to examine facets of the same phenomenon using two sets of independent results (Creamer, 2018). A survey strategy was chosen for data collection, which was conducted in a single phase (Creswell & Creswell, 2022). The observed variables under examination within the job structure of online facilitators included dimensions of autonomy, task variety, task significance, task identity and feedback from the job. The work design questionnaire (WDQ) designed by Morgeson and Humphrey (2006) incorporates these dimensions under measures of task characteristics. The component structure is used as a job diagnostic tool that measures job enrichment and internal work motivation based on the aforementioned job dimensions that were instrumental in this study (Hackman & Oldham 1976, 1980). The job structure of online facilitators was therefore tested and validated using task characteristics within the WDQ (Hackman & Oldham, 1980; Morgeson & Humphrey, 2008).

Closed-ended questions were measured on a 7-point Likert scale ranging from 1 being strongly disagree and 7 being strongly agree. The scale includes items like:

[T]he job allows me to plan how I do my work, the job is arranged so that I can do an entire piece of work from beginning to end, the results of my work are likely to significantly affect the lives of other people, and the work activities themselves provide direct and clear information about the effective (e.g., quality and quantity) of my job performance. (Morgeson & Humphrey, 2006, p 1337)

Parallel open-ended questions were also asked to allow online facilitators to describe their job structure in terms of the wide-ranging activities (task variety), work outputs (task identity) and freedom (autonomy) when performing their tasks. They were also asked to explain how their role influences others inside or outside the HEI (task significance) as well as how they gain information about the effectiveness of their job performance (feedback). The semi-structured questionnaire was uploaded to a self-administered web survey to eliminate the researcher’s bias (Creswell & Creswell, 2022). Therefore, participants were able to answer questions without the behaviour and influence of the researchers (Creamer, 2018). Prior to administering the questionnaire, data collection was simulated through pilot testing. According to the guidelines shared by Hair et al. (2014), we selected five online facilitators to participate in the pilot study, and their responses were reviewed, and the questions that still needed clarity were refined for content validity. The objective was to collect data that are appropriate for the level of accuracy required in the analysis, and we were also able to maintain reliability and trustworthiness through the refined research instrument (Maxwell, 2020).

Data analysis

The data were then analysed separately using both quantitative and qualitative converging lines of inquiry (Creswell & Creswell, 2022). Firstly, the quantitative data were analysed using CFA to confirm the job structure based on the five-factor model of autonomy, task variety, task significance, task identity and feedback from the job. The CFA procedure was then conducted to assess model fit using Statistical Package for the Social Sciences IBM SPSS Statistics (Version 29) and Amos (Analysis of Moment Structures)version 29. These five factors represent the variables under examination, and their descriptive statistics were accounted for by mean scores (M) for average values and SD measures showing distance from mean scores (Hair et al., 2014). The Omega-H values were used to assess the reliability scores. The CFA procedure also assessed the content validity of the model, that is, the extent to which a test can cover the concept it intends to measure (Cheung et al., 2023).

Secondly, a parallel thematic analysis was employed to describe and explain how these five factors representing the core job dimensions were experienced in the role of online facilitation. The procedure allowed the researchers to use the emerging themes to support the quantitative results (Creswell & Creswell, 2022; Yin, 2016). Therefore, a deductive approach was used by developing a codebook template that guided data analysis to gain convergence (Braun & Clarke, 2023). The methodological flexibility brought evidence from both objective and subjective measures without eroding the meaning of the context (Creamer, 2018). The process entailed visual exploration and reading through text segments to familiarise ourselves with the dataset derived from the open-ended responses (Braun et al., 2016). This was done to gain a general understanding of the database and identify quotations that had similar views, trends and patterns relevant to the research objectives, followed by systematically allocating codes using a codebook template to maintain consistency and build credibility of the analysis. The extracted data were then loaded into qualitative coding software – ATLAS.ti version 24. The process was repeated to conduct a midpoint analysis and permit the emergence of additional codes, which were revised, refined and merged into categories to gain confirmability of the emerging themes (Yin, 2016). The analysis software produced both narrative reports and intuitive visuals to demonstrate transparency and increased rigour and the credibility of the thematic analysis procedure, thereby strengthening the qualitative findings (Maxwell, 2020).

Thirdly, the data were integrated to corroborate findings and validate the job structure of online facilitators in virtual work settings within HEIs (Skamagki et al., 2024). The integration was achieved through comparison of the results and findings from both strands of data. The procedure gave us the opportunity to look for common concepts across the results and establish ways in which the qualitative findings converge, diverge, complement or expand the quantitative results (Creswell & Creswell, 2022). The integration procedures enabled us to use multiple sources to gain confirmability and enhance the validity of the findings. Moreover, the implementation of multiple methods increased the reliability and trustworthiness of the research (Creamer, 2018).

Research procedure

The quantitative and qualitative findings from the two strands of data are discussed with reference to the research methods and rigorous analysis, and the researchers were transparent in describing the research procedures (Braun & Clarke, 2023; Creswell & Creswell, 2022).

Ethical considerations

Ethical approval to conduct this study was obtained from the University of South Africa Ethics Review Committee (ERC), No. 2020_CRERC_028 (FA), and the study was conducted according to the university’s research ethics policy and procedures. Respondents gave written consent to participate in the study. Participation was completely voluntary, and privacy was maintained at all times. The data used in this study have been anonymised. No information on the dissemination of the research findings can be traced back to the participants.

Results

The observed latent variables of the model were assessed using the standardised regression weight (SRW) with the recommended score > 0.05. The SRW measures factor loadings of the item indicators within the observed variables (Cheung et al., 2023). Most of the indicator items achieved standardised factor loading > 0.50; however, few achieved low scores. Therefore, the model required modification. The most appropriate modification required was to covary error terms that are part of the same factor (Chen, 2007). The modification indices (MI) above > 10 were used to assess improvement. Therefore, the error terms belonging to the same factor of autonomy, task variety and task identity were correlated to improve the measurement model, which is presented under statistical results. The SRW scores for item indicators within the examined factors showed an improvement in the model. In particular, the lowest SRW for item JD19-TaskID was 0.16. However, the standardised residual covariances in item indicator for JD19-TaskID were insignificant compared to the overall scores in the improved model. The item indicator was retained, and the factor loadings of the observed latent variables confirm the content validity of the improved model (Hair et al., 2014; Jöreskog, 1970). The improved measurement model further assessed construct validity, that is, how well the model fits the data and accounts for correlations among the observed latent variables. The goodness of fit for the model used the maximum likelihood estimation method. The results were based on the following recommended statistical fit indices: Chi-square/degrees of freedom ratio (CMIN/df) < 3; P-value for the model > 0.05; comparative fit index (CFI) > 0.95; Tucker-Lewis Index (TLI) > 0.95; adjusted goodness of fit index (AGFI) > 0.80; root mean square error of approximation (RMSEA) < 0.05; and PCLOSE value > 0.05 (Bentler & Bonett, 1980; Goretzko et al., 2023; Hu & Bentler, 1999). Figure 1 depicts the correlations of the modified measurement model.

FIGURE 1: Improved measurement model.

The improved model achieved good fit statistics including CMIN/df = 1561; CFI = 0.965; TLI = 0.960; AGFI = 0.849; RMSEA = 0.049; and PCLOSE = 0.576. The five-factor model does account for the correlations among the observed variables in the dataset. Therefore, the results show that the hypothesised measurement model fits the data based on the acceptable threshold (RMSEA < 0.05, AGFI > 0.80, CFI > 0.90).

Composite reliability, convergent and discriminant validity

The composite reliability (CR), convergent validity and discriminant validity of the model were also established. Convergent validity indicates how well item indicators correlate with each other within the observed factor. If items correlate highly with variables outside the observed factor than with variables inside the observed factor, then the model will have discriminant validity issues. The results in Table 1 show the model achieved good CR (> 0.07) for autonomy, task variety, task significance and feedback from the job. This means that the total error variance of the latent variables consists of less than 30% of the variance, which is the recommended value (Cheung et al., 2023). However, the Omega-H did not achieve a good reliability score, that is, 0.069 for task identity, which is less than the recommended value of 0.07. The average variance extracted (AVE) score of 0.49 also did not confirm strong convergent validity for task identity, that is, the latent factor is not well explained by its observed variables, particularly item JD19-TaskID, which achieved the lowest score under factor loadings. Table 1 lists the measures that were helpful based on Fornell and Larcker (1981) criterion.

TABLE 1: Descriptives (mean, latent standard deviation, composite reliability, average variance extracted, maximum shared variance, latent correlations) of the job dimensions (N = 236).

The other observed variables in Table 1 show that the model achieved good AVE scores, that is, > 0.05, and CR was > AVE; therefore, CR and convergent validity were achieved for autonomy, task variety, task significance and feedback from the job. Discriminant validity was also assessed with maximum shared variance (MSV) (Fornell & Larcker, 1981). All variables achieved good discriminant validity scores, including task identity, that is, AVE > MSV for all observed variables, which is an indication of good discriminant validity. Another evidence of discriminant validity is diagonal values in brackets, which represent the square root of AVE greater than inter-construct correlations. This is an indication that no latent variable is better explained by some other variables than by its observed variables (Cheung et al., 2023). Overall, the five-factor model confirms and validates the dimensions of autonomy, task variety, task significance and feedback within the job structure of online facilitators; however, with task identity dimension, lower reliability scores were achieved in the Omega-H tests and the AVE.

Qualitative findings

The statistical results were limited in providing deeper information, particularly to the challenges encountered with task identity. The results were explored further using thematic analysis to describe these five dimensions based on participants’ points of view to help explain the statistical results. Therefore, the analysis was done in parallel to quantitative results with the objective of gaining a deeper meaning and understanding of how online facilitators in HEIs experience the dimensions of their job structure within the virtual work setting. We first looked into task identity challenges. The rigorous analytic procedure revealed that task identity challenges also had a negative impact on job performance feedback.

Task identity and feedback challenges

In sharing their experiences about gaining work outputs, online facilitators expressed that being able to complete a piece of work with an obvious beginning and end is challenging for various reasons. To start with, their work transpires in both online and offline modalities; however, more emphasis is being placed on the online mode activities and less on what transpires offline (Martin et al., 2019). Despite greater emphasis on gaining work outputs from online activities, getting timely access to the LMS that serves as their facilitation site is problematic. They work on the contingent work arrangement; therefore, many encounter late allocation of student groups and late contract renewals. In real-time measures, work output in an online mode can be seen in the number of weekly tutorial activities assigned to students in conjunction with the level of participation in the lessons, activities and threaded discussions. This is statistically captured on the LMS dashboard (Kipp, 2018). However, late student groupings and activation on the LMS lead to insufficient time to complete all facilitation tasks throughout the academic semester. Moreover, they receive little technical assistance when faced with information and communication technology (ICT) difficulties:

‘The start is mostly delayed due to the late allocation of student groups. I feel like I do not fully take students to the finish line of my module; this is because I do not have access to their performance on assignments or the final exam.’ Participant_CSET1, male, has a Master’s degree, and 9–10 years of job tenure.

‘I try to complete the work at hand; however, it will all depend on how early I have been allocated the students as well as not having challenges in ICT related problems or lack of participation from students which can make one to slow down the pace especially if they are encountering administrative related problems.’ Participant_CEMS3, male, has a Master’s degree, and 13–14 years of job tenure.

The challenges encountered hindered effective knowledge transfer within the LMS and the much-needed student support. The rigorous analysis revealed more insight that although convergent validity and discriminant validity were achieved within the feedback dimension, there were also challenges with the job dimension. Figure 2 illustrates the challenges.

FIGURE 2: Illustration of task identity and feedback challenges.

Task identity challenges had an adverse impact on gaining feedback from the job. The minimal work output also meant that online facilitators had limited information about their job performance. In their reality, there is no regular communication about their job performance, and this created challenges in identifying areas for growth and improvement. This is worsened by a lack of engagement in complete phases of facilitation, which is attributed to late access to their facilitation site and limitations in assessing student work:

‘There has been no feedback on my performance or the performance of the students that I support/supported.’ Participant_CEDU4, female, has a Master’s degree, and 3–4 years of job tenure.

‘It is hard to tell as we never get feedback from questionnaires which students complete regarding feedback on the module. I’ve been requesting feedback since 2015 to renew and reinvent myself in areas where I lack or should amend my teaching methodology.’ Participant_CEMS1, female, has a Doctorate degree, and 8–9 years of job tenure.

‘I know that there are some facilitators that also work as markers for the module they tutor, and I assume they have more insight into students’ performance and where to focus their attention on future tutorials. I am, however, not a marker and have very little insight into where students go wrong and where I should ideally focus my attention on tutorials, etc.’ Participant_CHS5, female, has a Master’s degree, and 9–10 years of job tenure.

These findings reveal that the statistical report generated by the LMS is limited, particularly regarding the actual time spent on the job to produce online activities. The imbalanced emphasis on online tasks disregards work done offline (Finne et al., 2022; Mehta, 2022). Time spent on lesson planning and development, as well as research that occurs offline to enhance learning material, is neither explicitly documented nor recorded in any system. As such, these offline job activities remain invisible:

‘There is no instrument provided to measure the pieces of work begun and finished. No time frames for topics. A facilitator should see to it to finish guiding students a few days before the due dates for assignments.’ Participant_CEMS5, Male, has a Doctorate degree, and 9–10 years of job tenure.

Considerably, the statistical report does not accurately represent the entire piece of work produced by the online facilitators, making it difficult to identify the tasks performed offline or to use it as a useful tool for effective or constructive job performance feedback (Knight & Parker, 2021). As a result, task identity and feedback dimensions were not fully supported. In terms of the other dimensions, the thematic analysis procedure supported the statistical results for autonomy, task variety and task significance. Taken as a whole, the findings for all the dimensions were integrated to determine the extent to which qualitative findings help to explain quantitative results (Creswell & Creswell, 2022).

Data integration

The two results gained from both methods were integrated, looking for convergence (Creswell & Creswell, 2022; Skamagki et al., 2024). The endeavour was done to corroborate and validate the job structure of online facilitators based on autonomy, task variety, task significance and task identity. Table 2 presents the integrated results for all dimensions examined within the job structure of online facilitators.

TABLE 2: Integrated results validating the job structure.

The data integration procedure allowed us to gain a certain level of confirmability for the dimensions of task identity and feedback from the job. Given the challenges encountered within these dimensions, the integrated results showed that convergence was partly achieved. The integration of the quantitative and qualitative findings expanded knowledge about the adverse encounters faced by online facilitators when performing their tasks using the LMS. Mainly, limited access to facilitate the entire curriculum meant that online facilitators could not engage fully with the students and support them adequately to understand their educational content. Their work output was also adversely impacted by the lack of performance measures for offline activities. The statistical report generated from the LMS does not fully account for their job performance therefore, it underscores their total work output. As a result, online facilitators encountered difficulty in identifying a complete piece of work generated from both online and offline tasks. The difficulties extended to receiving feedback about their own job performance.

Convergence for autonomy, task variety and task significance

Convergence was also examined to gain confirmability for other dimensions. The corroboration of the statistical results and the emergent themes was concordant for autonomy, task variety and task significance. Convergence for autonomy, task variety and task significance was fully supported, and there were common concepts across the results. The description of how autonomy was experienced by online facilitators confirmed the validity of the dimension. Their job gives them the freedom to present educational content within the specified framework and guidelines; however, it is not set in stone. They have the freedom to plan for the online classes and schedule facilitation time within the allocated course timeline. The freedom to schedule work was greatly interlinked with decision-making. They have the latitude to choose the types of resources they make available to their students, and these can be as engaging as they choose. Most of them have industry experience, and they were able to incorporate their knowledge and skills to broaden students’ understanding of the course content. Many aimed to use multiple methods to meet the diverse learning needs of their students. For instance, they applied Gardner’s multiple intelligence model, and they use audio material or have live group chats, present case studies with probing questions, post lesson plans or slideshows, and provide students with the opportunity to interact with each other, which helps to model their learning (Camilleri & Camilleri, 2022).

The validity measures for the dimension of task variety were supported by the emergent themes, that is, both online and offline activities were transparent in patterns of data. The description of wide-ranging activities performed online reflected the pedagogical role that transpired in the LMS and further required facilitators to maintain an online social presence to support and engage with the students in threaded group discussions (Khurshid, 2020; Kulikowski et al., 2022). The provision of additional resources such as open textbooks, educational blogs and links to other recommended open educational resources (OERs) enrich the process. The live presentation of lessons using interactive platforms such as Microsoft Teams amplifies the facilitation process and allows online facilitators to deliver simplified educational content to meet the learning needs of diverse students. Fundamentally, efficiency in these activities requires doing a significant amount of research and reviewing learning material and activities for better planning and development of educational content. Most of these activities are administrative, time-consuming and occur offline. Therefore, the converging line of enquiry supported the dimension of autonomy and task variety gained by performing online and offline activities.

In essence, the findings show the significant role played by online facilitators and thus demonstrate task significance. Both statistical measures and expressions of meaningful experiences within the role support the dimension of task significance. Facilitators see themselves as having a significant influence by assisting course leaders in facilitating knowledge transfer and providing student support, thereby growing their skills. They also bridge the educational distance by supporting students who may feel isolated in their virtual learning environment while trying to balance other work–life demands (Grant, 2021; Simmons et al., 2023). Overall, quantitative and qualitative data integration and thematic findings strengthened the validity measures within the job structure and with strong support for the dimensions of autonomy, task variety and task significance.

Discussion

This study examined the job structure of online facilitators working in HEIs using the LMS as their virtual work setting and thus shed light on the changing nature of job design. The study implemented a convergent mixed-method approach to validate the job structure based on the five core dimensions of autonomy, task identity, task variety, task significance and feedback. Ideally, the presence of these core dimensions within a job structure would demonstrate the positive impact of the online facilitation role within higher education (Foreman-Brown et al., 2023; Rhode et al., 2017). Generally, performing a variety of tasks with clear work outputs would showcase their task identity and lead the online facilitators to experience greater meaningfulness in transferring knowledge using the LMS (Mbati & Minnaar, 2015; Simmons et al., 2023). However, only three job dimensions were validated without challenges, namely, autonomy, skill variety and task significance. The results of these three core job dimensions converged under data integration. Task identity and feedback from the job did not fully converge because of challenges that online facilitators experienced. They had limited time to facilitate the complete curriculum within an academic semester and had difficulty gaining feedback about their own job performance, leaving them with little room to improve. These challenges are not unique nor surprising. Similar studies examined these core job dimensions using the job characteristics model in the digital work context, and the internal job structure did not show support for all the dimensions, particularly task identity (Hidayah et al., 2017; Kulikowski et al., 2022; Wegman et al., 2016). Virtual work settings have been reported to have difficulty in producing clear work outputs (Finne et al., 2022; Mehta, 2022; Wegman et al., 2016). However, these studies did not provide insight into the attributes of these challenges. In this study, the meta-inferences gained from multiple sources and measures, rigorous analysis and methodological flexibility validated the job structure of online facilitators. Validity measures and the corroboration of the results brought light into attributes of task identity and feedback challenges within the online facilitation role. Hence, the findings showed that the discord in knowledge and data inferences is attributed to limitations in producing clear work output on the LMS. In the main, late access to the facilitation site led to the inability to complete all phases of facilitation. What is recorded electronically in the statistical report does not represent the total work output in the dual modes of productivity. The report is generated under immense time constraints, and it does not fully capture what transpires in a facilitation role given a particular academic semester. The predicament creates difficulties for online facilitators to identify a complete piece of work produced from the wide-ranging activities they perform online and offline. Resultantly, the report is also less constructive when it comes to recording the actual job performance within a full academic semester and thus has practical implications in the online facilitation role.

Practical implications and recommendations

The web-based tools used to capture a variety of tasks performed by facilitators during the course of an academic semester have practical drawbacks. In particular, a statistical record of the activities on the LMS does not accurately represent the amount of time spent on the job and work done offline to produce work outputs. The difficulty in capturing a complete piece of work underscores total work outputs, which leads to a lack of recognition (Camilleri & Camilleri 2022; Kulikowski et al., 2022). This lack of recognition is detrimental to internal work motivation and could have a negative influence on the critical psychological state of the online facilitators. In the long run, this might hamper their ability to offer effective facilitation services and support students accordingly. To overcome this predicament, there is a need to review the job dimensions of online facilitators to improve their overall performance and virtual working conditions that come with using the LMS as a facilitation site. The endeavour requires collaboration between human resource management (HRM) practitioners, such as organisational development (OD) specialists and top management, to redesign the online facilitation role with the goal of creating job enrichment and internal work motivation (Derakhshan et al., 2023; Van Beurden et al., 2024). One possible way of attaining this is to design and implement job performance standards with accurate measures to improve work outputs transpiring in both online and offline modalities. The endeavour could be done in four ways: The first way is to allow progression into a complete work cycle to preserve the task identity of the online facilitators (Christensen et al., 2022; Foreman-Brown et al., 2023). A complete work cycle of the facilitation role has four phases, which begins with learning analytics (Martin et al., 2019; Rodés et al., 2021). This phase requires constant access to the LMS to critically review learning tools and materials to improve student learning experiences. Access to this type of data will enable facilitators to judge whether students need extra support and motivation to achieve learning outcomes. However, the effective planning and implementation of the facilitation strategies require constant presence to connect and engage with the students (Martin et al., 2019; Rodés et al., 2021). Therefore, orientation is another critical phase that allows online facilitators to make the necessary connections and establish an online social presence to support students effectively. This phase will allow facilitators to create an online learning community where students feel valued and respected for their ideas (Derakhshan et al., 2023; Foreman-Brown et al., 2023; Pandit & Agrawal, 2022). Engagement in these phases leads to an effective use of the LMS, provided student groupings and early access are granted to begin the actual facilitation of learning. The effective facilitation can be depicted through communicating key principles in the subject matter and impactful presentation of online lessons amplified by rich media and visuals (Dennen & Arslan, 2022). Following a progression through these three phases, a final phase is reached where assessment measures are used to execute a systematic evaluation of knowledge transfer. The completion of these critical phases in a work cycle will showcase the facilitators’ best practices, performance standards, competence and dedication to help students master the learning outcomes and thus lead to job enrichment (Dennen & Arslan, 2022; Grant 2021; Kipp, 2018).

The second way is to implement an inclusive system that records total work output emanating from a completed work cycle. Moreover, recording total work outputs depicts the knowledge, skills and competencies of the facilitators through the quality and richness of learning materials developed offline and shared online with the students. Similarly, a role without a clear communication and support structure is futile in a job design perspective. This shows a lack of clear constructive feedback and thus demonstrates a communication breakdown (Knight & Parker, 2021; Oldham & Fried, 2016). Constructive feedback is, therefore, the third way needed to expand and restore communications to improve job performance. The information must be direct and clear to online facilitators (Khurshid, 2020; Oldham & Hackman, 2010). Feedback providers, who are most likely to be line managers, must have a clear intention. The goal must be aimed at supporting online facilitators to become more proficient at using the LMS to transfer knowledge (Maré & Mutezo, 2020). The endeavour will also evoke positive and affective responses from online facilitators. Their motivation to exercise autonomy to perform a variety of facilitation tasks effectively will be heightened. In this case, the expansion of job performance feedback should include regular communication in all phases of facilitation. The fourth way is to showcase organisational support through tailored training to improve job mastery. The results from other studies have shown that job mastery in the delivery of online education has a significant influence on knowledge transfer and successful course completion in higher education (Hidayah et al., 2017; Kadhim et al., 2023; Simmons et al., 2023). By and large, provision of organisational support will improve the virtual working conditions in higher education, thereby mitigating feedback and task identity challenges within the online facilitation role (Foreman-Brown et al., 2023; Rhode et al., 2017).

Limitations and future research

The researchers encountered limitations in choosing equal sampling sizes to merge quantitative and qualitative strands. However, rigorous procedures in the thematic analysis corroborated the statistical results and strengthened the credibility of the research findings. Future research should focus on the knowledge, skills and competencies required to facilitate through the use of web-based technologies in the higher education context.

Conclusion

This study validated the job structure of the online facilitators, as few studies have been conducted in this area. The findings shed light on job design matters in virtual work settings within the higher education context. The integrated results demonstrated the vital role that online facilitators play in HEIs, that is, to offer support services to students in completing their courses in their respective fields of study. Furthermore, online facilitators operate as the first point of contact with the student using the LMS. Therefore, their performance has a great impact on student satisfaction with courses offered online in higher education. However, experiences with limited job performance feedback and challenges of task identity within their role necessitate a modification of their job structure to capture all salient aspects of facilitating knowledge transfer in HEIs (Foreman-Brown et al., 2023; Wegman et al., 2016).

Acknowledgements

This article is partially based on the author M.D.K.’s doctoral thesis entitled ‘Job design effects on the psychological contract and job engagement among online facilitators working virtually at a South African higher learning institution’ towards the degree of Doctor of Philosophy in Management Studies, University of South Africa, South Africa, with supervisors P.N. and T.K.M., received in December 2024.

Competing interests

The authors declare that they have no financial or personal relationships that may have inappropriately influenced them in writing this article.

Authors’ contributions

The contents of this article are an equal contribution of M.D.K., P.N. and T.K.M., who all drafted, revised and finalised the article.

Funding information

This study received no specific grant from any funding agency in the public, commercial or not-for-profit sectors.

Data availability

The data that support the findings of this study are available from the corresponding author, M.D.K., upon reasonable request.

Disclaimer

The views and opinions expressed in this article are those of the authors and are the product of professional research. They do not necessarily reflect the official policy or position of any affiliated institution, funder, agency or publisher. The authors are responsible for this study’s results, findings and content.

References

Bentler, P.M., & Bonett, D.G. (1980). Significance tests and goodness of fit in the analysis of covariance structures. Psychological Bulletin, 88(3), 588–606. https://doi.org/10.1037/0033-2909.88.3.588

Berge, Z.L. (2008). Changing instructor’s roles in virtual worlds. Quarterly Review of Distance Education, 9(4), 407–414. Retrieved from http://eric.ed.gov/?id=EJ875111

Braun, V., & Clarke, V. (2023). Toward good practice in thematic analysis: Avoiding common problems and be(com)ing a knowing researcher. International Journal of Transgender Health, 24(1), 1–6. https://doi.org/10.1080/26895269.2022.2129597

Braun, V., Clarke, V., & Weate, P. (2016). Using thematic analysis in sport and exercise research. In B. Smith & A.C. Sparkes (Eds.), Routledge handbook of qualitative research in sport and exercise (pp. 191–205) Routledge, London.

Camilleri, M.A., & Camilleri, A.C. (2022). The acceptance of learning management systems and video conferencing technologies: Lessons learned from COVID-19. Technology, Knowledge and Learning, 27(4), 1311–1333. https://doi.org/10.1007/s10758-021-09561-y

Chen, F.F. (2007). Sensitivity of goodness of fit indexes to lack of measurement invariance. Structural Equation Modeling: A Multidisciplinary Journal, 14(3), 464–504. https://doi.org/10.1080/10705510701301834

Cheung, G.W., Cooper-Thomas, H.D., Lau, R.S., & Wang, L.C. (2023). Reporting reliability, convergent and discriminant validity with structural equation modeling: A review and best-practice recommendations. Asia Pacific Journal of Management, 41, 745–783. https://doi.org/10.1007/s10490-023-09871-y

Christensen, M.K., Nielsen, K.J.S., & O’Neill, L.D. (2022). Embodied teacher identity: A qualitative study on ‘practical sense’ as a basic pedagogical condition in times of Covid-19. Advances in Health Sciences Education, 27(3), 577–603. https://doi.org/10.1007/s10459-022-10102-0

Creamer, E.G. (2018). Striving for methodological integrity in mixed methods research: The difference between mixed methods and mixed-up methods. Journal of Engineering Education, 107(4), 526–530. https://doi.org/10.1002/jee.20240

Creswell, J.W., & Creswell, J.D. (2022). Research design: Qualitative, quantitative, and mixed methods approaches (6th ed.). Sage.

De Metz, N., & Bezuidenhout, A. (2018). An importance-competence analysis of the roles and competencies of e-tutors at an open distance learning institution. Australasian Journal of Educational Technology, 34(5), 27–43. https://doi.org/10.14742/ajet.3364

Dennen, V.P., & Arslan, Ö. (2022). The visual performance of online identity: Instructor presence and persona across tools and settings. Studies in Technology Enhanced Learning, 2(1), 13–30. https://doi.org/10.21428/8c225f6e.9e975efc

Derakhshan, A., Greenier, V., & Fathi, J. (2023). Exploring the interplay between a loving pedagogy, creativity, and work engagement among EFL/ESL teachers: A multinational study. Current Psychology, 42(26), 22803–22822. https://doi.org/10.1007/s12144-022-03371-w

Finne, L.T., Gammelgaard, B., & Christiansen, F.V. (2022). When the lab work disappears: Students’ perception of laboratory teaching for quality learning. Journal of Chemical Education, 99(4), 1766–1774. https://doi.org/10.1021/acs.jchemed.1c01113

Foreman-Brown, G., Fitzpatrick, E., & Twyford, K. (2023). Reimagining teacher identity in the post-Covid-19 university: Becoming digitally savvy, reflective in practice, collaborative, and relational. Educational and Developmental Psychologist, 40(1), 18–26. https://doi.org/10.1080/20590776.2022.2079406

Fornell, C., & Larcker, D.F. (1981). Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research This, 18(1), 39–50. https://doi.org/10.1177/002224378101800104

García-Izquierdo, A.L., & Castaño, A.M. (2022). Work characteristics and occupational health: Validation and proposal of a shortened version of the work design questionnaire. Anales de Psicologia, 38(1), 149–162. https://doi.org/10.6018/ANALESPS.480481

Goretzko, D., Siemund, K., & Sterner, P. (2023). Evaluating model fit of measurement models in confirmatory factor analysis. Educational and Psychological Measurement, 84(1), 123–144. https://doi.org/10.1177/00131644231163813

Govender, P. (2018). E-tutors’ pedagogical practices in a selected open and distance learning university in South Africa. Progressio: South African Journal for Open and Distance Learning Practice, 40(1), 1–16. https://doi.org/10.25159/0256-8853/4706

Grant, M.M. (2021). Asynchronous online course designs: Articulating theory, best practices, and techniques for everyday doctoral education. Impacting Education: Journal on Transforming Professional Practice, 6(3), 35–46. https://doi.org/10.5195/ie.2021.191

IBM Corp. (2020). IBM SPSS Statistics for Windows (Version 29.0) [Computer software]. IBM Corp. https://www.ibm.com/products/spss-statistics

Hackman, J.R., & Oldham, G.R. (1975). Development of the job diagnostic survey. Journal of Applied Psychology, 60(2), 159–170. https://doi.org/10.1037/h0076546

Hackman, J.R., & Oldham, G.R. (1976). Motivation through the design of work: Test of a theory. Organizational Behavior and Human Performance, 16(2), 250–279. https://doi.org/10.1016/0030-5073(76)90016-7

Hackman, J.R., & Oldham, G.R. (1980). Work redesign. Addison-Wesley.

Hair, Jr. J.F., Sarstedt, M., Hopkins, L., & Kuppelwieser, V.G. (2014). Partial least Squares Structural Equation Modeling (PLS-SEM). European Business Review, 26(2), 106–121. https://doi.org/10.1108/EBR-10-2013-0128

Hidayah, N., Nadhir, M., & Puteh, F. (2017). Impact assessment of job characteristics model on employee engagement. E-Academia Journal, 6(1), 2289–6589. Retrieved from http://myjms.mohe.gov.my/index.php/JeA/article/view/2222%0Ahttp://journale-academiauitmt.edu.my/

Hu, L.T., & Bentler, P.M. (1999). Cutoff criteria for Fit Indexes in Covariance Structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6(1), 1–55. https://doi.org/10.1080/10705519909540118

IBM Corp. (2020). IBM SPSS Statistics for Windows (Version 29.0) [Computer software]. IBM Corp. Retrieved from https://www.ibm.com/products/spss-statistics

Jöreskog, K.G. (1970). Simultaneous factor analysis in several populations*. ETS Research Bulletin Series, 1970(2), i–31. https://doi.org/10.1002/j.2333-8504.1970.tb00790.x

Kadhim, J.Q., Aljazaery, I.A., & AL-Rikabi, H.T.S. (2023). Enhancement of online education in engineering college based on mobile wireless communication networks and IOT. International Journal of Emerging Technologies in Learning, 18(1), 176–200. https://doi.org/10.3991/ijet.v18i01.35987

Khandan, M., Momenyan, S., Javadi, F., Allahdadi, Z., Koohpaei, A., & Tabar, H.H. (2018). Assessing reliability and validity of the work design questionnaire as a tool for macro ergonomics surveys: A case study in an iranian worker population in 2016. Journal of Occupational Health and Epidemiology, 7(3), 145–152. https://doi.org/10.29252/johe.7.3.145

Khurshid, F. (2020). E-pedagogical skills of online instructors: An exploratory study. Bulletin of Education & Research, 42(2), 235–250. Retrieved from https://login.ezproxy.lib.purdue.edu/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=eue&AN=147917174&site=ehost-live

Kipp, K. (2018). Exploring the future of the learning management system. International Journal on Innovations in Online Education, 2(2), 1–11. https://doi.org/10.1615/intjinnovonlineedu.2018028353

Knight, C., & Parker, S.K. (2021). How work redesign interventions affect performance: An evidence-based model from a systematic review. Human Relations, 74(1), 69–104. https://doi.org/10.1177/0018726719865604

Kola, M.D. (2024). Job design effects on the psychological contract and job engagement among online facilitators working virtually at a South African higher learning institution [Unpublished doctoral thesis]. University of South Africa, Pretoria, South Africa.

Kulikowski, K., Przytuła, S., & Sułkowski, Ł. (2022). E-learning? Never again! On the unintended consequences of COVID-19-forced e-learning on academic teacher motivational job characteristics. Higher Education Quarterly, 76(1), 174–189. https://doi.org/10.1111/hequ.12314

Maré, S., & Mutezo, A.T. (2020). The effectiveness of e-tutoring in an open and distance e-learning environment: Evidence from the University of South Africa. Open Learning: The Journal of Open, Distance and E-Learning, 36(2), 164–180. https://doi.org/10.1080/02680513.2020.1717941

Martin, F., Ritzhaupt, A., Kumar, S., & Budhrani, K. (2019). Award-winning faculty online teaching practices: Course design, assessment and evaluation, and facilitation. Internet and Higher Education, 42(April), 34–43. https://doi.org/10.1016/j.iheduc.2019.04.001

Maxwell, J.A. (2020). Why qualitative methods are necessary for generalization. Qualitative Psychology, 2019, 111–118. https://doi.org/10.1037/qup0000173

Mbati, L., & Minnaar, A. (2015). Guidelines towards the facilitation of interactive online learning programmes in higher education. International Review of Research in Open and Distance Learning, 16(2), 272–287. https://doi.org/10.19173/irrodl.v16i2.2019

Mehta, P. (2022). Work alienation as a mediator between work from home-related isolation, loss of task identity and job insecurity amid the COVID-19 pandemic. International Journal of Workplace Health Management, 15(3), 287–306. https://doi.org/10.1108/IJWHM-03-2021-0070

Morgeson, F.P., & Humphrey, S.E. (2006). The Work Design Questionnaire (WDQ): Developing and validating a comprehensive measure for assessing job design and the nature of work. Journal of Applied Psychology, 91(6), 1321–1339. https://doi.org/10.1037/0021-9010.91.6.1321

Morgeson, F.P., & Humphrey, S.E. (2008). Job and team design: Toward a more integrative conceptualization of work design. Research in Personnel and Human Resources Management, 27, 39–91. https://doi.org/10.1016/S0742-7301(08)27002-7

Oldham, G.R., & Fried, Y. (2016). Job design research and theory: Past, present and future. Organizational Behavior and Human Decision Processes, 136, 20–35. https://doi.org/10.1016/j.obhdp.2016.05.002

Oldham, G.R., & Hackman, R.J. (2010). Not what it was and not what it will be: The future of job design research. Journal of Organizational Behavior, 31(2–3), 463–479. https://doi.org/10.1002/job.678

Onwuegbuzie, A.J., & Collins, K.M.T. (2017). The role of sampling in mixed methods-research: Enhancing inference quality. Kolner Zeitschrift Fur Soziologie Und Sozialpsychologie, 69(Suppl 2), 133–156. https://doi.org/10.1007/s11577-017-0455-0

Pandit, D., & Agrawal, S. (2022). Exploring challenges of online education in COVID times. FIIB Business Review, 11(3), 263–270. https://doi.org/10.1177/2319714520986254

Parker, S.K., & Grote, G. (2022). Automation, algorithms, and beyond: Why work design matters more than ever in a digital world. Applied Psychology, 71(4), 1171–1204. https://doi.org/10.1111/apps.12241

Parker, S.K., Morgeson, F.P., & Johns, G. (2017). One hundred years of work design research: Looking back and looking forward. Journal of Applied Psychology, 102(3), 403–420. https://doi.org/10.1037/apl0000106

Peschke, J. (2014). Where is the teacher in online learning: Centre stage or cameo appearance? In European distance and E-Learning Network 2014 Research Workshop (EDEN) Conference Proceedings (No. 2), (pp. 47–56). 27 – 28 October 2014, Oxford, United Kingdom, European Distance and e-Learning Network. http://www.eden–online.org

Rhode, J., Richter, S., Gowen, P., Miller, T., & Wills, C. (2017). Understanding faculty use of the learning management system. Online Learning Journal, 21(3), 68–86. https://doi.org/10.24059/olj.v21i3.1217

Rodés, V., Porta, M., Garófalo, L., & Enríquez, C.R. (2021). Teacher education in the emergency: A MOOC-inspired teacher professional development strategy grounded in critical digital pedagogy and pedagogy of care. Journal of Interactive Media in Education, 2021(1), 1–14. https://doi.org/10.5334/jime.657

Simmons, D.E., Gravina, N., Sleiman, A., & Kronfli, F.R. (2023). Using web-based behavioral skills training to teach online interview skills to college students. Journal of Organizational Behavior Management, 44(2), 88–112. https://doi.org/10.1080/01608061.2023.2219466

Skamagki, G., King, A., Carpenter, C., & Wåhlin, C. (2024). The concept of integration in mixed methods research: A step-by-step guide using an example study in physiotherapy. Physiotherapy Theory and Practice, 40(2), 197–204. https://doi.org/10.1080/09593985.2022.2120375

Van Beurden, J., Borghouts, I., Van den Groenendaal, S.M., & Freese, C. (2024). How Dutch higher HRM education prepares future HR professionals for the impact of technological developments. International Journal of Management Education, 22(1), 100916. https://doi.org/10.1016/j.ijme.2023.100916

Wegman, L.A., Hoffman, B.J., Carter, N.T., Twenge, J.M., & Guenole, N. (2016). Placing job characteristics in context: Cross-temporal meta-analysis of changes in job characteristics since 1975. Journal of Management, 44(1), 1–35. https://doi.org/10.1177/0149206316654545

Yin, R.K. (2016). Qualitative research from start to finish (2nd ed.). The Guilford Press.