key: cord-0719640-hkezwyen authors: Ibrahim, Walid; Ibrahim, Wissam; Zoubeidi, Taoufik; Marzouk, Sayed; Sweedan, Amr; Amer, Hoda title: An Online Management System for Streamlining and Enhancing the Quality of Learning Outcomes Assessment date: 2022-05-06 journal: Educ Inf Technol (Dordr) DOI: 10.1007/s10639-022-10918-8 sha: c11c79cad6a7ba1e7dd14d1900bfb442fa9fa3c5 doc_id: 719640 cord_uid: hkezwyen Learning outcomes assessment is an effective academic quality assurance tool that enables educators to review and enhance the alignment between planned, delivered, and experienced curricula. Accurately assessing what students know and are able to do after completing a learning module is the first step to decide on the strategies to implement and the proper actions to take in order to ensure the continuous improvement of the student learning experience. Nonetheless, learning outcomes assessment processes in higher education are still facing major challenges that affect their proper and effective implementation. Hence, faculty do not usually experience noticeable improvement in the students’ performance over several assessment cycles, which causes their frustration and reluctance to continue participating in the assessment process. This paper discusses the main issues that affect the implementation of the assessment process and prevent the closure of the assessment loop. It also introduces a unified assessment process and an online management system that have been developed recently to address the discussed issues. The online management system streamlines the assessment process, while providing administrators and quality assurance officers with valuable infographics and reports to effectively oversee the implementation of the assessment process. The system has been deployed at the United Arab Emirates University since fall 2018, and has been successfully used by faculty to assess the learning outcomes for more than 1000 courses each semester. Moreover, collected statistics showed that the online features provided by the system allowed faculty to continue their assessment tasks seamlessly during the COVID-19 pandemic. The last few decades witnessed a paradigm shift in higher education quality assurance as academic programs are increasingly adopting the outcome-based education (OBE) model instead of the traditional curriculum-based one (Harden, 2007) . The traditional curriculum-based model is instructor centric. It focuses on the topics that will be covered by the instructors in each learning module, and the associated pedagogy they will use in the classroom. On the other hand, students are the focal point of the OBE model, which relies on articulating a set of learning outcomes (LOs) that defines what students are expected to know, and be able to demonstrate after the completion of a learning module (Kennedy et al., 2007) . The defined LOs are then used by instructors to guide content development, classroom pedagogies, and student evaluation. They are also used to define the type and depth of the learning students are expected to achieve, and provide a point of reference to assess the effectiveness of the learning experience. Moreover, they clearly communicate expectations to potential learners and prospective employers. The adoption of the OBE model in higher education received a significant boost in 1999 when 29 European countries signed the Bologna Declaration in 1999 (Wende, 2000) . The number of European countries joined the declaration increased steadily since 1999 and reached 48 countries by 2010. The Bologna process aims to ensure comparability in the standards and quality of higher-education qualifications across Europe (Crosier & Parveva, 2013; Wächter, 2004; Kettunen & Kantola, 2006) . Although the Bologna Declaration did not mention the OBE model explicitly, the OBE model is a central element of the European Credit Transfer and Accumulation System (ECTS) adopted by the Bologna process (European Commission. Directorate General for Education and Culture, 2009). A pilot project conducted on the ECTS showed that study programs at different institutions are much easier to be compared if they are described in terms of LOs, instead of offered curriculum (Taurina, 2015) . Thus, the ECTS uses LOs as the basis for estimating students' workload and credit allocation, which facilitates student mobility and academic recognition. Starting 2010, all modules and programs in post-secondary institutions throughout the European Union were required to be redesigned to reflect a set of defined LOs (Kennedy et al., 2007) . The adoption of the OBE model is also supported by several higher education organizations, qualification frameworks, as well as regional and international accrediting agencies. These include the USA National Institute for Learning Outcomes Assessment (NILOA) (Kuh and Ewell, 2010) , the Australian Tertiary Education Quality and Standards Agency (Hay, 2012) , the European Qualification Framework (Méhaut and Winch, 2012) , the National Qualifications Frameworks in the United Kingdom, the Australian Qualifications Framework (McInnis, 2010) , the Qualification Framework Emirates (National Qualification Authority, 2012), ABET (Felder and Brent, 2003) , AACSB (Kelley et al., 2010; Shaftel and Shaftel, 2007) , WASC Senior College and University Commission (Heyman et al., 2017) to mention a few. A 2013 national survey conducted by NILOA revealed that 84% of all the USA colleges and universities had adopted stated learning outcomes for all their undergraduates programs (Kuh et al., 2014) . To ensure continuous improvements, the OBE model relies on assessing the defined LOs through a periodic assessment cycle (Huba and Freed, 1999; Kuh and Ewell, 2010) as shown in Fig. 1 . During each cycle, appropriate assessment tools are used to support students in their progress (formative assessment) and measure their attainment of the intended LOs at the end of the learning module (summative assessment). The collected assessment data are then analyzed and compared against predefined targets to determine which LOs have been attained by students and which ones need improvement. Remedial actions are recommended to address any revealed deficiencies, and the assessment loop is closed after the recommended remedial actions are implemented, and their impact are measured (Banta and Blaich, 2011) . Learning outcomes assessment (LOA) is an evidence-based process that aims at increasing teaching effectiveness and efficiency. It offers a valuable means for colleges and universities to evaluate their current programs and policies, to innovate where necessary, and to assure the attainment of their mission (Volkwein, 2003) . LOA enables educators to review and enhance the alignment between the planned, delivered and experienced curriculum, as it can be used to answer the following important questions: • Are students achieving the intended outcomes? • Are they gaining the required skills to succeed in their field of profession? • Is the program continuously improving the student learning experience? • Should the offered curriculum or the teaching pedagogies be modified? • Are there other techniques or additional resources that would help students learn more effectively? Answering the above questions would help educators decide on the proper actions to take and the strategies to implement in order to ensure the continuous Fig. 1 Course learning outcomes assessment process improvement of their students' learning experience, and the attainment of the intended LOs. The 2013 survey conducted by NILOA (Kuh et al., 2014) revealed that the main driver for LOA is the institutions strive to satisfy the expectations of regional and program accrediting agencies. Nonetheless, using LOA for program review and continuous improvement are getting more attention form higher education institutions recently (Mince, 2019) . Despite the promising features of the OBE model and its integrated assessment and continuous improvement cycle, its effective implementation in higher education is still facing major challenges that affect the closure of the assessment loop (Alruwais, 2018; Friedlander and Serban, 2004; Liu, 2011; Serban, 2004; Zlatkin-Troitschanskaia et al., 2016) . Friedlander & Sebran (2004) identified multiple challenges to the design and implementation of a sustainable approach for LOs assessment. The first challenge is the faculty reluctance to participate in LOA processes. While the second challenge is the absence of an infrastructure to streamline the assessment processes and provides faculty with the required technical knowledge and support. The third challenge is the shortage of designated quality assurance staff with the time, knowledge, and skills to oversee the implementation of the assessment processes. The fourth challenge is the difficulty to gain consensus among faculty on what they are trying to achieve at the course, program, and college levels. Friedlander & Sebran (2004) also mentioned that faculty do not usually have sufficient training or experience in articulating learning outcomes and developing the processes to assess them, or in determining the target attainment thresholds. In addition to the above challenges, we believe that other main challenges that facing the successful design and implementation of LOA processes include: -the vague definition of responsibility and accountability for the different assessment tasks, -the difficulty in retrieving and comparing attainment results over a sequence of offerings, and -the lack of follow up on the implementation of the recommended remedial actions. We also believe that failing to close the assessment loop is one of the main reasons for the faculty reluctance to participate effectively in the assessment process. When the assessment loop is not closed properly, faculty do not usually realize any improvement in the students' performance over several assessment cycles, which causes their frustration and discourages them from participating effectively in the assessment process. In a 2013 survey conducted by NILOA (Kuh et al., 2014) , 516 provost revealed that the most important and prevalent assessment supports provided by their institutions were: -institutional policy/statements about assessing undergraduate learning, -faculty engagement and involvement in assessment, -existence of an assessment committee, -Institutional research and/or assessment office capacity for assessment work, and -availability of professional staff dedicated to assessment. Moreover, public and for-profit institutions participated in the survey indicated that assessment management systems and recognition or rewards for faculty are two major incentives for faculty participation. This paper discusses in detail the main challenges that affect the implementation of effective assessment processes in higher education. It also introduces a unified assessment process and a Learning Outcomes Assessment Management system (LOAMS) that was recently developed and deployed at the United Arab Emirates University (UAEU) with the aim of alleviating the aforementioned LOA challenges. LOAMS streamlines the assessment process and has several embedded quality assurance measures that provide administrators and quality assurance officers with infographics and reports to manage the implementation of the assessment process effectively. The reset of the paper is organized as follows: Section 2 discusses the main challenges that could disrupt the implementation of the assessment process. Section 3 introduces the assessment process and defines the entities responsible for conducting the assessment process and their specific roles. The LOAMS implementation and features are detailed in Section 4, followed by discussion and concluding remarks in Section 5 and 6. Successful implementation of assessment processes requires collective contribution from multiple entities throughout the assessment process. Therefore, without clearly defining the responsibility of each entity, there is a considerable chance that the entire process might collapse. For instance, consider the case where the assessment process does not clearly specify who is responsible for analyzing the collected assessment data and documenting the analysis remarks. In such a case, it is most probably that no one will voluntarily assume this extra responsibility. This will cause the assessment process to terminate prematurely after the data collection step. Nonetheless, even if someone eventually volunteered to complete the analysis task, the analysis remarks done on voluntarily basis will lack the accountability aspects necessary for quality assurance. Other potential challenges that might occur at different stages during the implementation of the assessment process and lead to its failure are discusses below. The course assessment process starts by articulating the intended Course Learning Outcomes (CLOs) and aligning them with the Program Learning Outcomes (PLOs) and course objectives. Articulating appropriate CLOs is essential for successful implementation of the assessment process, as they define the breadth and depth of the learning that students are expected to achieve, and serve as guidelines for content development, instructions, and students' evaluation. Each CLO should start with an appropriate action verb that defines the expected cognitive level (Anderson et al., 2000) . As such, the CLOs articulated for freshman level courses should not have the same depth and breadth as the CLOs defined for senior level courses. Similarly, the articulated CLOs should clearly differentiate between Bachelor, Master, and Doctorate level courses. For instance, the outcomes articulated for undergraduate courses should take into consideration that students at that level should deal with basic concepts and knowledge, and work with accepted theories and approaches. On the other hand, the outcomes for PhD courses should require students to display considerable depth of knowledge, deal with complex material and work with uncertain and incomplete data. Failing to articulate the CLOs at the proper cognitive level usually lead to the lack of covering material and activities, which render the CLO immeasurable. Starting the CLO with a measurable action verb is also essential for collecting direct evidence that gauge the attainment of the outcome. It is usually hard to collect assessment data for a CLO that starts with immeasurable action verb such as to "know", "appreciate", "understand", or "be familiar with", as it does not explicitly define the depth of knowledge the students are expected to receive. A measurable CLO should also be focused on a specific category of student learning, and should be attainable by students given the provided learning experience and the course time bound. The second task of the course assessment process is to define the course topics needed to cover each of the articulated CLO. A topical outline table is usually used at this stage to ensure that the offered topics provide students with the opportunity to develop and attain the intended outcomes as shown in Table 1 . The table is also used to assure that the offered topics are aligned with the cognitive levels specified by the CLOs. Moreover, the table is necessary for closing the assessment loop effectively, as it accurately pinpoints the topics associated with unattained CLOs. Failing to align the course topics and activities with the articulated CLOs is another cause of immeasurable CLOs. In such case, faculty usually recognize at the end of the semester that they did not collect any assessment data for one or more CLO. This could happen because the CLO was not covered at all; or it was covered but was not assessed by any of the course activities (e.g., assignment, quiz, test, etc.). For instance, a course may define a CLO for students to communicate effectively in both oral and written format; however, the course could fail to define any topic or have any activity that covers communication skills. Failing to align course topics and activities with the cognitive level of the articulated CLOs will render the CLOs immeasurable as well. For example, a course could have a CLO articulated at the "create" cognitive level (the highest level of Bloom's cognitive taxonomy), however, the covered topics by the course stop only at the "apply" level. The third task of the assessment process is to define the direct and indirect assessment tools that will be used throughout the semester to collect assessment data for each CLO. The aim is to select assessment tools that require little extra time and effort, and do not overwhelm course instructors and students with new tasks. Assessment data could be easily collected from regular learning activities (e.g., exams, portfolios, capstone projects, lab assignments, etc.) rather than additional tasks for students. It is important to emphasize that the better the integration of the assessment tools into existing student work, the greater the probability that the developed assessment plans will succeed. Utilizing faulty or inappropriate assessment tools would lead to collecting assessment data that does not truly represent the students' attainment level, which will consequently lead to wrong analysis remarks and ineffective remedial actions. For instance, a final exam question is an inappropriate tool to assess the student's ability to implement a design. Similarly, MCQs are not appropriate tools to measure the students' communication or information literacy skills. An appropriate assessment tool should be able to measure the competency addressed by the outcome effectively and accurately from multiple sections of the same course. Utilizing effective rubrics could also be used as a tool to increase the consistency and accuracy of the assessment data collected from multiple faculty teaching the same course. Closing the assessment loop by analyzing the collected assessment data and addressing discovered deficiencies with appropriate remedial actions is the core of the OBE assessment process, as it provides the mechanism for continuously monitoring and improving the students' learning experience. However, it is typically where the assessment process is disrupted Kinzie et al., 2015) . For instance, course instructors may terminate the assessment process after reporting the collected assessment results without providing any analysis remarks or recommend remedial actions. This could happen because they do not fully understand their role in the assessment process and believe their assessment contribution ends after reporting the assessment data. This could also occur because they do not have the training, experience, and/or the tools required to analyze students' performance effectively across multiple assessment cycles. Even when analysis remarks are provided and remedial actions are recommended, there is usually a need for a follow up plan to ensure that the recommended actions are fully implemented during subsequent course offerings. This is especially needed when the remedial actions are documented by course instructors in hard course portfolios (paper format) or in a soft format that is difficult to look up and extract. Without closing the assessment loop properly, the continuous improvement cycle will be interrupted, and no noticeable improvement will usually be detected in the students' performance during the subsequent offerings. This usually leads to faculty frustration and reluctance to participate in the assessment process, as they do not realize the benefits of the extra efforts and time they spent collecting and reporting assessment data. Therefore, having an assessment process that defines the responsibility and accountability of each assessment task is essential for the successful implementation of the assessment process. Up until 2013, CLO assessment at UAEU was a common practice for the internationally accredited programs. However, it was not the same for many other programs. Therefore, in 2013 the University initiated a staratigic project to ensure that learning outcomes are uniformly defined and assessed across all offered programs and courses. The first initiative towards achieving this target was the establishment of the LOA Unit underneath the Office of Institutional Effectiveness (OIE). The OIE oversees all academic quality assurance in the University, and helps the University realizing its commitment to provide students with the finest education and a supportive learning environment. In a process that involves many contributors, splitting the process into welldefined tasks and defining the responsibility for each task is essential for accountability and quality assurance. Therefore, to organize and manage the anticipated new LOA activities across the University, the LOA Unit established a hierarchy of assessment committees and personnel. The aim of the hierarchy is to foster faculty engagement and participation in the development and implementation of the assessment process, and to formalize the responsibility and accountability of the process implementation and quality assurance. In the case of CLOs assessment, the process should clearly define the entities responsible for collecting the assessment data, analyzing the assessment results, and recommending remedial actions to address discovered shortcomings. It is also crucial to specify the entity responsible for overseeing the implementation of the recommended remedial actions, especially when the implementation spans over multiple semesters and requires a series of approvals at the department, college, and the university levels. Therefore, the unified CLO assessment process developed by the LOA Unit clearly defined the role of the Course Coordinator, Course Committee, Department Assessment Committee (DAC), and Department Curriculum Committee (DCC) as follows. A course coordinator is appointed by the department chair for each course offered by the department. The course coordinator is the main instructor of the course and has the following responsibilities: 1. Maintaining the official version of the course material (e.g., syllabus, lecture notes, assessment reports, etc.). 2. Chairing the course committee if the course is offered to multiple sections by multiple faculty members. 3. Updating the LOA management system with the summative assessment tools approved by the course committee. 4. Setting up periodic follow up meetings with the courses committee to ensure homogeneous and synchronized progress of the course delivery across all offered sections (lectures and labs). 5. Coordinating the preparation of the unified midterm and final exams, and ensuring their alignment with the approved summative assessment tools. 6. Overseeing the implementation of the remedial actions approved by the course committee. 7. Uploading the course analysis remarks and recommended actions to the LOA management system along with progress reports on the implementation and impact of the previous remedial actions implemented during the semester. In the case of specialized courses that are offered only by the course coordinator, the course coordinator assumes the responsibilities of the course committee including the developing of the assessment plan and selecting the appropriate assessment tools. In both cases, the developed assessment plans are reviewed and approved by the Department Assessment Committee as discussed below. A course committee is routinely formed at the beginning of the semester for each course offered for multiple sections by multiple faculty members. The committee is chaired by the course coordinator and comprised of all faculty teaching the course that semester or have taught the course recently (if required). The course committee is responsible for developing the course assessment plan by selecting the assessment tools that will be used throughout the semester to collect direct and indirect assessment data. The committee should document the selected summative tools using the template provided in Table 2 . The template is used to ensure that each outcome is covered by at least one direct tool that is well aligned with the outcome's competency and cognitive level. If multiple direct tools are used to assess the same outcome, the committee could assign a weight to each tool that reflects its importance relative to the other tools as discussed in Section 5.3. In addition to the assessment tools, the assessment plan should also define an appropriate and realistic target attainment level for each outcome. The continuous improvement cycle is very sensitive to the setting of the target attainment level. Setting the target attainment level too low could halt the improvement cycle, as the outcome will be always attained and no remedial actions will be recommended. If no previous assessment data exist to guide the setting of the target attainment levels, the course committee usually start with a baseline value that is usually used by other courses in the same level. It is expected that the target attainment levels will be revised and adjusted by the course committee after a few assessment cycles to ensure the continuity of the improvement cycle, which could eventually lead to setting different target attainment levels for different CLOs of the same course. The course committee shall meet at the beginning of the semester to: -discuss and approve any modifications to the official course material proposed by the course coordinator, -review and revise the course assessment plan, if needed, -review the teaching materials, and decide on the delivery timeline and milestones including common midterm and final exams, and -discuss the remedial actions recommended from previous offering, and decide on their implementation plans. During the semester, members of the course committee are responsible for executing the assessment plan and collecting the assessment data by applying the selected assessment tools in their offered sections. They are also responsible for implementing the approved remedial actions and collecting evidence on their impact. By the end of the semester, the course committee meets again to analyze the CLOs attainment results, discuss the impact of the implemented remedial actions, and decide on the recommended remedial actions for subsequent offering. Department Assessment Committee (DAC) is a standing committee appointed by the department chair at the beginning of each academic year. The DLOAC oversees the implementation of the course assessment process and ensures that the assessment data are collected and analyzed as per the approved assessment timeline. They are also responsible for reviewing and approving the assessment plans developed by the course committees. The committee reports to the department chair regularly on the progress of the assessment plan and the implementation of the remedial actions. The DAC also plays a crucial role in closing the assessment loop. The scope of the course remedial actions could vary from a simple action that only requires the approval of the course committee, to more complex actions that might require the approval of the department, college, and university curriculum and assessment committees. Some actions such as changing the course modality (e.g., face-to-face, blended, online) might even require substantive change approval by the accrediting agency. Therefore, the role of the DAC is instrumental in following up with the involved entities to ensure that the remedial actions are implemented and the assessment loop is closed effectively. The department curriculum committee (DCC) is another standing committee appointed by the department chair at the beginning of each academic year. The curriculum committee is responsible for reviewing and approving any modification to the offered courses proposed by the course committee. The modifications may include course description, topics, modality, and LOs. In order to streamline the implementation of the assessment process while fostering quality assurance, an online learning outcomes assessment management system (LOAMS) has been developed and deployed across the UAEU campus starting the 2018-2019 academic year. The system implements the approved assessment process, and provides users with dashboards and infographics to track the status and progress of the process implementation. LOAMS defines the following types of users and grants them escalating privileges: -Course Instructors -Course Coordinators -Department Leads (Dept. Chairs, Chairs of DAC) -College Leads (Deans, College Academic Quality Assurance Officers) -University Leads (University Administrators, OIE) The LOAMS database is populated with data extracted regularly from the University's Student Information System, Learning Management System, and Curriculum Management system. The data uploaded to the system is divided into three categories: 1-Data uploaded once and revised as needed such as data related to users' information and privileges, colleges, departments, programs, courses, LOs, and their interrelationship. 2-Data uploaded to the system at the beginning of each semester. This includes information regarding offered sections such as course reference number, section number, instructor id, list of enrolled students. The student data (id, gender, nationality, major, minor, etc.) is also updated each semester to add the newly admitted students and revise the major and minor information of current students. The students' major information is used to automate the segregation of the assessment data when a course is offered to students from multiple majors. 3-Assessment data uploaded by course instructors and coordinators throughout the semester, which include -Assessment tools information (description, direct/indirect, summative/formative, weight, etc.) -Collected assessment data -Analysis remarks -Recommended remedial actions -Progress reports on implemented remedial actions The system defines two internal workflows to manage the submission of assessment data at the course and section levels as shown in Fig. 2 . The workflows ensure that specific conditions are satisfied at each state before the process can progress to the following state. Figure 2 shows the entity responsible for each step in the workflow. Although the course committee is responsible for course assessment, the system delegates this responsibility to the course coordinator in order to facilitate the submission of the course assessment data. Both workflows start at the beginning of the semester by the creation of the courses and their associated sections, and progress The course assessment cycle starts at the beginning of the semester by updating LOAMS with the information regarding the offered sections and their associated courses. This initial step is done by the system administrator who is responsible for extracting the required information from the student information system and uploading them to LOAMS. For each created section, the system administrator defines the course code, the section identification number, the assigned instructor, semester id, and the list of enrolled students. Similarly, for each created course, the course code, the course identification number, the course coordinator, and the semester id are also defined. Once the sections and the courses are created, LOAMS advances the workflow status to "Define Assessment Tools" status. The second step of the workflow is to define the summative and formative assessment tools that will be used by the course instructors to collect assessment data throughout the semester as shown in Fig. 3 . Course coordinators are required to update the system with the information regarding the summative assessment tools (title, description, max score, weight, etc.) as approved by the course committee and documented in the assessment tools template (See Table 1 ). Summative assessment tools are defined at the course level, hence applicable to all offered sections. Formative assessment tools, on the other hand, are section specific and could be uploaded as needed by the course instructors throughout the semester. In addition to the tool's information defined in Table 1 , the course coordinator may click on the "Set Rubric" button to attach a predefined rubric to the tool, or define a custom rubric. The system has several predefined rubrics including the widely used American Association for Colleges and Universities (AAC&U) value rubrics (Finley, 2011; Greenhoot and Bernstein, 2011) . LOAMS enforces several measures to ensure the quality of the created assessment tools. The first measure is preventing course instructors from creating any summative assessment tools in addition to the ones approved by the course committee. LOAMS allows course instructors to create as many formative assessment tools as needed to provide students with timely feedback regarding their attainment of the CLOs. However, to ensure consistent data collection across multiple sections and to simplify the aggregation of the assessment data, the system only allows course coordinators to create the approved summative assessment tools. Once created, the summative assessment tools become available to all offered sections. The second measure is to ensure that at least one summative assessment tool is created for each CLO. The system does that by checking the status of the summative assessment tools each time a new tool is defined by the course coordinator, and advances the status of the workflow to "Collect Assessment Data" only if at least one summative tool is defined for each CLO. Using major classroom activities such as projects, term papers, and final exams is a common practice to collect data regarding the attainment of CLOs. A major activity such as term projects could be used to assess multiple CLOs such as information literacy, writing, and critical thinking skills. Similarly, final exams are excellent tools to assess multiple CLOs as they usually contain several questions that are well aligned with the CLOs. However, a common mistake that usually occurs when defining the assessment tools is to use the final score of the activity multiple times to assess different outcomes. For instance, using the final score of a term project to assess both critical thinking and writing skills does not pinpoint the students' attainment in either competency. Alternatively, the course instructor should use a specific rubric to score each competency individually. To prevent this common mistake, LOAMS checks each submitted tool and rejects the submission if the tool is already used to assess another outcome. Moreover, LOAMS auto validates the data submitted for each tool and notifies the user if the submitted data is invalid. For example, the maximum score, tool weight, and students' scores must be positive numerical values. Drop down menus are also used whenever possible to simplify the data submission and ensure the validity of the tool's information. The final measure is the documentation of the tool description. The system requires the course coordinator to provide a detailed description of the tool during the tool definition. This allows the DAC and the office of quality assurance to audit the assessment tools selected by the courses committee and ensure that they are valid and well aligned with the corresponding outcomes. Course instructors are responsible for collecting CLOs assessment data using the assessment tools defined in the previous step. For each tool, the course instructors submit the score received by each student as per the list of student IDs uploaded by the system administrator as shown in Fig. 4 . Student performance must be measured using explicit criteria (e.g., rubrics, grading scheme) connected to the learning experience. To ensure consistent assessment across multiple sections, course committees are strongly encouraged to attach a scoring rubric to each selected tool. Rubrics improve consistency, as all students will be subject to the same grading standard. The system allows instructors to type in the students' scores, or copy and paste from external source (e.g., excel files). However, the students' order in the system and in the external source should be identical. In the case of a discrepancy, the system allows course instructors to reorder the students' IDs to match the data captured from the external media. The system also allows them to delete the IDs of the dropped-out students or add the IDs of new students who joined the section after the student data was initially populated by the system administrator. This allows the system to segregate the attainment results per students' majors as shown in Fig. 5 . Segregating the attainment results is important as it allows the course committee to understand the performance of students coming from different majors with different backgrounds, which could lead to revising the course prerequisites for a particular major. It is also essential for using the course assessment results further to assess the attainment of program LOs, as each program would be interested only in the assessment results of its students. Course instructors can submit the collected assessment data throughout the semester. LOAMS automatically calculates the attainment level for each tool once the data is submitted. This gives course instructors the chance to closely monitor the students' performance and provide them with timely feedback. However, if multiple assessment tools are used to assess a CLO, the system has to wait until the assessment data for all tools are submitted before aggregating the submitted data and calculating the overall attainment level of the CLO. According, LOAMS does not advance the section workflow status to "calculate section assessment results" until the assessment data for all the created summative assessment tools are submitted by the course instructor. The system provides course instructors with a detailed status regarding the assessment tools submitted for each CLO including the number of direct, indirect, and formative assessment tools as shown in Fig. 6 . To ensure the assessment data are submitted on time, the system sends three reminders to course instructors at the beginning of the last three weeks before the submission deadline. LOAMS automatically calculates the attainment result of an assessment tool once the assessment data is submitted by the course instructor. For each CLO 'c' the course committee defines an attainment threshold th c . The CLO attainment level when tool 't' is applied in section 's' (A t,c,s ) is calculated as the percentage of students who have achieved the attainment threshold th c of the tool's maximum score (i.e. std(i,t) max(t) ≥ th c ), where std(i,t) is the score of student i in tool t, and max(t) is the maximum score for tool t. If multiple tools are used to assess the same outcome, the system allows the course committee to define the weight w t, c associated with tool 't' when used to assess CLO Figure 3) . By the end of the semester, the system calculates the attainment result for each CLO 'c' in section 's' as: The tool's weight is used to calibrate the influence of the tools on the attainment score of the CLO, such that the weighted average score reflects the actual attainment of the outcome. For example, assume that a CLO was assessed in a section using three different tools (e.g., a quiz, a question in the midterm exam, and a question in the final exam). Assume also that 16, 14, and 12 out of the 20 students enrolled in the section achieved the attainment threshold in the three assessment tools respectively. Hence, the achievement scores of the three tools (A t,c,s ) are 80%, 70%, and 60% respectively. If the course committee decided to assign the same weight to the three tools, the attainment score of the CLO is calculated as: The course committee could also give more emphasize to the midterm and the final exams questions by assigning weights 0.5, 1, and 2 to the three tools respectively. Accordingly, the achievement score of the CLO is calculated as: If multiple sections are offered for a given course, after calculating the achievement score of each section, the system advances the status of the course workflow to (3) A c,s = 80 × 0.5 + 70 + 60 × 2 0.5 + 1 + 2 = 65% "calculate the course assessment score" and calculates the overall attainment score of the CLO as Figure 7 shows the attainment results for a course with six CLOs as presented by the system. It shows the overall attainment result of each CLO, as well as the attainment result for each of the three offered sections. Detailed attainment result for each tool is also provided as shown in Fig. 5. After completing the submission of the assessment data, course instructors could browse the attainment result of the outcomes as well as the attainment result for each submitted tool. Course instructors could also browse and compare the attainment results with the results from previous offering. This is important in order to identify any positive or negative developing trends, and investigate the impact of the remedial actions introduced during the semester. The system features two levels of analysis remarks. At the section level, course instructors are responsible for providing the analysis remarks for their sections, which include: • the attainment result for each CLO, • the extent to which the syllabus was covered, • the impact of the implemented instructor level remedial actions (cf. Section 5.5) • general comments on any problems encountered during course offering At the course level, the course coordinator calls the course committee for the last meeting of the semester to discuss the aggregated attainment results for each CLO. The committee should take into consideration the consistency/discrepancy between the attainment results for each section and comment on the reasons behind the developing negative or positive trends. Providing accurate analysis remarks is a key for recommending effective remedial actions to rectify discovered deficiencies. Therefore, LOAMS provides the course committee with details regarding the attainment level for each tool used to assess the outcomes in each offered section. The system also provides the committee with the ability to browse the assessment results of previous offering. This helps the committee deciding whether the discovered shortcoming is persisting, or an outlier related to a specific tool, section, or cohort. If the analysis remarks suggest that the shortcoming is persisting over several offering, the course committee may decide to intervene with remedial actions or to collect more assessment data during next offering. Similarly, detecting a shortcoming in a specific section or across all offered sections would affect the decision to recommend remedial actions at the instructor or the course level as discussed later. In addition to their remarks regarding the attainment levels of the outcomes, the course committee should also comment on the appropriateness of: • the textbook and other learning resources • the utilized summative assessment tools • the course prerequisites • the impact of the implemented course level remedial actions (cf. Section 5.5) For instance, the committee may decide that one of the summative tools selected by the committee during the first meeting is inappropriate based on the feedback received after the tool implementation during the semester. Moreover, based on the offered material, the course committee might decide that an outcome is immeasurable or has inappropriate cognitive level. It could also decide that the official textbook is obsolete or inappropriate, as it does not cover the offered material substantially. As mentioned earlier, course instructors usually terminate the assessment process after collecting the assessment data without providing the analysis remarks. Therefore, the system provides a user-friendly interface in order to simplify the submission process and encourage course instructors to provide their analysis remarks. A course instructor may select an analysis remark from the predefined dropdown list as shown in Fig. 8 , or select "Other" from the list and provide their own remarks. In order to advance the status of the workflow to "recommend remedial actions", LOAMS ensures that the required remarks are submitted successfully. If one of the required remarks is missing, LOAMS rejects the submission and highlights the missing remark(s). After discussing and deciding on the analysis remarks, the course committee is responsible for recommending remedial actions to rectify discovered shortcomings. Based on the analysis remarks, the course committee could recommend remedial actions at either the course level or the instructor level. Instructor level remedial actions are recommended if a shortcoming is consistently detected in the sections offered by a course instructor. These remedial actions would affect the instructor's pedagogy related to the identified shortcomings, and has no effect on the pedagogy in other offered sections. For instance, a course instructor may recommend a remedial action to add a new assignment or homework, or to change the teaching material for a specific unit. No further approval is needed for instructor level remedial actions once discussed and approved by the course committee. Individual course instructors are responsible for submitting the approved remedial actions to the system and for providing periodic progress reports until the actions are closed successfully. On the other hand, course level remedial actions are recommended by the course committee if the analysis remarks show that the same shortcoming is consistence across all offered sections. Remedial actions at this level could include changing an outcome, updating the official textbook, or revising the course prerequisites. The course coordinator is responsible for uploading the course level actions to the system and for overseeing their implementation. Based on the scope of the remedial action, further approval by the department, the college, or the university curriculum committees might be required. Therefore, the course coordinator is responsible for providing a periodic progress report until the action is implemented and closed. Since implementing the remedial actions and assessing their impact are essential for closing the assessment loop, the system assigns the following status for each created remedial action: • New: a new remedial action is created and submitted to the system • In Progress: the implementation of the action is in progress. • Implemented: the action has been implemented successfully, but its impact is not measured yet. • Closed: the action is implemented, and its impact is measured. To update the status of remedial actions, a progress report should be submitted by the course instructor (in the case of section level actions) or the course coordinator (in case of course level actions) that shows the implementation details. To close a remedial action, the impact of the action (positive, neutral, negative) and the analysis of the implementation should be also provided. To ensure the closure of the assessment loop, the system does not allow course instructors/course coordinators to advance the workflow status to "generate assessment report" until at least a course or an instructor level remedial action is submitted for each unattained outcome. If the identified shortcoming is not persistent, a remedial action to collect more assessment data during subsequent offerings could be recommended. In addition to submitting new remedial actions, LOAMS provides members of the course committee with the ability to browse the status of previously submitted remedial actions. At the beginning of the semester, LOAMS sends an automated email to course instructors encouraging them to check the remedial actions successfully closed by the course committee. The course instructor could use the system to review the implementation details and analysis remarks of the closed actions. This gives the course instructors the opportunity to re-implement actions with positive impact in their classrooms and avoid the remedial actions with negative or neutral impact. Sharing the remedial actions and their implementation details and analysis definitely helps spreading good teaching practices within the course committee. As mentioned above, course coordinators and course instructors are responsible for providing periodic progress reports to update the status of the remedial actions until the actions are closed. The system provides the department chair and the chair of DAC with infographics regarding the status of the remedial actions as shown in Fig. 9 . The system also provides a list of all uploaded progress reports related to a selected action, which allows the DAC to follow up on the implementation of the action especially when it is pending for the approval of an entity. After submitting the recommended actions successfully, the system updates the workflow status to "Generate Assessment Report" which allows the course instructors to generate the assessment reports for their sections. The section assessment report includes the following information: Course coordinators could also generate a course assessment report that aggregates the assessment results from all offered sections. In addition to the information mentioned above for each section, the course assessment report provides the overall attainment result for each outcome, and the overall attainment result per major as shown in Fig. 5 . It also includes the course level remedial actions recommended during the current offering, as well as the progress reports and implementation details for previously recommended remedial actions. The adoption of the OBE model at UAEU started more than two decades ago by several programs in the Business and Economics, Engineering, and Information Technology Colleges. This adoption was mainly driven by the commitment of these programs to continuously improve the learning experience provided to their students, as well as to satisfy the requirements for international programmatic accreditation. However, up until recently, this adoption was not uniformly implemented by other programs offered by the University. Therefore, in the 2013-2014 academic year, a new policy was approved by the University, which requires that each offered academic program and course must have a set of LOs that are periodically assessed to measure the students' attainment level. The University Program and Curriculum Office led the implementation of the approved policy, and organized several workshops to increase the awareness of the academic community regarding the articulation, alignment, and assessment of LOs. After the successful articulation and alignment of LOs, the implementation of the assessment process went through several improvement phases until the recent deployment of LOAMS. During the first phase, course instructors were requested to prepare course assessment reports using an MS Word template developed by the assessment committee. The course instructors were also requested to save hardcopies of the assessment reports in the course portfolios they submit at the end of each semester. Three main quality issues were identified during this initial phase. First, course instructors were required to apply the instructions provided in the approved assessment policy to calculate the outcomes attainment level for each outcome manually. This complicated the assessment process and caused significant discrepancies in the calculated results due to the inconsistent interpretation and application of the calculation instructions. Second, it was difficult for course instructors to retrieve the attainment results and recommended actions from previous cycles, or to compare the assessment results of multiple offered sections. Third, no restrictions or guidance were imposed on the selection of the summative assessment tools, which gave course instructors the freedom to select the summative assessment tools to be used for the collection of the assessment data in their sections. This was another source of inconsistency in the calculated attainment results, which complicated the aggregation of the assessment results from different sections to calculate the overall attainment result of the CLOs. To overcome the first issue, the MS Word template was replaced in the second phase by a unified excel worksheet developed by the LOA Unit. The worksheet utilized embedded macros to automate the verification of the submitted data and the calculation of the attainment results. Course instructors were still required to fill the worksheet with information regarding each offered section (course code, reference number, semester, enrolment, etc.), as well as the scores received by the students in each selected assessment tool. The worksheet also included a segment for course instructors to provide their section level analysis remarks discussed in Section 4.5. The second phase came in time with the University adoption of a centralized online course e-portfolios repository system instead of the traditional hardcopy ones, which significantly simplified the submission and retrieval of the completed worksheets. Using the embedded macros improved the accuracy of the calculated attainment levels. However, a few course instructors argued that filling the worksheet is a bit problematic and requires technical competencies that they do not have. It was also still complicated to aggregate the attainment results from different sections and compare them with the results from previous offering. Moreover, the LOA Unit noticed several copy and paste errors in the uploaded worksheets, as course instructors used already filled worksheets as initial versions to develop new ones. These types of errors were difficult to detect by the verification macros, as the data provided by the course instructors is valid but used for wrong sections. During the third phase, LOAMS was designed and developed in-house. All the discovered issues during the first two phases where addressed during the system design. The system was very well received by the University community, as it streamlines the assessment process and allows administrators and quality assurance officers to effectively manage the implementation of the assessment process. The system was used by 574 course instructors during the Fall semester of 2019-2020 academic year (pre COVID-19 pandemic) to successfully assess 921 out of the offered 1107 courses (83%) as shown in Table 3 . Table 3 also shows that the 1107 courses were offered in 2489 sections out of which 2002 sections were assessed (80.4%). Table 3 shows also that the 1107 assessed courses have a total of 5467 CLOs, out of which 4732 CLO were assessed (86.6%) and 3987 outcomes were attained. With respect to closing the assessment loop, a total of 1599 remedial actions where recommended by the course committees, out of which 1121 are still new, 37 are in progress, 63 are implemented, and 378 are already closed. Due to the COVID-19 pandemic, all course activities have moved online starting the second half of the Spring 2020 semester. These include final exams, presentations, and final reports, which administrated online through the Learning Management System (LMS). Fortunately, this disruption in the delivery mode had no effect on the implementation of the assessment process. Having LOAMS in place allowed course instructors to complete the assessment process seamlessly. The system has two features that allow course instructors to automatically extract the assessment data for the assessment tools administered through the LMS. The assessment statistics for Spring 2020 semester are presented in Table 3 . Compared to the assessment statistics of the previous semester (pre-pandemic), it is evident that the disruption in the course offering due to the COVID-19 pandemic had no negative effect on the implementation of the assessment process. On the contrary, automatically extracting the assessment data from the assessment activities administered online simplified the assessment process further and encouraged more course instructors to participate in the assessment process. Collected statistics show that 612 course instructors participated in Spring 2020 assessment cycle compared to 574 instructors in the previous semester. Table 3 shows that 994 out of the offered 1137 courses were successfully assessed (87%), up from the 83% in Fall 2019. The number of assessed sections was also increased by 6% from 2002 to 2129, and the number of assessed outcomes increased by 7% from 4732 to 5071. The adoption of the OBE model has received a significant attention from the higher education community over the last two decades as it promises a new paradigm to enhance the teaching and learning process. The successful implementation of the OBE model relies on the accuracy and effectiveness of the assessment processes utilized to ensure that students are achieving the intended learning outcomes after the completion of each learning module. However, learning outcomes assessment processes are still facing several challenges that affect their implementation. We believe that the vague definition of responsibility and accountability in assessment tasks, the difficulty in retrieving and comparing collected assessment data over a sequence of offerings, and the lack of effective mechanisms to follow up on the implementation of the recommended remedial actions are among the main challenges. Moreover, assessment processes depend heavily on the contribution of faculty members in the collection of assessment data and the analysis of attainment results. Nevertheless, having a defective assessment process in place is the main reason for faculty reluctance to participate effectively in the process, as they do not usually notice any improvement in the students' performance over several cycles of the assessment process. To overcome these challenges, this paper presented a unified course learning outcomes assessment process that has been recently defined and implemented by UAEU. The process carefully articulates the different assessment tasks while defining the entities responsible for carrying out each task and the associated timeline. The process defines the terms of reference for a hierarchy of assessment committees and entities, which fosters faculty engagement and participation in the development and implementation of the assessment process as well as formalizes the responsibility and accountability of the process implementation and quality assurance. The paper also presented the online management system (LOAMS) that has been designed and deployed in house to streamline the implementation of the unified assessment process and assure its effectiveness and accuracy. The system has an embedded workflow and enforces several measures to ensure the quality of each step of the assessment process. It also provides faculty, assessment communities, and quality assurance officers with infographics and reports to manage the implementation of the assessment process effectively. LOAMS was successfully deployed at the beginning of the 2018-2019 academic year and has been used since then to streamline the assessment process for more than 1000 courses every semester. The ability of LOAMS in providing attainment results segregated by major, tracking the implementation of recommended actions and their impact on the continuous improvement of student learning, and generating detailed assessment reports, among many other features, have contributed immensely to the successful national and international accreditation of offered programs. Moreover, the new features provided by the system during the COVID-19 pandemic have been instrumental, as they enabled a seamless implementation of the assessment process despite moving all learning activities online. The results of a recently conducted survey implied that more than 82% of the participants agreed or strongly agreed that LOAMS has reduced the complexity of submitting the assessment data and generating the required assessment reports. Meanwhile, 72% agreed or strongly agreed that the QA measures embedded in LOAMS help department/college administrators to track the status of the assessment process effectively. We wish to confirm that there are no conflicts of interest associated with this publication and no influence from the funders of the study Ethics approval This research did not include human or animal participants, hence, formal consent is not required. Advantages and Challenges of Using E-assessment Taxonomy for Learning, Teaching, and Assessing, A: A Revision of Bloom's Taxonomy of Educational Objectives Closing the Assessment Loop. Change: The Magazine of Higher Learning The Bologna Process: Its impact in Europe and beyond. UNESCO. European Commission. Directorate General for Education and Culture Designing and Teaching Courses to Satisfy the ABET Engineering Criteria How Reliable Are the VALUE Rubrics? Meeting the challenges of assessing student learning outcomes Using VALUE Rubrics to Evaluate Collaborative Course Design Outcome-Based Education: the future is today Over the Threshold-Setting Minimum Learning Outcomes (Benchmarks) for Undergraduate Geography Majors in Australian Universities WSCUC's Community of Practice for Advancing Visibility of Learning Outcomes Assessment. Presented at the Association for the Assessment of Learning Outcomes in Higher Education (AALHE) Learner-Centered Assessment on College Campuses: Shifting the Focus from Teaching to Learning Evidence of student learning: What counts and what matters for improvement, in: Using Evidence of Student Learning to Improve Higher Education A Review of Assessment of Student Learning Programs at AACSB Schools: A Dean's Perspective. The Journal of Education for Business Writing and using learning outcomes: A practical guide Fostering greater use of assessment results: Principles for effective practice, in: Using Evidence of Student Learning to Improve Higher Education The State of Learning Outcomes Assessment in the United States. Higher Education Management and Policy Knowing What Students Know and Can Do: The Current State of Student Learning Outcomes Assessment in Outcomes Assessment in Higher Education: Challenges and Future Research in the Context of Voluntary System of Accountability. Educational Measurement: Issues and Practice The Australian Qualifications Framework The European Qualification Framework: Skills, Competences or Knowledge? Striving for Excellence in Program Outcomes Assessment QFEmirates Handbook [WWW Document Assessment of student learning outcomes at the institutional level Educational assessment and the AACSB Students' Motivation and Learning Outcomes: Significant Factors in Internal Study Quality Assurance System Implementing Outcomes Assessment on Your Campus The Bologna Process: Developments and Prospects The Bologna Declaration: Enhancing the Transparency and Competitiveness of European Higher Education Assessing student learning outcomes in higher education: challenges and international perspectives. Assessment & Evaluation in Higher Education Publisher's note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. Walid Ibrahim 1 · Wissam Ibrahim 1 · Taoufik Zoubeidi 1 · Sayed Marzouk 1 · Amr Sweedan 2 · Hoda Amer 3