key: cord-0051414-n9jom0ih authors: Jahnke, Isa; Meinke-Kroll, Michele; Todd, Michelle; Nolte, Alexander title: Exploring Artifact-Generated Learning with Digital Technologies: Advancing Active Learning with Co-design in Higher Education Across Disciplines date: 2020-10-09 journal: Tech Know Learn DOI: 10.1007/s10758-020-09473-3 sha: 525efe9fb762a3bfe00f9545205a0f37ddb416de doc_id: 51414 cord_uid: n9jom0ih Active learning strategies increase student engagement and performances, however, there is student resistance toward such instructional interventions. To overcome student resistance groupwork can be useful. In addition, digital technology can be used to re-design courses to add active learning techniques and support learning with and not from technologies. We developed active learning strategies in a digital environment, artifact-generated learning (AGL). The aim of this exploratory research was to study student engagement in AGL setting by focusing on how student work together, student satisfaction, motivation, and roles. We conducted an open course with students from various disciplines. We applied Chi’s conceptual framework of the three active learning levels of active-constructive-interactive (interactive is highest level), to study the AGL intervention in student groups. Methods of focus groups, observations, and online questionnaires were conducted to analyze group interaction. Results, presented for four student groups, indicate two groups were active-constructive, one group was interactive, and one group reached beyond the highest level that we call co-design level. Implications raise awareness to distinguish between active learners and co-designers. An interactive group is not necessarily a co-designer. A co-designer is an active student who also acts beyond the given course design as s/he constructively searches for and utilizes other resources in order to accomplish set goals. To overcome student resistance, a new process-based assessment format may enhance students to become co-designers on the group level. Research shows the importance of active learning strategies in higher education. Active learning increases student performance and improves student outcomes (Ruiz-Primo et al. 2011; Freeman et al. 2014) . While passive learning means students are consumers of information (e.g., lectures), active learning leads students to be engaged with the content in various forms such as engaging in discussions or learning by doing (Bonwell and Eison 1991) . However, with web-enabled technologies in 1:1 classrooms (as one device per student), students are also able to search for the information online. Floridi's (2014) concept of being constantly online, before and particularly during Covid-19, pressures instructors to rethink active learning strategies in online or hybrid formats. Learning with digital technologies can mean that the instructor wants the students to find the right answer of a predetermined problem. This might be problematic when students have access to online resources. Students may find the correct information but might not always understand why and if the information they found is correct. New or enhanced forms of active learning for the digital classroom may include students searching for the correct information online but adding an explanation of why they think an answer is correct. Or they may create novel solutions and showcase their learning growth in digital ways such as screencasting discussions about different problem-solving strategies (Jahnke and Kumar 2014) . In this study, the goal was to explore a digital version of active learning strategies in which students developed digital artifacts-we call it artifact-generated learning (AGL). AGL emphasizes that students showcase their learning progress through the iterative creation and assessment of digital artifacts or products. Expanding the concept of project-based learning, students in AGL create first drafts of their learning artifacts, receive formative assessment, and improve their work during a second or third iteration. We offered an open course for students across disciplines. In the following, we describe the course design, show results of how the students used and translated the active learning strategies for their needs, and describe how they developed the artifacts. The learner's experience will be presented. Active learning consists of a group of instructional strategies or pedagogies that include various levels of support for students to be active as a learner and to actively engage students which is opposite to traditional learning setting with teacher-centered instructions (e.g., lectures). Studies show that active learning is critical because students tend to earn higher grades and retain, integrate, and transfer information at higher rates compared to traditional learning settings (Laird et al. 2005) . Studies by Freeman et al. (2014) and Akyol et al. (2011) demonstrate that active learning increases student performance and leads to improved learner outcomes that are of higher quality. Active learning is effective, improves student outcomes, and narrows achievement gaps in undergraduate science, technology, engineering, and math (Vetter et al. 2020; Theobald et al. 2020) . In active learning, students integrate and synthesize information with their prior knowledge. This becomes part of students' thinking as they learn to approach new phenomena and see things from different perspectives (Ramsden 2003; Tagg 2003) . Active learning focuses on the meaning of information and how students comprehend the meaning of material. Marton and Säljo (1976) raised awareness of the differences between surface and deep processing (Beattie et al. 1997) . Surface or passive learning can be described as memorizing information which is characterized as reproductive and unreflective (e.g., rote and routine learning) whereas active learning strategies focus on skills beyond recalling facts by involving students into activities such as analyzing, evaluating, being critical, creating new ideas, or applying multiple perspectives. Recent work emphasizes how instructors apply active learning strategies. Fullan et al. (2018) describe new pedagogies for active learning; they provide examples from seven countries of how to engage students, situate the learner in her context and work with others. More digital tools for active learning are published in Dexter et al. (2020) . Hakami (2020) shows that student engagement improved with Nearpod which is a digital tool that combines formative assessment and active learning in digital environments by using digital quizzes, polls, drawing, open ended questions. Dahdal (2020) shows how the use of What-sApp promotes active learning and student engagement. Mørch et al. (2019) show how digital tools help students to be engaged in collaborative knowledge construction. Active learning is based on learning with others e.g., group learning (Hodges 2018) , project-based learning, or team-based learning (Michaelsen and Sweet 2008) . However, students often resist active learning Silverthorn (2020) . A study by Owens et al. (2020) in science education shows different reasons of student resistance e.g., students struggle when authoritative information is absent, and students are unfamiliar with active learning as compared to traditional teacher-centered instruction. Similar results have been reported in nutrition education from the student perspective in which students who were not familiar with active learning as a format found it useful after being exposed to it during the course (Santos et al. 2020) . Smith and Kennedy (2020) applied active learning strategies in an undergraduate nursing program and results point to students' decreased stress levels in this new form of active learning when working in groups, receiving support from faculty and classmates, and when faculty using templates or rubrics for grading the student assignments. To summarize, research shows that active learning strategies are effective and lead to students' increased performance. Students are resistant to active learning though especially when they are exposed to it for the first time. Groupwork and instructor support are useful to overcome such challenges. Existing work however does not sufficiently explore emerging forms of active learning that uses digital technology in higher education. Our work aims to close this gap. To explore active learning with digital technologies, we created an open course in which we applied strategies of digital artifact-generated learning (AGL) (read Sect. 2.1) and applied formative research to study the course design and potential improvements. For this research, we focused on student groupwork, particularly co-design and active roles (read Sect. 2.2). The research questions are: 1. How do students work together (and co-design) in an AGL context? 2. What kind of active roles does the course promote (and how can it be improved)? 3. What is the learner's perception and experience in the AGL context? A prominent framework for studying active learning is provided by Chi (2009) and Chi and Wylie (2014) . They differentiate three modes of active learning: active, constructive, and interactive. According to their model, active is doing something physically and includes the search of existing information. Constructive is the production of outputs that contain ideas that go beyond the presented information. Interactive is a dialogue-creating process that incorporates a classmate's contributions. Chi (2009) argues that interactive is better than constructive, which is better than active. According to the model by Chi and Wylie (2014) as reported by Hodges (2018) , "both passive and active forms of engagement may help students store information, but only constructive and interactive modes promote students' abilities to infer and transfer ideas, leading to deeper, more robust learning" (p. 4). In our work we enhance Chi's work with digital technology use, inspired by David Jonassen (1996) , who distinguishes between learning from technologies through content available online and video lectures versus learning with technologies through focusing on student use of technologies to create new artifacts. While Chi (2009) argues that interactivity is the highest level of active learning, Jonassen (1996) argues that technology can be used as a mind tool and intellectual partner. In the mid-90s, Jonassen (1996) described different types of learning with and from digital technologies. We take his approach by asserting that existing web-based technologies can be used by students to generate artifacts or products in groups or to showcase their own learning process in new ways. AGL is a specific form of Chi's (2009) interactive learning type enhanced with digital technology use in groups. The key constructs of AGL, a form of active learning, are (a) students develop artifacts and receive formative assessment in several iterative steps (active-constructive-interactive), (b) students work in groups (cooperative), and (c) groups showcase their learning progress by connecting classroom learning with materials outside the classroom (reflective, authentic). AGL follows the five learner-centered pedagogical principles of active, authentic, reflective, cooperative and goal-directed learning (Howland et al. 2012 ). (a) Artifact-generated learning enhances active learning in terms that students co-create or co-design artifacts in iterative processes of receiving feedback and further development (Jahnke 2015) . AGL is a group co-design learning strategy and follows an activitybased model of instruction that starts with the premise that the student does not learn by the teacher's activity, but rather because of her/his own activities. It means that the artifacts are not existing yet and not presented by the teacher. Instead, student groups develop these artifacts with digital technologies in an iterative process and improve it by receiving feedback from teacher, peers, or self by way of guided self-reflection. (b) As such AGL is based on the premise that students become active agents (Ramstedt et al. 2016) . However, research suggests students need support to take and even go beyond an active agent role thus becoming critical co-designers of their own learning (inspired by Dohn and Hansen 2014) . The term co-design originally originates from cooperative design, which is the act of creating with stakeholders (business or customers) together. In this study, we adapted the term 'co-design' for the context of learning in higher education using the principle for teaching that Martin et al. (2018) adopted from Gee (2004) . Gee (2004) stated that co-design "requires that learners feel like active agents (producers) not just passive recipients (consumers)" (p. 17). Critical co-design in learning refers to actively shaping the learning process beyond the boundaries of the initial course design. In other words, students shape their own learning processes by adding new ideas or filling missing structures beyond what is set or given by the instructor. The concept of the critical co-designer expands Chi's framework (2009) in which students interact and connect to other resources by using technologies, make choices and decisions, and explain why they are doing what they are doing, why they do not choose another way, and why this is meaningful for their learning growth. Table 1 shows the differences between a passive student role, active roles and them becoming critical co-designers. Learners connect course concepts outside the classroom to places, people or affinities, and turn those connections into tangible learning objects (c) Martin et al. (2018) added that co-design strategies in teaching require students to connect theoretical concepts to places, people, or affinities. Students research and develop those connections into tangible learning objects that they can share with their classmates and future students. By using the concept of Martin et al. (2018) , the third key construct of AGL focuses on student groups' showcasing their learning progress by connecting their classroom learning to the material world outside the classroom in order to show that they are capable to transfer the learned content or skills to real-world practice. This approach is commonly known as authentic learning (Herrington et al. 2014 ). Our study was of exploratory nature. We did not conduct a controlled experiment with defined variables and hypotheses because our aim was to understand how students work together (RQ1), which kinds of roles the study setup promotes (RQ2) and how the students perceived their experience (RQ3). Table 2 shows the broader framework which served as a basis for our study. Our study applied the general principles of a design-based research approach (Huang et al. 2019) where we deployed the approach and designed the AGL learning intervention (in an open course digital environment) and then studied it in practice with a formative research methodology to learn how to improve the instructional strategies and to learn how to support student groups in becoming active-constructive-interactive or even co-designers of learning. We conducted a 16-week course in Fall 2017 at a university in the Midwestern United States. The main learning activity was for students to develop location-based gamified applications (apps). The learning objective of the course was for students to be able to design, develop, and test gamified applications (apps) that should be novel and functional (Sternberg 1999) . After the course, students should be able to make design decisions, develop apps, test apps using common usability methods, and defend their products in front of experts. We offered an award for the best app to simulate competition of real-world projects in the Information Technology (IT) sector. The course was open and hybrid. It was open in that we invited students from all study programs and all levels. The course included five in-person or face-to-face meetings with students and instructor while instruction in between took place online. The student groups coordinated their own activities between the aforementioned meetings. Mainly online questionnaire Quality of student outcomes: groups' products Novel = group product is novel or valuable in some or whole parts, does not exist in its current form, and exceed teacher expectations (Sternberg 1999) Analysis of group products Functionable = the product is usable and works (i.e., the gamified AR app can be used by others) We created the course by applying the technology development cycle consisting of iterations of designing, developing, testing, and refining a digital prototype (Gibbons 2016; Buchanan 1992) . We also used the course development framework of digital didactical design (Jahnke 2015 ) that focuses on the constructive alignment of teaching/learning objectives, active learning strategies, formative assessment, support of social structures, and use of technology. Instructions were scaffolded along five face-to-face workshops (Fig. 1) . Each of these workshops lasted two hours and they took place from August to December. The five workshops had two purposes. First, students received input or inspiration from the instructor team regarding specific information such as design thinking and usability evaluation methods. Second, students presented their stage of development, reflected on past actions and future action items, and received feedback from peers and instructor. Feedback is a crucial element in AGL and aims to support learner growth and, as such, was part of the five meetings. Workshop 1: Teams and Idea Formation. During the first week, Workshop 1 started with the introduction of instructor and students. The instructor presented the overall goal of the course and requirements for the app prototype development. They also shared and discussed the three basic requirements for the app development: (a) Create a digital gamified Augmented Reality activity with interactive elements; (b) Make it location-based; and (c) Ensure it has something to do with learning. Finally, the instructor suggested templates for group work, project management, and storyboarding. These supporting material were provided, but they were optional to use. The course platform, a Google drive folder, and Slack, a text messaging tool for instant communication, were introduced. Groups were encouraged to use the provided materials and communication channels to facilitate their group work. In the second half of the workshop, groups were formed. They brainstormed ideas and discussed conceptual frameworks. Workshop 2: Storyboard and Development Platforms/Software. Each group shared their conceptual frameworks and storyboards in the Google folder they developed between the first and second workshop. During this second workshop, groups reported their conceptual gamified app designs. All groups received feedback from peers and instructor to receive feedback and discuss strengths and weaknesses of their design. In the second half of the workshop, the instructor introduced ARIS and GuidiGo as tools for mobile interface development. Both tools were selected as optional use for groups to develop the gamified 1 Five face-to-face meetings; one meeting per month from August to December app for learning. GuidiGo is an easy-to-use platform for people who have no experience in coding or software projects. ARIS is an open access platform that can be used without any coding experience, but it also provides the option to add code. The groups were told that they had the option to also use alternative platforms. After this workshop, groups were asked to develop their first prototype for the next workshop which would take place after 7 weeks. During this time, groups decided to use ARIS, GuidiGo, or an alternative software. Workshop 3: Prototype Presentation and Usability Methods. Groups presented their first prototypes and received critical-constructive feedback from instructor and peers on how to improve the prototype. The second half of the workshop was used to introduce usability testing methods. Two expert teams from an accessibility and usability lab were invited for this presentation that was a combination of slides and active small group work. Groups had time during this workshop to start creating plans to test their apps using the previously introduced usability methods. The groups were asked to share their usability study designs in the project Google Folder before the next workshop. Before the fourth workshop, groups conducted usability or user experience studies. During the workshop, groups presented their study results. Presentations included their conclusions for design recommendations of their prototypes. Groups also received feedback from peers and instructor related to improving their apps based on the usability results. Groups had around 3 weeks to improve the apps for the last meeting. gamified apps. In addition to the groups we also invited members of the larger public to play the gamified activities for learning. We sent out flyers to the entire campus and distributed the event via Social Media channels. Prior to the event, people were invited to play the games on campus. The event had three parts, first a short introduction and group presentations of the four apps, second, time to play and third, announcing the winners. The apps were assessed by an expert panel (novel and it works) and through a public vote (I like best…). Certificates for best prototypes were awarded to stimulate competition of a realworld project. Between the Workshops. The groups used Google Drive to share information and receive documents from the instructor. Each group had separate sub-folders and all groups had access to the other groups' folders to be able learn from each other. Each group had the task to coordinate their work in between the meetings on their own. They were encouraged to use Slack, a text messaging tool, as a communication tool. Slack allowed them to attach files connected to the Google Drive folder. Slack was also used for communication between the instructor and the students. We used an exploratory study design using mixed methods (Creswell 2009 ). Data collection included focus group interviews (3-5 students per group) and pre-and post-questionnaires. Instead of individual interviews, we applied focus groups that were conducted at the beginning and end of the course in Fall 2017. Focus groups and student learning groups were the same; the method was selected to capture the student group learning process and the individual and group perception. The focus groups were audiotaped and transcribed. In addition, photos and student group documentation, such as design documents, storyboards, usability results, and prototypes, were gathered. During the five workshops, up to three observers took field notes and photos with notes. They were used in the open coding process (read data analysis). Students were recruited in Spring 2017 through an open call for participation and with the snowball method, meaning interested students invited other students to join them. All students were invited to an orientation meeting in Spring 2017. Open invitations were sent to undergraduate and graduate programs and all disciplines, such as architecture studies, business, education, engineering, journalism, library science, medicine, and nursing. We wanted to reach students beyond classical computer science programs and address computational thinking skills for students across disciplines. The main idea was to invite students who have not worked in the IT sector, were not familiar with IT development in general, or had no practical experience with app design and development. The reasons for this openness came from the idea to reach out to students across campus to address computational thinking skills for all. Incentives were outlined at orientation and available at the completion of the course (e.g., credit points or gift cards). Informed consent was obtained from all participants. Two rounds of focus groups were conducted; in the beginning and end of the course with the same student groups. We applied a focus group interview protocol that covered five topics. The first part included four questions and aimed to get an overview of the group activities. Questions such as, "Describe your group work," "What has gone well and what was not so good?" were included in this part. The second part asked about the design and development process. Questions asked in this part included, "How has your group approached the design and development process?" and, "Have you used the course templates? Why or why not?" The third part of the focus group was about roles and responsibilities. We asked the questions, "What is your role(s)?" and "What tasks do you feel responsible for?" The fourth part focused on communication tools. We asked the students, "Does your group use tools for interaction, communication, and document sharing? For which ones and how?" The final part was wrapup. Students were asked if the work was what they expected or different and if they had anything else they wanted to share. At the end of the course, we had a second round of focus groups and we applied the same questions, but we started with two additional questions: "How is your team doing? What has your team done so far?" and "Are there any highlights or challenges that surprised you?" We conducted three online questionnaires with Qualtrics: an entry questionnaire (beginning of the course, September), a follow-up (mid-term, November), and a post-questionnaire (end of the course, December). Entry Questionnaire. The entry questionnaire consisted of four parts with a total of 12 questions. The first part included gender, age, and study program (Q1-3). The second part focused on student motivations. We applied the instrument inspired by Filippova et al. (2017) . It allowed the students to choose their level of agreement related to ten options such as "have fun", "get to know new people", "interest in usability", "sharing my experiences", and "others" (Q4). We also asked how familiar students were with web development platforms (Q5) with five possible answers ranging from never heard about it to know it well. The students also had the possibility to state whether they prepared anything for the course (Q6). The third part asked about team work experience and communication tools (Q7-9). A fourth part focused on responsibilities (Q10-Q12). One question asked, "What do you consider to be your responsibilities as part of your team?" Multiple answers were possible such as leading the project, programming, designing content, and testing. Follow-Up Questionnaire. The follow-up questionnaire consisted of six parts with a total of 11 questions. It focused on expectations, student experience, team coordinator role, and satisfaction with the group work, including individual contribution. See Table 3 . Post-questionnaire. The post-questionnaire had 11 main questions with eight parts. Similar to the follow-up questionnaire, it focused on student experience and satisfaction with individual contribution and group performance. In addition, it asked about student satisfaction with the group outcome and added follow-up questions. See Table 4 . We did not conduct intensive quantitative analyses but rather perceive the questionnaires as additional qualitative data points that provide additional background about the students' perceptions and experience. For the open-ended questions and qualitative data (transcripts from observation data and focus groups), we applied the thematic analysis method (Ely 1991) with open coding. The emerging bottom-up semantic codes become the label of a category. The data analysis was iterative and recursive and guided by the constitutive elements of the key constructs presented previously in Table 2 . To capture learner experience, group dynamics and interaction, we adopted the role theory by Jahnke (2010) as a lens to uncover how students experienced and perceived the course and their groups' collaboration. Four main aspects guided the analysis: formal and informal roles, tasks and responsibilities, formal and implicit expectations, and interaction. We analyzed the data of the student groups as the unit of the analysis. We combined and analyzed the observations, focus groups, and surveys for each person and group. We then identified crucial situations in which students experienced problems or dilemma during groupwork that affected or hindered the group's progress. We compared the student groups according to their active learning level, group interaction and roles. To ensure reliability, an interrater agreement was conducted by means of content validation and peer-review procedure in which three researchers went through the data and coded them in an iterative process (Bauer and Gaskell 2000) . Table 3 Follow-up questionnaire after 8 weeks of the 16-week course Likert has five answer options, strongly disagree (1) to strongly agree (5) Table 4 Post-questionnaire at the end of the course The work happiness scale has seven answer options from 1 = Never, 2 = Almost never, 3 = Rarely, 4 = Sometimes, 5 = Often, 6 = Very often, and 7 = Always. The # items refer to the number of total items that were included in the scale (Schaufeli et al. 2006) 3 9 Seven answer options from Never (1) Fifteen students signed up for the course. One student dropped out after the first workshop. A total of 14 students were from undergraduate (7), graduate (2), and doctoral (5) programs. They came from different disciplines such as engineering, business, digital storytelling, architecture studies, journalism, learning technology, library science, nutrition and fitness, and marketing (see Table 5 ). None of them had significant experience related to prototype or IT-development. Nine students were between 19 and 29 years old and four were older than 30, one did not reveal her/his age. In the entry questionnaire all, 12 participants, reported networking, career opportunities and getting to know new people as motivations to participate. Ten wanted to learn something new and were interested in augmented reality and game technology. Eight participants were interested in usability. Seven participated because of fun. Six said, "do something that I would not do normally." Five stated they wanted to share their experience and expertise. Students perceived their responsibility differently. Nine participants said, they perceive their responsibility to "design content," while eight said, "create storyboards." Six were interested in leading a team, however, only four teams were formed. Participants reported they wanted to do "programming, coding, fixing bugs" (n = 6) or be responsible for "testing and reporting errors" (n = 5). In the first workshop, students formed four groups according to interests or already established relationships. We asked the groups to give themselves a group name and to present the name and the idea for the gamified AR activity in the second workshop. Table 6 provides an overview of the groups and their ideas. Table 7 compares the groups, the different roles, actions, coordination strategies, chosen software, and skills that the students learned as self-reported. The comparison is based on our analysis of observations, workshops, focus groups and questionnaires. Three students were in this group. The group developed a digital scavenger hunt for certain sites on campus. In the second workshop, the group shared their storyboard. The locationbased app leads players to different areas on campus and asks them to find answers to a clue, take a photo, and receive a virtual scrabble letter to complete the final puzzle. The group decided on the locations, took photos of the locations, and developed quizzes. In the third workshop, the group presented their first prototype. In the fourth workshop, the group presented usability study results that helped them to refine the prototype. On game day, the app was ready to use. Overall, the group met the goal to design, develop, test, and refine the application. The group won the Experts' Choice Award for the best functionable prototype. The focus groups revealed that each group member independently completed a portion of the assigned work by the due dates. They focused on the task and group members expressed a positive experience. With regard to the technical platform GuidiGo, Member 2 mentioned, "It's very user friendly and anybody can learn how to use it." However, the group members also emphasized that with GuidiGo there was no opportunity to learn programming skills. Member 1 said, "Yeah. I share the same idea with Member 3 and Member 2. So, we don't improve our IT skills in this project. But we learned other skills, like interaction skills, group work skills." Member 2 reported, "I personally learned usability testing. That was something really interesting that I'd never done before so. I thought that was good to know how to do." With programming being important to the group members, it would have been better for the group to use ARIS over GuidiGo if their goal was to learn how to program-but the group did not change their original decision. From a research perspective the question is, why did they not change to ARIS? The data do not give any sufficient answer, except that they did stick to their original decision. It is unclear why they were not able to make a change. ARIS provided more programming opportunities. Both tools were mentioned during the first and introduced in detail in the second workshop. One theory is that the three members noticed this problem later in the course and then it felt too late for them to change to ARIS. Five students were in this group. In the first workshop, the group said they wanted to create a campus tour. They did not change this throughout the course and developed a functional prototype. In the second workshop, the group presented the storyboard, the physical flow of the campus tour, and some media files (images, video, and audio). In the third workshop, the group introduced their first prototype and received feedback from other groups and the instructor with design suggestions. They only made minor changes to the game design. The group leader presented the usability results in the fourth workshop. They reported on issues with their game wording and number of locations, as their users said it was too long and included too many locations. The group decided to revise the prototype. They made Table 6 The Four groups and Their Apps Group Group name/topic Gamified app description 1 Word Scramble/Getting to know campus Find campus locations and receive a letter tile to unscramble the final word at the end of the game 2 Why Me?/Learning campus history Each location has a clue for the user to understand campus culture and history 3 Libway/Getting around the library A digital robot helps students find their way around the library and locate books using a way finder map 4 Ethical Dilemma/Putting ethical theories into practice Artificial Intelligence, AI, has taken over campus; your decisions affect everyone on campus in this storybased game changes to the game flow. On game day, the group presented their ready-to-play prototype. The team won the People's Choice Award from the public vote for the best functionable prototype. In the focus groups, the group reflected on their experience, what they learned, and what they might do differently. They did not believe they would have made any major changes in their project management approach, leadership, or design concept. While they enjoyed time working with their members, they were dissatisfied with the software of GuidiGo due to its limited capabilities to support ideas they wanted to implement. Also, they stated they would not participate in this type of event again due to the time involved and concern that the skills they developed were not very relevant for their future work. Four students were in this group. They had the idea to support students in libraries to better understand how to find certain books. They developed a wayfinding game that allows students to locate a book title in the specific shelf and row. In the second workshop, the group coordinator presented the storyboard. The group added additional ideas to support student navigation through the college library system by using QR codes. The group created the game character, Rob the Library Robot, who virtually helps students to find a book. The group demonstrated the prototype, including the wayfinding mapping link, in the third workshop. Member 4 volunteered to conduct the usability testing. S/he organized the usability evaluation in the math library. In the fourth workshop, s/he reported that the users pointed to several design recommendations. The group improved their prototype. They showcased the app on the game day, although there were some software problems such as break downs. The group had a partial prototype ready to play. In the focus groups, the group described their motivation for why they remained persistent to the end despite their struggles with coding and software programming. Member 1 stated, "To create programming plus thinking like a programmer was sort of the most challenging but also the most rewarding." The group reported that they learned new JavaScript skills and how to develop mobile apps while applying project management skills. Member 2 added, "Yeah, I still feel like I could definitely improve in that area (programming skills)". There were a few times when members felt like giving up because the development environment crashed resulting in a loss of large parts of their work until this point. The group reported that they remained motivated to finish because of their personal work ethic and commitment to the project and the group. The group reported that overall their conceptual framework did not change, but the big picture did. Being involved in this project influenced their thinking of how a digital design and development project works. They reported that they gained a lot of new knowledge by working together. Member 2 said, "Since there are so many moving parts, it just shows how having different roles it can be really helpful." This group began with three students. However, one left after the first workshop, so only two students remained until the end of the course. The group created the idea to transform the ethics dilemma theory into a digital game. The game would place players in situations at different places on campus in which they would need to make ethical decisions. More specifically, the game would present students with ethically difficult situations that happened at those locations in the past and asks the users to make decisions. In the second workshop, the group shared a form of a storyboard. The group based the game choices on researched theories from ethics and philosophy disciplines. The group showed the first prototype in the third workshop. Member 2 specifically talked about the prototype because it was originally her idea and research area. When Member 1 started explaining the context of the design, Member 2 politely interrupted and took over. Member 1 said later in the focus group that they "are both strong headed and have opinions. Even though it is a good thing, it gets in the way." The group shared their usability results in the fourth workshop. The group struggled with usability testing because of distance and schedule problems, so each member found a user near their location and tested the app with them separately. Both members agreed they needed more time to finish the game. On the game day, the group was excited to present their prototype, although they were disappointed that the game was not completed as they had hoped. They presented a visual mockup instead of a functionable prototype. The focus group indicates the group had a good experience overall. However, they felt the development platform was limiting them from creating the final prototype as initially envisioned. They also said it should be a yearlong project rather than a semester. Both members were satisfied about the computational skills gained from the experience. Member 2 would have liked a group of coders available to help her to be successful more quickly. She perceived herself as an idea creator and wanted some people around to make those ideas happen. Member 1 was pleased about the group and especially proud of the work Member 2 had accomplished. After the course finished, this group continued working on improving their digital game. Follow-Up Questionnaire. Table 8 provides an overview of the results from the followup questionnaire that was conducted after the second workshop. It indicates that the course confirmed most students' expectations (mean = 3.90). Only two students expressed their disagreement. Students agreed that the tools, templates, and materials were useful overall including the design framework and prototyping templates (mean = 4.13). Students were satisfied overall with their projects (mean = 4.14). All students agreed they could contribute meaningfully to their group, had a voice, and participated actively (mean = 4.57). The majority of students perceived their leader to be a good coordinator, Table 8 Follow-up questionnaire after 8 weeks of the 16-week course Likert answer options ranged from "strongly disagree" (1) to "strongly agree" (5) Post-questionnaire. Table 9 shows the results from the post-questionnaire conducted at the end of the course. Student satisfaction with individual contribution (mean = 4.11) remained almost identical to the follow-up questionnaire. The majority of the participants were satisfied with the outcome (mean = 4.14). The participants said that the groups' final output reflected the student individual input (mean = 4.56). However, the project outcome did not meet student expectations in comparison to when they joined the project in the beginning. Expectation confirmation had a mean of 3.9 in the middle of the course in comparison to the mean of 3.33 in the end of the course. The work happiness results indicate a mean of 4.14 on a scale with seven answer options. It means that the average student was only "sometimes" happy to work in the course. The standard deviation is high though with 1.65. It means, students experienced the course very differently. In other words, they were either in favor, very/often happy, or on the other side of the scale, they were almost never/rarely happy. The post-questionnaire shows that students agreed they were able to contribute meaningfully to their group, had a voice, or participated actively (mean = 4.7). This result is similar to the one obtained after the follow-up questionnaire. However, students were rather uncertain as to whether they intended to continue working on the app afterwards, stating they "might or might not" (mean = 2.9). When asked about what they would do differently (open ended), students mostly referred to the technology development process and said they would "spend more time on developing and less time to justify the product with evidence from literature," "try to fix the technical part earlier," "use a more complicated game platform," or "use a more challenging development platform to work with." Table 9 Post-questionnaire at the end of the 16-week course Likert with five answer options from "strongly disagree" (1) to "strongly agree" (5) We also asked students what the course can do better to support them. Students mostly answered that they want more guidance or information from the instructor and said, "give us a programming expert," "give us requirements," "provide us more options to choose from," provide more face-to-face training," "show us what have been done previously," "provide more support," "teach us each platform," and "give more time." All student groups were engaged to make the app prototype work. Group 4 had more trouble adjusting their original goals and content than the others. They ended up not presenting a complete prototype. Instead, they had a mockup that was a combination of a visual paper prototype with some elements in the digital app. Group 3 had a partial prototype ready to use. Groups 1 and 2 presented ready-to-play functionable prototypes. All groups were provided with design tools and templates (e.g., storyboarding). Some groups used them, and others used them as a start to outline, later creating their own or using tools that they thought fit better to their needs. All students demonstrated active agent roles. However, the group members shaped their roles and actions differently. On the group level, observable in the group members' interaction (Stahl 2006) , the four groups showed different levels of active, constructive, interactive, co-designing or a mix. See Table 10 . Group 3 showed evidence of moving from an interactive to a co-designing learning level with a novel idea and they ended with a partially functionable prototype. Groups 1 and 2 showed active and constructive levels with functionable prototypes but no novel concepts; they created a digital campus tour which already exist. Group 4 indicates constructive and interactive learning levels. While they had a novel idea, they did not have a functionable prototype. Three crucial situations, described below, support the results. In such situations, students experienced a dilemma between what they wanted to do (i.e. their idea or vision) versus the course reality or what is actually possible and doable when developing technology. In addition, only two groups (1 and 2) had functionable prototypes but were disappointed that they did not learn more advanced programming skills. Both groups decided to stick with GuidiGo where no programming skills are required. Group 1 reported that they missed programming opportunities. Although the ARIS tool provided these options, the group kept GuidiGo. The students in the group stuck with the first decision, which was not a choice that aligned with their stated desire of learning software programing. The group made an early decision and was not open to change later to the other tool. Even if they discovered later that more programming features were needed, they did not change to the ARIS tool to review its capabilities. Group 2 experienced similar issues. They reported that GuidiGo's technical capabilities did not offer the desired functionality. Throughout the course both Group 1 and Group 2 faced software limitations. The group members were dissatisfied with the software due to its limited capabilities to support the ideas they wanted to implement. Groups 3 and 4 used ARIS where programming skills were useful. Group 4 had difficulties but did not reach out to an extremal resource or to the instructor. Group 3 approached the problem differently. They also stuck with the platform they chose originally but they asked a computer science student to help them creating a solution for their game. This is an indicator that Group 3 members became co-designers. Group 3 reported that they wanted more structure and scaffolding. Member 3 said, "I feel it's more visual components and less structure… So, I guess maybe in the future we get … more structure for templates or more tools so we can see more examples." As Group 3 faced the challenge of missing structure the group started to create that structure themselves. For example, the group had coding issues but solved them by asking an outside person to help. Member 1 explained, "So, to create programming plus thinking like a programmer was sort of the most challenging but also the most rewarding." Group 2 experienced similar problems. They wanted to get more hands-on training from the instructor for programming and coding and said that the platform they chose did not offer any additional coding work. They did not change the platform and they did not include external resources. All groups, especially Group 4, had difficulties with time management which is not surprising because time issues are a reported problem in courses (Hodges 2018) . To address this issue in advance the course had five milestones for the groups to deliver first results. Three groups noticed time management issues and changed their plan, either by making the app smaller than originally anticipated or by cutting other functionality. Groups 1, 2, and 3 had a readyto-play app for the final presentation day. However, Group 4 was not able to present a functionable prototype. This group perceived the course to be better suitable as a yearlong project rather than a semester. Groups 1 and 2 had functionable prototypes but they were not novel (campus tours). Group 4 had a novel idea but no functionable prototype (ethics dilemma game). Only Group 3 pursued a novel idea and had a partially-functionable prototype ready to use (libway). Both Groups 3 and 4 faced time management and complexity issues. While Group 3 asked an outside person to help them with programming, Group 4 did not try to find outside resources to solve the problems. Our study provides indication that one group (Group 2) resisted active learning (Silverthorn 2020). It confirms Owens et al. (2020) that this group struggled when authoritative information is absent, and students are unfamiliar with active learning as compared to traditional teacher-centered instruction. As one student event directly stated, "but the instructor didn't tell me". Student resistance does not mean that students are aware of the resistance. It is not a choice by them to be resistant. They are rather struggling with the absence of clear step-by-step processes or formative assessment strategies. Group 1 was active and did not create a novel prototype but it worked and other users were able to use it. Group 4 showed an interactive level (almost highest form of active-constructive-interactive group level) and had a novel prototype but they did not manage to make it work. Only Group 3 which faced several challenges e.g., lack of programming skills, managed to fill the missing resources which is an indicator for co-design. This group was the closed group to the level of co-design. Their prototype was novel and partial functionable. Following Chi and Wylie (2014) , also Freeman et al. (2014) , there is a positive correlation between student engagement and learning outcomes. However, to truly understand the quality of a group's active learning level, the digital artifacts (e.g., gamified AR apps) of the student groups need to be considered too. This study indicates a correlation between the active learning level and student product quality: The higher the learning level, the better the student product (novel and functionable). However, further research is required to test this correlation with more student groups. There are several aspects that may hinder students in becoming active or co-designers. Such issues may be tied to social or cultural differences, expectations, or cognitive overload (Hodges 2018) . Roles are a critical factor that affected the groups (Jahnke 2010). Group leadership roles were formed throughout the design process, however, groups acted differently. Group 1 and Group 2 followed their group coordinators. Groups 1 and 2 were organized in a clear hierarchy, while one asked during the workshop, "Are the group leaders doing the work or the group?" While Groups 3 and 4 showed more equality among the members, Group 4 worked more collaboratively than Group 3. The second aspect is that some students felt they were not allowed to break out of the given course design. They felt that they were not allowed to change software platforms. Our analysis did not provide a compelling reason for this student perception. In Workshop 1 it was clearly stated, and several times emphasized, that they could freely choose their approach and tools. One reason to this behavior might be that students are deeply interlinked with an existing, rather closed culture in higher education that they never considered what the instructor really meant with the option to act freely. Expanded meta-cognition methods could be one option to support students clarifying the options they have as well as the instructor's and their own expectations. We conclude that the course design needs improvement. For example, adding an element that help students to talk, express, and reflect about the choices they have and decisions they make. This can be supported by coaching, for instance. Additional research is needed to explore support structures for the co-designer level in AGL. The study provides indication that artifact-generated learning is a useful strategy to foster co-design learning with, not from, technologies. The student groups we studied developed new products and showcased their cognitive connections and learning growth through the artifacts they created. They all learned about design, development, and testing as shown in the presentation of the five workshops and the product in the end. However, not all groups reached the level of critical group co-design. As defined previously, the role of a critical co-designer requires groups to go beyond the given course framework, add their own ideas, and fill missing structures that the instructor could not have foreseen or does not want to fill on purpose, for example, to give groups a balance between guidance and freedom for own ideas. When we offered the course, we applied the artifact-generated learning strategy which means in this context to shift the thinking from instructor onto students. This implies that students not just follow the course design or the ideas of the instructor, but rather they break out of the receptive habits and co-design their learning processes in the moment of need. While we thought our course provided a suitable structure for the students-we made sure to apply the AGL approach and cross-checked the quality with the five principles of active, authentic, constructive, cooperative and goal-directed learning (Howland et al. 2012 )-the results show that three of four groups did not take the role of a critical group co-designer. Besides obvious suggestions such as clear instruction and clear communication-which we thought we had provided-we learned that more specific guidance for students to become critical co-designers in groups is required. First, what might be clear for instructors might not be clear for students. To communicate expectations several times, not only in the beginning of the course, is crucial. In addition, it might be useful to add a specific question that helps students to express and reflect on their choices and decisions and prompts them to reverse decisions or make other decisions. Creativity methods can be deployed to support students. For example, students can be asked to think outside the box or to find many possible ways before making a decision. When looking back it becomes apparent that we did not explicitly ask students in the five meetings, "Why did you do it the way you did?" After this course experience we believe it is also important to ask, "Why didn't you do it another way?" Embedding reflective coaching elements into the learning process several times as an iterative process rather than only at the end may help students to understand how to make necessary changes during the learning process. As we learned from this course, to encourage students to apply co-design on the group level, we suggest a process-based assessment procedure that we developed based on our findings from studying the course. We suggest making the 4-matrix field in Fig. 2 and procedure available to students in the beginning and as a reminder in each workshop. It is inspired by Herrmann (2012) and based on Amabile's Consensual Assessment Technique (1982) . The process-based assessment procedure for AGL applies two steps: first, students create something; second, students (peers in the classroom) and domain experts evaluate the student-generated artifacts. The artifacts will not be rated against an absolute standard but compared with each other (Said-Metwaly et al. 2017) . This formative assessment format can be applied after the student group's idea pitch, first prototype, and final prototype. This procedure may foster students to become co-designers. AGL is not limited to computational thinking courses like the one presented here. AGL can be used for various course content and topics. For example, currently we apply AGL in an online course called Learning with Web-based Technologies in which school teachers choose apps for collaborative learning. First, they create ideas and share them with the class to discuss potential improvements. Second, they develop the artifacts as instructor sites or other documents. Then, they test their ideas in the classroom. It is an iterative process of designing, receiving feedback and revising or improving. In the end the course participants developed artifacts that showcase active learning strategies with technologies and they justify why they think this strategy boosts student active meaningful learning with technology integration. The small number of groups allowed for exploring artifact-generated learning is a limitation in this study. More groups would provide additional data on how students experience AGL and may inform co-design aspects further. In the digital classroom, artifact-generated learning means shifting the thinking from instructors onto students, so they are able to make cognitive connections (Jonassen 1996) . We conceptualized learning situated in a social practice in which interaction and roles 2 Process-based formative assessment form for supporting group co-design in AGL context. The correlation of student products and co-design level needs further studies played a crucial function. In an open course, students developed gamified digital applications. It gave students the opportunity to be in charge of their learning. Our study shows that artifact-generated learning combined with co-design is a useful active learning strategy that fosters learning with technologies rather than from them. Student groups developed new products and showcased their cognitive connections and learning growth as evidenced in their artifacts. Throughout the course, students were active agents. Groups 1 and 2 managed to develop a ready-to-play prototype although it was not a novel concept. Group 3 applied a novel concept and the prototype worked to some extent. Group 4 created something novel, however, the prototype did not work. Only Group 3 took the critical co-designer role as a group when they partially filled the missing structures and integrated an outside source that helped with programming. When they got stuck with coding, they hired an experienced student who helped them to finalize the prototype. They thus did break out of the given frame of the course design. Although the instructor encouraged all groups to do so right from the start only one of four groups actually did it. Instead, in the focus groups and questionnaires the students of the other three groups expressed a desire for more formalized structure, more content of design theory, tools, and more options to choose from including more time to finish the product. As this small study explored, a critical group co-design approach in AGL is needed. Without it, group artifacts lack either novelty or functionality. With this research, we contribute to Chi and Wylie (2014) framework of active-constructive-interactive and expand their original three levels with a new level 4 of co-design in which the learner or the group adds new ideas and fills missing structures beyond what is set or given by the course framework or instructor. Further research is needed to investigate support structures for students becoming codesigners on the group level in artifact-generated learning with technologies. The impact of course duration on the development of a community of inquiry Social psychology of creativity: A consensual assessment technique Qualitative researching with text, image and sound Deep and surface learning: A simple or simplistic dichotomy? Accounting Education Understanding information systems continuance: An expectation-confirmation model Active learning: Creating excitement in the classroom Wicked problems in design thinking Active-constructive-interactive: A conceptual framework for differentiating learning activities The ICAP framework: Linking cognitive engagement to active learning outcomes Research design: Qualitative, quantitative, and mixed methods approaches Using the WhatsApp social media application for active learning Inter)Active learning tools and pedagogical strategies in educational leadership preparation Is learning designs something you think, do, live with, react to, or conceptualize with? Doing qualitative research: Circles within circles From diversity by numbers to diversity as process: Supporting inclusiveness in software development teams with brainstorming The fourth revolution: How the infosphere is reshaping human reality Active learning increases student performance in science, engineering, and mathematics Deep learning: Engage the world change the world Learning by design: Games as learning machines. Interactive Educational Multimedia Design thinking 101 Using Nearpod as a tool to promote active learning in higher education in a BYOD learning environment Authentic learning environments Kreatives prozessdesign Contemporary issues in group learning in undergraduate science classrooms: A perspective from student engagement Meaningful learning with technology Design-based research Dynamics of social roles in a knowledge management community Digital didactical designs: Teaching and learning in cross action spaces Digital didactical designs: Teachers' integration of iPads for learningcentered processes Computers in the classroom: Mindtools for critical thinking Measuring deep approaches to learning using the national survey of student engagement 13 Principles of good learning in games-Applied to teaching On qualitative differences in learning: I-Outcome and process December 16). The essential elements of team-based learning Emergent practices and material conditions in learning and teaching with technologies Student motivation from and resistance to active learning rooted in essential science practices Going farther together: The impact of social capital on sustained participation in open source Learning to teach in higher education Rethinking chemistry in higher education towards technology-enhanced problem-based learning Impact of undergraduate science course innovations on learning Approaches to measuring creativity: A systemic literature review Students' perspective on active learning in nutrition education The measurement of work engagement with a short questionnaire: A cross-national study Active learning in college science: The case for evidence-based practice authentic teaching to promote active learning: Redesign of an online RN to BSN evidence-based practice nursing course Group cognition: Computer support for building collaborative knowledge The learning paradigm college Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math Effectiveness of active learning that combines physical activity and math in schoolchildren: A systematic review Publisher's Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations Acknowledgements We want to thank very much Shann Bossaller for contributing to the project with student support and data coding. We are grateful to the University of Missouri (MU) Interdisciplinary Innovations Fund that supported the project. We thank the Information Experience Lab students that helped with providing resources for the usability study and MU Adaptive Computing Center who provided accessibility evaluation resources. We also thank the students who participated in this project. Thank you very much to all of you! Funding Funding was supported by MU Interdisciplinary Innovations Fund (Grant No. 2016. Ethical Approval All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.