1 THE CHALLENGES OF MIGRATING AN ACTIVE LEARNING CLASSROOM ONLINE IN A CRISIS
The coronavirus disease of 2019 (COVID‐19) has caused significant harm to our civilization. The economic damage and organizational chaos that ensued has shown that many organizations could not function as before. They needed to evolve and hastily adopt the technological advances of the digital age. In education, institutions had to replace traditional classrooms with online courses to support distance learning. Luckily, there were examples of e‐learning technology use before the crisis, as universities adopted learning management systems (LMS) and utilized multimedia educational materials.
However, transitioning from an LMS‐supported traditional classroom to a fully online classroom is no small feat. Students face many challenges that hamper their motivation and devotion to a course, and without face‐to‐face communication, their engagement declines [12, 22]. Furthermore, instructors lack awareness and training for developing engaging digital educational content, which can exaggerate the problem [12, 22].
In this paper, we analyze the success of transforming three mandatory undergraduate software engineering courses at our university, from a traditional classroom to a fully online classroom. Thus, before we needed to transition online, we searched the background literature to find the best practices to develop effective digital educational content and run a fully online active learning classroom.
After researching the best practices, we conducted an empirical study to evaluate our success and determine: (1) which factors influence students’ engagement with the course content, (2) how to create effective digital educational content. Our empirical study encompassed:
observing student interactions with course materials, course staff, and other students;
conducting focus group discussions with students;
two surveys, one mid‐semester at the peak of the COVID crisis in Europe and one at the end of the semester. Around 90 students volunteered to complete the surveys.
Based on the collected data, we formulate recommendations for creating and conducting online courses that are both effective and engaging.
The contributions of this paper are as follows:
We provide the catalog of digital elements used in our online active learning classrooms and discuss their use and development. This catalog can aid educators in developing their educational digital content.
We offer a detailed explanation of how we designed our study to assess factors influencing student engagement and the quality of digital learning elements. Our study design may aid researchers in designing evaluative methods for similar studies.
We provide insights into the factors that influence student engagement with online courses. Based on these insights, we offer recommendations that can help teachers motivate their students in an online classroom.
We provide insights into the type of digital educational content most appreciated by contemporary students. The recommendations based on these insights can help educators create more effective digital educational materials.
As our study includes both first‐ and third‐year software engineering students with different backgrounds, our findings are generalizable to most software engineering students. As most of our work relies on widespread digital technologies (e.g., video), it should also be relevant for most contemporary students. Finally, while our study focuses on fully online learning during a crisis, most of the findings apply to any digitally supported classroom. Therefore, we believe that our findings can help educators in many fields develop engaging digital educational content and run a better online or blended learning course.
We organize the rest of the paper as follows. Section 2 explains our course design and the background work that guided it. Section 3 describes our study design. Section 4 presents the results we obtained by analyzing and interpreting the collected data. In Section 5, we discuss our findings. First, we point out limitations that might affect the generalizability of our results. Then, we list our recommendations based on the obtained conclusions. Then, in Section 6, we analyze published papers closely related to our study. Finally, Section 7 concludes our paper and discusses areas for further research and plans for improving our online classroom.
2 COURSE DESIGN
This section describes the course design we developed by searching the literature for best practices to develop effective digital educational content and run a fully online active learning classroom. Section 2.1. describes our initially planned traditional classroom course and emphasizes the background work that guided our teaching model’s choice. In Section 2.2. we describe the set of digital elements we used to support the planned teaching strategies in a fully online setting. Here, we emphasize the literature which guided the development of the elements of our digital classroom.
2.1 Course context
All three courses are part of different computer science and software engineering study programs at our university. The first two courses are instances of the course Software Specification and Modeling (SSaM), conducted at two different study programs. Though these are two instances of the same course, the course content differs slightly to account for the differences in study program backgrounds. They are attended by roughly 220 third‐year undergraduate students and focus on requirements engineering and design specification aspects of software development. The third course is called Introduction to Software Engineering (ISE). The ISE course serves to introduce about 70 first‐year students to the field of software engineering, by covering the generalized software development lifecycle and its composing practices and tools.
The core teaching model for all three courses is project‐based learning . The model’s general idea is for students to work in teams of three to four members on a semester‐long project, incorporating new knowledge as it is introduced into their work. We chose this model as it produces good learning outcomes and helps students develop cross‐cutting skills like planning, critical thinking, decision making, and collaboration [3, 35].
We supplement project‐based learning with several active learning teaching strategies  that we utilize in our lectures and lab exercises. We utilize the most widespread form of active learning: asking students questions and discussing their answers (i.e., the Socratic method). We supplement this activity with Wait Time  to increases the overall student involvement. We encapsulate challenging questions into tasks, where students create virtual teams of two to four members to complete them. Through these tasks, we facilitate small‐group learning, promoting higher knowledge retention, better learning outcomes, and favorable attitudes towards the material in students .
2.2 Online course design
From the start of the semester, we utilized a LMS as a repository for educational presentations and text, and as a medium for delivering announcements regarding the course. When the state of emergency was declared due to COVID‐19, our LMS became the foundation of our online classrooms. Here we describe the different elements of our online classrooms:
Text and Image (Section 2.2.1), which we use as a basic tool for capturing and disseminating information regarding a topic;
Video (Section 2.2.2), which closely maps to the traditional lecture where the speaker presents information on a topic;
Interactive Video (Section 2.2.3), where the basic video is augmented by prompts and questions thereby supporting interactivity and active learning;
Task (Section 2.2.4), which present challenges that students complete to enhance their learning;
Online Discussion (Section 2.2.5), where the students communicate with the teacher to discuss a topic, a task, or their semester project.
Finally, in Section 2.2.6, we describe how we integrated all the elements into a cohesive structure called the Online Lecture, which supports an active online classroom.
2.2.1 Text and image
The simplest type of element includes text and images. We developed these following the guidelines of Jereb and Šmitek , where the authors elaborate on good design practices for multimedia educational content used in university courses. We follow the best practices for developing clear and understandable textual paragraphs and utilizing the advantages of the Web environment (i.e., by enhancing plain text through the HyperText Markup Language [HTML]). Where appropriate, we supplement the text with images or graphs to better explain or highlight an important point. We utilize textual paragraphs and images to:
Provide a summary of an earlier topic, usually from a previous lecture, on which we are building upon in the current topic;
Introduce a new topic, by explaining the purpose and nature of the content that follows;
Offer new information for the current topic, often placed between videos;
Conclude the current topic by summarizing the content and tying it into the broader context.
Of all the types of elements, textual paragraphs take the least effort to develop. Due to this flexibility, we could quickly insert new or rework existing textual elements when we discovered gaps in the students’ understanding. Furthermore, as we developed videos, we found different or even better ways to express a point. We could communicate this understanding with far less investment by utilizing text or image than it would take to design, record, edit, and publish a reworked video
The next type of element is the video, which combines audio and visual elements to deliver educational content. We utilized videos in the same way we used text and images—to introduce, expand, or conclude a topic. Videos facilitate the learning of both verbal and visual learners, as defined by the Felder‐Silverman model of learning styles .
We followed best practices from  to construct our video lessons, including video segmenting and mixed perspectives. Adhering to segmenting, we avoided long, uninterrupted videos and favored those that last 5–15 min and cover a single, meaningful concept. Through segmenting, students benefit from the time between segments where they can process information .
Regarding mixed perspectives, the goal is to combine different styles of video, including first‐person (i.e., view of the speaker’s screen) and third person (i.e., view of the speaker) . Throughout the state of emergency, we have experimented with several video styles, as illustrated in Figure 1. Following the nomenclature defined in a recent survey on instructional video styles , we experimented with the following styles (numbers correspond to numbers in Figure 1):
Talking head, where the speaker talks to the viewer, covers a large frame area and is not accompanied by slides;
Head and slides, where a smaller talking head is accompanied by slides that follow the talk;
Slides with heavy use of animation, where the speaker is not present on the video and the voiceover is followed by heavy use of animation to direct the viewer’s attention to an area of the slide;
Slides with use of a marker, where the speaker is not present on the video and the voiceover is followed by a marker pen that directs the viewer’s attention to a section of the slide;
Screencast, where the frame focuses on the speaker’s screen while they work through an example (in our case, a software design or coding challenge) while talking about their thinking process;
Virtual whiteboard, where the speaker uses a whiteboard application to draw and explain certain concepts.
Understandably, developing videos requires far greater effort than creating text. The speaker needs to prepare the visual content (e.g., slides, code), the speaking notes, perform a rehearsal, and record the complete package. Furthermore, recordings can be error‐prone. We ran into issues such as the recording software capturing the wrong application, microphone misconfiguration, and crashes during video editing.
We found several benefits of segmenting related to video development. In the event of a significant error during recording (e.g., the microphone did not work), the cost of rerecording a video was reduced compared to longer videos. Furthermore, the rehearsal was less straining, as the speaker only needed to prepare 5–15 min of content. Finally, we found that editing shorter videos was significantly faster and less prone to errors.
2.2.3 Interactive video
The most sophisticated element we used in our course is the interactive video. Interactive videos are augmented by reading prompts or questions (e.g., multiple‐choice or open‐ended). Figure 2 presents an interactive video. The left segment represents the paused video; the right displays a multiple‐choice question after being answered with feedback provided for each choice to explain why it is correct or incorrect.
Adding interactivity to a video has been shown to increase student engagement and retention [18, 37]. If a video is a digital version of a passive lecture, the interactive video enables active learning through questions and feedback on the answers. Adaptive feedback, which states the correctness of the solution and provides additional information to help the student understand the result, is invaluable for enhancing learning [11, 24]. This feature enables the simplest active learning technique: the question (both aimed at retrieval and generation). The Wait Time  is inherently supported as the viewer can take all the time they need to formulate an answer.
Out of all the elements of our digital classroom, interactive videos require the most effort to develop. In addition to the effort needed to record a video, it takes time to annotate it with questions and formulate adaptive feedback for the answers.
We found two significant benefits of the interactive video. First, by monitoring the students’ answers, we can identify confusing content that requires more explanation. Second, it is simple to add additional questions and textual prompts to clarify complicated content and expand the educational value of the video.
The next type of element is the task, which describes a problem for which the student can present and discuss solutions in a discussion forum related to the topic. We defined several categories of tasks:
Discussion point, where the student writes an opinion on a nontrivial open‐ended question using knowledge from previous lectures or different courses;
Coding or design challenge, a hands‐on exercise where the student needs to write or refactor some code or draw a unified modeling language (UML) diagram;
Investigation task, where the student searches the Web for an answer to a question (e.g., by finding a few articles and synthesizing an answer);
Reading task, where the student reads a selected chapter from the course’s related literature.
These digitalized tasks represent a transformation of the active learning technique related to small‐group learning. A benefit that this transformation brings is the increased possibilities of tasks regarding their diversity and complexity. Furthermore, online discussion forums facilitate collaboration and a greater depth of discussion, as participants have time to elaborate their answers and reflect on what is being posted .
Like text, tasks require relatively low effort to make. However, significant time can be required to moderate and keep up with an active discussion forum. Without monitoring and directing the discussion, students might agree on a suboptimal solution. Furthermore, feedback on a solution is the most valuable part of the task , and without teacher motivation, some solutions might fail to receive feedback. In our setup, there were no clear external incentives to complete a task such as points towards a grade. The only tangible reward was the feedback a student would get for their solution.
2.2.5 Online discussion
Online discussion is an event where one or two teachers join multiple students or student groups in a discussion using a voice communication application. By organizing online discussion, we work towards building a blended e‐learning course model , swapping direct face‐to‐face communication with technology‐supported communication.
Typically, we would organize an online discussion several days after the publishing of a new online lecture. Here, we discussed interesting tasks and analyzed the progress on the semester projects that the teams were developing. A team would present their progress on the semester project, and the teachers would evaluate its quality through inspection and targeted questions. As most sessions included many teams, other teams would benefit from this evaluation by examining their work for the identified deficiencies. We analyzed how the students understood previous topics, offered corrective advice, and noted improvements for our educational content.
Finally, online discussions helped us perform control checkpoints, where the teachers would go over the teams’ progress on their semester projects offline, and then discuss their findings with the teams online.
2.2.6 Online lecture
We summarize the presented elements of our digital classroom in Figure 3 and cluster them around their containing element—the Online Lecture. We define the Online Lecture with a title it shares with the set of topics it covers. Each topic is a collection of multimedia elements, such as text, images, videos, and tasks. The physical form of an Online Lecture is a Web page developed on our LMS, which ties in many instances of the elements mentioned above into a cohesive whole.
We combined different types of multimedia to facilitate different learning styles and increase student engagement, following the recommendations from the Felder–Silverman model of learning styles  and the Ou et al.  seven‐principle model for developing lessons for online learning.
3 STUDY DESIGN
We developed a strategy for monitoring and observing student behavior during the COVID‐19 crisis to ensure we were providing our students with the best learning experience. To this aim, we formulated the following research questions:
What are the factors that influence student engagement with a mandatory online course during a state of emergency?
What type of educational content do contemporary students most appreciate in an online classroom?
By answering the first research question, we analyze how to positively influence student engagement and motivation in a fully online classroom. By answering the second question, we guide our efforts for developing effective educational content.
We describe the high‐level overview of our methodology for answering the research questions in Section 3.1. Sections 3.2 and 4.2 correspond to RQ1 and RQ2, respectively. In each of these sections, we provide the specifics of the methodology conducted to formulate and answer that specific research question.
Figure 4 presents the timeline of events and activities for answering our research questions:
In preparation for COVID‐19, we researched best practices for organizing and running fully online classrooms, as described in the previous section. When the pandemic hit, we applied our findings to our courses. We spent the next month researching further, trying out ideas, and adapting the recommendations to suit our specific context better.
Once the state of emergency was declared and we went fully online, we continuously monitored and observed the students’ behavior and interaction patterns, both with the teachers and the educational content.
After the first week of online lectures, we started performing informal interviews  and focus group discussions  with students at the end of online discussions, to understand the challenges they were dealing with and to obtain feedback on our work. This transformed into a focus group discussion,
Activities conducted during Steps 2 and 3 allowed us to formulate our research questions. To both answer and refine them, we formulated and administered a mid‐semester questionnaire (i.e., the first survey). The goal of the survey was to help us understand the issues students were facing, evaluate the quality of our work, and, most importantly, help us adjust our approach to online teaching. The survey was voluntary and anonymous. Out of 290 students attending the three courses, 80 chose to participate—53 students from the two instances of SSaM and 27 from ISE.
Finally, near the conclusion of the course, we conducted a second survey. This survey was more extensive and strictly focused on answering our refined research questions. It was also voluntary and anonymous, and, out of roughly 290 students attending the three courses, 94 chose to participate—60 students from the two instances of SSaM, and 34 from ISE.
In Section 3.1.1, we describe how we observed student progress and conducted focus group discussions. In Section 3.1.2, we explain how we constructed the survey questions, and in 3.1.3, we describe how we analyzed the obtained responses.
3.1.1 Observations and focus group discussions
Starting from the beginning of the online lectures, we monitored and observed student behavior and interaction patterns. For example, we utilized tools offered by our LMS and video hosting platforms to analyze how many students engaged with different educational elements and for how long. This helped us get a sense of how many students were engaging with online lectures and their semester project as the coronavirus crisis was developing.
We started with focus group discussions at the end of online discussions to understand the issues the students were dealing with, how well our materials were received, and to gain additional insight regarding our observations. These focus group discussions  present opportunities to collect data through group interaction regarding the topic of interest . Such a setting enables students to build on the responses of other participants, thereby enhancing the quality of the information gained . Although the feedback was useful, only a handful of students actively participated in these discussions.
3.1.2 Survey construction
Guided by our observations and focus group discussions, we developed the first survey to gather feedback from a broader set of students. We formulated the questions of the first survey to be open‐ended. Students were free to write the answers in their own words. As these responses are unstructured, they are harder to analyze. However, we intended to avoid restricting or influencing students’ answers.
Guided by our observations, the focus group discussions, and the findings of the first survey, we constructed more structured Likert item  questions for the second survey. To construct the questions, we also consulted similar papers which aim to understand student reactions through surveys and tailored their questions to our context [2, 7, 12, 16, 21, 22]. We also added open‐ended questions to allow the students to elaborate on their Likert item answers. We developed both surveys following the recommendations from Punter et al.  for conducting effective online surveys.
3.1.3 Survey analysis
To mitigate our subjectivity when examining the answers to unstructured open‐ended questions, we performed the analysis in two steps:
Factor categorization: The first two authors of this paper independently read the students’ answers to note all the unique factors that they recognized in the answers. Then, they discussed the unique factors they identified to develop a unified categorization by grouping similar factors in the same category. We based the grouping on semantic similarity (e.g., “work habits” and “continuous studying” fell into the same category as they referred to the same studying pattern). We merged categories with few answers into semantically similar broader categories.
Answer mapping: When the categorization was established, each author independently reread each answer to note their mapping to the defined categories. Finally, they had a discussion to solve the disagreements.
As descriptive statistics for Likert item responses, we report the median and the frequency. To measure associations among Likert item answers, we used the nonparametric Spearman correlation. According to , this is the most appropriate way to analyze Likert item questions. In further text, we denote the value of Spearman correlation as rs. We use the significance level α of , that is, if the p value is less than 0.05 (p ≤ .05), we consider the correlation significant.
3.2 RQ1 What are the factors that influence student engagement with a mandatory online course during a state of emergency?
We wanted to understand the challenges our students faced and direct our activities to support them and help them stay engaged and motivated regarding their studying. We analyzed the student’s perspectives regarding our actions and efforts, as well as those of our colleagues on other courses, to identify the factors that positively and negatively influence student engagement.
Regarding RQ1, we formulated the following open‐ended questions for the first survey:
() “What is the one thing that the course staff currently does that you appreciate and would like to see more frequently?”
() “What is the one thing that the course staff currently does that you DO NOT appreciate and would like to see less frequently (or not at all)?”
() “In your opinion, which factors influence students (both positively or negatively) to engage with the course materials and solve tasks continuously? This question refers to all official courses you participate in during the pandemic.”
Later, we conceptualized the second survey questions listed in Table 1, aimed at answering RQ1. Here, Likert item questions use a 5‐point response scale, ranging from 1 = strongly disagree to 5 = strongly agree. Open‐ended questions can be answered in the free‐form text.
|Devotion and motivation regarding the course||() According to my circumstances, I have devoted myself significantly to this course.||Likert item|
|() What is the reason for your (lack of) dedication? Please emphasize both internal causes (e.g., I lack interest in the subject) and external causes which the course staff may influence (e.g., not enough examples or exercises).||Open‐ended|
|() Considering all semester courses, which factors influenced you to dedicate yourself to some courses more than others?|
|() I was motivated to attend online semester courses and do the tasks regularly.||Likert item|
|() Which factors positively influenced your motivation during this period?||Open‐ended|
|() Which factors negatively influenced your motivation during this period?|
|Course staff dedication and organization||() What is your level of satisfaction with the commitment of the course staff?||Likert item|
|() What is your level of satisfaction with course organization, goals, and tasks?|
|() In addition to the above, is there any other aspect of this course you find (un)satisfying?||Open‐ended|
|Working environment and conditions||() I had adequate working conditions.||Likert item|
|() What circumstances caused you to have (in)adequate working conditions (e.g., internet connection, shared space, work equipment)?||Open‐ended|
|Self‐organizing skills||() I have organized my time well.||Likert item|
|() What did you find challenging when organizing your learning schedule? If starting over, how would you improve your schedule organization?||Open‐ended|
|External incentives||() Control checkpoints have incentivized me to work regularly.||Likert item|
|() A larger number of control checkpoints (that demand smaller scopes of work) would have a positive effect on me to work regularly.|
|Perceived workload||() My subjective feeling is that the workload we had was more substantial than it would have been if we had regular classes.||Likert item|
|() Why do you feel that the workload was more/less substantial than it would have been if we had regular classes? Where do you gain/lose time?||Open‐ended|
|Peer support||() We have successfully organized the labor division for the project task in our team.||Likert item|
|() I regularly communicated with my peers (both teammates and other colleagues) to share experiences with new lectures and tasks.|
|() What were the challenges of working in a team? If starting over, how would you overcome them?||Open‐ended|
|() Can you suggest a way to improve teamwork in an online classroom? How can the course staff help promote student teamwork?|
We consider and (“Devotion and motivation regarding the course”) as variables indicating genuine student engagement with the subject. These two variables have a moderate‐to‐strong positive correlation, which is statistically significant (). To understand if any significant factors influence student motivation and engagement, we analyze the correlation of these two variables to other Likert item measured factors.
3.3 RQ2 What type of educational content do contemporary students most appreciate in an online classroom?
We list the part of the second survey aimed at answering RQ2 in Table 2.
|Course material quality and clarity||(S2Q1) What is your level of satisfaction with the quality of the educational materials presented in this course?||5‐point Likert item (1 = very unsatisfied, 5 = very satisfied)|
|(S2Q2) What is your level of satisfaction with the clarity of this course lectures?|
|Types of digital educational materials||(S2Q3–S2Q9) Please rate the type of digital educational material according to how well it replaces regular lectures.||Slides (no video)||5‐point Likert item (1 = it is a terrible alternative, 5 = it is an excellent alternative)|
|Text interleaved with (interactive) Videos|
|Lecture video styles||(S2Q10–S2Q15) Please rate how much you liked each video style (Section 2.1.4).||Talking head||5‐point Likert item (1 = I don’t like it at all, 5 = I like it very much)|
|Head and slides|
|Slides with heavy use of animation|
|Slides with use of a marker|
|Task types||(S2Q16–S2Q21) Please rate how much you liked different task types.||Discussion points||5‐point Likert item (1 =s trongly disagree, 5 = strongly agree)|
|Coding and design challenges|
|Short questions in interactive videos|
|Blended classroom||(S2Q22) Are there any aspects of the regular classroom that you consider impossible to replace by a digital substitute?||Open‐ended (free‐form text)|
|(S2Q23) In which aspects (if any) is the digital classroom better than a regular classroom?|
We analyzed the answers to our first and second surveys and processed the notes related to the observations and focus group discussions we conducted throughout the semester. In this section, we present the results of our study directed at answering RQ1 and RQ2.
4.1 RQ1 results
We identified seven factors that influenced student engagement with our online course, which we present along with interesting correlations between the factors (Table 3).
- Note: Statistically significant correlations (significance level α = 0.05) are bolded.
- Abbreviations: external incentives; workload, perceived workload; peers, peer support; self‐org., self‐organizing skills; external inc.; staffs’ devotion, course staff dedication and organization; students’ devotion, devotion and motivation regarding the course; work env., working environment and conditions;
- *According to , correlations greater than 0.29 indicates a medium or high relationship between the items.
4.1.1 Course staff dedication and organization
Students praised three aspects: (1) accessibility, as the course staff created an atmosphere where the students were not afraid to ask questions; (2) timeliness, as the course staff regularly uploaded the materials and promptly answered student questions and provided feedback. Students appreciated the readiness to transfer the course online swiftly; (3) commitment, as the students felt the course staff both cared about their personal needs in the current situation and were enthusiastic about teaching the subject.
In both surveys, respondents stated that the staff attitude had positively influenced their engagement. This fact was emphasized by 49% of the first survey respondents (answers to and ) and 52% of the second survey respondents (answers to , , , and ). Second survey respondents explained that the more they communicated with the course staff (e.g., through online discussions, announcements, and course materials), the easier it was for them to remain motivated. Likewise, many second survey respondents stated that they had neglected the courses that did not have devoted teaching staff, and which lacked regular updates. These results are further supported by a moderate‐to‐strong positive correlation between how students perceive course staff commitment and their motivation regarding the course, which is statistically significant ().
Additionally, the first survey revealed that course staff attitude is more important to first‐year students, as they have emphasized this factor in 74% of their answers. We attribute this to the first‐year students being generally unfamiliar with higher education and, consequently, in a greater need for guidance and reassurance.
Regarding the course organization, 21% of the first survey respondents emphasized that the materials were arranged in a manner easy to understand and follow, and students appreciated notifications the course staff posted (answers to ). These respondents stated that a straightforward organization prompted them to work regularly. On the other hand, 11% of respondents highlighted that course navigation was complicated (answers to ). These students had a hard time navigating the course as it had multiple links for pages containing lecture material, online discussions, and tasks. This, coupled with the many notifications they received, contributed to their sense of being overwhelmed, which demotivated them to engage with the content.
Our results indicate that the course organization is vital for student motivation. There is a moderate‐to‐strong positive correlation between a student’s level of satisfaction with the course organization and that student’s motivation, which is statistically significant ().
We also note that there is a moderate‐to‐strong statistically significant correlation between students being satisfied with course organization and their perception of course staff commitment ().
4.1.2 Working environment and conditions
The need to adjust to a new work environment and conditions negatively influenced students’ motivation. Many had to relocate and adapt to the new routine, while some had unsatisfactory working conditions. Furthermore, some respondents were from the areas more heavily struck by COVID‐19. According to their response (answers to ), this uneasiness negatively impacted their concentration and focus. Further evidence that the working environment and conditions influence the student’s motivation to engage with the course is reflected in a small positive correlation between and , which is statistically significant ().
Forty‐three percent of first survey respondents (answers to ) stated that the change of the working environment and conditions contributed to the lack of engagement. Interestingly, only 21% of respondents to the second survey claimed that working from home negatively impacted their devotion and motivation regarding the course (answers to , , , and ). Furthermore, 76% of respondents to the second survey agreed that their working conditions were adequate (answers to ). Only 16% disagreed with this claim, listing a lack of space as the main problem with their working environment (answers to ). We believe the reason behind the significant difference between the first and second surveys is the students’ adaptation to the situation. The first survey was conducted 2 weeks after the state of emergency was declared when students were still settling into their new routine. The second survey was conducted near the end of the semester, almost 3 months after the routine shift.
4.1.3 Self‐organizing skills
Without the usual lecture timetable, students receive fewer incentives to work regularly. From our observations and first survey respondents (answer to ), we identified three systems of studying:
continuous studying, where students prefer to work continuously, as suggested by the course schedule. They work on all courses during the week;
studying in chunks, where every few weeks the students dedicate the whole workweek to a single course to avoid context‐switching. This group of students lacked the motivation to engage in weekly tasks, especially online discussions, as old tasks were already solved and thoroughly discussed;
last‐minute studying, where the students would wait until the exams to engage with the course content.
The ability to self‐organize is crucial for continued engagement with the course and semester workload. This factor was emphasized by 16% of respondents to the first survey (answers to ). Likewise, 34% of second survey respondents noted that their ability to self‐organize significantly impacted their continued devotion and motivation regarding the course.
Forty‐six percent of respondents of the second survey agreed that they successfully organized their studying (answers to ). In contrast, 35% noted that they were not able to organize their time effectively, and most of them did not know how to improve their self‐organization (answers to ). Interestingly, this percentile was equally spread across student groups, where first‐year students (attending ISE) found only slightly more difficulties in self‐organizing compared to their senior colleagues. This lack of insight on how to improve their organization might be partially attributed to having inadequate working conditions, as there is a small, statistically significant relationship between self‐organization success and having adequate working conditions ().
We found that the student’s ability to organize studying has a moderate‐to‐strong, statistically significant correlation with both student’s motivation () and student’s devotion to the course ().
4.1.4 External incentives
A significant factor for student engagement with the course materials was regular control checkpoints, where the teaching staff examined the students’ progress on their semester projects. We organized these checkpoints three times during the semester. Importantly, the course staff assessments of the team’s progress influenced the students’ grades of the semester project.
During focus group discussions, we discovered that students were far more likely to engage with a course if an exam or checkpoint was coming soon, which could affect their grade. We also observed that educational elements focused on a topic related to a checkpoint had the most views during the week before a checkpoint.
As part of the second survey, 81% of respondents stated that these checkpoints incentivized them to work regularly, while only 12% disagreed with this statement (answers to ). Furthermore, 71% of respondents agreed that additional checkpoints would further incentivize them to work continuously (answers to ). Finally, 28% of respondents stated that the checkpoints’ presence heavily influenced their devotion and motivation regarding the course (answers to , , , and ). These findings agree with the findings of Andrew Sobel , who replaced two big tests in his class (a midterm and a final) with nine smaller quizzes. As a result, Sobel’s students studied more frequently, and their overall grade, knowledge, and satisfaction with the course went up.
There is a moderate‐to‐strong correlation between students’ motivation and students’ perception of how vital the checkpoints are for their regular engagement with the course materials, which is statistically significant (, ). This correlation further supports the evidence that regular checkpoints can positively influence both student’s engagement and motivation regarding the course.
There is a small but statistically significant correlation between student’s perception of the importance of checkpoints and student’s perception of course staff commitment (, ), as well as the student’s comprehension of how good the course organization was (, ).
Students that did not organize their time well do not think that additional checkpoints would have incentivized them to work continuously, as shown by a small but statistically significant correlation between and (, ).
Finally, we should note the students who believe the checkpoints have incentivized them to work regularly also think more frequent checkpoints would be beneficial, as shown by a moderate‐to‐strong statistically significant correlation between and (, ).
4.1.5 Perceived workload
Many students felt that the online materials and tasks overburdened them. They complained about: (1) this course workload, that is, that the duties on this course were plentiful and thus hard to complete by the suggested deadlines; (2) semester workload, that is, that the general workload from all the courses in the semester was overwhelming. Some courses exacerbated this problem by taking several weeks to transfer online and consequently posting several weeks’ worth of material at once; (3) previous semester workload, that is, some students had trouble following the semester’s courses due to having unfinished tasks of the prior semester.
On the first survey, 36% of respondents stated that some overwhelming workload aspect was inhibiting their motivation to engage with the course (answers to ). Respondents restated this problem in the second survey, where 28% defined overwhelming workload as a chief reason behind their lack of devotion and motivation regarding the course (, , , ).
When asked if students felt that they had more responsibilities during the state of emergency, 55% of second survey respondents agreed (answers to ). In contrast, 31% stated that they had less workload during the state of emergency, listing that they saved time commuting, studying during their peak concentration, and increasing the playback speed of videos. The first‐year students (attending ISE) perceived to have more workload, as 64% agreed with this statement, while 21% disagreed. Interestingly, we did not find a significant correlation between students’ perception that the workload was more substantial during the state of emergency () and their devotion () or motivation () regarding the subject.
4.1.6 Peer support
Fourteen percent of second survey participants noted that their team and colleagues influenced their devotion and motivation for the course (answers to , , , and ). Students with a well‐organized team found it easier to engage with the course, while students with disorganized teams described the negative impact this had on their work. Interestingly, we did not find a significant correlation between students’ perception that the course project tasks were fairly distributed among their team members () and students’ devotion () or motivation () regarding the course.
Students with a well‐organized team do not think that additional checkpoints would have incentivized them to work continuously, as shown by a small but statistically significant negative correlation between and (, ).
Although small course project teams may not influence students’ engagement with the subject, classmates’ peer support seems to be an essential factor. There was a moderate‐to‐strong, statistically significant positive relationship between students’ reporting regular communication with classmates and their dedication () and motivation () regarding the subject. Unsurprisingly, students with well‐organized teams reported substantial peer support, as can be seen from a strong, statistically significant positive correlation between the answers to and ().
Finally, students that reported having substantial peer support also said that they successfully organized their study‐time. We can see this from a small positive, statistically significant correlation between and ().
4.1.7 General interests and ambition
Many students emphasized that the devotion and motivation regarding a course were closely related to the individual’s interests and ambition. Although some students focused solely on obtaining a passing grade, others strived to acquire knowledge regarding topics they were interested in or perceived as valuable.
Forty‐four percent of respondents to the first survey held that the engagement stems from personal ambition and interests (answers to ). Interestingly, 44% of the respondents to the second survey shared this view, stating that devotion and motivation regarding a course are tied to the individual’s preferences.
4.2 RQ2 results
We organize the presentation of the results according to categories in Table 2 (column 1).
4.2.1 Course material quality and clarity
Questions and serve to evaluate whether we created the educational materials that are good enough for the students to assess different aspects of educational materials and state their preferences. Figure 5 shows the frequencies of student responses to these questions. We may conclude that the students were generally satisfied with course materials quality and clarity, as answers to both questions have the median value of 4.
As a side note related to RQ1, we observed a moderate‐to‐strong significant positive relationship between students’ motivation () and their perception of material quality (), as well as their perception of material clarity ().
4.2.2 Types of digital educational materials
We show the descriptive statistics to the answers S2Q3−S2Q9 in Table 4. According to the median of the responses, the best online material types are on‐demand video presentations and online discussions. According to the focused group discussions, students appreciated the short, segmented videos we offered. This aligns with the findings of Fiorella and Mayer .
|Material type||Median||Mode||Frequency of response|
|Video||5||5||1 (1%)||2 (2%)||11 (12%)||26 (28%)||54 (57%)|
|Online discussions||5||5||3 (3%)||4 (4%)||6 (6%)||22 (23%)||59 (63%)|
|Text interleaved with (interactive) Videos||4||5||2 (2%)||5 (5%)||11 (12%)||32 (34%)||44 (47%)|
|Interactive video||4||5||11 (12%)||15 (16%)||19 (20%)||21 (22%)||28 (30%)|
|Live streaming||4||4||7 (7%)||13 (14%)||24 (26%)||29 (31%)||21 (22%)|
|Text/Book chapter||2||2||23 (24%)||30 (32%)||20 (21%)||10 (11%)||11 (12%)|
|Slides (no video)||1.5||1||47 (50%)||23 (24%)||17 (18%)||2 (2%)||5 (5%)|
The second best types of materials were live streaming, interactive videos, and Text interleaved with (interactive) videos. In the focused group discussions, students stated that the problems with live streaming are that this approach allows less freedom in self‐organization, the students may have trouble concentrating during the whole lengthy session and teachers were unaccustomed to not seeing their students while lecturing. Several respondents did not like interactive videos as they could not be watched at an increased speed. This restriction was due to the platform we used and can be avoided by installing a browser plugin.
The respondents generally did not like to study from textual descriptions or book chapters and preferred them only to lecture slides unaccompanied by video presentations.
4.2.3 Lecture video styles
We show the descriptive statistics to the answers to S2Q10−S2Q15 in Table 5. According to the median of the responses, students generally liked all the video styles we offered, except for the Talking head style, for which most of the respondents were neutral. The most preferred video style is Screencast.
|Video style||Median||Mode||Frequency of response|
|Screencast||5||5||3 (3%)||3 (3%)||19 (20%)||15 (16%)||54 (57%)|
|Slides with heavy use of animation||4||5||2 (2%)||4 (4%)||12 (13%)||37 (40%)||39 (42%)|
|Virtual whiteboard||4||5||1 (1%)||7 (7%)||18 (19%)||27 (29%)||41 (44%)|
|Slides with the use of a marker||4||5||3 (3%)||5 (5%)||17 (18%)||33 (35%)||36 (38%)|
|Head and slides||4||5||4 (4%)||10 (11%)||21 (22%)||25 (27%)||34 (36%)|
|Talking head||3||3||10 (11%)||25 (27%)||29 (31%)||17 (18%)||13 (14%)|
4.2.4 Task types
We show the descriptive statistics to the answers S2Q16−S2Q21 in Table 6. Students generally liked the coding and design challenges and the course project, while they were generally neutral to other types of tasks.
|Task type||Median||Mode||Frequency of response|
|Course project||4||5||5 (5%)||10 (11%)||17 (18%)||25 (27%)||37 (39%)|
|Coding and design challenges||4||4||8 (8%)||12 (13%)||24 (26%)||29 (31%)||21 (22%)|
|Short questions in interactive videos||3||4||12 (13%)||22 (23%)||16 (17%)||23 (24%)||21 (22%)|
|Investigation tasks||3||3||10 (11%)||21 (22%)||24 (26%)||22 (23%)||17 (18%)|
|Discussion points||3||3||17 (18%)||19 (20%)||29 (31%)||19 (20%)||10 (11%)|
|Reading tasks||3||2 and 3||20 (21%)||24 (26%)||24 (26%)||9 (10%)||17 (18%)|
When it comes to tasks completed through an online discussion forum, we observed a general lack of collaboration between the students. Even though we encouraged students to contribute to the answers of other students, we recorded only several such posts, which made up about 5% of the total posts. Most students instead chose to submit their answers without examining the solutions of other students. Furthermore, a small number of students tried to solve one or more tasks (72 out of 290, making up about 25%), while only 15 students (i.e., 5%) addressed five or more sets of tasks (out of 10).
Interestingly, third‐year students (attending SSaM) preferred to solve their tasks in teams (teams submitted 70% of solutions on these courses). In contrast, first‐year students exclusively submitted individual work. We note that the solutions created by the student teams were more mature than those submitted by individuals. However, no solution offered by the students was complete, and the teacher always had some corrections, which were often major.
We also observed that students mostly offered solutions for coding and design challenges, even though most task sets contained one or two coding and design challenges, discussion points, a reading task, and an investigation task. Out of all the task activity, 67% of posts were related to a coding and design challenge, 26% addressed a discussion point, and the remaining 7% were related to investigation tasks. No post addressed a reading task.
4.2.5 Blended classroom
As we developed the digital content, we wanted to understand how best to utilize it for the next generations, when we return to the traditional format. To this end, questions and aim at assessing the best way to create a future blended classroom. We have identified the following disadvantages of a digital classroom compared to a regular one:
Real‐time communication: 46% of the respondents complained about not being able to get the answers immediately as they would during the live lecture but having to send emails and possibly wait a couple of days for an answer. Furthermore, emailed responses may be hard to understand and may need further clarification.
Human contact: 30% of the respondents desired live interaction with their teachers and colleagues. They missed the feeling of being a part of a community and the opportunity to discuss matters and exchange experiences unrelated to the course subject.
Lack of socializing: 30% of the respondents missed external stimuli to prompt them to work regularly. They emphasized both the importance of timetables and being surrounded by their peers and teachers in the working atmosphere. They stated it is easier for them to remain motivated in a real class as it presents a stimulating working environment.
We have identified the following advantages of a digital classroom compared to a regular one:
Replayable lectures: 51% of the respondents appreciated that they could repeatedly listen to the same lecture to understand and refresh their memory on the presented concept. They enjoyed the possibility of pausing and rewinding videos if they lose focus. Furthermore, 79% of respondents stated that videos are a better alternative to the traditional passive lecture.
Self‐organization: 47% of the respondents were happy that, unrestricted by the rigid timetable, they could organize their own study time. Their studying was more productive as they could sync it to the time of the day when they have the most concentration. They were able to learn at their own pace and avoid context‐switching.
Saving time: 15% of the respondents were happy that they saved time traveling to and from classes.
Increased comfort: 4% of the respondents reported that their home was a much more comfortable learning environment than a crowded classroom.
By examining the literature and the results of our empirical research, we were able to adapt our approach and offer a satisfying learning experience to most students. From this study, we define a set of recommendations for addressing both the human and technological challenges of running a fully online classroom. With these recommendations, we look to help both ourselves and all teachers run an effective online classroom, particularly during a pandemic or similar crisis.
In Section 5.1, we define the limitations of our study and the resulting findings, which may affect our recommendations’ generalizability. Next, in Section 5.2, we provide recommendations on how the teachers may facilitate student engagement with online learning through activities and tools. In Section 5.3, we discuss how well‐received the digital elements (defined in Section 2.2) of our online courses were. Based on these findings, we provide recommendations on how to create online learning material.
We applied multiple precautions to minimize risks that may affect the validity of our study. First, we have relied on published related work to scope our study, define research questions, and design our research procedure to answer each posed research question (Section 3). Our study participants have not been acquainted with our research questions to ensure their impartiality and avoid possible biases. All participants are students of the same educational institution and enrolled in the course conducted by the same teaching staff. We mitigate the threats regarding the participant set’s diversity and representativeness by including three different courses with students of diverse backgrounds on both the first and third years of study.
We minimize the risks regarding data analyzes in the following way. We have collected the data before analyzing it. We consulted related work when designing our data analysis procedure to avoid our own biases and to ensure that the collected data would be useful for interpretation. Based on the published work, we have carefully selected descriptive statistics to present our results appropriately. However, some limitations that may impact our findings remain.
The first limitation of our study is the relatively small scale of our empirical research. The primary source of our results are the surveys, and almost 100 respondents completed them. Although this is not an insignificant number and includes both first‐ and third‐year students, the real limitation stems from the fact that all respondents were students of computer science study programs. Such students are familiar with information and communication technologies, and their careers require them to learn the latest tools and technologies continuously. Therefore, our students could more easily adapt to working online, even during a state of emergency. However, we should note that our online classrooms rely on widespread technologies, such as videos and instant messaging applications, which are intuitive and easy to use, and do not require high technical skills from their users. Therefore, we believe that students from other fields are also familiar with these technologies. Any possible issues should be easy to solve by providing clear instructions (e.g., through an instructional video) regarding installing and using a specific tool required to consume digital educational content.
Another limitation of our study is the heavy reliance on survey results. While surveys help collect data from a large population, they can produce inaccurate results if improperly implemented .
First, a survey question might be misunderstood by the respondent, leading them to provide an inaccurate answer. To combat this problem, we followed different guidelines for effective survey construction, described in Section 3.1.2. Furthermore, one author developed the initial set of questions and sent them to the other authors for review. Each author wrote their understanding of the question and expected answers, and we redesigned ambiguous questions until all authors agreed on their meaning.
The second problem with surveys is the interpretation of the answers to the open‐ended questions. The answers can be misinterpreted and might hold a different meaning to different people. To overcome this issue, we followed the methodology for the survey analysis described in Section 3.1.3. We further tackled this issue through focus group discussions after the first survey, where we asked for clarification regarding the answers provided by a significant number of students. Although we tried to address these issues, it is still possible that some students failed to understand the question and that we misinterpreted some answers. A better solution would be to supplement the questionnaire with follow‐up interviews with each respondent, but this was not practically feasible for us.
Finally, the collected data would ideally be extracted directly from students’ actions in the online learning platform. However, our study relies on a self‐report questionnaire that was voluntary. Even though a substantial portion of participants completed the survey (33.6%), we must be aware that voluntary activities are generally performed by the most devoted participants.
5.2 Facilitating student engagement with an online course
Here, we discuss the results of our empirical research related to RQ1. We define recommendations for facilitating student engagement with a fully online course, particularly during a crisis such as COVID‐19. We group these recommendations around discovered categories of factors that influence student devotion and motivation regarding a course (Section 3.2). We summarize these recommendations in Table 7 to highlight the key takeaways.
|Course staff dedication and organization||
|Working environment and self‐organizing skills||
5.2.1 Course staff dedication and organization
In the responses we collected, the teaching staff attitude proved to be one of the most influential factors for student engagement. Our results show that students take cues from their teachers and that the course staff’s devotion translates into the students’ devotion. Importantly, this factor can have both a positive and negative effect. Many students reported that it was difficult to engage with courses that had little staff commitment and were poorly organized.
We found that continuous communication is critical for successfully running an online course, especially during a state of emergency. We note two aspects of this communication, including communication between the teachers and communication between teachers and students. First, the teaching staff needs to regularly exchange information, observations, and ideas among themselves to ensure they can quickly address any rising problems. We found that frequent use of instant messaging applications helped us coordinate our efforts and promptly react to new information. Second, our results show that students remain motivated to engage with the course when they have opportunities to interact with the teaching staff (and not only consume videos and other educational content). Therefore, we recommend setting up a timeslot for online discussions or integrating them into the course content, as we described in Section 2.2.5.
5.2.2 Working environment and self‐organizing skills
Through the surveys, we saw that most of our students had adequate working conditions. Those who did not mostly lack adequate working space, something the teaching staff cannot influence.
However, while the majority had adequate working conditions, many students had trouble organizing their workload and maintaining motivation in their new working environment. The research of Lee et al.  also confirms that a technologically based learning environment is an especially challenging environment for students when it comes to learning self‐regulation. Luckily, teaching staff may address this problem, as self‐organizing skills can be enhanced .
First, we recommend that instructors define clear expectations and deadlines regarding their course and the student’s engagement. Without clear expectations, our results show that students have trouble organizing their schedules and are more likely to engage with courses that have defined these important milestones.
Second, we advise teachers to assist their students in organizing their workload and developing a working timetable for the semester. Without the timetable imposed by traditional lectures and lab exercises, many students had trouble developing a working schedule for consuming the digital educational content of various courses. This schedule should include transparent estimations of the time needed to complete each objective and precise specifications of mandatory and optional tasks. Over time students can help improve the estimates by reporting the actual time they needed to finish the learning goal. Teachers should be aware of the self‐organization issue and take time (e.g., at the end of online discussions) to examine how the students organize their studies.
Finally, a dedicated short workshop on goal and time management might be helpful for interested students. Such a workshop could help students define realistic and relevant study goals and teach them how to decompose and plan their realization in unfavorable working conditions. Furthermore, it could introduce them to task and time management tools to support them in this process. As this workshop falls outside the course teacher’s scope, the faculty management is responsible for providing such a service to the students (e.g., through a set of educational videos).
5.2.3 External incentives
Our students praised control checkpoints as a significant factor for their continuous studying and completion of the semester project. We recommend teachers decompose their semester projects and end‐semester exams into smaller, more manageable chunks that grade the student’s understanding and incentivize them to work more frequently. Such recommendations are well‐established in the literature, as frequent testing promotes better knowledge retention than large exams that promote studying in chunks .
We also note that incentives such as bonus points towards a grade can promote engagement with tasks and other additional assignments. Without such incentives, many students commented that they avoided tackling the tasks to focus on the semester project that contributed to their grade. Therefore, teachers should offer some tangible rewards for any student engagement they wish to see more of.
5.2.4 Perceived workload
Many students felt overburdened by the overall semester workload. A significant factor responsible for this state was the lack of organization on some courses, resulting in unclear expectations and chaotic release of educational content. Furthermore, many students had trouble self‐organizing, which contributed to feeling overwhelmed. We already discussed both topics.
Importantly, several students felt that some teachers introduced more tasks and assignments than they would have in the traditional setting. Although we did not measure the validity of these claims, we recommend that teachers avoid the temptation to introduce additional mandatory assignments. Students save some time on commuting and having increased playback speed. Nevertheless, they can easily be overwhelmed if all teachers introduce additional compulsory tasks.
5.2.5 Peer support
We observed how a team’s cohesiveness could influence the engagement of its members with the course. Our results even show that well‐organized teams require fewer external incentives in the form of control checkpoints. As project‐based learning often involves teamwork, we recommend that teachers discuss the challenges they have seen teams face when working on their course. This guidance can help students run more efficient teams, resulting in higher engagement with the course and better learning outcomes.
Considering the coronavirus and similar crises, students could benefit from a dedicated workshop that teaches them how to collaborate remotely, communicate, and organize work without physical contact. Such a workshop could teach students how to use contemporary tools to collaborate effectively. As these topics fall outside the curriculum of most courses, the faculty is responsible for providing such a workshop to interested students.
Finally, our results show that general peer support help students maintain their devotion and motivation regarding a course. As students learn a lot from their peers, we recommend finding a digital twin for the chatter that happens in the hallways during breaks. By developing an online community of practice related to the course and the broader topics it covers, students can engage with colleagues to ask questions, share articles, and support each other. Finally, we should note that effectively running an online community of practice is no small feat and is a separate research topic that we will explore as part of our future work.
5.3 Building effective digital educational content
Here we discuss the results of our research related to the development of effective digital educational content. We explore how the different digital elements (described in Section 2.2) were received and why this was the case. We group our insights and recommendations around sets of digital elements and summarize them in Table 8.
|Text, image, and slides||
|Videos, interactive videos, live streaming, and online discussions||
5.3.1 Text, image, and slides
Students generally disliked learning from static educational elements, such as text and presentation slides containing text and images. Furthermore, our results show that students dislike learning from books, as they poorly rated both reading tasks and book chapters as sources of learning. Part of the reason might be the language barrier, as most authoritative books on the subject lack a good translation from English. Another reason might stem from the fact that millennials prefer to consume video and audio materials instead of text. Based on these findings, we recommend that teachers carefully consider if they can offer an effective online course by relying solely on static text and images.
In contrast, students appreciated when the text was interleaved with video materials, as was the case for most of our educational content. We recommend this approach as it provides much flexibility to the instructor to correct and supplement the video materials with additional information, ideas, and tasks.
5.3.2 Video, interactive videos, live streaming, and online discussions
All forms of videos received generally favorable reviews. Students appreciated videos the most, as this enabled them to listen to the lecture at their own time, replay part of the video, and pause it when they needed a break or wanted to try out a tool. Furthermore, they praised the ability to increase playback speed and even critiqued interactive videos for lacking this feature. However, many students appreciated interactive videos and stated that the question prompts helped them focus and improve their knowledge retention. Although we did not test the validity of these claims, several papers document this finding [18, 37]. Although live streaming did not offer any of these advantages, students liked that they could interrupt the lecturer to ask for clarification. This finding agrees with our finding that students greatly appreciated online discussions and stated that such communication significantly affects their engagement and dedication to a course.
Based on these findings, we recommend that teachers develop interactive videos and host them on a platform that allows students to increase video playback speed. Alternatively, teachers should investigate if a browser plugin exists that offers this feature and notify their students. Finally, the interactive videos should be supplemented with online discussions that allow students to ask questions and enable teachers to assess the student’s understanding of the lecture.
Regarding the video styles that students prefer, we found that the talking head, when not accompanied by any slides or visual aids, is the least well‐received. Students commented that it was hard for them to focus on the lecturer’s talk, especially if the talking head segments were longer. On the other hand, students enjoyed watching screencasts, stating that live coding and tool use help them maintain focus, even with longer segments. Therefore, we recommend that instructors avoid the talking head and favor practical problem‐solving exercises through screencasts.
We observed that a small number of students provided solutions for our online tasks. We believe there are two reasons behind this inactivity. First, some students might not be interested in learning more about the related topic. As there is no incentive to complete a task other than to receive guidance from the instructor, students might not be motivated to participate. As discussed before, some students will be more willing to complete the tasks if this study contributes to their grades. Second, some students prefer to study in chunks, collecting a few weeks of coursework to complete a batch. As we have often commented on the submissions to a task before the new lecture, students learning in chunks understand the final solution and are dissuaded from contributing further. We recommend giving students more time (e.g., 2 weeks) to post their submission before commenting on the solutions.
Importantly, students avoided commenting on each other’s solutions, despite our recommendations to do so. Without this collaboration, students miss the opportunity to learn from one another, build a better solution, and receive all the benefits of working together on a small‐scale problem . We will research and design a better system for online tasks and their collaborative completion. As a first step, we recommend emphasizing the importance of collaborative task solving and stimulating this behavior with external incentives.
Regarding the different task types, students preferred the design and coding challenges, while investigation and reading tasks were less well‐received. From the focus group discussions, we conclude that part of the reason stems from the general disinterest our students have in reading. However, we discovered that some students were not aware of the skills they could hone by solving these tasks and their utilization in their future jobs. They argued that coding and design challenges were more useful as they map to everyday programming tasks. Therefore, we recommend that teachers articulate the long‐term value of solving a particular task type.
6 RELATED WORK
This section surveys research closely related to our research goal of providing the students with the best online learning experience during a crisis. We review multiple studies aimed at understanding the impact of COVID‐19 on higher education institutions through questionnaires, as well as studies on student engagement in higher education online classes.
Nenko et al.  aimed to investigate the distance learning process in Ukrainian higher educational institutions: its effectiveness, the negative and positive aspects, and future perspectives. They conducted a survey that encompassed 540 participants from three major higher educational institutions. They concluded that online education has a positive impact on student satisfaction and report that the amount of time for students’ independent learning has increased compared to the traditional classroom. They stated several obstacles to effective distance learning, some of which are due to students lacking adequate equipment or self‐management skills. These findings agree with ours for RQ1. Although while our software engineering students were generally well‐equipped, they lacked self‐organization skills, which influenced their motivation.
Demuyakor  posed several research questions similar to our RQ2: “What is the student’s Perception of the effectiveness and credibility of Course Content for online learning?”, “How much satisfied are the students with the ‘Learning Resources’ available?” and “What are the expected challenges that students are likely to encounter during the online teaching and learning?”. They have conducted a survey on 330 Ghanaian international students at the various institutions in China. They found that the students appreciated the transfer to the online classroom and praised its effectiveness. Students were generally satisfied with the provided learning resources. They stated that a significant challenge of online learning is to impart a sense of community in an online environment. All of these findings agree with ours.
Lall and Singh  aim to understand the student perspective, attitude, and readiness regarding university‐level online classes. They surveyed 200 students studying in various departments of Graphic Era Hill University, Dehradun. Their questionnaire contained questions similar to our RQ2: they asked the students about their level of satisfaction with online learning, reasons for liking and disliking online classes, and their preferred choice of digital educational material. They found that the students generally liked the online courses, the most common reasons being the possibility of self‐organization and the flexibility of study location. The main issues with online learning are missing human contact (cocurricular activities and meeting friends) and the lack of two‐way communication. These findings agree with ours for questions aimed at assessing the possibility of the blended classroom. As for the type of digital educational material, Lall and Singh  found that the most preferred option was PPT (Microsoft PowerPoint presentation) with an audio recording, followed by videos, webinars, and video conferencing. The least favorite option was PPT. These findings also agree with ours. Compared to , we have performed more extensive research on this topic, where we also asked the students to rate different video styles.
Hammond et al.  composed a very extensive survey aimed to determine the effects of the forced transition to online learning during the COVID‐19 pandemic. The goal is to determine the factors that cause individuals to respond positively and negatively. The responses are still collected, but the researchers have published recommendations:
Students are struggling to establish effective collaboration. Instructors are advised to help form study groups and sessions and make participation mandatory for a few weeks. In our course, we aimed to address this issue by organizing compulsory student teams to complete the course project. We also tried to facilitate communication outside of these small teams through various discussion and research tasks and online discussions. Our findings indicate that peer support is an essential factor for students’ engagement.
Internet connectivity might be an issue for some students. The recommendation is to avoid using real‐time interactions exclusively and provide recorded material. We followed this recommendation, and our results show that students highly appreciated it. Even without connectivity issues, students enjoy the flexibility offered by the prerecorded lectures (learning at their own pace and the possibility of self‐organization).
Students expressed several aspects of online learning that caused them anxiety or stress. Instructors are advised to provide support for students and use more scheduling than previously to help them stay on track. These findings agree with our own.
Students missed campus life and expressed stress due to the disruption that might impact their financial stability. Hammond et al.  recommend that the instructors should facilitate communication about these sources of stress. Similarly, many students in our study craved human contact in an online classroom. Our students did not express economic anxiety but felt the disruption in their daily routines and working conditions, which negatively impacted their motivation. Thus, we also recommend that the faculty facilitate online student discussions unrelated to the course materials to cope with these issues.
Andone and Mihaescu  present a case‐study of blending Massive Open Online Courses (MOOCs) into higher education courses. They report that most of their students would like the theoretical part of the class to be online. These findings agree with ours. When asked about which Open Educational Resources (OER) they liked the most, most of the respondents in  voted for video materials, which agrees with our findings. Interestingly, the next popular OER option was slide presentations, which the participants in our study prefer the least. We hypothesize this might be due to the different styles of creating slides (e.g., less/more text on the slide). However, we cannot prove this hypothesis as the authors did not provide a detailed explanation of the slides presentations used in their study.
Mozelius and Hettiarachchi  conducted a literature review to analyze the aspects that need to be considered when implementing blended learning in higher education. Their findings agree with ours, as they identify the following categories as essential factors for successful learning, applicable to our fully online mandatory course context: teaching staff attitude, responsiveness, and teaching quality; students’ goals and ambitions; collaborative learning; course organization; supplying interactive media formats (e.g., interactive videos). They also state that the social presence and learner satisfaction can be stimulated by increasing the use of media technology and addressing students’ concerns and queries, which we strived to implement thorough our course.
El Firdoussi et al.  investigate the distance learning initiatives in Morocco during the COVID‐19 pandemic. The authors surveyed 3037 students and 231 professors enrolled in different stages of higher education programs to collect their perspectives on the advantages and pitfalls of transitioning to an online learning environment. Most respondents found online learning less interesting than face‐to‐face alternative. Related to our work, the authors examined the student’s preferred educational content format. They found that 56% of students preferred recorded video lectures, 28% liked live video conferences, while only 16% was content with static documents, such as slideshows or PDF files.
Similar findings were presented in Reference . The author proposes a modified blended learning method that he implements and evaluates as part of a building automation engineering course at a technical university. The author surveyed students to determine which type of educational content the students’ preferred during their distance learning. He found that 78% liked recorded videos, while 21% enjoyed live online discussions.
Kanij and Grundy  describe their experiences of transferring a data structures and networking course at an Australian university during the COVID‐19 pandemic to a distance learning model. They list the challenges and key lessons learned during this transition. We examine three out of their eight recommendations, as they are closely related to our findings. The first and the second recommendation highlight the need for clear communication of expectations and new information between the teaching staff and student body. Like our findings, frequent reminders regarding the course content and students’ assignments help them overcome organizational challenges worsened by the crisis. In contrast to our findings, the authors found that prerecorded videos harmed students’ motivation and that live interactive session provide a better alternative.
The paper by Sankar et al.  surveys 784 higher education students from multiple universities to collect their perspectives on the COVID‐19 pandemic and the quality of their new distance learning environment. The authors study the relationship between education quality and seven independent factors, including administrative support, technical support, course content, course design, instructor characteristics, learner characteristics, and social support. Related to our work, the findings reveal a positive relationship between the learning experience and the course design’s suitability to the e‐learning environment. Furthermore, they highlight a strong correlation between the quality of learning and the teaching staff’s support for organizing and managing the student’s workload.
This study aimed to determine a strategy for providing the best possible online learning experience to students during a crisis such as COVID‐19. During this pandemic, we were forced to transfer online three mandatory undergraduate software engineering courses at our university. We strived to ensure that our students’ learning outcomes were not impaired compared to what they would acquire in the traditional classroom model we initially planned. To do so, we needed to (1) help our students engage with our digital learning materials actively and efficiently, and (2) create effective digital learning content.
At the start of the crisis, we have researched and implemented the best practices and constructed an entirely online active learning classroom. We presented a detailed catalog of digital elements used in our transformed active learning classroom as one of the contributions of this paper. We hope it will inspire and guide educators striving to create an engaging and effective e‐learning course.
To evaluate our approach and provide further recommendations, we formulated two research questions: (1) “What are the factors that influence student engagement with a mandatory online course during a state of emergency?” (2) “What type of educational content do contemporary students most appreciate in an online classroom?”. To answer them, we have designed and conducted an empirical study based on surveys, focus group discussions, and instructors’ observations. We consider this study a contribution that can assist research in conducting similar empirical studies.
Finally, we have analyzed the collected data to identify the main challenges students face when switching to a fully online classroom and their opinion of the various digital elements we developed for our courses. We discuss these findings to provide recommendations, which we consider a significant contribution of this paper. We hope our recommendations can help educators build and run a more successful online course.
Enabling learners to study online is essential during a crisis such as COVID‐19. However, knowing how to develop useful and engaging online courses transcends this context. These findings can be used for designing blended learning classrooms, which allow for a more student‐centric approach in a traditional face‐to‐face setting. Moreover, learning to study online effectively is a vital skill for students. In today’s quickly evolving workplace, high‐valued workers can master new skills quickly and independently. Therefore, we need to prepare our students for the future of life‐long learning  not just by teaching them the subject at hand, but also the skills required for learning in the classroom‐less environment. The findings of this study may help educators understand which of these self‐regulating skills students may need help exercising.
Our empirical study participants considerably valued staff commitment and appreciated their effort to take a personalized approach to help students understand the subject. However, it was challenging for us to adopt a student‐centered approach in our large student groups. This problem was inflated in the online setting due to slowed communication via email. Students recognized the delayed feedback on their work as one of the primary deficiencies of the online classroom. Luckily, we can alleviate this problem by using digital technologies. With this goal in mind, we plan to develop a digital assistant for student‐centered tutoring specialized in teaching software engineering. This tool will integrate with the student’s development environment and offer real‐time feedback on their programming. This feedback will include one or more educational digital elements, where we will use a recommender system to offer a personalized learning experience. We expect this tool will be of great use to us on the three undergraduate software engineering courses featured in this paper. We hope it will be of use to other educators striving to adopt a large‐scale student‐centered approach in software engineering and inspire other researchers to expand on it or create similar tools for other fields.
This study was supported by the Science Fund of the Republic of Serbia, Grant No 6521051, AI‐Clean CaDET.
Nikola Luburić is holding the assistant professor position at the Faculty of Technical Sciences, Novi Sad, Serbia since 2020. Mr. Luburić received his master’s degree (2015) and PhD degree (2020) all in Computer Science from the University of Novi Sad, Faculty of Technical Sciences. Since 2014 he is with the Faculty of Technical Science in Novi Sad. His research areas includes software engineering, computing education, and security and privacy.
Jelena Slivka is holding the associate professor position at the Faculty of Technical Sciences, Novi Sad, Serbia since 2020. Ms. Slivka received her master’s degree (2008) and PhD degree (2014) all in Computer Science from the University of Novi Sad, Faculty of Technical Sciences. Since 2009 she is with the Faculty of Technical Science in Novi Sad. Her research interests include machine learning, data mining, and computing education.
Goran Sladić is holding the associate professor position at the Faculty of Technical Sciences, Novi Sad, Serbia since 2016. Mr. Sladić received his master’s degree (2006) and PhD degree (2011) all in Computer Science from the University of Novi Sad, Faculty of Technical Sciences. His research interests include information security and privacy, document management systems, context‐aware computing and workflow systems.
Gordana Milosavljević is holding the full professor position at the Faculty of Technical Sciences, Novi Sad, Serbia since 2020. Mrs. Milosavljević received her master’s degree (2001), and PhD degree (2010) all in Computer Science from the University of Novi Sad, Faculty of Technical Sciences. Her research interests include model driven engineering, agile development methodologies, business process modeling, and computing education.
Data available on request from the authors.
- 1, , , , and , Understanding learners’ motivation and learning strategies in MOOCs, Int. Rev. Res. Open Distributed Learn. 18 (2017), no. 3. 119– 137.
- 2 and , Blending MOOCs into higher education courses—a case study, 2018 Learning With MOOCS (LWMOOCS), IEEE, 2018, pp. 134– 136.
- 3, Project‐based learning for the 21st century: Skills for the future, The Clearing House. 83 (2010), no. 2, 39– 43.
- 4, , and , Make it stick, Harvard University Press, Cambridge, MA, 2014.
- 5, Statistical power analysis for the behavioral sciences, Routledge Academic, New York, NY, 1988.
- 6, , , and , What to expect, and how to improve online discussion forums: The instructors’ perspective, J. Internet Services Appl. 10 (2019), no. 1, 22.
- 7, Coronavirus (COVID‐19) and online learning in higher institutions of education: A survey of the perceptions of ghanaian international students in China, Online J. Commun. Media Technol. 10 (2020), no. 3, e202018.
- 8, , , , , and (2020). Assessing distance learning in higher education during the COVID‐19 pandemic. Education Research International, 2020.
- 9 and , Active learning in the college classroom, J. Excellence College Teach. 9 (1998), no. 2, 3– 24.
- 10 and , Learning and teaching styles in engineering education, Eng. Educ. 78 (1988), no. 7, 674– 681.
- 11 and (2018). What works and doesn’t work with instructional video.
- 12, , , , , , and (2020). A survey to measure the effects of forced transition to 100% online learning on community sharing, Feelings of Social Isolation, Equity, Resilience, and Learning Content During the COVID‐19 Pandemic.
- 13 (2009). Visible learning: A synthesis of over 800 meta‐analyses relating to achievement.
- 14 and , Applying multimedia instruction in e‐learning, Innov. Educ. Teach. Int. 43 (2006), no. 1, 15– 27.
- 15 and , Adapting teaching of a software engineering service course due to COVID‐19. 2020 IEEE 32nd Conference on Software Engineering Education and Training (CSEE&T), IEEE, 2020, pp. 1– 6.
- 16 and , CoVid‐19: Unmasking the new face of Education, Int. J. Res. Pharmaceut. Sci. 11 (2020), no. SPL1, 48– 53.
- 17, , and , Generative learning strategies and metacognitive feedback to facilitate comprehension of complex science topics and self‐regulation, J. Educ. Multimedia Hypermedia 18 (2009), no. 1, 5– 25.
- 18, , , and (2014). LIVE: An integrated interactive video‐based learning environment, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2014, pp. 3399–3402.
- 19, Focus groups, Annu. Rev. Sociol. 22 (1996), no. 1, 129– 152.
- 20 and , Critical factors for implementing blended learning in higher education, Int. J. Inform. Commun. Technol. Educ. 6 (2017), no. 2, 37– 51.
- 21, , , , , and , Chronicling engagement: students’ experience of online learning over time, Distance Educ. 40 (2019), no. 2, 262– 277.
- 22, , and , The COVID‐19 Distance Learning: Insight from Ukrainian students, Revista Brasileira de Educação do Campo 5 (2020), e8925.
- 23, Focus groups supporting effective product development Joe Langford and Deana Mcdonagh (Editors), Design J. 6 (2003), no. 1, 61– 62.
- 24, , and , Designing and developing video lessons for online learning: A seven‐principle model, Online Learn. 23 (2019), no. 2, 82– 104.
- 25, Modified blended learning in engineering higher education during the COVID‐19 lockdown—building automation courses case study, Educ. Sci. 10 (2020), no. 10, 292.
- 26, , , and , Conducting on‐line surveys in software engineering, 2003 International Symposium on Empirical Software Engineering, ISESE 2003 Proceedings, IEEE, 2003, pp. 80– 88.
- 27, Pausing principles and their effects on reasoning in science, New Directions Community Colleges. 31 (1980), 27– 34.
- 28, , , , , , and , Factors affecting the quality of e‐learning during the COVID‐19 pandemic from the perspective of higher education students, J. Inform. Technol. Educ. Res. 19 (2020), no. 1, 731– 753.
- 29, , and , Speakers and boards: A survey of instructional video styles in MOOCs, Tech. Commun. 63 (2016), no. 2, 101– 115.
- 30 F. Shull, J. Singer, and D. I. Sjøberg (eds.) Guide to advanced empirical software engineering, Springer Science & Business Media, London, 2007.
- 31, , and , Effects of small‐group learning on undergraduates in science, mathematics, engineering, and technology: A meta‐analysis, Rev. Educ. Res. 69 (1999), no. 1, 21– 51.
- 32, Using Likert type data in social science research: Confusion, issues and challenges, Int. J. Contemporary Appl. Sci. 3 (2016), no. 2, 36– 49.
- 33, and , Introduction to qualitative research methods: The search for meanings, Wiley‐Interscience, New York, NY, 1984.
- 34, The focus group: A strategic guide to organizing, conducting and analyzing the focus group interview, Probus Publishing Company, Cambridge, 1994.
- 35, A review of research on project‐based learning, The Autodesk Foundation, 2000.
- 36, , and , A study of student satisfaction in a blended e‐learning system environment, Comput. Educ. 55 (2010), no. 1, 155– 164.
- 37, , , and , Instructional video in e‐learning: Assessing the impact of interactive video on learning effectiveness, Inform. Manage. 43 (2006), no. 1, 15– 27.