Aiding engagement and interactivity in online classrooms.

nimisha jain
14 min readMay 30, 2021

--

Edoore: Student Engagement Score

Introduction

In recent years, online education has exploded, with the industry projected to reach $370 billion by 2026. The rise of online learning is not just in post-secondary education. In the 2017–2018 school year, 59% of public high schools offered at least one online course. But online education suffers from a lack of engagement and interactivity. Many of the students have sat through an online class, zoning out, multi-tasking, and browsing social media. This issue was compounded by the pandemic, which forced over 4 million K-12 classes online.

In March of 2020, schools all around the country and world made the decision to transition to remote learning in light of the dangers of Covid-19 with in-person learning. While necessary to protect the health and well-being of both teachers and students, the abrupt transition left school systems struggling to adapt to online education. Traditional methods of interaction and gauging classroom engagement fell short with the difficulties inherent in distance learning. Teachers across the country made the best they could do with the tools they discovered on the internet, but there is a lack of online education resources designed for K-12 online learning.

For understanding the real challenge

Partnered with a High School (name cannot be disclosed) to conduct user research and test with teachers. It is a wealthy, single high school town near Boston, MA. The school was selected to partner with for both convenience, as one of the researchers had a pre-existing relationship with the principal and school, and because there were relatively few confounding factors when trying to understand the relationship between in-class and online learning. All the teachers at Hingham High taught traditional, in-person classes before March of 2020. During the pandemic, the school transitioned to asynchronous learning in the early spring, followed by synchronous online learning and hybrid learning starting in the fall of 2020.

The statistics of increasing online education

The research was conducted to understand where teachers and students were facing the largest challenges in online education, especially in regard to measuring and increasing engagement in the classroom. Using the human-centered design, outlined in four phases: user research, concept design, detailed design, and testing, the product was designed, called Edoore.

Team and Timeline

As a graduate student, I teamed up with another student and collaborated closely (remote) in the entire process

My role: User Interview, Ideation and Concept Design, Detailed Design and Prototyping, Think Aloud Usability Testing, Expert Review, Final Design.

Timeline: As a capstone project while working towards the Master’s degree, we spent around 6 months (2 Semesters) in total working on this project.

The Process

User Research

Background Research

Background research was conducted to understand the definition and importance of engagement and interactivity in online high school classrooms, as these were the two factors affecting the student's performance.

It was found that student engagement can be demonstrated by how the student interacts with others and their motivation, with five key benchmarks:

  1. Active and collaborative learning
  2. Student effort
  3. Academic challenge
  4. Student-teacher interaction
  5. Support for learners.

And, teacher engagement can be described as the ability of teachers to positively impact student engagement through facilitating interaction, organizing and designing course materials and timelines, and instructing students

For promoting the engagement, the following two factors were proven to be vital:

  1. Supporting Discourse
  2. Setting Climate

Benchmarking

Benchmarking was conducted to understand what tools were available in the online education and conferencing space.

  1. Learning Management Systems for managing student data
  2. Video Conferencing platforms for conducting classes
  3. Poll/quizzes for quick assessments or feedback
  4. Engagement tools for keeping class engaged and fun
  5. Collaboration tools for working live in teams and tracking work

To conduct actual classes, most teachers turn to Zoom or Google Meet. While Zoom and Google Meet are excellent at handling basic meetings, the added functionalities such as whiteboards, polling, and hand-raising are clunky and largely ineffective. Furthermore, Zoom has serious security issues, as data is not encrypted and if a meeting is not password protected, anyone can join.

User Interview

User interviews with teachers were determined to be the best way to gain in-depth insight into the daily challenges teachers face, what they have discovered, and what they need.

  • 6 teachers from one high-school
  • 60–90 minute video conference
  • 28–60 years of age
  • Backgrounds: math, science, foreign language, and English literature
  • Taught in-person before the pandemic
  • Probed areas like teaching background, current online teaching, preparation, and differences between in-person and online teaching.

Teachers provided pictures of their teaching setups, example planning, and lesson documents, and assessments they were using to gauge student learning. These additional materials gave a greater understanding of the physical environment, teaching style, and behaviors.

Interview Questions and Online teaching set-up shared by a chemistry teacher

Below are some words from the teachers…

It would be great if I could listen and see what’s happening in a session secretly or walk around and check each session for 10 sec

It’s hard to know how disengaged they are.

I don’t need a hundred apps. I need 5 that work really well for what I do

Because I can’t be in every breakout room at once and there’s nobody actually either monitoring them or, you know, supporting their, their conversation

Student Survey

Student survey helped in triangulating teacher statements on student activity, solicit student opinions on key areas identified in teacher interviews, and ensure the voice of our secondary users was heard.

  • 56 responses
  • 25 questions, 5–10 min survey, via Google form
  • Shared to students through teachers
  • Triangulating teacher statements
  • Open responses on- engagement, assessments, interaction (verbal or non-verbal), overall opinion on online learning
Survey questions with categories

Below are the findings from the survey…

My social skills are getting worse and worse and it’s getting to harder to talk to people since I haven’t in such a long time.

During the important parts I am definitely listening but during times when I feel like “oh this isn’t important” i’ll be on my phone and just moving around.

Being able to do anything you want and having no consequences literally

Students missed socializing the most

Empathy and Analysis

Affinity Diagram

The findings from interviews and surveys were categorized and sub-categorized into the following-

  1. Teaching style- despondency, training, teacher life, teacher learnings, teacher flexibility, lifestyle benefits of online teaching
  2. Technology- technology time waste, the technology used, good with technology, technological desires
  3. Academic issues- student responsibility, safety & liability, special education IEP, rigor, fear & failure, testing
  4. Interaction and Engagement- feedback, communication, creating relationships, collaboration, tracking engagement, classroom behavior, efforts to engage.

Below are the shots and words from teachers and students, categorized.

  1. Teaching lifestyle:

Its eliminated all the best parts of teaching

Teachers are used to “running our kingdom the way we want to

District preparation: “our school, they tried” (seemed unhappy)

Category 1: Teaching lifestyle

2. Technology:

Lots of stuff already exists, but have to make usable

We’re in bad shape if I’m the one leading the understanding of technology

Category 2: Technology

3. Academic issues

They are young and not that motivated to be honest

I realized, I need to “focus on the minimum”

Category 3: Academic issues

4. Interaction and Engagement

I try to pull in bells and whistles to track them, make it interactive with quizzes and discussions.

They can’t chit chat and you can’t chit chat

something about being on video camera inhibits them

Category 4: Interaction and Engagement

Personas

The primary persona is a teacher, Lizz. Teaching is Lizz’s passion, and she has 6 years of experience. Since moving to online teaching, she misses one-on-one interactions and making meaningful connections. She is concerned her students are forgetting how to discuss and engage with others. To combat this, she tries to use tools to create more interactive lessons but does not know if it is working, as all she sees are some mute faces on the screen.

Primary Persona: High school teacher

The secondary persona is that of a student. Mike is a sophomore in high school who is technologically savvy and misses having fun interacting with his friends at school. He is easily distracted by all the technology around him during class such as his phone. He wants to do well in school but does not learn well when just listening to a lecture on Zoom.

Secondary Persona: High school student

Ideation and Concept Design

Brainstorming Ideas

  • How Might We (HMW)
    Each statement was drawn directly from the affinity diagrams and rephrased statements from teachers into questions to be answered. Over 60 how might we statements were created.
    Using the statements as a guide, brainstorming was done individually, with ideas categorized under the how might we question. Then these ideas were discussed, vision clarified that prompted new ideas.
How Might We and idea generation

From there, a selection round was conducted to narrow down ideas to those that were feasible according to the resources available and showed promise. These ideas were further refined and sparked even more ideas. At the end of the brainstorming process, there were over 150 individual ideas.

Idea Pew Chart, narrowing down the ideas based on the factors (column heads).

The gamification in the table was conceptualized in two ways:

  1. In-class gamification of lessons: To gamify lessons and engage students in activities during class, the content was the major factor. Teaching style, grade, and subject also needed to be considered. The teacher would either create games with the content, using pre-set templates or use the pre-existing games. These games would allow students to play, learn, and be assessed.
  2. Post-class gamification: involved cascading existing content of lessons, fixing rewards for each assignment, and motivating students to learn, finish their homework, and unlock the next level or module. The idea was to keep the student motivated and engaged in the class, while the teacher would keep adding content and quizzes for each module, primarily using the existing content.
  • Sketching and storyboarding
    Storyboards were created to further refine the ideas and share them with 4 teachers in a follow-up meeting and qualitative in-depth feedback was collected.
    Feedback exposed a major issue in the gamification of lessons- teachers do not want to add to their work by re-creating content into games. With an already increased workload, individual teachers lacked the time and expertise to do so. To make pre-existing content work for every teacher, a massive library of content would be needed, a project both out of the scope of the class and infeasible without at least state-level buy-in. One teacher explained that each student is different, and teachers know their students, they just wanted some help from the technology to conduct the activities and keep track of the student’s engagement.
Storyboards, sketching, and doodling for idea generation and sharing within the team.

Eliminating the unwanted, relevant ideas were kept and new features were added to the concept. This led to the storyboard as below, which marked the final idea.

Final idea storyboard
  • Wireframing
Initial Wireframes

Iterations and Detailed Design

User flow and features

The user requirements were updated a final time, and features ideas of the system were iterated and mapped to requirements to ensure user requirements were met by the prototype, these were documented with the sources as below for keeping the team on the same page.

Sample list of issues and requirements mapped with features of Edoore, along with the sources of the issue.

When the detailed feature was finalized and categorized, it looked as below:

The categorized issue to final feature mapping

Visual Design

The focus was primarily on e-learning requirements and psychology with colors as per the research,

  • The white background was chosen to keep the interface lively and the users awake while conducting and taking classes.
  • The shade of turquoise was the primary color that brings in natural freshness.
  • Popping colors helped in grabbing attention, so using bright colors with their status psychology made sense (green for good and red for areas that needed attention).
  • Icons from Material.io which is a design language developed by Google for web and mobile applications. These icons were chosen because it is the most popular and open design system currently in the digital world, and as Edoore was expected to be integrated with existing Learning Management Systems, the design needs to be aligned with these systems.
  • Round and filled icon styles are proven to be better than sharp ones because they are easier on the eye, making information processing easier and they are also easily recognizable on light backgrounds.

Once the designs were developed, they looked as below,

The screen for teachers during the live class
Teachers Dashboard, for post-class analysis
The screen for the student during the live class

Usability Testing and Feedback

Think Aloud

The prototype was evaluated through remote user testing. The user testing was a combination of interviews and think-aloud usability testing. More qualitative methods were chosen because the higher level of detail allows for a smaller sample size. Smaller sample size was desired due to the difficulties of recruiting, scheduling, and conducting remote usability testing.

The teachers were provided a list of sixteen tasks, eight in class, and eight dashboards, and asked to complete the tasks on the prototype while sharing their screen. The student user group had six in lecture and three dashboard tasks, a total of 9 tasks in this exercise

The mistakes, hesitations, confusions, and excitements were notes and probed on to uncover the details and thoughts of the users.

Positive (green) and negative (red) comments from the user during think-aloud usability testing. Grey ones were considered to be out-scope as per the resources and other research data.

The user comments were collated, combined with the observations and each issue was assigned a severity and complexity.

Severity: ranged from low, meaning the issue caused no confusion or loss of functionality, to high, meaning there was confusion or loss of functionality that prevented the user from seamlessly completing the desired action.

Complexity ranged from low, or issues with a known easy fix, to high complexity, meaning the solution was unknown or involved research and development of additional functionality.

High severity issues were addressed first, with low complexity issues following.

Categorized issues based on the level of severity and complexity

System Usability Scale (SUS)

After completing the assigned tasks, the users were asked to fill out a system usability scale (SUS) survey for the dashboard and in-class functionality separately. The SUS was chosen due to its ease of administration and robustness with small sample size and that it works well with prototypes even without true functionality, revealing insights about the design.

A question from SUS Survey

To calculate the scores, the raw scores were adjusted by taking the average of individual scores and then subtracting that number from 5 for the odd questions and subtracting 1 from the even questions. This normalized each question to score out of 4, with 4 being the most positive response. The individual question scores were then added together and multiplied by 2.5 to calculate the final SUS score.

SUS score of Edoore and where it lies as per the SUS standards

Market Expert Review

For a review of the idea, approach, and market compatibility of Edoore, the team reached out to the Global Head of Education at a renowned company. Meeting virtually, walked through the idea, prototype, research, and thought process behind the solutions.

It was found that Edoore has some great potential to stand out in the market as it is targeting very specific areas that the customers are facing, and the market is lacking. In fact, some companies, are currently developing similar platforms, with the increase in online education in the past year due to Covid-19, diverting their research in similar areas.

Edoore holds a distinct advantage in that it was developed from user needs, allowing it to provide targeted solutions to very specific problem areas.

Final Designs

Design changes

Icons fixed
  • Fixing unclear use of icons. The inactive state seemed like the active.
  • The copy was fixed to match with what the users were familiar with.
Copy fixed
  • Hints/tips and statuses were added for error prevention and revealing system status to the users.
Added tips and system statuses

Interactive Prototype

Prototype video with functionalities elaborated for teachers (primary persona)
Prototype video with functionalities elaborated for students (secondary persona)

Conclusion

Learnings

The project was a fantastic opportunity to learn and grow both as a designer and a researcher. Applying the Human-Centered Design Process that was learned in the earlier semesters, validated the knowledge, and helped us successfully pass through the challenges that were encountered while working on the Capstone Project.

Specifically, I learned to gather data, creating surveys for broader studies, triangulating, and empathizing with the real users having real problems. I also learned the importance of initial feedback and iterations. This helped in getting to the real problem, removing all the misunderstandings, filling up the gaps while ideating and brainstorming for the solutions.

Challenges and Future

Due to lack of resources and the pandemic, we were unable to reach out to a broader user base that is diverse in school demographics, location, and modality. Expanded research and the team can help learn more, iterate the design, and make Edoore usable for the majority of schools in the nation or globally.

Initial hard-coding can help in improving the interactivity, test in real class environments, address technical requirements, and test nuances of interactive events such as hand-raises and responses from the teacher in real-time.

From the current usability study of Edoore, it was found that the Engagement Metrics needs to be defined properly, providing teachers more data, and flexibility to use the data for tracking each student’s progress in various areas.

Limitations mapped with future plans for Edoore

Thanks!

Thank you for reading the case study, please comment or reach me out for any suggestions. Feedback is greatly appreciated.

Special thank you to Aaron Lytle, Dr. Sang Hwan Kim, and everyone who supported and helped me while working on this project.

Return to my portfolio for other work samples,

--

--