How AI is Shaping Online Discussion Quality

Adam Fein
4 min readNov 5, 2019

Discussion forums have been a key feature of online and technology-enhanced courses for over 20 years. There is a breadth of information and best practices guidelines on ways to use discussion forums in online learning environments. Online discussion is an important area of research in both the scholarship of teaching and learning and educational technology. Unfortunately, it presents several hurdles to successful implementation. To spark mutual communication among students, the instructor must be present but not invasive, craft protocols that require dialogue, provide a schedule for posting and, even at this point, successful student engagement requires that students regularly access the discussion platform.

Artificial Intelligence (AI) based technologies, such as the AI-driven discussion platform Packback, have been deployed in large high-enrollment classes to address a need for reducing instructor and grading workload while maintaining the perception of an instructor-student interaction, which is a key successful ingredient for student participation and learning.

At the University of North Texas (UNT), we are conducting research to investigate the influence of an artificial intelligence (AI)-driven discussion platform (Packback) on student learning outcomes and instructor workflow. Packback’s approach to online discussion implements a discussion protocol based on the Socratic method of questioning, assigning students a ‘curiosity score,’ and creating a newsletter of featured posts. It also includes automated moderation of posts that allow instructors to focus on higher level coaching. The platform has these and other features that mimic a social networking site as opposed to the traditional discussion forums in a learning managements system (LMS).

As part of our research project at UNT, we are comparing discussion experiences with the same courses and instructors but with different platforms for online discussions: the Canvas LMS for the control condition and the AI-driven Packback for the experimental condition. We also administered preliminary surveys of student perceptions and developed measures of post quality and mechanisms to assess instructor workflow. The pilot study conducted in the summer of 2019 has served as proof of concept for the main study that we are currently conducting in the fall semester. 248 students were enrolled in the three courses, of which 129 students used Packback and 119 students used Canvas. The three online courses on which we conducted a summer pilot study were an undergraduate political science course, taught by Dr. Jackie DeMeritt, an undergraduate contemporary biology course taught by Dr. Rudi Thompson, and a graduate learning technologies course taught by Dr. Lin Lin.

The prompts and expectations for posting differed in each of the courses, but in all three courses students cited significantly more sources when using Packback compared to students in Canvas discussions forums (44% of all posts in Packback included a cited source, compared to 14% of posts in Canvas). In post editing mode, Packback offers guidance on sustaining ones’ claims such as adding an image, video, or text-based resource to give the reader what they need to know to respond to the original poster, as well as a space to cite the source of the resources used. Our summer pilot indicates that this guidance is effective in creating posts with claims that rely on resources other than the poster’s opinion, and in providing a citation for the resources used.

Further research is ongoing at the University of North Texas NetDragon Digital Research Center in our Division of Digital Strategy & Innovation and Center for Learning Experimentation, Application and Research (CLEAR). We included additional survey questions for listening (i.e., reading posts) and learner types as well as focus groups to examine the impact of the instructors’ and teaching assistants’ interventions on quality of student posts and learning outcomes. Possible further opportunities to enhance research on student performance and engagement include providing more support for different reading strategies by students to improve learning outcomes, providing a dashboard view of student participation to encourage students, validating resources cited in posts, decreasing group size for online discussion, and collecting data on listening behaviors.

— —

July 2020 Update:

We covered above the findings from Fall 2019 that highlighted quality of submission improvements.

  • Quality of submission. The prompts and expectations for posting differed in each of the courses, but in all three courses students cited significantly more sources when using Packback compared to students in Canvas discussions forums (44% of all posts in Packback included a cited source, compared to 14% of posts in Canvas).

Follow-up research in Spring 2020 elucidated an additional benefit of the Packback AI-based discussion board:

  • Prioritization of high utility tasks. Data show that AI can help remove repetitive administrative tasks (particularly daunting in high enrollment courses), including complete/incomplete grading and board moderating. Allowing machine grading and moderation allows instructors (humans) the ability to spend more time interacting with students, coaching and praising them. Preliminarily: instructor coaching seems to work. Discussion posts improve significantly after having being privately coached by an instructor or TA. This was evident in all three courses: biology, polisci, learning tech.

— —

We would love to partner with other institutions for collaboration! Researchers interested in discussing or replicating this study can visit our digital research center site at netdragon.unt.edu or contact Cassie Hudson (cassie.hudson@unt.edu) or Dr. Tania Heap (tania.heap@unt.edu) at the University of North Texas for more information.

--

--

Adam Fein

Adam D. Fein (PhD, Illinois) is the VP of Digital Strategy & Innovation at the University of North Texas. His research examines multimedia learning performance.