Honest review of Packback: The classroom AI discussion board

I used the Packback application for a gen-ed class of 100+ students for a Fall 2021 in-person class. Honest review of the learning tool and if I would use it again.

Full disclosure: I am not being paid and this review was not requested by Packback.co

Scenario: I used the Packback application for a gen-ed class of 100+ students for an in-person class in Fall 2021. I had also had a TA moderating discussion board content weekly. I wanted to provide other educators a short summary of my experience to help them make a decision about using the tool in their classroom.

Summary: I would use it again, but only for large classes. Because of its cost ($28 per student at my university), I would only use it classes where I have no textbook. I would also reduce responses to every other week as opposed to every week. It’s a superior tool to in-system LS discussion boards, but there were issues with syncing, usability, and content moderating that will frustrate instructors who are not prepared for them.

In my ideal world, I’m against AI providing feedback for students. In my perfect universe, classes are small and we have well-paid TAs aiding us in the fine craft of teaching. Students write thoughtful essays, and receive and implement copious amounts of feedback on their writing. In today’s higher-ed environment, that’s just not going to happen. There is a significant learning curve to this tool, and if you don’t like discussion boards, Packback is not going to change your mind. I believe there are contexts where it may be useful.

Why did I use it? I disliked Canvas’s in-house discussion boards and tools like FlipGrid, which uses video recordings for responses. I did not have exams in the class, and wanted weekly responses to the readings to be more robust and substantial. I wanted my students to write regularly and weekly to reinforce what they were learning in class.

What is Packback? Packback is basically a rolling discussion board feed. It’s a single discussion board, as opposed to a forum-style where posts are sorted by topic, very similar to Facebook or Twitter. Students receive a post score based on on the length and complexity of their post submission and whether or not they cite a source. When I surveyed my students after 8 weeks of use, 59% (51) said they liked Packback, 25% neither liked it or disliked it, and 16% disliked it.

Pros: This tool gives instant feedback for students on aspects of their writing and prompts them to write more. I set it so that students needed a minimum response score of 50 (out of 100) to get credit for the assignment.

Students reported liking the feedback and the points. As long as the points were above 50, the grade was auto-graded based on completion, and my grader moderated a subsection of responses per week. Students liked trying to beat their previous scores. Everyweek, Packback sent me auto-generated report with the top scorers and most-improved, and there was also a leaderboard based on the points. In class, I would announce students that boosted their scores up from the previous week (most improved).

In terms of support, Packback had excellent tech support and guided me through every single step of the process from set up to finishing the course. Through secondary meetings with their team, I saw more examples about how other instructors used the app, which resulted in students creating memes related to the weekly readings. The app is structure to promote open-ended questions, which can be difficult for everyone to get used to. It did help students structure arguments or make interesting salvos on the discussion board.

If a student response met the minimum, but was off-topic, my grader would moderate it. There was a way for students to re-do their submissions, but few students took that opportunity.

Cons: The AI is remarkably easy to manipulate. I didn’t have this problem in my class as much, but in other classes students could copy and paste from previous students’ responses and the AI can’t detect the plagiarism. Because of the scrolling format, it was hard for my grader to spot it as well. The only way to moderate something is to give the student 0 points for it, which is problematic because if students wrote something, I normally wouldn’t give a 0.

Students receive a higher score for including a source. The tool, however, does not vet the sources, so student can just post any old website, and it counts as a source. This makes it difficult to look at all posts by prompt. This was especially annoying for my grader, who found the tool initially difficult to navigate to grade and moderate, but eventually got used to it. In terms of costs, it cost each student $28 to use this service, and students also had to pay for $15 subscription to iClicker that semester as well. I did not have a textbook for the course. A total $45 is not terrible in terms of cost for a class, but I would prefer to keep my students’ costs below $25. Also, something feels inherently wrong that students are not paying for content (i.e. textbooks) and instead for learning apps.

Finally there was an issue with syncing with Canvas. If students didn’t click through the Canvas site first, their response wouldn’t be linked to their account. This was extremely confusing for me and for the students. They would sign into the website directly, submit their response, and then it wouldn’t show up on their grades. I was able to fix this issue by only posting the prompt in Canvas.

Where would I use this again? Only for large, online courses where there is no textbook. The auto-grading is really nice, and the students like the AI feedback and leaderboards. It’s not perfect, though, and discussions need active moderation of content. With large in-person courses, the audience response system subscription is more important for attendance and participation, and I would prioritize that when I’m balancing student costs.

For more information about the product: packback.co

%d bloggers like this: