Last year, when I was just beginning to think about authorship verification, I was a student at the University of Maryland’s Smith School of Business. As a student, a few things were clear:
- pen and paper had very little use in the actual submissions we made for grades (everything was submitted through the learning management system.
- the policies in my classes related to academic integrity varied, even for our tightly coordinated cohort based program
- Artificial intelligence had risen so quickly and broadly (in October/November ’22) that the world had changed but institutions and classrooms were struggling to react to the changed environment.
While designing a new tool for authorship, understanding the student perspective has always been one of our top priorities. We’ve actively sought feedback (and contributions) from students through internships (thank you Lance, Grayson, and MJ!), directly through customer discovery interviews, surveys on campuses, and demos with current students. Over those 100s of interactions, we’ve learned a lot.
Here’s a quick summary of what we’ve learned about student’s, their thoughts on writing, and their experience using Cursive.
In a recent survey conducted by Cursive to over 100 university students, we asked students to describe the best possible writing app. While I can’t share all of the dozens of ideas here, what I can say is that students expressed an almost universal interest in features that would help them become better writers. They were aware that they might have deficiencies and eager to remediate them.
Many students focused on how AI might help them improve, such as offering feedback on audience, tone, and clarity.
On that same survey, we listed six features of a hypothetical “Writing Improvement App.” 84% of students ranked “Feedback (either from a person or AI generated) as important to a student facing tool.
While many teachers are providing great feedback to students. We also know that many teachers are struggling to find the time to provide feedback to students on their written assignments.
We know that personal data is sensitive and that students these days are tracked across their digital lives. Our goal is to track and capture the minimum necessary data and to do it as innoculously as possible. When we talk with students about key log data, it’s surprised us that students reactions are some variation of “wow that’s cool” instead of “ew that’s creepy.”
In user testing, we always to ask students how they feel about biometric verification and their interactions with the app. We’re excited that student reception is positive and they’ve shared (anecdotally) their distrust of being on camera or sharing location data.
Utilizing the keyboard is just less “invasive.”
Whether the data capture bothered them? “I didn’t think about it at all. It was no different than how I usually write in class.”
Whether being able to see their Authorship status mattered: “It’s good to see the green check. It just helps give confidence that the system and process are working to verify writing.”
At Cursive, our goal is to create something that’s not only student-friendly, but that benefits students.
Our work in user testing and gathering feedback from students and teachers will never end; but we’re already very excited by what we’re learning. We look forward to each and every conversation we have. We’re actively testing and talking with students and continue building transparency directly into the authorship tool we’ve created.