Research In Action
Research In Action
It’s not just what you say, but how you say it that matters. This may be particularly relevant for written feedback since that allows someone time to review it repeatedly. Even still, there hasn’t been much research into the design of performance feedback reports for young drivers.
With some driving simulation systems, drivers can get an automated performance feedback report. This is particularly important for novice drivers, who are at their highest lifetime crash risk. If these reports are effective, they could be used to supplement driver education and training and improve novice driver safety.
How to Make These Feedback Reports Effective?
This led us to wonder what design features resulted in a feedback report being easy to use and understand while motivating for improvement. Given the lack of research in this area, our team of researchers conducted an online study of report design features.
From working with Dr. Ellen Peters, an academic expert in science communication, we knew that we had to keep participants’ ”number literacy” in mind because it can impact the way information is perceived. So, we developed variations of a feedback report that tested the impacts of:
- Performance summary (present or absent)
- Action plan length (short 4-item plan or long 8-item plan)
- Action plan grading method (numeric grade, letter grade, or combination),
- Action plan order (best-to-worst performance, worst-to-best performance, or by item importance)
- Peer comparison (present or absent)
We tested these different reports in over 500 young adults (ages 18 to 25 years old) and asked them questions about how easy the report was to use and understand, as well as if it motivated them to improve their driving skills. Overall, we found:
- A summary motivated them to improve their driving skills.
- A short focused action plan increased their understanding of the report.
- A short action plan by itself increased difficulty of use, while pairing it with a summary made it the easiest to use.
- Action plan items ranked from worst to best performance were the easiest to understand.
- Reporting grades in only numbers was easiest to use.
- Peer comparison did not significantly affect participants’ motivation to improve.
- Participants with a greater understanding of numbers found the reports easiest to use.
A Design Plan for the Future
Taken together, the study findings recommend a design that includes a performance summary and short action plan for an easy-to-use and easy-to-understand feedback report that motivates participants to improve their driving skills. A design that includes the main areas for improvement first and a numeric grading scale also appeared to be the most useful.
The study did have some limitations, including a small sample size, limited age range, and use of reports that were not personalized. Additionally, future replications could use standardized measures, such as the System Usability Scale, instead of ones created specifically for the study.
We hope this research helps others to create effective feedback reports and to test additional features and age groups to continue improving the reports to be the most effective.