If It’s Safe, Don’t Ask: Decreasing Frustration through User Involvement for Risky Robot Behaviors

Leusmann, Jan and Schömbs, Sarah and Diedrich, Maximilian and Müller, Florian

Abstract: Human–robot collaboration faces a speed–accuracy trade-off (SAT): higher speed lowers latency but increases errors; lower speed improves accuracy but extends waiting time. Both pathways can frustrate users in research and real-world deployments. Despite this importance, the impact of SAT on frustration and how to mitigate it remains underexplored. We conducted a user study (N = 24) in which participants collaborated with a robot in an assembly task. We investigate three levels of SAT (conservative, moderate, risky), and examine how uncertainty communication and offering decision autonomy affect user frustration. Our results show that user frustration is highest for risky robot behavior. User involvement decreased frustration for risky behaviors, but increased it for conservative ones, while verbal uncertainty communication had no effects. We further found that perceived transparency, agency, intelligence, and utility of the robot increase with conservative SAT, with user workload decreasing. We propose that user involvement is advisable in higher-risk settings to mitigate user frustration, whereas autonomous operation is preferable in lower-risk scenarios.