Replies: 2 comments
-
Quoting the meta-analysis from Sischka et al (2020), page 14:
Kmetty & Stefkovics (2022) report that there is a difference between an explicit non-responsive answer option and a skip option (p667):
This suggests that there is a need to consider the design of available experiments with regards to the replication of human refusal and dropout behavior. If it is desirable to do build such a feature, then I reckon that a good place to start would be:
It would then be necessary to test whether real-world refusal rates are more closely mirrored by having include_comment made mandatory or optional, across a variety of models and agent demographic types, and whether inserting an instruction that the mandatory/optional comment will not be shared with the researchers makes aggregate responses rates more or less human-aligned. This would be similar to the <thinking> tag that was used to such extraordinary effect in Meinke et al, 2024 (preprint). |
Beta Was this translation helpful? Give feedback.
-
Thank you @iwr-redmond ! This is really interesting and helpful. I'd love to include some guidance in docs around any options we add, and call out these areas for investigation. |
Beta Was this translation helpful? Give feedback.
-
For example, a user may want to allow a non-responsive answer such as "I don't know" to a numerical question.
Currently, you can use an optional question parameter
answering_instructions
(or the question text directly) to instruct the agent to return a specific response that will be validated for the question type (e.g., "Return '9999' if unknown."), but this is not ideal because the format is constrained.This feature is probably unnecessary for multiple choice, checkbox or linear scale (labeled option) questions. But it could also be useful for free text questions to avoid any extraction steps.
Beta Was this translation helpful? Give feedback.
All reactions