A “failed” experiment (Ross & Vallée-Tourangeau, 2021) tried to reveal the role played by materiality in solving an insight problem that made reference to embodied action, leading to valuable insights about the nature of cognition and the experimental method. In this commentary, we argue that this study reveals various forms of interactivity and brings new evidence against the idea that “pure” cognition can be isolated from either materiality or sociality. The question becomes, then, not whether the use of objects helps or hinders problem solving, but how objects, bodies, and other people participate in it, even in controlled lab settings, and to what effect. Reflections are offered on why and how cognition stays wild (i.e., embodied, dialogical, and surprising) and what this means for experimental work.
Keywords: cognition, problem-solving, interactivity, materiality, sociality, experiments
Ross and Vallée-Tourangeau (2021) distinguish first-order (hands-on and interactive) and second-order (cognitive and abstract) problem solving and set out to experimentally demonstrate the neglected superiority of first-order problem solving for certain creative tasks. To this end, they use the socks problem: “If you have a drawer with brown socks and black socks mixed in a ratio of 4:5, how many would you have to pull out in order to guarantee a pair?” They expected, with good reasons, that participants who interacted with socks would be better at resolving the problem compared to participants who only imagined interacting with socks. However, their results revealed an inconclusively small difference between the conditions in terms of the number of participants who solved the problem and the time taken to do so.
The finding of this ostensibly failed experiment is insightful and richly developed by Ross and Vallée-Tourangeau. They note, with the power of hindsight, that they had been overly focused on the interactivity with the socks while failing to consider the multiple other dimensions of interactivity within the research design: interactivity with the instructions, internal dialogical interactivity, interactivity with the researcher, and interactivity with any available resources. Indeed, even the interactivity of participants who had socks was uncontrollable, with some upending the sock bag and getting distracted by counting the socks. They conclude that interactivity permeated both conditions, and this realization has broad implications for experimental research that tries to control interactivity. Interactivity, it seems, is so fundamental to humans that it cannot be experimentally isolated.
To further illustrate the point, there is yet another layer of interactivity that was not considered by Ross and Vallée-Tourangeau, but which is evident in their data, namely the larger dialogical context of what it means for participants to be in a “psychology experiment.” Few contemporary participants are engaging in their first experiment, and few have not heard about experiments with confederates, deceptions, and creative twists. Thus, another dimension of interactivity is that participants know that the experiment is a game and that the experimenter may have hidden motives. They expect a trick, and this makes the seemingly simple instructions seem puzzling. We see this in their data. Participants 41, five minutes into the task asks: “so I have to pull out a pair?” (5:21). Participant 26, nearly two minutes into the task states: “I’m so confused by the question” (1:45). Yet, the instructions given to participants were ostensibly straightforward. We suggest that what these participants are confused about is what they are “meant” to do (i.e., what is the experimental context around the text of the instruction). Participants are thus in a mental dialogue with the genre of the experiment. They are unsure what the experimenters were looking for and whether even the question itself was a trick. In short, the experimental situation has layers of interactivity that go far beyond the situation of waking up in the morning and trying to find a pair of matching socks in a sock drawer.
This pervasive interactivity within the experiment brings us to the “rewilding” metaphor used in the article. Their choice to introduce materiality (intentionally) and sociability (unintentionally) into the experimental design is the equivalent of opening the door to the “wilder”, harder to measure and control, elements of cognition. Reminiscent of older notions of “cognition in the wild” (Hutchins, 1995), where sanitized laboratories are replaced by messy, interactive settings of action and interaction, the use of “wild” here and elsewhere raises a legitimate question: did we ever manage to domesticate cognition in the first place? Did we achieve experimental designs that can, at will, cut off the mind from body, others, institutions, and culture (to name but a few “unruly” elements), in order to study “pure” thought (or what Vallée-Tourangeau and March in 2020 called second-order problem solving)? Decisively, no. We can certainly create methodologies that foreground abstract thought and place bodily movement, object manipulation, and dialogues with others into the background. Still, the influence of the latter can never be discounted. The fact that participants don’t visibly use their hands to tinker with objects doesn’t mean that they are any less embodied or that material forms of engagement have no part to play in their ongoing mental operations. Similarly, being left alone to solve a task doesn’t make that particular moment or context asocial. Inner forms of dialogicality are ever-present—the other is always there, just like bodies are. The question is, then, not to try and compare “wild” and “tame” forms of cognition, “muddied” and “pure”, distributed and internal, in terms of the participants’ performance in an experiment. A more interesting question is the one Ross and Vallée-Tourangeau raise: how exactly do materiality and sociability participate in cognition (here, problem-solving) in ways that either accelerate or hinder performance?
Sociality, we suggest, contributes to creativity by expanding possibility through introducing alternative perspectives (Glăveanu, 2020; Zittoun & Gillespie, 2015). While creativity tasks are often based on divergent thinking, insight problems call for a better balance between divergent and convergent thinking. They also depend on social-psychological processes like perspective-taking (Glăveanu, 2015) that connect “creators” to their audiences. In the context of the study, the most direct audience was the experimenter and as such, it is not surprising that participants tried to propose, clarify, or specify their perspectives directly to her. This reaching out to the other not only serves the purpose of reaching the most suitable outcome, but it is also an engine for creative ideation. Creative insight is not the personal or internal moment most imagine it to be. It is, in fact, the result of past and present dialogical experiences in which the perspectives of the self are placed in dialogue with those of others (and the more diverse these others are, the more likely it is for participants to reach more creative solutions; Gassmann, 2001). The experiment reported by Ross and Vallée-Tourangeau was not designed to allow for creative outcomes but to end with a numeric answer, the number of pulls needed to reach a pair. It is interesting to imagine if allowing the participants to have a more playful and social approach to the problem (e.g., being in dialogue with other participants or the experimenter) might itself lead to more creatively productive aims.
The pervasive interactivity across both experimental conditions does not mean that Ross and Vallée-Tourangeau’s initial guiding research question was mistaken. There is a lot of evidence for humans having two types of cognition, as evident in the many “dual-process” models of cognition. It is evident in Vygotsky & Luria’s (1994) distinction between the mental functions humans share with other primates and the symbolically mediated mental functions that seem more peculiar to humans. And, more recently, it is evident in the distinction between thinking fast and slow (Kahneman, 2011). These two modes of thought seem pervasive in human cognition, not just creative problem solving. However, this does not mean they can be experimentally separated with ease. A mind capable of both first- and second-order problem solving is likely to leverage both within any real problem solving episode (why limit oneself to only one cognitive process?) The creative mind is characterized by movement within thought and between styles of thought (Gillespie & Zittoun, 2013). In the first-order condition, participants may close their eyes to conceptualize the problem abstractly. In the second-order condition, participants may simulate the socks using pen and paper or their mind’s eye. In short, in cognition, as it naturally occurs in the wild, humans likely oscillate between these two modes and perhaps even participate in both simultaneously—and it is this overlap of cognitive modes that holds the key to creative cognition.
Interestingly, this idea, that creative cognition stems from moving between concrete and abstract modes of thought, is evidenced by Ross and Vallée-Tourangeau’s own methodology. They begin with a quantitative analysis that is abstracted, with conditions differentiated by numbers. However, in the qualitative analysis, they engage with the concrete particulars—just like participants who had access to the bag of socks, they opened up the experiment to have a look inside and see what was actually going on. This leads them to observe interactivity in the so-called non-interactive condition. Participants talked to the experimenter, asked questions about the question, and pitched answers to see the response. These concrete observations were then integrated back into a more abstract understanding of rewilding cognition not just in their own experiment, but in experiments more generally. Thus, Ross and Vallée-Tourangeau arguably, engaged in both first- and second-order problem solving in their own creative analysis.
One of the innovative aspects of the article, which Ross and Vallée-Tourangeau only make passing reference to, is the combination of qualitative analysis within an experimental design. It enables them to be both concrete about what went on in the experiment (analyzing videos) and also to be more abstract (assessing statistical differences). In the nomenclature of mixed methods research, they used a peculiar variant of an explanatory sequential design (Creswell & Creswell, 2018). These designs start with a surprising quantitative finding and then use qualitative research to generate plausible explanations. Ross and Vallée-Tourangeau’s variant of this design is peculiar because the quantitative and qualitative parts of the analysis pertain to the same events (participant behavior in the experiment). Normally, mixed methods sequential designs use separate data sources (e.g., a survey followed by an interview). Combining qualitative methods within the quantitative experiment provides two different lenses on the same behaviors—thus enabling each to elucidate the other. This qualitative-quantitative analysis of the behaviors within the experiment enabled them to zoom out to identify a surprising lack of difference between conditions and then zoom in to describe what participants were actually doing.
What the present studies and several others (e.g., Glăveanu et al., 2019; Hawlina, 2019) show us is that rethinking experiments in psychology, and other disciplines, remains an urgent task. The video recording of participants in experimental studies or conducting post-experiment interviews or focus groups should be the rule in this kind of research, not the exception. The possibility of collecting qualitative data within research designs that are meant to be quantitative and focused on measurement should be seen as a valuable opportunity by experimentalists. Detailed data about the actions, interactions, and beliefs of one’s participants are not meant to be analyzed mainly when the statistical analysis failed to offer the desired finding—it should become a source of deeper reflection about the phenomenon under study and about the participants being studied in most cases. Of course, there are pragmatic reasons for not overloading researchers with collecting and especially analyzing datasets that, on the surface, seem highly distinct. But, as Ross and Vallée-Tourangeau’s research shows, these are not even necessarily different datasets—they can be part of a single account of the course of action prompted by the experiment, the origin of both quantification and of qualitative forms of interpretation. Adding cameras or making room for interviews within a traditional experiment is not meant to disturb standardization or reduce control. On the contrary, these additions can offer precious insights into how things happened and what actually happened, above and beyond the narrow measurement of changes in a few pre-selected dependent variables (see also Glăveanu et al., 2019).
Moscovici (1991) lamented the separation between experiments and experience when experiments in psychology necessarily entail manipulations of participants’ experience. Too rarely have experimenters “opened the black box” of the experiment to examine what is actually going on (Corti et al., 2015; Psaltis, 2007). Ross and Vallée-Tourangeau’s use of qualitative methods within the experiment is an exciting illustration of the insights that can be obtained by studying what goes on in experiments, by moving between the abstract statistical type of analysis and the more concrete, grounded qualitative type of analysis. Ironically, their delving into the concrete particulars of their own experiment to examine what happened demonstrates the point that they failed to establish experimentally, namely, that first-order concrete thinking can pump insight and problem solving. It is also a vivid reminder that lab-based cognition doesn’t become “wild” whenever the door is opened for the physical manipulation of objects or the possibility to be in dialogue with the experimenter. It stays as wildly embodied, dialogical, and surprising as it ever was.
Corti, K., Reddy, G., Choi, E., & Gillespie, A. (2015). The researcher as experimental subject: Using self-experimentation to access experiences, understand social phenomena, and stimulate reflexivity. Integrative Psychological and Behavioral Science, 49(2), 288–308. https://doi.org/10.1007/s12124-015-9294-6
Creswell, J. W., & Creswell, J. D. (2018). Research design: Qualitative, quantitative, and mixed methods approaches. Sage.
Gassmann, O. (2001). Multicultural teams: Increasing creativity and innovation by diversity. Creativity and Innovation Management, 10(2), 88–95. https://doi.org/10.1111/1467-8691.00206
Gillespie, A., & Zittoun, T. (2013). Meaning making in motion: Bodies and minds moving through institutional and semiotic structures. Culture & Psychology, 19(4), 518–532. https://doi.org/10.1177/1354067x13500325
Glăveanu, V. P. (2015). Creativity as a sociocultural act. Journal of Creative Behavior, 49(3), 165–180. https://doi.org/10.1002/jocb.94
Glăveanu, V. P. (2020). The possible: A sociocultural theory. Oxford University Press.
Glăveanu, V. P., Gillespie, A., & Karwowski, M. (2019). Are people working together inclined towards practicality? A process analysis of creative ideation in individuals and dyads. Psychology of Aesthetics, Creativity, and the Arts, 13(4), 388–401. https://doi.org/10.1037/aca0000171
Hawlina, H., Gillespie, A., & Zittoun, T. (2019). Difficult differences: A socio-cultural analysis of how diversity can enable and inhibit creativity. The Journal of Creative Behavior, 53(2), 133–144. https://doi.org/10.1002/jocb.182
Hutchins, E. (1995). Cognition in the wild. MIT press.
Kahneman, D. (2011). Thinking, fast and slow. Macmillan.
Moscovici, S. (1991). Experiment and experience: An intermediate step from Sherif to Asch. Journal for the Theory of Social Behaviour, 21(3), 253–268. https://doi.org/10.1111/j.1468-5914.1991.tb00197.x
Psaltis, C., & Duveen, G. (2007). Conservation and conversation types: Forms of recognition and cognitive development. British Journal of Developmental Psychology, 25(1), 79–102. https://doi.org/10.1348/026151005x91415
Ross, W., & Vallée-Tourangeau, F. (2021). Rewilding cognition: Complex dynamics in open experimental systems. Journal of Trial and Error. https://doi.org/10.36850/e4
Vallée-Tourangeau, F., & March, P. L. (2020). Insight out: Making creativity visible. The Journal of Creative Behavior, 54(4), 824–842. https://doi.org/10.1002/jocb.409
Vygotsky, L. S., & Luria, A. (1994). Tool and symbol in child development. In R. Veer & J. Valsiner (Eds.), The Vygotsky Reader (pp. 99–174).
Zittoun, T., & Gillespie, A. (2015). Imagination in human and cultural development. Routledge.