TL;DR: beginners in user need research have trouble dealing with the openness of possible outcomes in research.
User need research is used for discovering the motivations, activities and problems of potential users. For this, the researcher often has a conversation (»Interview«) with participants and asks them questions about their experiences:
Researcher: »Can you describe, how you use your smartphone when you are on the go?« or even: »can you describe how you spend your time in transit?«
asking closed questions
To ask such questions seems to be tough for beginners. Usually, designers and programmers I tought struggled with this way of asking questions. A typical first try sounds more like »Do you play games when you use your smartphone on the go?«. This question is closed: Its answer is predictable either »yes« or »no«. In contrast, an open question like the initial ones can lead to an infinite variation of answers.
grouping by surface similarity
When the gathered data is analyzed, other problems emerge: Beginners seem to prefer grouping data by surface similarity, same words which are mentioned, same object referred. It is much harder to group by possible concepts and meanings. But it leads to principles like »Being on the go is a time out for me«, which are very useful in design.
embracing openness is hard
In general, it seems to be hard for beginners to embrace situations in which the outcome is open or ambiguous: The answer to an open question may be almost anything; Possible results when analyzing concepts may be in flux and changed after additional data was acquired: the result is not right or wrong but "just" improving.
Why is embracing the openness in user research hard? Two possible reasons:
1. Not knowing what will happen
If the outcome in asking questions and analyzing is open it is hard to (mentally) prepare for what might happen.
This may be particularly tough for designers and developers. The result they aim for is to have an implementation at the end. What they find out in research may or may not require (big) changes to what they have in mind. And throwing away work does not feel great. As a proponent of Human Centered Design you may argue that it is just wrong to think of implementations when still researching needs. But it is a known behavior (Sharp, 2013, see this post), and overcoming it is probably hard.
2. The one right information
In quantitative (statistics) research you put up an hypothesis, and show that you tried to falsify it. If you don’t succeed, the hypothesis is corroborated. There ought to be a clear cut criteria, like a p<0.05 if it is more you reject, if it is less you take your hypothesis. You as a researcher should not influence the outcome.
In qualitative, need finding research, this is different. Instead of avoiding any individual influence made by the researcher, his/her interpretations play a vital role. And instead of having a one-shot approve-or-disapprove, you may as well iterate, refine ideas, hypothesis and their rejections or corroboration over the course of your studies. It is fine to be surprised. In a qualitative study, wondering about something happen can be a sign that you get into trouble. In qualitative it is the sign that you are onto something.
idea of a research method that is empirical but not deductive, is hard to grasp, since it is contrary of the idea of research many beginners bring along: Research ought to be precise, »right« and quantifiable. Integrating another paradigm into this view or research is not easy.
Human Centered need finding research poses difficulties for beginners. One could blame the »wrong« attitude, education or prejudices, but doing so will not help to improve future designs. Understanding the problems makes it possible to address them in a more productive way.
- Hein, Serge F. "I Don’t Like Ambiguity": An Exploration of Students' Experiences During a Qualitative Methods Course." Alberta journal of educational research 50.1 (2004): 22-38.
- Sharp, Helen, et al. "A protocol study of novice interaction design behaviour in Botswana: solution-driven interaction design." Proceedings of the 27th International BCS Human Computer Interaction Conference. British Computer Society, 2013.