This method is a more traditional marketing approach to gathering evidence. You bring customers or people from your target audience into a room and get their opinions on existing or new products. It promises the chance to easily understand what your customers want in one session, and gives you a chance to probe them on the areas you are interested in.
Ideally a focus group will provide you with feedback and reasoning, so you can go away and address the concerns. They may even give you quotes that describe how much they like your product, which you can use to drive marketing.
However when used poorly a focus group can become the justification for sweeping assumptions and overconfidence, based on a few throwaway comments. Focus groups can be a good place to start your research and help direct it, but they shouldn’t represent the only evidence you find.
This is another method where recruiting the right participants matters. You want people who are either your actual audience or who match them.
If you already have personas defined then it’s worth trying to get representatives from each of those groupings and not just populate your group with one type. If everyone is too similar then you’ll potentially only hear a chorus of identical feedback.
Just like when interviewing or user testing, it’s important to write some kind of script or discussion guide, which captures the questions you want to ask. You don’t need to stick rigidly to it—part of the benefit of focus groups is letting the group evolve the discussion to areas you hadn’t thought of—but it’s there as a structure to fall back on.
Like interviews or user tests it’s a good idea to have a couple of people facilitate: one can talk and engage people while another takes notes and records. If you’ve not run any focus groups yourself before it can be worth getting an independent agency with a good track record to do so (they’ll help you avoid the mistakes outlined below).
The feedback you get from a focus group shouldn’t represent the end of your evidence-gathering, as it’s easy for them to come to skewed conclusions. It is better to take the outcomes (particularly any insightful comments) and use these as starting points for further research.
One reason to be wary of focus groups is that the results can be so easily manipulated by either biased facilitators—who have a vested interest in the product being successful—or by the loudest voice in the room. This problem of biased facilitators can appear in other forms of research too but it’s made worse in focus groups by the power of ‘group think’ as once an idea is suggested to the group it can spread quickly so the group end up repeating back what they’ve been told.
The biased facilitators phenomenon is shown well on the TV show The Apprentice where it’s the only user research method they use in product-creation tasks (and then they often proceed to ignore what people say anyway).
‘Group think’ creates effects where people don’t always behave honestly. If one member of the group loudly declares that she dislikes something, quieter members who think the opposite may agree or stay silent to avoid conflict, or for fear of looking silly.
The group can also get sidetracked by one or two people’s opinions taking up all the time and the session can run out before everyone can have a say. You can miss out on the nuanced thoughts of some people, which you would be able to dig into when interviewing individually.
Another big problem with focus groups is that you’re placing a lot of weight on what people say rather than what they do, two things that are often quite different. This is illustrated well by the classic yellow Sony Walkman story, which is worth reading here if you haven’t heard it.
In customer interviews you should try to ask people about actual behaviour but this can be harder to do in a group setting, where people can say things to impress others or to match the group consensus.
The focus group is potentially a dangerous beast and not something I come across much today in tech product decision-making. It used to be the preserve of corporations and occasionally a client would present me with their main research from focus group feedback.
The downsides can of course be avoided but require careful moderation and analysis afterwards, and there just aren’t many people in the digital/startup space with that experience.
This isn’t really one that requires lots of tools and software, just a method for recording and reporting on it afterwards. You can do remote focus groups in online chat software like Slack—I’ve done this for new idea development.
The same rules apply online to doing it in-person. You tend to need to marshal users more to keep them on topic but it gives less vocal people a chance to be heard and you have the benefit of a written transcript to analyse at the end.
A focus group itself should last 1-2 hours to keep it, er, focussed. Set up and analysis requires a few days.