I hope I do rather better on the note taking here than I did capturing the discussion at Cloud 3 in May!
John Griffiths (that's me) led off with a presentation you will find in the Scriptorium - Communities of interpretation. Which in summary hold that analysis and interpretation is the last bastion of added value that market researchers can bring to the party. And this can be amplified by building networks of researchers at the analysis and interpretation stage. As each node of the network is informed by the perspectives of the other network nodes.
Discussion afterwards focussed on how such a network could be balanced. Wasn't it the case that it would reflect every bias. And that a senior researcher or project leader could influence the network out of shape? Other concerns were about the need for every project to have an editor who had final say. John Griffiths argued that qual research had been constrained in scale because the level of complexity could not exceed the ability of a single mind to contain it. A community of interpretation might break that barrier - which is necessary since online projects involve collection of so much more data from so many more sources.
Joanna Chrzanowska wanted to know how emergent ideas would be captured and recognised by the network - if each researcher was preoccupied with an existing theme or audience they had been briefed to represent.
We couldn't continue this discussion because we needed to make room for Annelies Verhaeghe of Insites who took us through a project which is still being analysed on Crowd Sourcing analysis. I hope the presentation will be posted very soon in the scriptorium. But in summary bloggers were asked to provide images of what they perceived to be cool at a music festival they were attending. Researchers, marketing experts and 4 different types of crowds were then given the task of evaluating these and providing perceptions of their own. The bloggers then graded these in terms of the insights they generated. The 4 different types of crowd included those who were at the festival and also those who were not. And those who knew the bloggers and those who did not. The result showed that crowds appeared to be a better source of insight. And that the most fruitful crowd was one familiar with the context (ie present at the festival) and unfamiliar with the blogger (at several degrees of separation). A faschinating paper which has given Insites a way to use crowds to increase insight generation (they claim) by 200%!
Discussion after this paper focused on how applicable the method would be to different products - not as involving as music festivals! Also the different role of researchers as facilitators rather than insight generators themselves. And the power of using outsiders rather than insiders to evaluate ideas. It still depends on the ability of the client organisation to implement these insights and monetise them.
That's as good a summary as I can muster at the moment. But I hope it shows that Cloud of Knowing is great at introducing original research ideas and even more important to provide a small scale environment where these ideas can be explored and challenged. In a way conferences just aren't able to do at the moment.