Customer experience and energy product agile insights

GroupQuality was contracted by a data-driven creative agency to use its leading agile online customer experience community discussion to track the customer on-boarding, usage and effectiveness of an energy prepayment product. The research project aimed to obtain direct feedback from a sample of customers and walk in their shoes as they experienced the product for the first time before the market roll-out. The project needed to identify the technical and user experience issues and process risks with the roll-out of the new mobile-enabled product. (Risks could include app issues, communication issues, hardware roll out issues.) Critically the project needed to monitor the customer journey and experience to understand the resource demands for product on-boarding, communication and ongoing support. Any insights captured from the feedback and structured questioning would be used to improve the product features, the process and the day to day usage. Target market Based on the initial brief, we recommended one online over-time seven to eight-week GroupQuality agile community discussion board across the different targets: Location of the customer group to be Tasmania, Australia. Current customers who already have the old energy product installed in their home are being transitioned to a new prepayment new energy product. A targeted representative sample of the target audience. Age and gender split across the sample. Total customer engagement 12 Methodology To meet your objectives for the project, we proposed the following: Eight-week discussion to track the customer experience from pre-invitation to day to day product usage. A GroupQuality moderated online discussion to track, monitor and listen to the customers’ experiences. At each stage of the customer journey, many topics were...

7 recruitment tips for online qualitative research

Free participant recruitment Guide! The key to successful people recruitment for online qualitative research projects begins with the first contact. Being able to clearly articulate what you expect of participants and how you will reward them. Recruiting for online research is different from traditional recruitment methods and requires an agile and proven online approach, as well as understanding how to engage participants once they enter the group. Participants recruited need to be able to jump on-board and learn how to use the online technology while participating, and without having to be trained in how to use the software before the group begins. A requirement for participant training only adds to the cost of recruitment and creates unwanted friction before the project has even started. 7 tips for online research automation recruitment: Don’t be afraid to inform – include a summary of the purpose of online focus group or discussion. To do this you need to understand what you are asking them to do and what value their contribution will make? The incentive reward is vital, but you will get an improved buy-in if you can help participants identify with the subject matter. We all like to be valued, and the same goes for individuals’ opinions. If people feel their views are appreciated they would be more likely to log into your group after a long day’s work. Set clear expectations and confirm – include a clear list which provides a summary of what people can expect when agreeing to participate in an online focus or community discussion group. A confirmation email should outline what a participant can expect on...

Ten tips for online qualitative research discussions

OK, so you think you are ready for your online market or social research community discussion? You’re full of nervous anticipation, and you are excited about what’s coming next. Your expectations are high, and you have relied on your knowledge and experience to get you to this far. There is only one problem, you have left everything to the last minute, assuming the online technology will automate everything for you. But you quickly discover there are new processes to learn and participant responses are being influenced by the way you have visually presented your online discussion. You hear yourself say, “if only I had known these things before .…!” There are no doubt advances in technology has improved the process of managing online qualitative market and social research projects, but it’s important to remember there is an inverse relationship between the time you invest in preparing for an online project and the effort required to capture the data. It is true that online qualitative research projects demand the learning of new skills and even relearning some old ones, but the time you spend preparing is rewarded ten-fold by the insights you will gather at the end of the process. Your online market or social research project will only be as good as the information you put into it. To repeat a cliché, ‘Garbage in, garbage out!’ Whether conducting online market research or evaluating the success of a national program, a clear understanding that you need to invest the time to learn the tools and become familiar with the process will ensure your online project is a success and delivers...

How to invite customers to an Agile Insights Micro Community

We often get asked: “What is the best way to invite a particular audience to a 3 day agile online insights community discussion…”, it depends! …on the kind of micro community group you are running, but here is one process that works pretty much all the time. When you invite participants to take part in an agile online micro community discussion, it pays to keep it simple! A micro insights community is a small community of 15 to 30 people who come together to talk about particular topics over days or weeks. From the very beginning assume your targeted participants don’t know anything about the platform or process you are using. This doesn’t mean sending them a very long and laborious email detailing every aspect of the online research method. A long-winded explanation only serves to create negative expectations and ultimately adversely affects the community context and discussion participation rate. To get an online community discussion up and running follow this three step process: (I am assuming here you have already screened participants through some kind of recruitment process – this is a topic for another day.): I strongly suggest you make contact with participants before sending out the actual invitation to the discussion; this can be done by email or by good old fashioned telephone. I can’t tell you how many times we have seen discussion boards started where participants have no idea why they are receiving an invitation email to log into a forum. It only leads to confusion, frustration and in some cases spam complaints for your organisation. Send participants the invitation email from the agile...

Case Study: Energy industry agile community

The client: “TasNetworks commenced operations on 1 July 2014. It has been formed by a merger between Aurora Energy’s distribution network (the poles and wires) and Transend Networks (the big towers and lines). TasNetworks is a Tasmanian state-owned corporation that supplies power from the generation source to homes and businesses through a network of transmission towers, substations and powerlines.” Objectives of a discussion focus group research project: To use an agile online discussion focus group to capture customer attitudes and preferences towards new product ideas and communication in context of an electricity network employee need to  access a customers’ property and home. Identified target market: Sixteen electricity metered residential customers were recruited from a variety of locations including rural and urban locations. A proportion of those customers had solar installed, and all customers were on either dial or digital tariff electricity meters. There was a mix of age and sex from early 20’s through to 60+ years of age. Method: A secure (GroupQuality) online insights community discussion board (over-time focus group) professionally moderated by us. Respondents were recruited and screened, invited to participate, and provided with a unique username and password. The discussion occurred over a 5-day period, and activity took place in a structured online discussion using an approved, scheduled discussion guide. The guide included questions, images, and documents to be reviewed and analysed. The discussion guide was carefully structured for the online environment to ensure the most efficient use of the participants’ time. Participants logged in at a time convenient to them each day – early in the morning or late afternoon or early evening. Each day...

Online surveys & discussions work hand in hand!

An online survey doesn’t always give you the depth of insight you might get from sitting down and having a two-way conversation with a person from your target audience. There is no doubt that online surveys are an easy way of facilitating a one-way question and answer session with many people at the same time, but it does require you first to craft one side of the conversation. There is a degree of guesswork involved because each question requires you to anticipate a response. It may seem counter-intuitive, but it does require you to have a good grasp of the intended audience and more importantly the objectives of you survey. Online surveys allow one way or asynchronous method of insight collection. Surveys are simply not built, or intended to facilitate two-way communication and engagement. Online surveys capture immediate responses and reactions to structured questions, but they do not promote the same level of discovery generated from a two-way (synchronous) post and answer discussion. Stay informedsubscribe First Name Please add your first name Email Address Please add your email address Submit Website and mobile survey environments train people to respond to questions according to a predefined response pattern, which means how the questions are structured and how they flow from one question to the next. In most cases, this tends to be short and sharp answer options framed by the question format. Surveys can also include open-ended text-based questions, where survey respondents type out a verbatim answer based on their interpretation of the question. People will often answer open text questions based on what they deem a ‘reasonable’ and ‘expected’...