Abstract
While survey research design tends to prioritise closed questions with predetermined responses, many surveys conclude with an open-ended ‘anything else you would like to tell us?’ question. This question, designed to elicit feedback or create opportunities for respondents to share additional information, offers significant potential for insight into respondents’ experiences. Yet, the extent to which these open-ended data are listened to remains opaque. In this article, we outline and reflect on our approach to a reflexive thematic analysis of responses to an open-ended survey question, to foreground listening. Drawing on responses (n = 1746) from a national survey of informal carers in Australia, we explore how the final ‘anything else?’ style question of a survey offers scope for fostering ongoing engagement (via survey feedback and context), recognising less visible experiences (through detailed personal accounts), and attending to respondents’ needs (via calls for attention and action). We discuss our approach to sociological listening, and the practicalities therein, including experiences of discomfort, and the challenges of responding to responses. In doing so, we argue that engagement with the data derived from open-ended survey responses is not only fruitful for generating feedback and contextualisation within self-administered survey design, but that such data also hold considerable opportunities for meaningful listening, particularly in contexts where respondents lack recognition and visibility.
Keywords
Introduction
Across sectors and contexts, the survey has sustained its position as the go-to instrument for systematically and efficiently collecting large-scale data to better understand clients, consumers, and citizens (Boynton and Greenhalgh, 2004; Bryman, 2012). Designed to generate data suitable for statistical analysis and permit description of patterns, most surveys are dominated by closed questions with predetermined responses. Many surveys also include a small number of open-ended questions which are included for four distinct purposes: extension (to provide an answer not offered in the closed question options); substitution (to promote accuracy in questions about socially undesirable behaviour); expansion (to offer scope to for more detail through ‘why’ or ‘how’); and general (to permit elaboration and overall comments in relation to the survey) (O’Cathain and Thomas, 2004). Open-ended questions might be placed in different positions within the broader survey in accordance with varied objectives, for example, in the concluding, general open-ended question, respondents are commonly invited to offer ‘any other comments?’ (Bryman, 2012; Decorte et al., 2019; Garcia et al., 2004; Krosnick, 2018; O’Cathain and Thomas, 2004). Despite the ubiquity of this final survey item, it is often unclear whether or how data are (or should be) interpreted (Garcia et al., 2004; Rouder et al., 2021). Indeed, the question of how to analyse general open-ended responses continues to be debated, as ‘a crucial yet challenging task for social scientists, non-profit organisations, and educational institutions, as they often face the trade-off between obtaining rich data and the burden of reading and coding textual responses’ (Beeferman and Gillani, 2023: 1; see also Behar-Horenstein and Feng, 2018; Etz et al., 2018; Feng and Behar-Horenstein, 2019; Fielding et al., 2013; Popping, 2015). This has prompted some to argue that ‘if you do not have the time or expertise to analyse open-ended responses, do not invite any’ (Boynton and Greenhalgh, 2004: 1314).
In this article, we aim to contribute to the ongoing discussion about open-ended survey responses through our reflexive account of thematic analysis of a final ‘is there anything else you would like to tell us?’ survey question. Drawing on 1746 responses from a national survey of carers in Australia, we foreground an orientation towards sociological listening, in an attempt to complement ongoing discussions of how to meaningfully analyse, and respond to, open-ended survey data.
Background
The possibilities and problems of open-ended questions in survey design
The potential benefits of including open-ended questions in survey design are well-established. Open-ended questions allow respondents to clarify or qualify responses to other (closed) survey questions, offer feedback, and include opinions or anecdotes (Abraham et al., 2020; Bryman, 2012; Garcia et al., 2004). Such data can be collected in ways that permit anonymity and may minimise power differences between survey creator and respondent, in that the opportunity to answer in one’s own words and in one’s own time can offer respondents a sense of agency and ownership over their contributions (Braun and Clarke, 2021; Decorte et al., 2019; O’Cathain and Thomas, 2004; Singer and Couper, 2017). Various factors have been shown to shape how respondents engage with open-ended survey questions. Research has found that the size of the answer box, follow-up probes, and error messages used can all influence whether, and the extent to which, a respondent completes an open-ended question (Emde and Fuchs, 2012). While some respondents may gain satisfaction offering personalised responses, such questions require additional effort and thus are often subject to item nonresponse (Keusch, 2014). Despite the risk of lower engagement with this style of question, researchers have argued that open-ended questions should be used more often, as survey designers cannot be sure of all possible response options (Krosnick, 2018). Yet, the analytic approaches for interpreting these data are often not straightforward, with some researchers arguing that ‘methodological approaches to open-ended coding in survey research have been anything but consistent and credible’ (Lupia, 2018: 121; see also Garcia et al., 2004). Put simply, the place of open-ended responses amid otherwise quantitative data remains somewhat fuzzy.
A range of tools and methods for analysing open-ended responses have been posited in existing studies, spanning academic disciplines (see, for example, Fielding et al., 2013; Popping, 2015; Rich et al., 2013). For example, responses may be coded using statistical or machine learning (Gweon and Schonlau, 2024), manually by classifying responses to predefined categories through numeric or alphanumeric code numbers (Biemer and Lyberg, 2003; Lupia, 2018), through statistical analysis (Popping, 2015), or using qualitative, more conceptually and thematically driven approaches (O’Cathain and Thomas, 2004). Yet, the appropriateness of these data for analysis continues to be debated, sometimes critiqued for a lack of richness, sometimes heralded for the potential to elicit rich and succinct responses (Abraham et al., 2020; Braun et al., 2021; LaDonna et al., 2018; Meitinger et al., 2021; O’Cathain and Thomas, 2004; Silber et al., 2020). As Rouder et al. (2021) have noted, ‘their place as a qualitative metric in an otherwise quantitative survey often makes them not quite substantial enough for a fully qualitative study, but not quite measurable enough to be useful for quantitative analysis’ (p. 2). As such, it is at times unclear whether or how researchers engage with or analyse the diverse data collected via open-ended questions, which analytical frameworks should be used to interpret open-ended responses and whether or how engagement with these responses is communicated back to respondents (Garcia et al., 2004). Indeed, previous research has highlighted how the processes for open-ended response analysis and integration with quantitative survey findings can be unsatisfactory or frustrating (Etz et al., 2018; Fielding et al., 2013; Rich et al., 2013). Thus, it has been argued that in order to provide meaningful, interpretable data, open-ended questions must be specific (Popping, 2015). In this way, as a sort of ‘catch all’, the final ‘anything else?’ question presents somewhat of a methodological dilemma for researchers, given the multiplicity of options for engagement and analysis, and the rarity of discussion about this survey component (Decorte et al., 2019; O’Cathain and Thomas, 2004).
Listening to open-ended survey responses
Rather than adding to the varied list of techniques for analysing open-ended survey responses, here we instead consider the imperative to respond to and listen to open-ended responses. As Rouder et al. (2021) argue, ‘since respondents have spent time and effort to provide this data, there is often a desire on the part of study team to ‘do something’ with it’ (p. 8). Of course, this imperative spans beyond the domain of academic research to organisational contexts in which survey data are relied upon and leveraged to drive improvements in support and service provision, allowing informed insights into the lives of clients or employees (see, for example, People with Disability Australia, 2021; Ruck et al., 2017). Responses to a final ‘anything else?’ question may also be sought to establish credibility in policy submissions or organisational reports or create vignettes or case studies to appeal in lobbying and advocacy work (Devine and Robinson, 2014; Mayne et al., 2018). As such, the very process of collecting open-ended survey responses presents an additional channel of communication between respondents and organisations. And while engagement with open-ended responses may not be visible to those who complete the survey, these data often hold considerable value for organisations looking to facilitate a meaningful and ongoing relationship with respondents, and may be followed up on in different ways. There is great potential, within and outside academic scholarship, then, for open-ended data to tell a powerful story, and enable consideration of social phenomena across different scales (Mason, 2006).
We situate the ‘anything else’ style open-ended survey question as an opportunity for sociological listening. In recent years, scholars have emphasised that it is a choice to seek information from respondents, one which comes with significant power, privilege, and responsibility ‘to listen to what they say or read what they write’ (Bassel, 2017; Singer and Couper, 2017: 127). Indeed, it is the listener who enables or constrains change through (not) listening and/or (not) acting (Back, 2007; Dreher, 2009; Waller et al., 2015). Such listening, as a form of inclusive practice, goes beyond the opportunity to participate; although there is evidence that research participation is of direct benefit, providing catharsis and empowerment, or feelings of recognition and solidarity (Bellamy et al., 2011). Whereas the lens of participation has tended to focus on empowering marginalised voices, the theoretical turn to listening places the onus on powerful actors who shape what and how voices are heard (Dreher, 2009; Waller et al., 2015). This includes the ‘labour of listening’ that involves a commitment to attention and valuation (Crawford, 2009), as well as the need for ‘competence in managing the vast amount of data that we collect through various listening exercises and activities and the capacity to make sense of these articulations, to hear what they have to communicate’ (Nolas et al., 2019: 396). Studies from human relations management have offered insights into the potential for strengthening two-way listening processes between individuals and governments or organisations, arguing that listening requires recognition of the speaker’s right to a voice, attention to that voice, and responding to what is requested (Cardon et al., 2019; Macnamara, 2018; Ruck et al., 2017). Indeed, research has demonstrated that listening can strengthen existing relationships, as well as satisfaction with and trust towards an organisation (Hastie, 2021). Within the social sciences, scholars have argued for the ‘right to be understood’ which creates an obligation of active engagement, ‘serious and sustained listening’, and commitment to understand the ‘Other’ (Downing, 2007: 12–13; Dreher, 2009). An important component of listening, then, is communication that demonstrates that action has been taken based on listening processes, so that those who contribute feel acknowledged and valued by the listener. In this article, we consider how, as well as why, we listen (see also Bassel, 2017). In order to do so, below we explore a series of dilemmas and tensions we faced in thinking through what is heard, how it is heard, and how to respond in meaningful ways.
Method
Survey exemplar: study and data
Our study draws on responses to the final, open-ended question of a national survey of informal carers in Australia. The term ‘informal carer’ refers to individuals who provide ‘unpaid care and support to a family member or friend who has a disability, mental illness, drug/alcohol dependency, chronic condition, terminal illness or who is frail’ (Carers NSW, 2020). There are now more than 2.6 million informal carers in Australia (ABS, 2020); given the ageing population and growing reliance on carers, understanding their experiences is critical. The survey was conducted in 2020 by a non-governmental organisation and guided by a Reference Committee comprising carer representatives, university-based researchers, and stakeholder representatives. Ethics approval for the survey was obtained via Human Research Ethics Committee of Macquarie University (Reference Number: 6233). The 73-question survey included sections related to the caring relationship, services and supports, employment and finances, health, wellbeing and social connectedness, and took approximately 25 minutes to complete. Questions were designed to capture the current caring context, reflect best practice approaches to research with carers, gain insights relevant to developments in the service and policy landscape, and generate information to address identified gaps in the research literature. The survey could be completed online, or in paper form. In the online survey, responses could be typed into a free-text box (without a character limit); in the hard copy survey, respondents were given five lines on which to write, with additional free space below if required. The survey was completed by 7735 carers, of whom 80% were female, the median age was 58 years, and 95% identified as ‘Australian’ and spoke only English (Carers NSW, 2020). Our analysis below focuses on the final survey question: ‘Optional: Is there anything else you would like to tell us about your experiences as a carer?’
Analysis
We approached the open-ended responses as a qualitative data set, influenced by interpretive traditions in sociology, and drawing on the reflexive thematic analytic method developed by Braun and Clarke (2021). We approached analysis abductively, with knowledge from survey design and analysis (for example, the inclusion of an ‘any other comments’ question to conclude a survey as a means to generate feedback on the survey itself, or to draw attention to topics overlooked within the survey). For systematic coding, the large MS Excel file with 1746 responses (including both online responses, and manual entries from paper-based surveys, transcribed verbatim) was imported into MS Word and then into NVivo 12. G.N. and E.K. first familiarised themselves with the open-ended response data set and then independently coded the data, developing thematic domains in an iterative process. Initially data were coded into themes related to experiences, requests, and feedback. Then, data were separately coded as responses that were inward facing (towards the organisation) or outward facing (towards advocacy and policy responses). Through this process the thematic domains were further developed and refined. The author team also met on several occasions to discuss the data and it was during these meetings that we became attuned to listening, initially as a substantive thematic finding. That is, we found over 200 responses that we interpreted as highlighting respondents’ desire to be heard. As such, we turned our attention within our analysis, and as the focus of this article, towards listening.
Following three rounds of coding and discussion, we reflected on our process and our interpretation(s), to think about listening to silences. This aspect of our method was critical in that it allowed us to sit in the discomfort of our own listening processes, to think about what we might have missed, sidelined, or privileged in our coding and why. Also important were reflexive discussions about the organisational challenges and logistics associated with managing open-ended responses that provided opportunities to consider formal and informal organisational responses to and uses of data generated (Ryan and Golden, 2006). We were interested in improving our approach, and to think about how the population of Australian carers (both those who completed the survey, and those who did not) could be better served and supported, in and through research and practice. Below, we explore three distinct domains of insights garnered from the ‘anything else?’ question, to consider how these responses offered opportunities for listening.
Findings
What (and who) is heard? The possibilities and limitations of open-ended responses
We begin by reflecting on the survey sample, to acknowledge that what can be listened to is only ever partial. Within the total survey sample, 22.6% of respondents completed the final optional open-ended question, the subject of our analysis (n = 1746). We were cognisant that respondents who answered this question may reflect particular demographics and/or experiences, so we conducted a tandem analysis of the survey responses overall to compare the characteristics of those who responded to the final question with those who did not. Most characteristics were similar, but there were differences that suggested that carers who responded to the final open-ended question were in more demanding or committed caring roles. Compared with the overall survey sample, those carers who responded to the final question provided more average hours of caring per week (78.1 compared with 72.8 hours) and had cared for longer on average (14.2 compared with 12.4 years). 1 These respondents were also more likely to suffer psychological distress at a higher rate 2 and experience a higher rate of social isolation. 3 As such, we were less likely to receive open-ended responses from those newer carers, or those who arguably felt their roles were less challenging.
In our analytics process, we were also mindful of the representation of survey respondents relative to the overall population of Australian carers. Given the survey was facilitated through a carer not-for-profit network, our sample generally had an over-representation of carers with higher-intensity caring roles, who were in contact with support and services, and who self-identified as carers (Carers NSW, 2020). Compared with the average carer in Australia (see ABS, 2020), survey respondents were more likely to be primary carers, female, older, have lower educational attainment, have a disability, and be unemployed (Carers NSW, 2020). The survey was only available in English, and carers for whom English was not their first language were under-represented in the survey. As such, some perspectives were less visible from the outset in our analysis and discussion. While this is a study limitation, focusing on what is present and absent within research data is critical to reflexivity (Guillemin and Gillam, 2004). Practically speaking, this process also prompted strategies for more targeted sampling and recruitment in future survey iterations.
Notwithstanding the partial representation via open-ended responses, our analysis of responses prompted reflections on our analytic process in terms of listening. For example, discussion emerged around responses that for many reasons we found more challenging to listen to. Some responses had been sidelined because we found them difficult to interpret (e.g. due to spelling or grammar; seemingly sarcastic or ironic responses that we could not confirm), or we perceived them as too difficult to display in research outputs (e.g. responses that used explicit or offensive language). Furthermore, the length of responses also shaped how we might listen to the data. There were approximately 100 long responses (500 words or more) that revealed some of the manifold and distressing challenges of everyday life as a carer. Thematically coding these accounts was difficult, as they often lacked coherence in terms of narrative and structure. Other responses (~350) were very short (only a few words), and as such – as qualitative data – lacked richness or context. While responses such as ‘I’m drowning’ are indicative of carers’ struggles, there are limits to what they can tell us about their lived experiences.
As well as recognising the challenges of interpreting open-ended responses or using responses in research outputs, the considerable potential for open-ended responses to shed light on emerging or unfolding issues that could not be gleaned through other (closed) survey questions became clear through our analysis (see also Decorte et al., 2019; Garcia et al., 2004). Respondents had a propensity to provide what might be described as ‘contextualising’ details on their specific circumstances, to emphasise the rationale behind earlier responses. For example, the final question attracted a significant number of responses articulating pandemic-related challenges and changes to caring circumstances: I have filled this in, using information prior to the isolation due to the corona virus. Now caring for my husband 24/7.
In this way, the open-ended responses captured the impacts of Covid, including social, political, and/or legal shifts that occurred between the time the survey was designed and distributed (see also Fielding et al., 2013). Several responses also situated the transience of carers’ circumstances, reminding us of the limits of cross-sectional data: Since mental illness can be episodic, my experience currently is atypical of my total experience, as my son is currently able to live well enough independently. My experience as a carer was over 2 years ago. If I had filled this form out then my answers would have been completely different. I was stressed, exhausted and receiving Centrelink payments [social support payments].
Our analysis also revealed the potential for open-ended responses to draw our attention to topics absent, or not sufficiently covered in the survey, which warranted future research attention. For example, several responses alerted us to the desire for more visibility for certain subsets of carers in need of specialist attention in policy and practice: There needs to be a specific Young carers advocacy and service, they don’t have the maturity or experience to say no to responsibility they can’t handle and get sick far more often than older carers. I really think they are most overlooked group. As a now ex-carer, I feel like I have been cut off from many support services as they focus solely on carers who are still actively caring. I’ve spoken with other ex carers and they feel the same. The time that carers have the most pressing need for support is when they are no longer carers.
Responses like those above reveal the potential within open-ended responses for us to listen to voices and experiences that might be lost in larger-scale statistical analysis. For example, to engage with the excerpt above, the broader survey analysis found that former carers had, on average, similar indicator outcomes compared with the overall cohort, yet the above response seems to point to further need to attend to former carers transitions out of caring (see also Kirby et al., 2022). We note here that the open-ended accounts, at times, were not contextualised enough to develop our understanding of experience in ways that in-depth interviews, for example, might (Garcia et al., 2004; LaDonna et al., 2018). We were able to contextualise some open-ended response by mapping the demographic characteristics and/or caring experiences from data elsewhere in the survey; but in practice, this gave limited additional insight relative to the time and resources required. As such, we encountered what might be described as a paradox of context: where open-ended responses at once offered more nuance than the responses to closed questions, but less richness than to enable confidence in our interpretation of a respondent’s lived experience. Irrespective of issues related to depth and richness of data, we found many responses that offered new insights into carer experiences, which we explore in the following section.
Visibility and recognition for carers: the open-ended response as voice
Within our analytic process, what helped attune us to listening was the sheer number of responses that highlighted the lack of voice felt by carers. One important example lay in their accounts of loneliness and isolation. Descriptive statistics showed that 31% of survey respondents were highly socially isolated and another 25% experienced only low social support. Listening to open-ended responses helped us think about why and how such experiences occurred, and gave insight into how carers felt about such isolation: It is soul destroying, lonely and sometimes thankless work. People often say chirpily ‘you must look after yourself’ not realising there are no options to do that when your loved one is still at home. Family/spouse carers become invisible in time. People don’t want to hear the challenges you face so you learn to not talk about issues to avoid people’s eyes glazing over with compassion fatigue. Fullfilling [sic], but exhausting. Your lifestyle puts you into another category which is not understood by a lot of people.
Our analysis also revealed how respondents felt that their caring role lacked visibility and recognition on the part of governments, communities, and families: Government needs to listen & really HEAR what carers (and Carer Associations) are telling them about their needs Carers need other people & services to listen and give them time to explain their information to help put things in place for when THEY die. Concerns.
There were tens of responses that showed how carers felt misunderstood or misrecognised by health and social care services. For example, one respondent explained how they felt dismissed by health professionals due to their literacy: Because of my vocabulary level much is presumed about my knowledge of medical terms and strategies/processes to support my father as he becomes more frail. I have often felt unheard or listened to or dismissed by professionals in the industry.
Other respondents felt invisible due to the normative focus on the needs of the person being cared for: I frequently feel invisible as every interaction or focus seems to be about my husband, everything I do is about his disability. recognise me as both carer and consumer.
Elsewhere, responses articulated a lack of recognition for carers as advocates for the person they cared for. For example: As a carer for a person that can’t speak, I am her voice, But no one is listening I feel I am the only voice who speaks up for him, so he can have quality of care and when and if things go wrong the buck stops with me . . .
In a similar vein, our analysis also revealed other forms of carer (in)visibility. Some respondents, for example, problematised the very use of the term ‘carer’ in the survey: I hate the word carer. I am a mother and a daughter. Sometimes I would tell people that I was his Carer rather than his wife. Occasionally people would challenge me about using the term Carer as they didn’t understand what it meant.
These and other excerpts highlight respondents’ ambivalence to the term ‘carer’ and the terminological blurriness that this category creates (Kirby et al., 2022). Moreover, responses such as these prompted us to reflect on how labels may include or exclude, embrace or alienate particular audiences in survey participation. We noted that respondents necessarily had to self-identify as a carer to be able to resist and/or amplify the meanings ascribed to the term, and in order to be counted, and listened to.
As the above and below excerpts illustrate, many carers had limited opportunities to discuss the lack of voice or visibility they experienced while caring (see also Kirby et al., 2022). An open text box in an anonymous survey may have thus offered a rare opportunity for carers to express the emotions they felt towards caring. However, as the following response, typed in all caps – to perhaps indicate the extent of despair – shows, the social pressure to be positive about one’s role as a carer was felt, including when completing a survey: WHEN I DO THIS SURVEY I FEEL GUILTY TO SHARE MY THOUGHTS ALMOST AS IF IM COMPLAINING WHICH IM NOT
Such logics of guilt and gratitude permeated the open-ended responses, and respondents used the final text box/lines to send messages of appreciation. Indeed, many responses made the link between the need to listen to carers, and the survey as a tool for listening, foregrounding their gratitude towards the organisation: Thank you for this survey. It’s great to know that someone is ‘caring for carers’. Just that it’s been good to complete this survey to know that someone out there is concerned. Thank you for asking & caring about us!
Our analysis underscored the importance of engaging with carers as a population lacking visibility and recognition. In this way, the survey, and the room allocated for ‘any other comments?’ offered a form of solidarity in and of itself. Asking for open-ended responses, in carers’ own words, might be considered a communicative and symbolic mechanism for – to use the above respondent’s words – caring for carers. In this way the survey offered a line of engagement, helping to recognise and acknowledge the need for improved support for carers. As such, we saw the ‘data’ produced by the survey as more meaningful than merely ‘research findings’ for academic articles; the survey was an engagement tool which enabled a sustained dialogue between the organisation and those it served. But opening up this channel of dialogue, as we found, also required attention to and engagement with several responses that were perhaps less comfortable to hear, as we turn to in the section that follows.
‘Do something about it’: listening, discomfort, and responding to open-ended responses
Through our analysis we became attuned to several feedback style responses which raised questions for us about how we should in turn respond. The challenge in listening to these responses lay in their variousness, and often seemingly contradictory nature. For example, several respondents lamented that the survey was too long, and too complicated, while others insisted that there were not enough questions to adequately reflect their caring context or experience: What AWFUL HORRIBLE complicated questions. Please simplify the next survey ! This survey took much longer than 20 minutes. No more time to comment further. I think you should ask more questions about alcohol and drug use in carers. It’s important I feel that thus [sic] be addressed. . . .
Our analysis also alerted us to how the wording of questions could frustrate respondents: Total lack of recognition and support of the time and effort which goes into caring role and this is reflected in your survey question. The question perpetuates a perception that caring is segmented and that a carer only does one of the many things daily or weekly. The peril of the perception needs to be changed.
The above response was indicative of those we discussed as part of listening to (missed) opportunities for recognition, prompting reflections on how to improve question wording and overall flow, to show solidarity for carers’ tasks, roles, and relationships over time. Listening, then, required sustained engagement with how the survey might be adapted to capture evolving domains of experience. Moreover, it prompted discussion of how to communicate this engagement with the audience of respondents (including the practical challenges and costs of amending the survey, as well as difficulties of representing all carer experiences).
Another (typically contentious) example of feedback-style responses related to the ways that the survey treated questions about finances and household income: You didn’t ask about my salary ? Only partners ? I don’t want to be treated benevolently I would like to be treated as holding an important role in society My money isn’t any of your concern.
These excerpts demonstrate how questions could be variously interpreted. Indeed, designing a survey for a large and diverse cohort with varying values and expectations has been shown to be complex (Meitinger et al., 2021). These examples did, however, serve as an important reminder that respondents were often not aware of the rationale behind the specific wording of questions (e.g. to align with internationally recognised scales or measures, for population-level comparisons). It was through our discussions that we were alerted to these potentials for discomfort, and to the benefits of greater transparency within the survey.
Delving deeper into considerations of communication and messaging between researcher and respondent, our analysis revealed the desire among respondents to know what was to be done with their responses. These responses drew attention to the urgent need for the survey, its results, and the organisations behind it, to transform responses into ‘something constructive’: I would like [organisation] to raise the fact that carers are invisible in this society and DO SOMETHING CONSTRUCTIVE ABOUT IT . . . Our job is NOT TAKEN SERIOUSLY and OUR NEEDS ARE CONSTANTLY IGNORED! Several questions do not make sense and or do not provide suitable options so I gave up . . . Being a Carer for a family member is not easy and I don’t feel confident that completing this will get the answers you require to get the answers you need to assist us.
Attending to respondents’ complaints and frustration helped orient us towards sociological listening through ‘dwelling in discomfort’ (c.f. De Souza and Dreher, 2021). Such responses, which demand organisational reflection or action, tend not to be prioritised in research reports. Yet, an increasing corpus of sociological and methodological research engages with issues of complaint (e.g. Ahmed, 2021), as well as more traditional debates around positionality, power, and privilege in social research and policy (De Souza and Dreher, 2021; Newton, 2022; Shaw et al., 2020). Clear pathways for responding to remarks like the ones above were often not clear; nonetheless, we wanted to sensitise ourselves to variations in meaning so as to iteratively improve how data and knowledge were collected and used in the future.
As well as submitting survey feedback, some respondents sought attention or action, ranging from requests for support with day-to-day tasks to broad advocacy on behalf of carers. For example, there were many responses that contained specific requests for follow-up, despite the survey being national in scope and anonymous: I’d like someone to meet me and talk/help for planning my son’s future please. Thanks.
We also saw frequent appeals for support in specific locales: Please provide computer course for us in [suburb]. It is urgent, we need that. Please somebody help me, is not only me, many of us my age and over. Please please please. It is compulsory to [know]. I need computer course. From beginning. This is very important PLEASE. THANKS.
These examples infer that respondents had a particular reader – or listener – in mind: a member of staff within the organisation who could and would follow-up to provide care and support. We discussed both the ethical and practical issues in how to act on these responses, which was sometimes contingent on whether respondents volunteered their contact details within the free-text box, as some did: My name is Peter and my phone number is [04XXXXXXXX].
Some responses more closely aligned with the initial survey instruction: ‘by sharing your opinions and experiences, you will help [us] advocate for greater recognition and support of carers across Australia’. Responses here revealed the domains of advocacy in which carers wanted to see improvements: Please lobby hard on NDIS [National Disability Insurance Scheme]–there is still an incredibly long way to go and my recent experience of a poor plan review has left me feeling very concerned for more vulnerable carers and their ability to advocate. Please advocate for a higher carer allowance and pension for us.
These excerpts indicate how respondents implicitly or otherwise responded in ways that foregrounded an imperative for (our) action, from individual requests for help, to pleas to improve recognition of carers. A complex challenge for us as listeners, was what to do in response? Our research team discussions have been wide-ranging and are ongoing, as we strive to foster and sustain dialogue with respondents that foregrounds transparency and rationale on survey design, and improvements to the survey. More broadly, we continue working to consider how to meaningfully communicate survey ‘results’ and demonstrate to respondents – and carers more broadly – how our listening can be translated into research and advocacy that foregrounds recognition and support.
Discussion
In this article, we have reflected on our experience of analysing an ‘anything else’ style open-ended survey question, drawing on conceptual and methodological approaches to sociological listening. Building on previous scholarship on ‘any other comments?’ style questions (Decorte et al., 2019; Garcia et al., 2004; O’Cathain and Thomas, 2004), and applying a reflexive thematic analysis approach developed by Braun and Clarke (2021), our analysis revealed the potential of this broad style of concluding survey question in eliciting considerable feedback on the survey instrument, insights into the less visible aspects of (caring) lived experience, and in generating calls for attention and action. Through our reflections, we also hope to contribute to the ongoing burgeoning literature focused on sociological listening.
For Back (2007), sociological listening requires critical focus on the details of individual accounts coupled with an attentiveness to the broader public issues (made) visible through such processes of listening. As we have examined above, carers’ open-ended responses gestured to experiences of precarity and loneliness within their communities. These responses offer important evidence to complement quantitative findings that indicate the social, economic, and political structures that contribute to carers’ lack of visibility and marginalisation (Mason, 2006). In this way, being orientated to listening through attending to responses to ‘anything else’ style questions, we found added richness to the patterns that we observe elsewhere in survey analysis. In shedding light on sudden social, legal, and political shifts (as we saw with the COVID-19 pandemic), the open-ended responses also served to highlight issues that might be included in future iterations of the survey (Fielding et al., 2013; Silber et al., 2020). Our analysis also demonstrated further challenges and opportunities to improve the survey instrument, through engaging with respondents’ contextualisation, feedback, and/or complaints (see also Decorte et al., 2019).
In making visible some of the opportunities and challenges presented through analysis of an open-ended survey question, we aim to foreground sociological listening as both a practice and a sensibility (c.f. Back, 2007; Waller et al., 2015). It was in this pursuit of attunement, of listening to the mundane or the uncomfortable, that we found genres of gratitude and complaint, narrative, and suggestion. Yet, we found the question of what to do with/about the responses, generated from the ‘anything else?’ style question, to be complex and replete with methodological and practical challenges; our discussion has barely scratched the surface of the myriad issues researchers may face as they try to respond by listening. Scholars who have focused on the politics of listening have emphasised that ‘listening lies in the how as much as the why’ (Bassel, 2017: 91). We uncovered among the respondent carers an intense sense of ‘I need you to know this’ (Back, 2007: 12). And to take this one step further, we saw in our analysis several calls for action, for us as listeners to ‘do something constructive’. A key challenge for us thus lay in how to listen to carers, not only as those to be cared for, but as people capable of speaking for themselves in their own ways (see also Bassel, 2017; Nolas et al., 2019). Part of this listening for us involved dwelling in discomfort (De Souza and Dreher, 2021; Nolas et al., 2019), to reflect on how our research processes (including norms that circulate around appropriate data for research outputs) can include and exclude, and (de)legitimise particular experiences and forms of care and caring (Bellamy et al., 2011; Braun and Clarke, 2021; Kirby et al., 2022).
While through the final question carers could directly write to those administering the survey, listening to written responses is difficult, and partial. We do not hear the tone of what is said, nor do we have channels available for direct and immediate dialogue with respondents. Despite this, we found much value in discussing ideas and opportunities for how to respond to responses. An important aspect of listening for us was our (ongoing) consideration of who did not, cannot, or will not speak (in responding to the open-ended survey question, or in completing or even hearing about the survey). In this way, sociological listening involves not only responsiveness to the insights generated, but also reflecting on the blind spots and gaps, in what may have been missed (see also Back, 2007). These gaps may be able to be addressed through personnel roles or advisory panels for peer-to-peer recruitment, as well as through remuneration for respondents (Coupe and Mathieson, 2020). For the next (future) iterations of this carers survey, there is ongoing discussion among advisory groups to improve representation through targeted sampling and recruitment, as well as continued engagement with carer representatives to improve the ‘user friendliness’ of survey completion.
In conclusion, we have aimed here to show how approaching open-ended survey responses qualitatively and thematically can afford a fruitful (albeit at times limited) channel for sociological listening. A crucial part of this listening, we found, lay in developing ways to convey the impacts of survey responses and participation through ongoing and reflexive dialogue. Given that survey respondents often have something else to say, and dedicate their time and effort to sharing their experiential knowledge and feedback, we see great importance in listening, and in responding, to open-ended responses.
Footnotes
Acknowledgements
Declaration of conflicting interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
