I analyzed 3,527,492 survey responses captured over the last year. Here's what the data shows... 1. Don't ask hard questions first ↳ Great surveys start with a VERY easy question ↳ Harder questions come later – once someone has "bought in" to your survey ↳ Consider starting with a Yes/No question ↳ The best surveys created on our platform have an 85%+ first answer completion rate. 2. "Choose one the following" > freeform inputs ↳ Freeform inputs are great for getting raw voice-of-customer language ↳ ...But they take effort to complete, and our monkey brains would rather just push buttons ↳ Freeform questions work best as contextual follow-ups to specific one-of-many questions, e.g. "Do you have a podcast? Yes/No" -> IF NO: "In a sentence or two, what's held you back from starting a podcast?" 3. Write conditional, "conversational" surveys ↳ Don't set up a survey that's just a flat list of one-size-fits-all questions ↳ The questions you ask should change based on previous answers ↳ ...And the question text itself should also change 4. Don't make it about you ↳ This is probably the most important point ↳ You're asking someone to give you time + personal data ↳ ...What's in it for them? ↳ Poor performing surveys don't make this obvious ↳ Great surveys make it clear that the data captured will help deliver better information, better recommendations, better everything – the questions are to help *them*, not *you* 5. No more than 4-5 answer options ↳ For choose one-of-the-following questions, limit your options to 4-5 ↳ If you need more options, show the top 4 first with a "Maybe something else?" option. If that option is selected, show other options. ↳ More options = more thinking = fewer completions 6. Short, punchy copy ↳ Poor performing surveys often have lengthy answer options ↳ Questions with high completion rates have simple, 1-2 word answer options ↳ More text = more thinking = fewer completions 7. How many questions doesn't generally matter ↳ Question #2 tends to have a 95% completion rate. Question #3 has a 96%. Everything beyond that has an 97%+ completion rate. ↳ If you're asking useful questions, people will keep answering ↳ Ideally use a survey tool, like RightMessage, that will capture data incrementally (rather than requiring the full survey to be completed) 8. Only ask what you really need ↳ Don't ask someone's gender unless it will help you give them better content ↳ Don't ask for someone's income unless this will help you qualify them or push them to the right offer ↳ Every question you ask should be framed as something that enable you to give them exactly what they need from you Which of these takeaways resonates best with you? Let me know in the comments 👇 And if you want to learn how to set up, write, and optimize great surveys, check out Segment With Surveys: https://lnkd.in/e9jdwfjn
How to Address Low Response Rates in Surveys
Explore top LinkedIn content from expert professionals.
Summary
Low response rates in surveys occur when fewer people participate or complete a survey than expected, which can impact the quality and usefulness of the data collected. Addressing this challenge means making surveys more inviting, relevant, and meaningful so participants feel their feedback matters and their time is respected.
- Keep it short: Limit survey questions and answer options to avoid overwhelming participants and focus on what you truly need to know.
- Close the loop: Always communicate back to participants about what was learned from their responses and how their feedback led to real changes.
- Build trust: Involve stakeholders early in the process, show respect for cultural differences, and make surveys feel personal and safe for everyone.
-
-
Increasing the Response Rate of the Post-Audit Exit Survey Issuing and reporting on post audit exit survey is a great way to receive feedback to improve IA's performance. However, even with significant effort invested in building and issuing the survey, audit customers may sometimes not complete it. When this occurs, the CAE should consider the following factors to diagnose and resolve the issue: 1. Survey Length The survey might be too long. Recommendation: Keep the survey to no more than 6–8 questions. Use closed-ended questions for most items and reserve free-text responses for the final question. Consider making the free-text portion optional. 2. Timing of the Request The request for survey completion might be poorly timed. Some teams wait until after the final report is issued—sometimes weeks or months after fieldwork—to ask for feedback. Recommendation: Introduce and reinforce the survey request early and often: - The VP should mention the post-audit exit survey to C-suite or VP-level executives during pre-planning. - The Director should raise the topic with VP or Director-level audit customers at the end of the initial planning meeting. - The Manager should emphasize the survey’s importance during the fieldwork kickoff. - The audit senior or supervisor should send the survey along with the draft audit report in preparation for the audit exit meeting. 3. Limited or No Follow-Up The audit leadership team may have made the survey request only once. Recommendation: Follow up multiple times. Audit customers may be juggling several projects, so a reminder can significantly boost response rates. 4. Relevance of Survey Content The survey questions might focus solely on Internal Audit, which may not resonate with the audit customer. If the survey only asks about the audit team’s performance (e.g., team knowledge or punctuality in deliverables), it might overlook the audit team’s impact on the customer’s operations. Recommendation: Include questions that evaluate both the internal team’s performance and the relevance of the audit team’s output. This balanced approach makes the survey more engaging and pertinent to the audit customer. 5. The Audit Customer is Annoyed or Dissatisfied with IA If the audit ran too long or if the team’s performance was subpar, it may lead the customer to want to move on from all audit-related matters. Recommendation: Give the executive or team a couple of weeks of space. The CAE (not anyone else) should then follow-up directly to obtain their feedback. And they should be prepared to commit to following-up individually with the audit customers once improvements are implemented to highlight time was wll spent providing feedback and the team took action on it.
-
Recently, I was giving a talk on people analytics and employee listening when someone asked, “How do you overcome survey fatigue?” Here’s my hot take: survey fatigue is a myth. Think about it — if you approach someone with genuine curiosity and ask them about their experiences or opinions, how often do they say, “No thanks, I’m tired of sharing my thoughts”? Probably never. People love to be heard. What they don’t love is wasting their time. Or as Morgan Williams put it: It’s not survey fatigue, it’s "inaction fatigue." Employees don’t get tired of surveys; they get tired of giving feedback that goes nowhere. So, if you’re seeing low participation, ask yourself: 📊 Have we shared the results? (Yes, even when scores are low!) ➡️ Have we shown employees ways their feedback led to real change? The best way to increase survey participation isn’t to send fewer surveys — it’s to *close the loop*. Share what you’ve learned. Take action. Make it clear that feedback actually matters.
-
I've said it before, but it's worth repeating. "Survey fatigue" isn't what you think it is. It's not about too many surveys, it's about too little action. At YMCA WorkWell, I often hear: "Our employees have survey fatigue, I don't think this is the right time to collect their feedback". But here's the thing. Employees aren't tired of providing feedback and and they aren't tired of speaking up to try and make their work better. They're tired of nothing changing when they do. A survey isn't the problem, it's feeling like your voice isn't going to be heard. That's what makes another survey feel pointless and exhausting. So if you want to do a survey right, start by asking: ✅ Have we closed the loop on the last one? ✅ Did we communicate what we learned and how we would respond? ✅ Have we made tangible changes based on the feedback? ✅ Have we communicated those changes back and clearly tied them back to the feedback provided? ✅ Do we have a process in place to communicate back what we hear this time quickly and clearly? ✅ Are we really committed to acting decisively on what we hear? If you're viewing a survey as just a round of data collection or something to check off on a box, you're going to fall short. Instead, view it as an opportunity to signal to everyone in your organization that leadership is listening, learning and responding. Because if employees stop responding and start complaining about surveys, it's not because they are tired of a 5-minute survey twice a year, it's because they don't think their voice will matter. So if you really want to address survey fatigue, removing employees' opportunities to speak up is not the answer. It's acting on their feedback when they do. #SurveyFatigue #EmployeeExperience #EmployeeSurveys
-
You hired external evaluators. Paid for rigorous methods. Got a 60-page report. Then you realised: → The findings don’t resonate with stakeholders → The recommendations aren’t feasible in the real context → The data missed what actually matters to the community That’s not rigor. That’s expensive irrelevance. And the signs were there from the start: Your survey had a 23% response rate. The focus groups sat in silence. The interviews felt like interrogations. You blamed “hard-to-reach populations.” But the real problem? Your evaluation design signaled: “We don’t understand you. We don’t trust you. Your perspective doesn’t matter.” The community responded exactly as expected. Here’s how to prevent this, drawn directly from best practice in culturally competent evaluation: ✅ Involve the right stakeholders in Step 1, not Step 6. Map the community, involve gatekeepers early, and let them help shape the evaluation questions, not just react to the findings. ✅ Check your own cultural assumptions before designing tools. Most “validated” instruments are validated for dominant groups. Pilot-test tools with community members, translate them properly, and remove questions that feel intrusive, irrelevant, or disrespectful. ✅ Match methods to the community, not just the donor. Some groups prefer storytelling to surveys. Some trust talking circles more than focus groups. Some won’t speak openly unless the facilitator mirrors their language, identity, or gender. ✅ Build trust before collecting data. Silence is often a signal of fear, not apathy. Spend time with community members, use local intermediaries, and make the process feel safe and human, not extractive. ✅ Co-interpret the findings. Don’t analyse alone and present conclusions as if they’re neutral truths. Bring participants into sense-making sessions. They’ll tell you what you got rightand what you misread. When culture is ignored, rigor collapses. When culture is central, rigor improves and your evaluation finally becomes useful. ------------ 👉 If you want more practical guidance like this on culturally responsive evaluation, follow me on Linkedin. #CulturallyResponsiveEvaluation
-
Writing your own survey? Stop making these survey mistakes… I’ve reviewed dozens of surveys from brands and consultants who are taking a DIY approach to survey-based research. While I love seeing more companies using data and original insights in their content, there are some common pitfalls with surveys that can undermine your efforts. Here are the biggest mistakes I see—and how to avoid them: 1️⃣ Too many open-ended questions While open-ended questions can be valuable, overusing them can overwhelm respondents and make it harder to extract actionable insights. Many of these could easily be reworked as multi-select options, which are quicker to answer and easier to analyze. 2️⃣ Not tailoring questions to respondents Failing to properly segment your audience or filter questions (e.g., asking irrelevant questions to people outside a specific group) frustrates respondents and skews your data. Make sure your survey flows logically and adapts based on responses. 3️⃣ Using jargon or acronyms Don’t assume your audience speaks the same language as your internal team. Spell out acronyms and avoid industry jargon—it ensures clarity and a better response rate. 4️⃣ Combining ideas in one question or response option Questions or responses like “Do you think A and B?” are problematic because a respondent might agree with one but not the other. Keep questions and responses focused on one idea at a time to get accurate answers. 5️⃣ Making surveys too long Long surveys lead to drop-offs or rushed responses. Respect your respondents' time—focus on what you really need to know and keep it concise. 6️⃣ No narrative structure—just a dump of internal questions One of the most common mistakes I see is surveys that lack a clear story arc. Instead of building around a strong theme or hypothesis, it’s just a long list of random questions from different stakeholders. The result? Disconnected data that's hard to turn into compelling content. When designing your survey, think about the story you want to tell. Build your questions to support that narrative. Key Takeaway: Thoughtful design makes a huge difference in the quality of your insights—and ultimately, the impact of your content. Have you seen any survey mistakes that drive you nuts? Or tips for improving them? #SurveyTips #OriginalResearch #ContentStrategy Hi, I'm Becky. 👋 My clients have garnered 80+ media mentions, 2-3X the leads, and over 250K in free advertising from branded research💰 Interested in branded original research to boost your marketing KPIs? DM me and we'll talk. 🙂
-
Survey Fatigue is a myth. People aren't tired of being asked questions, they're tired of leaders not listening to the answers. When companies blame "survey fatigue" for declining participation rates, what they're really admitting is "our employees don't believe we'll act on their feedback." The data is clear: organizations that demonstrate consistent follow-through on feedback achieve much higher response rates. (85-90%) while organizations with a track record of ignoring feedback see 30% or less. The problem isn't your frequency. It's a follow-through deficit. At organizations with high survey participation, leaders do three things differently: -They set expectations on how feedback will be acted on and why it's important -They communicate specific actions taken based on feedback -They close the loop with transparent timelines When you follow through with meaningful action, employees trust their voice matters. When you don't, employees view surveys as pointless exercises.
-
My partners at OPINATOR believe customer feedback should be more than a form—a brand experience. In today’s crowded digital landscape, customers are constantly bombarded with requests for feedback. Most surveys feel generic, cold, and disconnected from the brand. The result? Low response rates, incomplete data, and missed opportunities to strengthen customer relationships. That’s where OPINATOR is different. Graphic design, brand identity, and copywriting transform surveys into emotionally engaging, visually branded, high-performing digital experiences. Every touchpoint—even a survey—is an opportunity to build loyalty, trust, and a deeper connection. Customers don’t respond to plain, transactional forms—they react to experiences that feel human and on-brand. Engagement skyrockets when a survey feels like a natural continuation of their journey. When it feels like a disconnected afterthought, it gets ignored. OPINATOR clients routinely see response rates increase by 2–5x compared to traditional surveys. But beyond just more feedback, they get better feedback—richer insights, clearer emotional drivers, and more actionable data. Their platform is built around Emotional Feedback, which helps brands go beyond surface-level responses and understand how customers truly feel. Here’s how they bring that to life: Branded Visual Design – Every survey is styled to match your exact brand guidelines—colors, logos, imagery, and typography—so it feels like part of your digital ecosystem. Conversational Copywriting – Approachable, brand-aligned language that invites a real conversation, not a checklist. Smart Placement – Embed surveys in key moments across the customer journey—post-purchase, after a chat, inside your app—so they’re contextually relevant and easy to respond to. Gamification and Interactivity—Sliders, emojis, avatars, and more make surveys fast, fun, and intuitive, especially on mobile. Personalization – Dynamically adapt survey flows based on behavior or response history, making every question feel tailored. OPINATOR isn’t just a prettier survey tool—it’s a strategic platform for turning feedback into higher-quality insights and strengthening brand perception. You unlock the ability to: Reduce churn by identifying emotional pain points, improve customer lifetime value through service design, and create experiences customers love—and talk about. When brands win or lose based on customer experience, the quality of your feedback tools matters more than ever. OPINATOR transforms surveys from a cost center into a loyalty-building asset. Suppose you’re using other platforms and suspect you’re not getting the engagement, emotion, or impact you need. In that case, we’d love to show you what’s possible with OPINATOR. Whether you're exploring alternatives or just curious, they'd be happy to give you a walkthrough and share how we’ve helped others transform their Voice of the Customer programs into brand-building powerhouses.
-
Save months of guesswork and copy my 3-step process for creating effective donor surveys: Define clear objectives • What do you want to learn? • How will you use the information? Example: Measure donor satisfaction, gather feedback on programs, understand communication preferences Keep it short and mix question types • 5-10 questions max • Use a combination of: Multiple choice (easy to answer) Rating scales (quantifiable data) Open-ended (rich insights) Example: How satisfied are you with our communication frequency? (1-5 scale) Which program interests you most? (Multiple choice) How can we improve your giving experience? (Open-ended) Always follow up with respondents personally • Thank them for their time • Share key findings and action steps • Explain how their input will shape your work Example: "Your feedback on our youth program led us to extend its hours. Thank you for helping us serve more children!" I've increased response rates by 40% using this method. Pro tips: • Test your survey internally before sending • Offer an incentive for completion (e.g., entry into a drawing) • Send reminders, but don't overwhelm • Consider the timing (avoid holiday seasons) Remember: The goal isn't just to collect data, but to deepen relationships and improve your work. Save this post for your next donor survey!
-
Survey response rates: the most common challenge I hear in demos. But honestly, it’s not the customer’s fault. It’s yours. Let me explain. Example 1: You want to survey people who took a test drive but didn’t buy. You send them an email or SMS. Response rate? Zero. Of course, they’re not your customers yet. Why would they bother? Try this: Don’t send a survey. Call them. Have a real conversation. Example 2: You’re a new bank. Your customers are retired professionals aged 60+. You send them a feedback email. Will they reply? Unlikely. Try this: Use WhatsApp. Then call. (We’ve seen surprising response rates on WhatsApp in this segment.) Example 3: You’re an NBFC, and your customers are in Tier 3/Tier 4 cities. And you send… an email? Try this: WhatsApp. Then call. Example 4: You’re an airline, and you send a survey 2 weeks after the flight. Do you think they even remember the experience? If you want better response rates: --Be in your customer’s shoes --Choose the right channel for your audience --Ask at the right time --Most importantly, don’t let feedback sit in a dashboard. Act on it. And let the customer know. That’s how you earn feedback. Not with reminders, but with respect. #VoiceOfCustomer #ResponseRates #CustomerFeedback #CustomerExperience
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development