Balancing scale and quality in a global research marketplace

Finding the right balance between growth and high standards is an ongoing negotiation that requires curiosity, humility and industry collaboration, says Katy Mallios.

pile of smooth stones balancing on top of each other surrounded by textured sand

For as long as market research has existed, the industry has wrestled with a tension: the drive to scale versus the need for quality. Yet, the tighter we control inputs, the more we limit growth.

Today, that tension is sharper than ever. The globalisation of sample supply, automation of data collection and proliferation of marketplaces have made it easier to access respondents, but harder to ensure that what we collect truly reflects human insight. The challenge is not simply balancing scale and quality, but redefining what “responsible growth” looks like in a research economy built on volume.

Growth requires risk, but not recklessness

Every supplier, panel, or exchange faces a version of the same dilemma: growth demands experimentation. New recruitment sources, incentive models, and regional expansions are essential for reaching underrepresented audiences and meeting demand at speed.

When a supplier tests a new traffic source or onboarding mechanism, they are essentially making a bet, one that could unlock new, diverse respondents or introduce fraudulent behaviour.

In a healthy ecosystem, scaling supply doesn’t mean lowering standards. It means empowering suppliers to take calculated risks: expanding thoughtfully, monitoring quality in real time, and acting quickly when something goes wrong. That balance, between experimentation and control, is what separates sustainable growth from short-term gain.

Quality as shared accountability

Quality has too often been treated as a one-sided responsibility, owned by whoever purchases or fields the research. But in a connected marketplace, quality must be distributed across the chain: buyers, suppliers, technologists and researchers all play a role.

Fraud and data contamination aren’t isolated to bad actors; they emerge when oversight weakens or when incentives reward speed over signal integrity. A shared-accountability model acknowledges that quality management is an ecosystem function, not a single checkpoint. That collaboration can take many forms: shared dashboards, open feedback loops, security protocols that detect suspicious patterns, and early warning systems for anomalies. What matters is not the specific tool but the cultural shift it represents when quality becomes everyone’s concern, not just a back-end audit step.

Transparency as currency

Trust between buyers and suppliers is built on transparency, knowing where data originates, how respondents are treated, and what processes govern inclusion and exclusion.

Buyers often expect visibility into source composition or reconciliation rates, but suppliers rarely see equal visibility into study design or respondent experience. Both perspectives matter. A respondent’s willingness to participate in a survey is shaped not only by recruitment quality but also by what happens once they enter the survey environment.

Transparency, in this sense, is a two-way street. Suppliers deserve to know that surveys are fair, engaging and not overly repetitive. Buyers deserve to know that respondents are genuine, informed and consented.


“What matters is not the specific tool but the cultural shift it represents when quality becomes everyone’s concern, not just a back-end audit step.”

 

Rethinking respondent experience

If the industry wants to preserve data integrity at scale, it must start with the respondent experience. Too often, participants are asked to answer the same demographic questions multiple times across systems. They face long surveys, redundant logic, or poorly optimised mobile interfaces, all of which erode attention and invite shortcuts.

Reducing that friction is not a small cosmetic fix; it’s foundational to quality. The next frontier of improvement may not be more verification, it may be simplification. Imagine an industry where researchers design with empathy for time and cognitive load. Where validation data can flow securely across systems, eliminating repetitive tasks. That’s not an efficiency play, it’s a quality one.

Technology can’t replace judgement

Machine learning and predictive scoring now underpin many quality-assurance systems, flagging anomalies and blocking low-integrity respondents before they enter surveys. These tools are critical, but not infallible.

Algorithms are only as good as their inputs, and even the best models can produce false positives, excluding legitimate respondents and biasing datasets. The future of data quality will depend on combining automation with human oversight, using machines for speed and scale, and people for context and correction. Balancing those forces is how the industry can evolve toward proactive, not punitive, quality management.

Toward a healthier marketplace

The push for growth isn’t going away. Demand for faster insights, broader representation, and continuous data will only accelerate, especially as the industry has entered the synthetic data era. The question is whether the infrastructure of research can evolve with equal sophistication.

A more resilient marketplace will rest on a few core principles:

  •        Shared responsibility for data integrity across buyers and suppliers
  •        Transparency that flows in both directions, illuminating processes and expectations
  •        Empathy for the respondent journey, reducing friction and redundancy
  •        Balanced automation, where people and machine intelligence complement each other.

Balancing scale and quality isn’t a zero-sum trade. It’s an ongoing negotiation, one that requires curiosity, humility, and collaboration across the industry. Rather than being about avoiding risk, responsible growth is about managing it wisely, with transparency as the connective tissue. 

Katy Mallios is senior vice-president of supply and data partnerships at Cint

We hope you enjoyed this article.
Research Live is published by MRS.

The Market Research Society (MRS) exists to promote and protect the research sector, showcasing how research delivers impact for businesses and government.

Members of MRS enjoy many benefits including tailoured policy guidance, discounts on training and conferences, and access to member-only content.

For example, there's an archive of winning case studies from over a decade of MRS Awards.

Find out more about the benefits of joining MRS here.

0 Comments


Display name

Email

Join the discussion

Newsletter
Stay connected with the latest insights and trends...
Sign Up
Latest From MRS

Our latest training courses

Our new 2025 training programme is now launched as part of the development offered within the MRS Global Insight Academy

See all training

Specialist conferences

Our one-day conferences cover topics including CX and UX, Semiotics, B2B, Finance, AI and Leaders' Forums.

See all conferences

MRS reports on AI

MRS has published a three-part series on how generative AI is impacting the research sector, including synthetic respondents and challenges to adoption.

See the reports

Progress faster...
with MRS 
membership

Mentoring

CPD/recognition

Webinars

Codeline

Discounts