By Debrah Harding, The Market Research Society (MRS)

At a time when the sector is struggling against a tide of fake data, fraudulent participants and bot technologies, which are affecting research data quality and integrity, the need to protect and encourage real participants has never been so important. The success of research relies upon the ability of the sector to engage and encourage members of the public to become participants.  Without trust, the sector will struggle to attract participants and the future of research becomes very bleak indeed.

Whilst it is encouraging that in the UK the value market research brings to business is recognized, there is still more to be done to improve the value recognition by individuals.  Part of this will be about improving the participant experience when contributing to research, but other factors such as assurances about data protection and data security are equally important.

The UK research sector is renowned for its innovation, and it is without doubt one of the key strengths of the UK research sector.

However, the profession must also be cognizant of the impact that some of the new technologies may have on the perception of their activities.  The GRBN Trust Survey results show that in the UK there are low levels of trust in data analytics, social media and AI; and all of these techniques are part of the researcher’s toolkit.  But there is also evidence from other research undertaken in the UK, that when these tools are used for purposes perceived as beneficial to the public, such as using AI for cancer detection, the public becomes more comfortable; with the caveat that the majority would like to see laws and regulations guiding the use of such technologies.

For the research sector, if AI and related technologies are used without consideration for ethics, the output from AI can amplify and emphasize human biases which could result in harm to individuals, business, and society and could ultimately damage the research sector itself.  Conversely, if AI and related technologies are positively managed by researchers, with ethical principles, which protect and inform participants at the core, the potential of the technology’s possibilities can be maximised.

In the UK, MRS has developed comprehensive guidance setting out how practitioners can act legally and ethically when using AI and related technologies.  Within the MRS Code of Conduct there are 12 ethical principles which underpin all rules and requirements within the Code. Within the new MRS guidance Using AI and Related Technologies the 12 Code principles have been applied to the use of AI and related technologies in all its forms.

In the UK this is the first step to keep a abreast of this rapidly evolving policy area, particularly as legal and regulatory frameworks develop around the world.

The research sector needs to adopt AI and related technologies in a measured, legal, ethical and privacy-first way to ensure that the sector harnesses the benefits whilst mitigating the risk. As a sector, research relies upon the confidence of clients, users and participants in the value of what we deliver.  To continue to retain that confidence, we need to put participants first and safeguard their confidence to support our activities.

Debrah Harding, The Market Research Society (MRS)