The rapid growth of global online fieldwork has provided a range of opportunities to researchers. Online fieldwork enables fast data collection, opens up access to hard-to-reach respondents and provides a cost-effective fieldwork solution.
However, this method of data collection also raises a number of ethical and technical issues.
This document serves as an official response to ESOMAR’s “37 questions to help research buyers of online samples” and provides a detailed insight into the operational integrity and panel management practices employed by Panelbase. ESOMAR’s guidelines on conducting market and opinion research using the internet are designed to provide advice on these issues.
In order to ensure a credible and robust research panel, Panelbase has embraced these guidelines alongside strong data management principles, sound business ethics, and an overall integrity that underpins the development of its research community.
What experience does your company have in providing online samples for market research? How long have you been providing this service? Do you also provide similar services for other uses such as direct marketing? If so, what proportion of your work is for market research?
Panelbase is a division of Dipsticks Research Limited – a full-service research agency that has been operating since 1997. Since 2004, the company has provided online research services and conducted fieldwork in the UK, Europe and further afield. The Panelbase research community has evolved since 2004 and provided sample for thousands of online and offline projects (CATI, mobile surveys and focus groups).
Prior to 2008, Panelbase was used primarily as an internal resource supplying sample to each research division within Dipsticks Research Limited. Since 2008, and due to the rapid growth of Panelbase, we have provided sample to external clients, including; research agencies, other panels, PR companies and end clients seeking highly targeted and responsive UK sample.
Panelbase currently only supplies sample for market research purposes, and does not have any future plans to change this position.
Do you have staff with responsibility for developing and monitoring the performance of the sampling algorithms and related automated functions who also have knowledge and experience in this area? What sort of training in sampling techniques do you provide to your frontline staff?
Panelbase does not use any sampling algorithms and all sampling is done manually by a team of project managers. When new members of staff join the sample management team there is a detailed documented training three-month training schedule to ensure that each new member of staff is familiar with our practises.
In addition to the initial training programme Panelbase undergoes frequent re-training and refresher courses to ensure that best practises are maintained.
What other services do you offer? Do you cover sample-only, or do you offer a broad range of data collection and analysis services?
Due to the connection between Panelbase and Dipsticks Research Limited, Panelbase is able to offer a range of different analysis services. In addition to providing sample to existing projects, Panelbase offers a field and tab service as well as advanced data analysis using different statistical techniques such as cluster analysis and segmentation, regression analysis, multi-factor analysis etc and full analysis of the results with a final report produced and presented.
From what sources of online sample do you derive participants?
Panelbase sources respondents mainly from its actively managed panel. To be able to ensure that we can provide a full solution if sample is required beyond the scope of Panelbase’s capabilities we occasionally source sample from trusted partners. We also conduct many projects using client-supplied data lists for projects such as employee surveys or customer satisfaction surveys.
Which of these sources are proprietary or exclusive and what is the percent share of each in the total sample provided to a buyer?
The size and responsiveness of the Panelbase membership means that we are almost self-sufficient when providing sample for online projects in the UK. The members of the Panelbase database are proprietary and exclusively owned by us. In most cases we do not require sample contributions from external partners, which simplifies the research process and aids overall data integrity. On occasion, we call upon on external partners to provide sample for international projects or extremely low incidence UK-based projects where the reach of Panelbase is insufficient to support a project’s requirements. We have vetted many external partners over the years and have established solid relationships with carefully selected partners who are available to support our requirements using their own proprietary panel.
What recruitment channels are you using for each of the sources you have described? Is the recruitment process ‘open to all’ or by invitation only? Are you using probabilistic methods? Are you using affiliate networks and referral programs and in what proportions? How does your use of these channels vary by geography?
Panelbase is built around two primary recruitment methods; using a network of affiliate partners and a referral program where our current members can refer friends and family. The particular make-up of the affiliate changes regularly to minimise any bias that can naturally arise from any particular recruitment source. The recruitment process is ‘open to all’ and we don’t use probabilistic methods. As Panelbase is a UK-only panel the channels only apply to one geographic region.
What form of validation do you use in recruitment to ensure that participants are real, unique, and are who they say they are?
All new members are subject to stringent quality checks, which are both automated and manual. The automated checks in place use metrics such as cross-referencing IP addresses, digital fingerprinting and details provided against the current database and known bad actors. In addition to this, manual checks are made checking the physical details provided with data lists, voter registries etc, to ensure that the details provided match real-life people across a variety of different sources.
What brand (domain) and/or app are you using with proprietary sources?
Panelbase is accessed via our member website, www.panelbase.net, and email invites.
Which model(s) do you offer to deliver sample? Managed service, self-serve, or API integration?
Customer service is important to us, and as a result we prefer to offer only managed service to deliver our sample. This allows us to ensure that each survey is sampled with the care and attention that each individual survey requires.
If offering intercepts, or providing access to more than one source, what level of transparency do you offer over the composition of your sample (sample sources, sample providers included in the blend). Do you let buyers control which sources of sample to include in their projects, and if so how? Do you have any integration mechanisms with third-party sources offered?
Any blending of sources will only be in cases where a blended external source is acquired to fill a specific shortfall or niche requirement. Where a study comprises multiple waves or is longitudinal by design, sample planning at the start of the project will map our partner contributions to ensure consistent sample compositions across all waves. We don’t let buyers control their sources of sample, only sample providers who have been vetted through the Panelbase supplier onboarding system will be used. Once third-party sample is onboarded to a project the respondents are automatically filtered through the same quality processes that our proprietary sample undergoes including proprietary scripts to identify and filter potential overlap between sources.
Of the sample sources you have available, how would you describe the suitability of each for different research applications? For example, Is there sample suitable for product testing or other recruit/recall situations where the buyer may need to go back again to the same sample? Is the sample suitable for shorter or longer questionnaires? For mobile-only or desktop only questionnaires? Is it suitable to recruit for communities? For online focus groups?
Each of our members are able to opt-in or out of specific research types which allows us to ensure that only respondents who are interested in taking part in particular studies are invited. We allow clients to collect PII from respondents for the purposes of recall studies, or our management system is able to determine respondents who have taken part in past studies if an unforeseen recontact is required. Our members are accustomed to taking part in a variety of different survey types and lengths, including recruitment to online and offline qualitative surveys such as communities, focus groups or depth interviews. Most of the surveys that we run are device agnostic however we are able to detect the device that a respondent tries to enter a survey on and either advise them if it’s not suitable, or physically block them if the survey has a requirement for a particular device.
Briefly describe your overall process from invitation to survey completion. What steps do you take to achieve a sample that “looks like” the target population? What demographic quota controls, if any, do you recommend?
Sample selection is driven by the profile requirements of each individual survey, as well as taking in to account additional factors such as available time for fieldwork and likely response rates. Feasibility and incidence are always established during the conception of a project in order to determine the most effective means of achieving sample selection. Exclusions can be implemented based on survey subject matter, frequency of participation, or any other criteria, as required.
When selecting or strategically excluding sample, our systems automatically extract at random those members who meet all profiling requirements. This process is subjected to quality assurance checks in order to verify that the correct sampling requirements and expectations are met before allowing our systems to engage in the mass deployment of survey invitations.
Survey invitations are typically deployed via email as well as being added dynamically to each invitee’s home page within the member website. We also have the capability to deploy SMS alerts to those members who have provided and triple opted-in their mobile number.
What profiling information do you hold on at least 80% of your panel members plus any intercepts known to you through prior contact? How does this differ by the sources you offer? How often is each of those data points updated? Can you supply these data points as appends to the data set? Do you collect this profiling information directly or is it supplied by a third party?
All members are invited to complete 18 profile sections to tell us more about themselves and to assist with pre-selecting them for surveys that are relevant to their personal profile and interests. In total, each member can provide information on more than 800 fields, however this is not mandatory. All members are prompted to complete their profiles or any sections which are out of date by more than 6 months. This helps to ensure maximum accuracy at all times and assists with feasibility and sampling. Where no relevant profiling is held, we make use of our MiniPollTM tool, which allows questions to be launched within a matter of seconds and can help to gauge incidence or act as a pre-screening mechanism for targeting niche sample profiles. We also regularly script bespoke pre-screening projects to accommodate niche or complex sample specifications which cannot be easily serviced using core profiling alone. We are able to append non-special category or PII information to final data sets or embedded in the client survey URL.
What information do you need about a project in order to provide an estimate of feasibility? What, if anything, do you do to give upper or lower boundaries around these estimates?
In order to provide a quote we require; number of completes required, target specification and incidence rate, desired fieldwork timeline, survey quotas, length of survey and any technical detail that may impact fieldwork. In cases where an incidence is unknown, we are able to use our MiniPollTM system to gather this within 24 hours of the initial quote.
What do you do if the project proves impossible for you to complete in field? Do you inform the sample buyer as to who you would use to complete the project? In such circumstances, how do you maintain and certify third party sources/sub-contractors?
In rare cases where Panelbase is unable to complete a project in field based on the original project specification typically third-party suppliers are used to bolster the completes. The identity of third-party suppliers are not disclosed, however if a client has a preference, or suppliers that they do not want to be used, this will be accommodated in the plan. Similarly as when a third-party is used in the original quote the suppliers used are automatically filtered through the same quality processes that our proprietary sample undergoes including proprietary scripts to identify and filter potential overlap between sources.
Do you employ a survey router or any yield management techniques? If yes, please describe how you go about allocating participants to surveys. How are potential participants asked to participate in a study? Please specify how this is done for each of the sources you offer.
We do not employ a survey router
Do you set limits on the amount of time a participant can be in the router before they qualify for a survey?
What information about a project is given to potential participants before they choose whether to take the survey or not? How does this differ by the sources you offer?
Do you allow participants to choose a survey from a selection of available surveys? If so, what are they told about each survey that helps them to make that choice?
Our members are invited to each survey on an individual basis depending on their profiled information and suitability. An invitation to the survey is then emailed to the member individually, however if a member would like see all surveys they have been invited to in once place they are able to do so on the website. On this page they can see each survey and the information that is included in the email in one place.
What ability do you have to increase (or decrease) incentives being offered to potential participants (or sub-groups of participants) during the course of a survey? If so, can this be flagged at the participant level in the dataset?
We are able to change incentives offered in field and we are able to determine the incentive that a participant was shown in the final dataset. This is typically only used in cases where infield metrics differ greatly from the initial brief, or where feasibility needs to be boosted by increasing the incentive offered.
Do you measure participant satisfaction at the individual project level? If so, can you provide normative data for similar projects (by length, by type, by subject, by target group)?
Feedback is collected at the end of every survey and using this we are able to build a database of average scores based on a variety of different factors such as length of survey, research type, and subject matter.
Do you provide a debrief report about a project after it has completed? If yes, can you provide an example?
Typical feedback includes information relating to incidence rates, volumes of quota full and screen-out activity, and any anecdotal feedback from respondents, which we tend to receive for most surveys. The report is not routinely given on all projects but is available upon request and is delivered in an email.
How often can the same individual participate in a survey? How does this vary across your sample sources? What is the mean and maximum amount of time a person may have already been taking surveys before they entered this survey? How do you manage this?
We don’t enforce a hard limit on the volume of surveys within a given period, however we are also conscious of not over inviting our members. For longitudinal studies, we usually implement a lock out mechanism of 3 months so that a respondent cannot take part in multiple waves of a study within this period. Some projects require shorter or longer lockout periods, in which case these are implemented on a case-by-case basis. Similarly, lock outs can be applied based on topic or exposure to certain content within a certain time frame.
What data do you maintain on individual participants such as recent participation history, date(s) of entry, source/channel, etc? Are you able to supply buyers with a project analysis of such individual level data? Are you able to append such data points to your participant records?
We store all survey participation information at an individual panel member level. Every email sent, click-through to survey, entry in to survey, exit from survey, and source of panellist, is recorded so that this can be used for quality assurance and reporting purposes.
If clients require reporting of this information in relation to their projects we are able to supply this on demand.
Please describe your procedures for confirmation of participant identity at the project level. Please describe these procedures as they are implemented at the point of entry to a survey or router.
Every online survey that we host, irrespective of also providing the sample, is subjected to stringent data integrity processes. All survey data that falls outside of our data integrity requirements are eliminated from the survey results. Furthermore, where panellists are found to provide unusable data on multiple occasions, their accounts are automatically removed so that they are not invited to future surveys. The payment of rewards to such respondents is rejected and they are unable to redeem any rewards held in their account. We also engage multiple anti-fraud detection systems at the point of registration, survey entry and in other areas to actively identify any potential rogue respondents and to remove them immediately from the panel. Sense checking of profile data against survey responses is an additional measure which we engage to help to filter out suspicious activity.
How do you manage source consistency and blend at the project level? With regard to trackers, how do you ensure that the nature and composition of sample sources remain the same over time? Do you have reports on blends and sources that can be provided to buyers? Can source be appended to the participant data records?
All third-party sources that are used for additional respondents come from our vetted list of suppliers who have their own proprietary sample and follow the same ethos as ourselves. Once their members enter our system they undergo the same masking and quality control systems that our own members do although we do tag each respondent with their original source. This tag can be used to enforce strict quotas to ensure a specifically defined sample composition from each source. Typically we do not disclose the name of the sources to clients and the sources are not appended to final data, however clients can request specific sources not to be used.
Please describe your participant/member quality tracking, along with any health metrics you maintain on members/participants, and how those metrics are used to invite, track, quarantine, and block people from entering the platform, router, or a survey. What processes do you have in place to compare profiled and known data to in-survey responses?
We have proprietary scripts which monitor the quality of the responses that each of our members provide. The script analyses consistency of responses provided on internally hosted surveys, both against answers given in previous surveys and profile data, and instances where they have been excluded for data quality reasons on externally hosted surveys. After we have seen too many quality issues from a particular respondent they are removed from our panel.
For work where you program, host, and deliver the survey data, what processes do you have in place to reduce or eliminate undesired in-survey behaviours, such as (a) random responding, (b) Illogical or inconsistent responding, (c) overuse of item nonresponse (e.g., “Don’t Know”) (d) inaccurate or inconsistent responding, (e) incomplete responding, or (f) too rapid survey completion?
All data checks are performed by our data team at the end of fieldwork using a mixture of manual and semi-automated checks. An automated script is used to calculate the probability of bad data based on responses in grid or rotating statement questions, length of time taken to complete the survey when compared to the average, inconsistent responses both within the survey and compared to answers given in previous surveys and profile data. Highlighted responses are scrutinised manually alongside other checks that are more difficult to automate such as illogical responses given and are all then fed back into the wider quality control system which may then highlight a respondent for removal from the panel if a pattern of bad responses emerges.
How do you comply with key data protection laws and regulations that apply in the various jurisdictions in which you operate? How do you address requirements regarding consent or other legal bases for the processing personal data? How do you address requirements for data breach response, cross-border transfer, and data retention? Have you appointed a data protection officer?
Panelbase employs an external Data Protection Officer who informs and advises on data protection laws in different markets and helps us ensure that we comply with all of them. There are monthly meetings between the DPO and members of senior staff to ensure everything is kept up-to-date. Any member is able to email either the Panelbase support team, or the DPO to exercise their right to be forgotten, submit a Freedom of Information request or remove responses from a particular survey. All requests for this are acted upon within the timeframe laid out in the Freedom of Information Act.
How can participants provide, manage and revise consent for the processing of their personal data? What support channels do you provide for participants?
Members are able to manage all of the data that they provide to us through their profile section on our website. In addition to this they are able to get in touch with our member support team via email for any assistance.
How do you track and comply with other applicable laws and regulations, such as those that might impact the incentives paid to participants?
To ensure that we always comply with the latest laws and regulations we are members of the MRS and ESOMAR and keep up to date with any publications or recommendations that they make.
What is your approach to collecting and processing the personal data of children and young people? Do you adhere to standards and guidelines provided by ESOMAR or GRBN member associations? How do you comply with applicable data protection laws and regulations?
We have successfully conducted hundreds of online and offline projects with children aged between 6 and 15, drawing from our sub-panel of over 50,000 children under the age of 16. In the case of offline projects, we only engage DBS-checked personnel on such projects and parental supervision of the research process is mandatory.
Online surveys with children are also subjected to parental consent and at no point do we communicate with children directly. Surveys designed for children are done so in accordance with ICC/Esomar and MRS guidelines. In the same way that we protect the identity of our panel members and the responses they provide to our surveys, all data provided by children is handled with the same levels of care and integrity.
Do you implement “data protection by design” (sometimes referred to as “privacy by design”) in your systems and processes? If so, please describe how.
All of Panelbase’s systems implement privacy by design and are compliant to the latest GDPR and data privacy and protection regulations. All employees within Panelbase only have access to information that is required for them to be able to their daily duties with member PII and sensitive information restricted.
All members are anonymised using an ID to identify themselves which is then encrypted and masked for each survey that the member takes part in, both internally and externally to prevent any information to be built up due to survey data or surveys that a member has taken part in.
When a project is internally hosted, members of the data team who work with the live survey data only have access to the encrypted survey-specific ID and are unable to link this up to the members ID to ensure that they are unable to link any survey responses to any specific member, or any survey responses given in any other survey.
What are the key elements of your information security compliance program? Please specify the framework(s) or auditing procedure(s) you comply with or certify to. Does your program include an asset-based risk assessment and internal audit process?
Panelbase is ISO 27001:2013 certified and follow both the digital and physical security recommendations laid out in the certification.
All data and project materials provided by Panelbase members and clients are stored on secure servers to which only authorised personnel have access, and only for the purpose of administering Panelbase member accounts and surveys. All data submitted by members via the Panelbase website is done so using Extended Validation SSL technology, which encrypts the contents of the browser session and ensures integrity of the data transaction between their internet browser and our systems.
The premises within which Panelbase servers are located is secured from public or unauthorised access both physically and electronically using the latest technologies and security systems, including but not limited to; firewalls, data encryption, IP-based permissions, CCTV, and swipe entry access control. Data back-ups are subject to the same levels of physical security and authorised access.
Do you certify to or comply with a quality framework such as ISO 20252?
Our company is ISO 9001:2015 certified and has been each year since 2002 in addition to this we have ISO 27001:2013 for Information Quality Management Systems since 2021. We are also working towards the ISO 20252:2006 Market Research Quality Standard.
Our ISO-approved quality management systems are built on the principles of effective data storage, security and management. These systems are subject to continual review and change, and therefore maintain compliance with external auditing requirements at all times.
Which of the following are you able to provide to buyers, in aggregate and by country and source?
Information on this is available in our Panelbook available: https://drg.global/divisions/panelbase/panels/adults/
Any questions regarding the content of this document should be addressed to:
Paul Wealleans (Managing Director)
Hexham Business Park
Alternatively, contact can be made using the following methods:
Telephone: 01434 611164
Email: [email protected]
“We have used Panelbase for many years and find their panel to be very well maintained, resulting in higher quality responses. The team are great to work with and always easy to contact. They always take great care with each and every project, resulting in high quality data.”
Taylor McKenzie | Research Director