The 37 questions that ESOMAR has created are intended to give a standard set of questions a buyer can ask to determine whether a sample provider’s processes and samples align with their research aims.

Please find our answers to the questions below.

Our Work

Company Profile

Q1 What experience does your company have in providing online samples for market research? How long have you been providing this service? Do you also provide similar services for other uses such as direct marketing? If so, what proportion of your work is for market research?

Internet Research Bureau was established in May 2011 with the goal of collecting, researching, and disseminating information for B2B companies. The company has more than 12 years of experience serving market research agencies. its management team and senior executives are industry veterans, each with 15+ years in the MR industry. We do not provide direct marketing services. Our services include market research-related services only.

We have staff responsible for developing and monitoring the performance of the sampling algorithms requisite to the management of our B2B data collection projects. Their training in sampling techniques is appropriate for their expert delivery of B2B projects with detailed target and quota specifications. Such training is provided over an initial three-month period for each project manager and is thereafter ongoing, including regular updates on guidelines as and when required, to maintain best practice.


Methodologies include stratified and targeted sampling methodologies for optimal delivery per project. More detail on this can be provided if needed.


As far as possible, delivery and collection controls are system-controlled and automated, providing homogeneous sampling quality. Our QA team works with our software engineers to refine the tool to monitor and enhance the response automated sampling function and algorithms.


Our sampling management tool has been developed internally for over a decade and the software is fully owned and operated exclusively by IRB. It is known as PASMT (Project and Survey Management Tool) and is designed to reduce duplication and increase data quality.

Q3 What other services do you offer? Do you cover sample-only, or do you offer a broad range of data collection and analysis services?

In addition to online sampling, IRB also supports clients with survey programming, hosting, translation services, and data processing. IRB recruits for IDIs, FGDs, and CATI surveys. We also provide data analysis services on a case-by-case basis.


In addition, IRB subcontracts further research services where required including quality moderation and analysis.

Sample Sources and Recruitment

Q4 Using the broad classifications above, from what sources of online sample do you derive participants?

Internet Research Bureau owns and maintains an online survey panel website named

Internet Research Bureau owns and maintains an online survey panel website named ‘Opinion Bureau’. We source users from our panel, ‘Opinion Bureau’, and other panel companies as and when required. We recruit users for ‘Opinion Bureau’ from various sources such as search engines, affiliate marketing, referral programs, banner advertising, social media, etc. We have approximately 2.4M registered users in Opinion Bureau who participate in our surveys depending on their profile and the topic of the survey.

Opinion Bureau. We source users from our panel Opinion Bureau and other panel companies as and when required. We recruit users for Opinion Bureau from various sources such as search engines, affiliate marketing, referral programs, banner advertising, social media, etc. We have approximately 2.36 M registered users in the Opinion Bureau who participate in our surveys depending on their profile and topic of the survey.

Q5 Which of these sources are proprietary or exclusive and what is the percent share of each in the total sample provided to a buyer? (Assume proprietary to mean that the sample provider owns the asset. Assume exclusive to mean that the sample provider has an exclusive agreement to manage/provide access to sample originally collected by another entity.)

Internet Research Bureau provides about 50% of the samples from our proprietary panel, Opinion Bureau – a panel community that we manage ourselves.

Q6 What recruitment channels are you using for each of the sources you have described? Is the recruitment process’ open to all’ or by invitation only? Are you using probabilistic methods? Are you using affiliate networks and referral programs and in what proportions? How does your use of these channels vary by geography?

We recruit users for ‘Opinion Bureau’ from various sources such as search engines, affiliate marketing, referral programs, banner advertising, social media, etc. There is a good mix with most of them coming from affiliate marketing. Recruitment is open to everyone within the given market from 15+ or 16+ depending on the market. The usage of these channels is similar across all markets.

Q7 What form of validation do you use in recruitment to ensure that participants are real, unique, and are who they say they are? Describe this both in terms of the practical steps you take within your own organisation and the technologies you are using. Please try to be as specific and quantify as much as you can.

Panelists need to pass through our quality checks which include their digital fingerprint, geolocation, proxy, and behavior analysis. We also carry out manual checks for each panelist. Registered members are checked periodically based on their behavior in survey, in response to our internal surveys and QC feedback from the clients and if they are found to be fraudulent, their membership rights are revoked. We conduct quality check surveys once a month to ensure that we do not send out surveys to members who respond inappropriately.

Q8 What brand (domain) and/or app are you using with proprietary sources? Summarise, by source, the proportion of sample accessing surveys by mobile app, email or other specified means.



Opinion Bureau members can access our surveys by three means – Email (40-50%), Portal (50-60%), and App (less than 1%)

Q9 Which model(s) do you offer to deliver sample? Managed service, self-serve, or API integration?

Managed Services

Q10 If offering intercepts, or providing access to more than one source, what level of transparency do you offer over the composition of your sample (sample sources, sample providers included in the blend). Do you let buyers control which sources of sample to include in their projects, and if so how? Do you have any integration mechanisms with third-party sources offered?

We don’t provide direct access to the client as we provide a managed service. We control the sources at our end and if any sources need to be restricted, our client needs to inform us in advance. We do not provide API services to the client so the control is at our end and it’s managed manually on a case-by-case basis.

Q11 Of the sample sources you have available, how would you describe the suitability of each for different research applications? For example, Is there sample suitable for product testing or other recruit/recall situations where the buyer may need to go back again to the same sample? Is the sample suitable for shorter or longer questionnaires? For mobile-only or desktop only questionnaires? Is it suitable to recruit for communities? For online focus groups?

We undertake detailed profiling of our samples and keep it up to update. Based on the research criteria, we pull the sample and based on the research requirement, we indicate the length of the interview and a compatible device to the respondent. So, they are ready for the shorter/longer questionnaire and will participate from the respective device only. In terms of matching samples to types of research, device type, or length, as we own our sample and have comprehensive records of panelist survey history, we can pull appropriate samples. As the question indicates, this can be useful to maximize quality, for example for very long questionnaires or minimize dropout from e.g. focus groups.


Our panel is primarily suitable for: –

  • B2B/ B2C/patient surveys.
  • All devices: mobile, tablets and desktops.
  • Video surveys
  • IDI recruitment.
  • Short to medium length of online surveys with limited open-end questions

However, our panel is also available for panel building, community recruitment, focus groups, and
other long-term engagement activities.

Sampling and Project Management

Q12 Briefly describe your overall process from invitation to survey completion. What steps do you take to achieve a sample that “looks like” the target population? What demographic quota controls, if any, do you recommend?

Panelists are selected by our survey management application (PASMT) based on their demographic information such as age, gender, income, education, employment status, location, and profile information as defined in the questionnaire/screener provided by our client. Once the panelists are shortlisted, they receive an invitation from our system. Surveys matching the panelist’s profile are also visible in their portal which they access through their account. Standard email templates are used in all markets in local languages.


We do on occasion recommend quota controls where these are lacking in the client’s targeting and quota instructions, for which they may be appropriate to the research requirement. For example, if the client is instructing us to simply ‘gen pop’ for a consumer project, or ‘executives’ for a B2B project we will discuss with them to ensure that the dataset will support the analysis required with demographic and/or firmographic targeting and/or quotas.

Q13 What profiling information do you hold on at least 80% of your panel members plus any intercepts known to you through prior contact? How does this differ by the sources you offer? How often is each of those data points updated? Can you supply these data points as appends to the data set? Do you collect this profiling information directly or is it supplied by a third party?

IRB does not source profile information from any third party. Once a panelist is recruited to our panel, we collect their demographics and other profiling information in different steps.


We collect the following information when members register to our panel: First Name, Last Name, Email ID, Date of Birth, Gender, Household Income, Highest Level of Education, Employment Status, State/Province, Physical Address, Zip Code/ Postcode, Phone Number (Optional).


We ask our members to keep their information updated and encourage them to do so by sending them to their profiling page after completing each survey. We also conduct continuous extensive profiling of our panel members: in addition to the information, respondents provide at the time of registration, other profile information is collected, after joining, based on the below categories: Personal and Family, Education, Occupation, Health and fitness, Finance, Leisure and Activity, Food and Beverage, Household appliances, Automotive, Travel, Media, Games.


We also have a system to use pre-screening in the surveys to continuously update profile information.


We hold basic demographic, firmographics, and health condition information on more than 80% of our panelists.


We can append answers for demographic or firmographic data points, generally supplying these free of charge for up to 2-3 data points.

Q14 What information do you need about a project in order to provide an estimate of feasibility? What, if anything, do you do to give upper or lower boundaries around these estimates?

We need the following information to provide an estimate of feasibility:

  • Geography
  • TG Definition
  • Sample description
  • Project description
  • Data collection methodology
  • Any de-dupe criteria from past surveys
  • PII collection if there is any
  • Device compatibility
  • Screeners/ Questionnaire
  • Quota details
  • Incidence rates
  • Length of interview
  • Field work timeline
  • Sample distribution across fieldwork time
Q15 What do you do if the project proves impossible for you to complete in field? Do you inform the sample buyer as to who you would use to complete the project? In such circumstances, how do you maintain and certify third party sources/sub-contractors?

If we are unable to complete a project, we would take all necessary steps to ensure the client understands the field difficulties and knows the best way to complete the fieldwork. The steps include:

  1. Field work extension
  2. Quota relaxation
  3. Screener relaxations
  4. Price increase for better incentivization
  5. Inclusion of all possible third party/sub-contractors – We don’t necessarily inform the sample buyer about individual sub-contractors unless they have specified that we should do so.
Q16 Do you employ a survey router or any yield management techniques? If yes, please describe how you go about allocating participants to surveys. How are potential participants asked to participate in a study? Please specify how this is done for each of the sources you offer.

Our survey router assigns the survey randomly to the users who land on our survey router page irrespective of allowed sources. The survey is assigned if their demographics and profile information match the screening criteria of the survey. A user can choose to route to a survey or decline to do so. User can route to as many surveys as they want based on the availability of the surveys. We can also control and limit routing on the survey level as well as the source level.

Q17 Do you set limits on the amount of time a participant can be in the router before they qualify for a survey?

There is no limit on participation through the router but there are time limits on qualifying for a survey. The qualification limit varies by country and source.

Q18 What information about a project is given to potential participants before they choose whether to take the survey or not? How does this differ by the sources you offer?

We give the following information to our proprietary panel participants: survey, survey category/topic, length of the interview, reward amount, survey instructions, survey link, unsubscribe link, help link, terms and condition link, privacy link, a welcome message with the declaration of voluntary participation, honesty, and integrity. If the user is coming from a third-party source, we show them the help link, terms and conditions link, privacy link, and welcome message with the declaration of voluntary participation, honesty, and integrity.

Q19 Do you allow participants to choose a survey from a selection of available surveys? If so, what are they told about each survey that helps them to make that choice?

Yes, we do allow users to choose from a list of available surveys if they log in to the portal or app. They choose surveys based on the topic, reward amount, and length of the interview. They can see this information and decide which survey to take.

Q20 What ability do you have to increase (or decrease) incentives being offered to potential participants (or sub-groups of participants) during the course of a survey? If so, can this be flagged at the participant level in the dataset?

We start with the standard incentive and can increase or decrease the incentive at any point that it’s needed. However, we ideally don’t change the incentive frequently (max up to 3 times) during the field. An update to the incentive is updated on the panelist level so it can be flagged who qualified at which incentive amount.

Q21 Do you measure participant satisfaction at the individual project level? If so, can you provide normative data for similar projects (by length, by type, by subject, by target group)?

No, we don’t measure participant satisfaction at the project level. But we provide continuous helpdesk support for our members and encourage panelists to send their feedback and queries, which helps us to improve the user experience and satisfaction levels of our members. We closely monitor their feedback and answer their queries to maintain high satisfaction levels. Some of our clients do measure parameters like the topic of the survey, length of the survey, ease of design, user experience and questionnaire understanding, relevancy of questions, etc.

Q22 Do you provide a debrief report about a project after it has completed? If yes, can you provide an example?

No, we don’t provide any report as such until asked by the client. Sometimes clients do ask for the debrief report such as invitation sent, channel report, response rate, qualified, terminates, dropouts, quota-full, field report, conversion rate, incidence rates, rejection rate, the median length of the interview, etc. We can provide this information on request.

Data Quality and Validation

Q23 How often can the same individual participate in a survey? How does this vary across your sample sources? What is the mean and maximum amount of time a person may have already been taking surveys before they entered this survey? How do you manage this?

A participant can’t participate in the same survey twice irrespective of their source. Once the participant has taken a survey, he can’t enter the same survey again as their member ID is updated in the survey. If the user is coming from third-party sources we track their device id, sub-ID, and cookies to ensure they don’t enter the same survey again.


On the other hand, we keep track of the survey behavior of all our members and as a quality control measure we don’t let a user participate in more than 30 surveys per month (one survey invitation a day). Also, when monitoring the high qualifying rates of any user, we scrutinize their responses in the survey and eliminate them if they are flagged as a multiple survey taker.

Q24 What data do you maintain on individual participants such as recent participation history, date(s) of entry, source/channel, etc.? Are you able to supply buyers with a project analysis of such individual-level data? Are you able to append such data points to your participant records?

Yes, we maintain all the information about recent participation history, date of entry, source, screener response if applied, IP-related information, device information, etc. of the survey participants and this can be shared with the client if requested.

Q25 Please describe your procedures for the confirmation of participant identity at the project level. Please describe these procedures as they are implemented at the point of entry to a survey or router.

The panelist’s identity in a survey is verified by their panelist ID, a unique ID that is assigned to panelists when they sign up with our panel, which enables us to track them across all their survey activities. When the survey invitation is sent to a panelist, the survey link remains unique for the participants, containing the panelist ID in encrypted form. If anyone tries to manipulate the link or change the panelist they wouldn’t be able to enter the survey as their link would become invalid and stop them from entering the survey. Their account is also locked by their email ID and password so if they wish to participate in a survey via the portal or mobile app, they would be required to log in using their email address and password. They can’t enter the survey portal until they log in securely and successfully.

Q26 How do you manage source consistency and blend at the project level? With regard to trackers, how do you ensure that the nature and composition of sample sources remain the same over time? Do you have reports on blends and sources that can be provided to buyers? Can source be appended to the participant data records?

We mirror the sampling methodology across wave trackers to ensure a decent sample blend and source consistency. Our sampling tool has a feature to set the limit for each source at the survey level as well as the quota level. So, we can copy the sample distribution by source in case of tracking studies and we can also set a good sample mix for ad-hoc surveys to ensure a decent blend. We keep track of sources and can be appended if requested.

Q27 Please describe your participant/member quality tracking, along with any health metrics you maintain on members/participants, and how those metrics are used to invite, track, quarantine, and block people from entering the platform, router, or a survey. What processes do you have in place to compare profiled and known data to in-survey responses?

We have developed a quality score on a scale of 1 to 5 for each panelist who joins our panel. If their quality score drops below 3.5, they are restricted to participate in B2B surveys and if it drops below 2.5, they are taken out of the survey pool and blocked with a notice to the panel member. The score is derived from a few parameters such as security terminates, client rejections, profile completion, screener response, etc.


Also, if a panelist’s response is rejected in 3 consecutive surveys by our clients, they are immediately removed from the panel.

Q28 For work where you program, host, and deliver the survey data, what processes do you have in place to reduce or eliminate undesired in-survey behaviors, such as (a) random responding, (b) Illogical or inconsistent responding, (c) overuse of item nonresponse (e.g.,“ Don’t Know”) (d) inaccurate or inconsistent responding, (e) incomplete responding, or (f) too rapid survey completion?

We check and clean the data before sharing the raw data with the client. We have set a process to check every data individually and flag them on the following parameters:

  1. Timestamp
  2. Speeders
  3. Open-end questions
  4. The illogical or irrelevant response pattern
  5. Missing data
  6. Contradictory responses within surveys
  7. Outliers

Our project managers flag respondents on the above parameters and remove and replace those responses with new data before sharing it with the client. Any questionable or suspicious response is removed manually before sharing.

Policies and Compliance

Here is the link to the participant privacy policy. https://www.opinionbureau.com/privacy-policy

IRB’s privacy policy broadly addresses:

  1. Compliance to protect the privacy of survey participants in accordance with International and local regulations including but not limited to EU-U.S. and Swiss-U.S. Privacy Shield Frameworks,
  2. GDPR, and ISO.
  3. Cookie Policy at length
  4. Types of PII collected along with its usage.
  5. How data is processed, stored, retained and shared
  6. Cross-border data transfer policy
  7. Rights of panelists, legal bases
  8. Access and control of PII by panel members

We have designed our data protection compliance program to meet the data protection and privacy laws and regulations. This is in accordance with international and local regulations including but not limited to EU-U.S. and Swiss-U.S. Privacy Shield Frameworks, GDPR, and ISO standards of privacy.


We ensure that the panelists agree to and accept our privacy policy to join the Opinion Bureau panel. Their consent is taken and logged whenever there is a need to share their data with any third-party clients. By giving their consent they understand how their data will be processed and used for research purposes. We keep the log of consent from each user as legal proof of their acceptance to participate in our panel and surveys.


On the technology front, we keep our panel member’s data on a very secure server, which can only be accessed by authorized team members. We do not share any personally identifiable information relating to our panel members with our clients or any other third party without the panelists’ consent. Our websites and server are also secured by the SSL certificate of DigiCert.

The panel members can log in to their accounts and manage consent through the setting and preferences section of the website. They can withdraw their consent to our policies and terms too. If they remove their consent, our system stops reaching out to them for any surveys. In addition to membership consent, their consent is taken at the survey level too if there is any need for PII to be shared with the client or they are informed in advance that the client survey will require them to share their PII and they can give their consent as per their convenience after reading client’s privacy policy. If any survey participants want to update their consent to any of the client surveys in which they have recently participated, they can email their consent directly to us and we can get this information updated by our client.


Our primary owned source of survey participants is opinionbureau.com. We don’t provide any

third-party database access to our clients. We also don’t provide direct access to our database to any of our clients, and we generally don’t share any of the PII of our panelists with our clients. In case the client wants to collect some PII such as email or phone, we sign NDA with the client, put a consent form for the panelist before they enter the client survey, and inform them to give their consent in the client survey after reading their privacy policy. Their consent with us is provided on a single survey basis and never relates to all surveys or all surveys with that client.

Q32 How do you track and comply with other applicable laws and regulations, such as those that might impact the incentives paid to participants?

We have outsourced privacy policy and compliance audit to TrustE which audits declarations, processes, and communicates every year to ensure that we comply with relevant laws and regulations.


In addition to that, we have our internal DPO who monitors privacy policy, data handling, and sharing processes, also data retention and protection measures to ensure compliance. Timely monitoring of processes, policy updates, and new developments is carried out to ensure compliance.

Q33 What is your approach to collecting and processing the personal data of children and young people? Do you adhere to standards and guidelines provided by ESOMAR or GRBN member associations? How do you comply with applicable data protection laws and regulations?

Children, as defined in law by each territory, are not eligible to join our panel or participate in our surveys. But, if we must contact minors defined by local or country law, we seek parental consent before letting them enter the survey.


We comply with national/regional regulations including GDPR, COPPA, ESOMAR, MRS Guidelines, and national and international MR association guidelines.

Q34 Do you implement “data protection by design” (sometimes referred to as “privacy by design”) in your systems and processes? If so, please describe how.

IRB implements data protection by design by –

  • Outlining technical and security measures
  • Organizational processes and protocol for data execution
  • Defining data handling and data sharing practices
  • Defining Dos and Don’ts for the employees and people managing data in any form.
  • Defining and limiting access to relevant people
    Internal approval system for access to data
    IT system, services, and contacts
    Organizational and employment policies
Q35 What are the key elements of your information security compliance program? Please specify the framework(s) or auditing procedure(s) you comply with or certify to. Does your program include an asset-based risk assessment and internal audit process?

We have contracted TrustE to audit and guide us to ensure the compliance of data protection and privacy policy. An annual audit is performed by TrustE for privacy authorities and data protection compliance. IRB’s managed hosting service providers and cloud service providers are ISO 27001 certified.

Q36 Do you certify to or comply with a quality framework such as ISO 20252?

No, however, we are applying for this in 2023.

  1. Average qualifying or completion rate, trended by month
  2. Percent of paid completes rejected per month/project, trended by month
  3. Percent of members/accounts removed/quarantined, trended by month
  4. Percent of paid completes from 0-3 months tenure, trended by month
  5. Percent of paid completes from smartphones, trended by month
  6. Percent of paid completes from owned/branded member relationships versus intercept participants, trended by month
  7. Average number of dispositions (survey attempts, screenouts, and completes) per member, trended by month (potentially by cohort)
  8. Average number of paid completes per member, trended by month (potentially by cohort)
  9. Active unique participants in the last 30 days
  10. Active unique 18-24 male participants in the last 30 days
  11. Maximum feasibility in a specific country with nat rep quotas, seven days in field, 100% incidence, 10-minute interview
  12.  Percent of quotas that reached full quota at time of delivery, trended by month


We have metrics available for a variety of measures, including completions, conversion rates, and device usage to name a few. We can provide the required metrics upon request since these are dynamic in nature.


Here is the link to our panel book that describes our panel composition and other profiling information in detail –