Procedural Bias

Another type of methodological bias is procedural bias, which is sometimes referred to as administration bias. This type of bias is related to the study conditions including the setting and how the instruments are administered across cultures (He, 2010). The interaction between the research participant and interviewer is another type of procedural bias that can interfere with cultural comparisons.

Setting

Where the study is conducted can have a major influence on how the data is collected, analyzed and later interpreted. Settings can be small (e.g., home or community center) or settings can be large (e.g., countries or regions) and can influence how a survey is administered or how participants might respond. In a large cross-cultural health study Steels and colleagues (2014) found that the postal system in Vietnam was unreliable and demanded a major, and unexpected, change in survey methodology. The researchers were forced to use more participants from urban areas than rural areas as a result of these challenges. Harzing and Reiche (2013) found that their online survey was blocked in China due to internet censoring practices of the Chinese government but with minor changes it was later made available for administering.

 

A courier is getting on his motorcycle. The materials being delivered are in a large container at the back of the motorcycle..
Problems with infrastructure and services may limit research participation. [Image By tomcatgeorge (Mail Delivery Taiwan) [CC BY 2.0 https://commons.wikimedia.org/wiki/File:Taiwan_Mail_Delivery.jpg]

Instrument Administration

In addition to the setting, how the data is collected (e.g., paper-and-pencil mode versus online survey) may influence different levels of social desirability and response rates. Dwight and Feigelson (2000) completed a meta-analysis of computerized testing on socially desirable responding and found that impression management (one dimension of social desirability) was lower in online assessment. The impact was small but it does have broad implications for how results are interpreted and compared across cultural groups when testing occurs online.

 

A male student is sitting in front a two computers. There are books and paper in front of the student.
How you take a survey or test may influence how you respond to certain types of questions. [Image by Mr Stein Open Cheat CC-NC-SA 2.0 https://www.flickr.com/photos/5tein/2348649408]

Harzing and Reiche (2013) found that paper/pencil surveys were overwhelmingly preferred by their participants, a sample of international human resource managers, and had much higher response rates when compared to the online survey. It is important to note that online survey response rates were likely higher in Japan and Korea largely because of difficulties in photocopying and mailing paper versions of the survey.

Interviewer and Interviewee Issues

The interviewer effect can easily occur when there are communication problems between interviewers and interviewees, especially, when they have different first languages and cultural backgrounds (van de Vijver and Tanzer, 2003). Interviewers, not familiar with cultural norms and values may unintentionally offend participants or colleagues or compromise the integrity of the study.

An example of the interviewer effect was summarized by Davis and Silver (2003). The researchers found that when answering questions regarding political knowledge, African American respondents got fewer answers right when interviewed by a European American interviewer than by an African American interviewer. Administration conditions that can lead to bias should be taken into consideration before beginning the research and researchers should exercise caution when interpreting and generalizing results.

 

A female sign language interpreter is signing to an audience.
A translator or interpreter can unintentionally change a question that may change how a participant responds [Image by daveynin CC-BY 2.0]

Using a translator is not a guarantee that interviewer bias will be reduced. Translators may unintentionally change the intent of a question or item by omitting, revising or reducing content. These language changes can alter the intent or nuance of a survey item (Berman, 2011), which will alter the answer provided by the participant.

License

Icon for the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

Culture and Psychology Copyright © 2020 by L D Worthy; T Lavigne; and F Romero is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License, except where otherwise noted.

Share This Book