Type of Bias

Cards (10)

  • Gender bias - as its name suggests, gender bias favors one gender over another. This bias impacts data analytics teams and composition and the result they derive, driving skewed analytics and insights to potentially be "blind" to 50% of society.
  • Application bias - is a result of applying analytical models on production data sets that do not have identical distributions and representation of training sets. When real life production data have populations with zero to very little information in the training set, varies have some very skewed insights.
    • Confirmation bias - originates from selecting only data and information that reinforces or confirms something you already know, rather than selecting the information that might indicate something against preconceived ideas.
  • Algorithm bias - the way you designed your artificial intelligence and machine learning modules is also crucial to achieving responsible and fair AI. Algorithmic bias represents systematic and repeatable errors that lead to unfamiliar results, such as privileging one arbitrary group of users over others. Bias algorithms can originate from unrepresentative or incomplete training data or insufficient information that incorporates historical inequalities.
  • Sampling bias - common error in data collection is a lack of representativeness. As a result, some items may be over-sampled relative to reality. Let's take the example of a company that wants to predict breakdowns of its machines. If it collects mostly error information, the algorithm will not accurately identify the normal operation of the equipment.
  • Exclusion bias - like sampling bias exclusion bias comes from inappropriately removing data from a data source. For example when you have petabytes of data, it is something to select a small sample to use for training, but doing so may inadvertently exclude some data, resulting in a biased data set. Exclusion bias can also result from duplicating data when the data elements are truly distinct. 
  • Survivor bias - is a category of selection bias that overstates the likelihood of success of an initiative by focusing attention on successful subjects who are statistical exceptions rather than representative cases.
  • Reporting bias - is a distortion of presented data from research due to selective disclosure or withholding of information by groups involved in the topic selected for study and design, conduct, analysis, or dissemination of study methods, findings, or both.
  • Group attribution bias - group attribution error is believing that an individual's characters always follow the beliefs of a group that they belong to or that a group's decisions reflect the feelings of all group members. This type of bias tribes individuals to prefer and collaborate with those who share similar characteristics or backgrounds in the analytics professionals. Thus, excluding other team members.
  • Racial bias originates from considering factors that proportionately impact a race or ethnic group as a proxy or key variable in other data analysis of algorithms.