Post:

    The first issue contributing to the race being often ignored from studies is the source problem. Due to the racial bias in favor of whites, other races' achievements received less recognition. According to De la Pea (2010), "in order to write the histories on race and technology that are missing, historians must ask about what is missing from the record and archives" (pg. 926). Thirdly, a greater emphasis on gender inclusion led to the racial absence in research. The demands to promote the participation of women in technology could not consider the ethnic disparity existing.

    Due to the developers' prejudices, the race is removed from computational computations. Algorithmic computations are not objective but rather depend on the developers' preconceptions. As such, algorithmic computations reveal preconceptions and systemic disparities. Marijan (2018) discovered that "new algorithms are further embedding biases about the poor and putting these vulnerable populations in an ever more precarious position" (p. 6). These biases occur due to the authors of these devices' views.

    Carolyn de la Pea believes that the answer to racial prejudice in technology history is for historians to be more ready to go outside the accessible archives. There is a source issue resulting from earlier historians' neglect to account for minorities. The proposed approach can adequately recognize the significant contributions of individuals of various ethnicities throughout the history of technology. However, efficiency may be reduced when historians go too far in their study or misunderstand the given evidence.

    The anonymity surrounding algorithmic computations is a significant impediment to resolving race in algorithmic computations. Algorithmic computations are incredibly confidential, as are the data sets utilized to generate conclusions. The public's inaccessibility to this knowledge may negatively affect society's most vulnerable members.

    References:

    De la Peña, C. (2010). The history of technology, the resistance of archives, and the whiteness of race. Technology and Culture51(4), 919-937. doi:10.1353/tech.2010.0064

    Marijan, B. (2018). Algorithms are not impartial.

    Classmates’ response:

    Thank you for your thoughtful analysis of the impact of racially biased algorithms.  You highlight examples of the origin of racial prejudice in technology from the preconceived notions of the authors and the lack of historical materials based on the neglect of historians to as you mentioned, account for minority perspectives. You also make an important point that the confidentiality of algorithms and the anonymity of their developers can also contribute to algorithmic bias through a lack of transparency. What steps do you think could be taken to address these contributing factors in the hopes of limiting algorithmic bias and its impact?

    WHAT TO DO:

    Please read my classmates response and answer the question to the question they asked.

                                                                                                                                      Order Now