Reflecting on Digital Humanities and Embedded Flaws

Well, some of the articles I found incredibly dry while others I kept going down rabbit holes! So much interesting material.

There were a few ideas that really stood out to me. First of all, how much we come to trust and rely on digital tools such as browsers. Though if I stop and think about it, I was aware of the fact that browsers come up with bias a flawed results often, it is not something I think about in my daily use of search engines. I forget that it is not just me influencing my search algorithm. Trying the auto-complete for a bunch of different search terms was a fascinating exercise. For some things it was incredibly disturbing, while others showed a marked improvement since the 2014 article on “Algorithms of Oppression” was written. It has really made me wonder how skewed my search results are based on oppressive algorithms. It is so disturbing that the algorithms have been trained by users to come up with these suggestions and also that the companies are allowing the algorithm to continue to come up with these damaging ideas.

“Marketable histories” is another concept that struck me. It goes back to what I have talked about before about how peoples’ histories and cultures are represented and how so often those stories are either suppressed or misrepresented. It is also about which versions of history are most acceptable to the masses. Marketability suggests that there is some sort of price or value placed on different histories. This is problematic in so many ways. Commodification of people’s stories does not do justice to the rich, diverse, and complex identities and experiences of people and will lead to certain versions or story tellers being promoted while those deemed less worthy by privileged society or faulty algorithms have their voices muted by the system.

Math may be unbiased, but algorithms train humanity’s deeply flawed histories and social currents. Not only are these histories flawed, but there are already so many discrepancies in the data available for the program to consume. I am thinking specifically of how certain stories and histories are not as present online. Algorithms are trapped within the system that they learn and operate; the internet is a reflection of social constructs in the world, but not the reality of the world itself. Algorithms are only objective so far as the world in which they exist. The stories and histories of oppressed people are often not as well documented as those from privileged demographics and what information that is available to be consumed by the algorithm can be based on colonialist perspectives and deeply problematic ideas of different identities.

Through the readings I saw shared threads with two of my other classes: public engagement and gender and environment. In my public engagement class I just finished a discussion paper on ethics which talks a lot about identifying and controlling for biases, and we have been discussing equity and oppressed identities all semester in gender and environment. As with theses classes, this reflection on digital humanities asks more questions than it answers surrounding personal biases and how they interact with bias systems. Though it is disheartening to see how problematic tools used by digital humanities work are, I believe that the discipline (it is far more complicated than that but that is what I am calling it for now) is playing and will continue to play an important role if addressing inequity, oppression, and systemic biases.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *