back to home

How smart is “smart” technology?

written by Andrea Wojcik

In December I attended a workshop organized by the Netherlands Graduate Research School of Science, Technology and Modern Culture (WTMC) on “smart” technologies. It was a stimulating three days that allowed for surprising connections and interesting questions related the ‘Making Clinical Sense’ research project. In particular, the lecture given by Merel Noorman, from the Tillburg Institue for Law, resonated with something I came across at the University for Development Studies in Tamale regarding learning physical examination skills—the potential bias of medical education material found online.

Noorman spoke about the role of science and technology studies (STS) amidst the current fascination and application of smart technologies. She drew on a number of examples to build her talk, but one that struck a chord with me was a Google image search of a three letter acronym—CEO. This isn’t the case anymore, but back in 2015, the first woman to appear in that image search was Barbie. Gender bias clearly made its way into Google’s image search algorithms.

Jumping over to medical education, Google is often a good friend to medical students. Need a quick physiological explanation? Google it on your smartphone. Don’t know how a condition may present clinically? Google it on your smartphone. Surely, having information at their fingertips has benefited medical students all over the world, but the story above does raise some questions about the limitations of immediate information.

For example, students were learning about cyanosis in class. According to a quick Google search: ‘Cyanosis refers to a bluish cast to the skin and mucous membranes…It’s usually caused by low oxygen levels in the red blood cells or problems getting oxygenated blood to your body’. So it is recognisable via a bluish tint. The class tutors explained that the tint is usually visible on the hands and feet and/or the lips, tongue and gums. The following image is of a Google image search for ‘cyanosis’ zoomed out at 50%. Anything striking?

Google image search for cyanosis, zoomed out 50%

In class, tutors informed students that a bluish tint would be more difficult to recognize in people with dark skin, which is the majority of their patient population. They also pointed out that a Google image search would not be helpful in generating images of how cyanosis presents in dark skinned people. Indeed, the image search above, zoomed out to 50%, came up with only a few presentations of cyanosis in a dark skinned person’s hands and feet and one image of a dark skinned persons lips, tongue or gums. (What might appear to be an extreme case on the bottom right is actually artwork.)

One of the biggest takeaways from the WTMC workshop was that “smart” technologies are sometimes quite dumb. Perhaps it comes as no surprise, but Google clearly doesn’t have all information one click away. For the medical students in Tamale, this isn’t per se a problem. They will come across presentations of cyanosis in dark skinned patients when they enter the clinic.

But as Noorman pointed out in her lecture, the role of STS is to ask the big picture questions. Beyond raising questions about who benefits from smart technology in medicine, then, we have to question the smart technology itself. Noorman made it clear that questioning how to eliminate bias in smart technology isn’t the only question we can ask. Besides, many STS scholars would argue that such a question is futile given that we, human beings, make the tech and feed the algorithms. Instead, we have to decide how to use smart technology. We need to learn what it can and can’t do. In medical education, this might mean emphasizing that Google can’t stand in for other important sources of learning, such as clinical experience, either of the students themselves or their teachers.