I have spent much time writing about AI bias recently, but chatbot bias reflects the "whole of human knowledge" that formed the training data. It continues to reflect our human perspectives that, unfortunately, contain all kinds of innate beliefs with the potential to cause harm if placed in the hands of artificial intelligence without empathy or a caring context.
Even with human conscience, we fail our neighbors. The story below is a horrific example of dehumanization, and it happens all too frequently. As a woman with a neuromuscular disease, I cannot imagine the horror of being left lying there as people walked by with the odd mix of compassion and indifference that has come to define Seattle. Yet I have experienced this indifference in the last few years, so I know it is real.
In my study of Contemplative Theology, those who live outside have long been my teachers; it isn't the other way around. I try to imagine waking every day in their shoes, arising with the "Courage to Be," described by Paul Tillich, and I know I still have much to learn about strength and humility. And, in reading this story tonight, we all have a lot to learn about how to treat fellow humans in need. We must do better.
Comments