AI and the subcontracting of our thinking

AI and the subcontracting of our thinking

Every new advance in technology has positive and negative aspects to it. We gain something, and we also lose something. That’s the reality. A new technology doesn’t mean that everything about the world is now better than it was before.

This applies to all kinds of technological advances. Think about transport technologies as an example. When we move from walking to riding a bicycle, we can go much further than we could before in the same time. Yet this speed means we appreciate a little less of what we are travelling through, and we need paths and roads to travel at speed while walking works well on all surfaces. Likewise, when we move from a bicycle to a car, we can travel much further in the same time. We can travel in comfort even when the weather is bad. Yet as the speed increases, so does pollution, fuel use, expense, and the risk of losing your life. There is no such thing as a perfect technological advance. You gain something, and you lose something. You need to decide whether the gains are enough to counter the losses.

Some of these gains and losses that come from a new technology are completely predictable. For example, when the motorcar became a popular means of transport, anyone could tell that the roads now needed to be better and that means of refuelling would need to be built. Yet the danger was more than expected. In fact, so many people were dying in road accidents in the early years that some wondered whether the cost was too high to pursue this technology further. The invention of the seatbelt changed this, saving many lives, and many more safety enhancements have come in since then. There is still a significant danger, but we have ways to limit it as much as we can.

Let’s apply this kind of thinking to AI. This new technology has revolutionised a variety of fields. Medical imaging is helped dramatically due to advanced pattern recognition. Simple computer coding is now much faster and better than any human can do it. Tasks can be automated and done much more quickly than they could before. Yet there are negatives too. The job losses that come from AI will be very large, with an associated societal cost. The environmental impact of large-scale generative AI is immense and a problem not yet solved (especially water and power use). And the increasing use of AI in social media means reality is harder to work out, and this powerful technology is often applied to worthless casual entertainment.

There is so much that could be said on this by people far more expert than I on this topic. Yet I want to explore the impact of AI and associated technologies on a variety of aspects of our Christian lives in the next few blog posts, starting with how it impacts how we think.

We were made to be thinkers. God gave us all the capacity to observe the world and think about it. We are expected to draw conclusions about what we see, like David in Psalm 8 who thought about the place of man in the universe, and Paul in Romans 1 who expected all people to know there is a God from observing creation. The wisdom books of the Bible expect us to consider things like suffering (Job), love (Song of Songs), and the meaning of life (Ecclesiastes) and draw conclusions from them. Jesus himself asked people to look at the birds and learn about God’s provision, and to think about what it means for God to be a shepherd or a Father or a master. We are supposed to see, and think, and draw conclusions. The Christian faith is not just remembering facts. It is knowing God’s word and observing God’s world, and making wise decisions with what comes before us.

One danger of AI is that we can subcontract all our thinking to it. There is evidence of many people coming to rely on AI for topics as diverse as relationship advice, investment directions, or legal advice. Areas which are admittedly complex, where we would previously spend much time thinking, possibly with the help of a professional in the area, we can just ask of AI instead. Do you see the danger here? In subcontracting what is an important part of being human, we are losing the ability to think for ourselves. We take the phone out too quickly to see what AI thinks, and we haven’t spent any time considering what we should think.

As Christians, this can mean we are led to different decisions to what God’s word would have us make. We can blindly rely on AI advice that might be unethical or very unwise.

Don’t subcontract all your thinking. AI can be useful in certain applications. Yet beware, especially in areas of ethics and relationships. Life is complex. We grow as we grapple prayerfully with difficult decisions and situations. Don’t bypass this by getting a computer to do it for you. You might well lose more than you imagine.