Guest Writer/ a prolific historical warning

Arrogant Individualism & Truth Decay by Timothy Brinkley

Part 1: The advent of generative artificial intelligence has been compared to the advent of the nuclear bomb (Nuttle, 2023). It was a new technology that changed the course of history. When used in Hiroshima and Nagasaki in August 1945, it killed “between 129,000 and 226,000 people” (Wikipedia contributors, 2023).

Truman then warned Japan: "If they do not now accept our terms, they may expect a rain of ruin from the air, the like of which has never been seen on this earth. Behind this air attack will follow sea and land forces in such numbers and power as they have not yet seen and with the fighting skill of which they are already well aware." (Truman Statement on Hiroshima – Nuclear Museum, 2022)

After Hiroshima, the next primary target was Kokura, but Kokura put up such a counteroffensive air fight, that air commander Sweeney decided to go on to the secondary target of Nagasaki. Nagasaki observers thought it was just another spy plane flying over

After Hiroshima and Nagasaki, Hirohita, emperor of Japan at the time, wrote a capitulation letter and surrendered despite a coup-d’etat to stop him from doing so. In it he writes:

Moreover, the enemy has begun to employ a new and most cruel bomb, the power of which to do damage is, indeed, incalculable, taking the toll of many innocent lives. Should we continue to fight, not only would it result in an ultimate collapse and obliteration of the Japanese nation, but also it would lead to the total extinction of human civilization. (Yosha Research, 2013)

As the memory of the bombing faded from society’s conscience, the new generations have forgotten the very real effect of nuclear bombs. The victims who survived those bombings re-inspired today’s pop culture view of the zombie apocalypse: burn victims struggling to walk, eye’s falling out, looking for someone to help.

Nuclear power is still accessible, but it is not used militarily. Nuclear powers are used “beyond providing electricity for homes and businesses. They can also be used to power desalination plants, provide heat for metal refining, and even generate hydrogen as a clean burning alternative fuel for vehicles” (3 Surprising Ways to Use Nuclear Energy, 2022).

When we compare the new nuclear technology of yesteryear, with today’s generative artificial intelligences, we see some commonalities. People said they both are the end of civilization as we know it (The End of the Beginning of the End of Civilization as We Know It – Part 1, 2023). They both are changing the way we behave: all the school drills for a nuclear bombing back in the 1940s, many are turning to AI to answer their questions today. Both are changing our societies.

OpenAI is being used for image generation, image restoration, movie making, the text for auto-generated voices, writing poems, search engine optimization, scripts, essays, project planning, grant writing, designing apps, websites, and even robots. It is a powerful open-source resource that anyone can use or modify for their own purposes.

Using AI is not bad in of itself, just like harnessing nuclear power is not bad in and of itself. It is how you use it. For example, the Internet has been compared to a knife (NetSmartz, 2014), it can be used to make amazing meals, or it can be used to kill people.

The big difference here is that the nuclear bomb was accessible to a few rich nations with smart scientists. With OpenAI’s ChatGPT, Google Bard, and knockoffs like WormAI, they are accessible to anyone in the world with Internet access and a computer device. More fundamentally, how is this technology affecting our relationship with God?

I personally am excited but also have many concerns with generative artificial intelligence. Do I think it is going to take over the world, no. But I do believe that it will distort truth and the image of God. Here are some examples of how it is already misleading many people:

· A New York court lawyer ended up accidentally citing non-existent cases because he used ChatGPT in a trial he was defending. He had to apologize to the judge and send in an official report, cited here (Mata v. Avianca, Inc., No. 22-cv-1461 (PKC) (S.D.N.Y. June 22, 2023). ChatGPT even said that the cases it was citing were real cases, but after being asked directly to cite its source, apologized and said it was fake.

· An agriculture professor at A&M University did not understand how ChatGPT works and used it incorrectly. He copy/pasted the full text of each student’s paper into it and asked ChatGPT if it had written them. When ChatGPT responded that it had written them, the professor gave all the students an F and they were not able to graduate (Heinz, 2023).

· In one case, “ChatGPT invented a sexual harassment scandal and named a real law prof as the accused” (Verma & Oremus, 2023). Although the events in the made up story did not match up to reality and it was dismissed, it is a good example of how incorrect these AI’s can be.

· In another case, AI facial recognition software falsely identified a black man named Robert Williams in Detroit, Michigan, who appeared to be in a retail fraud video and was arrested (Allen, 2020). Only after 30 hours of incarceration was he released, his case was thrown out of court over lack of evidence, and he is eligible to get it expunged from his record (2020).

Let me share some positive personal experiences with A.I. Recently I have been using the paid version of ChatGPT (both versions 3.5 & 4.0) to help me write some grants for my high school students to go on field trips. The responses have been helpful by giving me suggestions of educational places to take students near the beach, identify lessons that I could use to connect the field trip to the subjects I teach, identify costs, find vehicles available to rent, hotels to reserve, and quickly create budgets based on various variables. In this sense, it has been a great administrative assistant.

I am also using it to help me redesign and update a mobile app that I previously created for school. I asked it to write the app in a language I am still learning called Dart which will allow me to more easily manage future updates and low-level changes I need to make as well as release it on Apple iOS, Google’s Android devices, Windows, and Apple desktops. Again super helpful and helping me learn a new programming language.

I also copy/pasted in the weekly meal plan one of my sons had developed and asked it to make me a shopping list of things I would need at the store and gave me a list of grocery items broken down by categories such as Vegetables and Fruits, Meat and Alternatives, Dairy and Alternatives, Pantry items, Baking Needs, and Spices with a list of items under each one. Again this is super helpful.

However, in other areas it is raising red flags. I asked it to help me write a persuasive essay on why GitHub should be allowed to be used at high school. It originally gave me a great outline and even recommended some real scholarly sources that I could use. But as I developed it further, asking it to write more of the essay, it started to give me bogus sources. When asked about the sources of a specific paragraph, it said that they were example sources which I would need to replace with actual ones, but if I had not asked I would not have known. I would have been deceived if I had not actually searched for the original sources it was referencing.

Even more fundamentally problematic is how I started to notice how my prayers to God had very similar verbiage to my ChatGPT questions. “Could you please give me a …,” and “Could you please suggest …” are both questions I ask both God and ChatGPT. This led me to question many things. Will this affect the way I pray? Will it affect my relationship with God? Could A.I. become an idol? Exodus 34:12–14 says:

Watch yourself that you make no covenant with the inhabitants of the land into which you are going, lest it become a snare in your midst. But rather, you are to tear down their altars and smash their sacred pillars and cut down their Asherim—for you shall not worship any other god, for the Lord, whose name is Jealous, is a jealous God.

This reminds me of a recent newsletter from Francis Frangipane where he wrote “What mankind has done is move the pagan temples from the high places of the countryside to the hidden places of the human heart” (Frangipane, 2023). He continues:

The world has taken its bloodlust out of the ancient Roman arenas and put it into violent movies. They have taken the goddesses of fertility from the Greek hillsides only to idolize sex in our theaters and televisions. What mankind has done is move the pagan temples from the high places of the countryside to the hidden places of the human heart. (2023)

The Internet has made a kind of idolatry possible that is much more widespread than in ancient times, and now ChatGPT is perfectly lined up to become another one that pulls the masses farther away from Jesus.

When OpenAI’s ChatGPT was asked if Christ is from God or not, its last paragraph of response is:

It’s essential to recognize that religious beliefs vary widely, and people’s views on the nature and significance of Jesus will differ depending on their faith traditions. Respect for different beliefs and open dialogue are crucial when discussing matters of religious significance. (ChatGPT, Jesus: Son of God, and Scripture, 2023)

Dr. Steven J. Wentland www.wwjwmtd.com

This e-mail, including attachments, may include confidential and/or proprietary information, and may be used only by the person or entity to which it is addressed. If the reader of this e-mail is not the intended recipient or his or her authorized agent, the reader is hereby notified that any dissemination, distribution, or copying of this e-mail is prohibited. If you have received this e-mail in error, please notify the sender by replying to this message and delete this e-mail immediately.

Leave a comment