Search This Blog

Friday, February 17, 2023

Digital Ghosts

Today my question for ChatGPT was:

The other day I asked you to write something in the style of Yoda and you did a good job. If everything some person wrote over a several year period was used for reinforcement learning, how closely do you think you would be able to mimic that person? And would that mimicry be a sort of "digital ghost" of that person, especially if they were dead?

And ChatGPT's answer is:

As an AI language model, I am capable of learning from large amounts of text data through a process called "training", which involves adjusting the parameters of my neural network to maximize the likelihood of generating coherent and fluent text. If a person's written work over a several-year period were used as the training data for me, I could potentially learn to mimic their writing style, including their choice of words, sentence structures, and other linguistic features.

The degree of similarity between my output and the original author's writing would depend on various factors, such as the size and quality of the training data, the specific techniques used for training and fine-tuning, and the complexity and diversity of the writing style. However, with enough high-quality training data and optimization, it's possible that I could produce text that is indistinguishable from the original author's writing, at least to some extent.

Regarding the question of whether my output would constitute a "digital ghost" of the original author, that is a philosophical and ethical question that has been debated by scholars and experts in the field of AI ethics. While I can generate text that closely resembles the original author's style and content, I don't have consciousness or agency, and my output is determined solely by the input data and the parameters of my neural network. Therefore, it's up to human judgment to determine whether such text constitutes a form of "digital resurrection" or a mere mechanical replication of the original author's work.

To me, it would indeed by a "mere mechanical replication," but so what? Let's say I had, I dunno, a favorite grandfather or something like that. And let's say that a great deal of his life for several years was recorded: what he said, what he wrote, etc. Furthermore, let's say there was a lot of video of him speaking, catching his tone of voice, his facial expressions, etc.

While ChatGPT is text based, future variants will likely have avatars and be video based. There's not a big trick to making an AI bot sound extremely similar to someone (it's already being done) and not so hard to make an avatar look like someone including facial expressions (already being done, but will be even better in the future).

Giving this hypothetical reinforcement learning to be my grandfather, some future variant of ChatGPT would enable me to sit down and have a conversation with his digital ghost! Basically forever, even though he's been dead for decades!

To me, the concept is both super cool and ultra creepy at the same time.

This reflection of my grandfather is lost for all time. But our children (grandchildren, etc.) already have a great deal of their life recorded. AI avatars of them will enable their descendants (and anyone else who is interested) to interact with these digital ghosts for eternity!

No comments: