Generative Angst

Seth Clark
7 min readNov 28, 2023

Since the launch of ChatGPT I’ve had a pit in my stomach which has been hard to put into words. I’ve been building AI products for over a decade, so this leap in technology was not a revelation, per se. I’ve eagerly followed the milestones along the way: IBM Watson’s defeat of Jeopardy mega-champions Ken Jennings and Brad Rutter; DeepMind’s complete mastery of Go; and GPT-2 cold rolling believable Shakespearean verse. All of these competition-based milestones — pitting new AI models against world-class human competitors — had a distinctive John Henry-feel; you’re rooting for humanity, but if the machines win, no big deal.

Still, using ChatGPT early in its initial release left me unnerved and unexpectedly worried about the future. Unfortunately, that nagging feeling has stuck around.

Our brains work best when they’re under load

I find writing to be one of the most cognitively taxing activities that I take part in on a regular basis. As I’m writing this article, for instance, I have to contend with competing thoughts and emotions, filter out the ideas that appear to hold the most distilled value, and then place them in some semblance of order. I also need to clearly synthesize in my own mind what I want readers to take away from my words. (For instance, I’m attempting to use narrative empathy to simulate how I think you, the reader, may interpret my words.) I need to shape this final concatenation with my own voice so that it feels authentic. I also need to jump randomly from paragraph to paragraph — adding or editing new ideas — so I don’t lose the thread that pops into my mind. It is exciting and exhausting!

Now, given the fact that I could have outsourced all of this mental effort to a server located somewhere in Ashburn, Virginia, does my labor provide any benefit to me beyond the usefulness of the work I’ve composed? Or, to put it another way, why shouldn’t I have just offered this prompt to ChatGPT?

“Imagine you didn’t have the internet until you were 12 and you’re currently listening to Phoebe Bridgers on repeat. Write a 1000 word medium post on why generative AI isn’t very punk rock”¹

It turns out that the research on the benefits of writing lags behind that of reading, but what limited publications exist appear to be pretty clear: writing gives more than it takes away. Writing improves our ability to concretize abstract ideas, improves memory, enhances analytical skills, and can boost both emotional and (bizarrely) even physical health.²

It is human nature to choose the easiest path available (it is certainly my nature), so one of my worries is that, because of their ease of use and the quality of their outputs, large language models will be too attractive to resist for anyone who needs to communicate in writing for school, business, or even in personal communications. The prolific use of navigation software (e.g. Google Maps, Waze) has already been demonstrated to result in a smaller hippocampus and reduced spatial memory.³ It’s not a leap to imagine generative AI having a similar impact. Know how important writing is to learning, reasoning, and empathizing, if we let that skill atrophy, we may lose something significant.

My concern about the impact of LLMs on human biology isn’t novel. Most new technological advancements have been met with well-meaning naysayers fearing the worst for the development of humanity (for instance, much to the chagrin of the Royal Bavarian Medical College, it turns out that steam locomotives do not cause mental unrest*) so I’ll try not to overdo it here. Still, one of the reasons why I believe this is a real concern is the fact that writing is tightly linked to one of the most important aspects of human culture and evolution: communication.

Outsourcing communication will have a cost

The premise that humans may end up over relying on Generative AI for communication is not hyperbole. From cover letters, to social media posts, to emails, we now must assume that anyone’s digital communications may have been generated by a large language model. Not only does this risk some amount of mental atrophy, but it also generates the risk that we’ll all get worse at talking to each other and working collaboratively.

Human cooperation is one of the key evolutionary differences that moved our pack of primates to the top of the food chain, with communication developing in tandem.** If GenAI has a measurable impact on our communication skills, this would result in worse collaboration, more conflict, and further social isolation. It would be hard to predict how long it may take to measure the impact of GenAI on our socialization, but noticeable changes in social skills have already been tied to the isolation we all experienced during the COVID-19 pandemic. These adverse impacts surfaced within only a couple of years, depending on who’s measuring.***

Conversely, one interesting outcome of a future that over-relies of LLMs is that in-person conversations will become the only true test of another person’s aptitude, intelligence, and wit. Perhaps, a silver lining of generative AI bots all talking to each other on our behalf will be that humans are inspired to spend more time together in person.

AI wasn’t supposed to do the fun stuff

It’s fun to create. Creation begets new ideas, it inspires us, it grounds us. I don’t think we need to consider brain chemistry to appreciate this fact. This is an entirely aesthetic argument, but is generative AI going to make creation more fun?

My first experience with what you might call data science was in a field called computational fluid dynamics. In a nutshell it’s the art of using computers to solve complicated math problems that explain how water or air behaves. It’s an incredibly important field in aerospace and naval architecture and it quite literally could not be done without a computer.

This formed the basis for the type of problems I thought AI & ML were made for; challenges that are too hard, complex, or time consuming for people to accomplish on their own. I used this mindset over the past decade to build software that used AI to automate painful tasks, leaving more time for creative pursuits. In all that time I never found a need to automate creative processes. That was never a pain-point, yet these activities are now being replicated with Generate AI. Given that many of us get a great deal of meaning from our work (for better or worse) what does that leave for humans?

What’s going to happen to those left behind?

Working in the field of artificial intelligence requires me to stay up to date with emerging trends in AI that aren’t typically known to the average person. I’m constantly learning about new models, open source projects, frameworks, startups, prompt engineering guides…it can get overwhelming. If it’s this dizzying for me, how could the average person possibly prepare for the coming changes to work, media, and economics-that are all but inevitable-and then successfully adapt to these changes? For the vast majority of people, it will be a roll of the dice as to whether or not they are able to avoid being in a field which ceases to exist in 5 or 10 years.

Without thoughtful adoption, and honest conversations about what’s going to happen to millions of displaced workers, the speed at which generative AI upends the economy will give us all whiplash.

It’s not all bad

Before concluding, I won’t feel right without admitting that generative AI is not a plague. It’s a technical marvel and it has the potential to do a lot of good. However, it currently being applied in some concerning ways and, perhaps, targeting the wrong domains. You know what could use some generative AI? Circuit design. Protein folding. Any of the Millenium Prize Problems (we’ve only solved one so far…). So why are we using it to steal comics from starving artists, rip off pop songs, or as a crutch for real, authentic human communication?

HumanBrain™: The Champagne of Content Creation?

I don’t yet have an answer for all of this angst except to put on a good shoegaze album while I ponder what we’re doing to ourselves, but I do have a seed of an idea.

It’s hard to imagine that the generative genie is going back into its synthetic bottle. This is a new age that we’ve entered into and, aside from picking up a hammer and joining a luddite revolution, it will be hard to escape the impacts of these new AI models. But, what if we decided to put a premium on authentic human thought? What if we created pathways for people to contribute new ideas through art, literature, music, theater, film, and commerce that was devoid of any AI shortcuts? What if we celebrated human creativity more than that fractional GDP boost generated by increased productivity?

I find myself much more excited at the notion of watching a play conceived entirely by other people who have their own stories to share with me and their fellow humans than I am by the notion of watching a series written, produced, and rendered by GenAI. AI-generated art cannot, by definition, be anything other than regressive.

Over the past decade there’s been a resurgence in support for craftsmanship and handmade goods. Many nations have even given some of their most cherished cultural products support, protection, and promotion abroad. France is perhaps the poster child for protecting their most prized cultural institutions — setting aside special protections for things like alpinism, Fest-Noz, and Champagne. Before too much time has passed, can we achieve something similar for human though? Through some combination of human action, regulations, and technology, I firmly believe that we need to demonstrate that people are worth more to us than technology.

This article was written without the assistance of generative AI. Edits were provided by the author’s human friends and human family.

1 I have not submitted this prompt through an LLM, so I’ll leave this as an exercise to the reader

2 https://www.niu.edu/language-literacy/_pdf/the-benefits-of-writing.pdf
3 https://www.nature.com/articles/s41598-020-62877-0

*https://jalopnik.com/there-was-once-an-actual-name-for-the-idea-that-traveli-1845362095

** https://www.sciencedirect.com/science/article/abs/pii/S1090513810000280

***https://www.frontiersin.org/articles/10.3389/fpsyt.2022.942692/full

--

--

Seth Clark

Co-founder and Head of Product at Modzy, product enthusiast, and serial hobbyist.