Joy Interrupted: AI Can Distract From Opportunities For Learning And Human Connection.

An image of a poster promoting the Ross Gay even on a college campus.

This is the fifth post in a series of five on AI. My last post discussed why we need more than prompt engineers, but also subject matter experts. This post discusses the danger of losing that learning if AI is not used in the best ways.

Last spring, I went to a campus reading from New York Times best-selling author and poet Ross Gay. It was well attended by faculty, administrators, employees, community members, and students. In front of me, three students sat with two on laptops and one on a phone. My first thought was a professor required attendance and they were taking notes for an assignment.

A great opportunity on campus was this author’s reading and book signing.

 

Ross began drawing me in with engaging stories of happiness and sorrow and simple delights found in life if we pay attention. He’s a master observer of joy found in everyday moments. His message and delivery were powerful, yet, my attention was soon distracted by the busy screens in front of me.

Glancing down it was obvious the students weren’t taking notes. They didn’t look up at the author at all. The student in the middle was watching a video on the phone. The two on laptops were jumping back and forth between different websites, documents, emails, and social media.

Despite Ross’s dramatic reading, I had trouble focusing with three screens flitting around in front of me. I imagined what it’s like to be a student in the back of a lecture hall or even a small classroom with dozens of student multitasking screens in front of them.

AI promises to free us from busy work.

Between the author’s readings, I glanced down again, hoping to find evidence of something related to the event and this author. Instead, I noticed the ChatGPT screen. Maybe the student was using it to supplement an assignment or get help with a difficult task. Perhaps I’d see how a professor integrated AI into a class.

Instead, I saw quiz questions from the university learning management system. Each question and answer was quickly copied back and forth between ChatGPT and a course quiz. Twenty questions were answered in less than a minute. I saw no effort to answer questions first or even read them. Did the student not know this was wrong? Or were they so engrossed in the screen that they forgot their surroundings? Perhaps the student views quizzes as busy work, not a learning tool to ensure reading and internalizing information.

AI promises to free us from drudgery to explore human creativity and imagination. The article “How AI can save you time: 5 skills you no longer need to learn, tells us we can now skip learning skills like writing because AI will do writing like reports and news articles for us. I wonder what creativity the journalist will explore when Euro News outsources articles to AI. I don’t want to be freed from writing. My creativity and imagination are explored through writing and evidence tells us writing is how we learn.

If the student in front of me was using AI to save time to explore human creativity, they missed one of the best opportunities that semester. While they focused on their screen using AI, a poet expressed the joy of being human moving some in the room to tears.

Sometimes there is no shortcut to learning.

Much of AI is being marketed to us and students as a shortcut. The easy way to complete a task, assignment, paper, or degree. In AI’s Promise To Pay Attention For You., Marc Watkins of Mississippi AI Institute, says, “Many third-party app developers are building off of OpenAI’s API to create apps that promise an end-to-end user experience where a machine listens for you, complies with information, and then creates bespoke summaries all so you don’t feel burdened by listening or thinking about the content.”

TikTok is full of student videos promoting these apps as the easy way to an easy grade. I’m all for removing friction to make banking, car buying, and hotel booking easy. But is easy the best way to learn? What if friction and struggle are how we learn? In an op-ed Jane Rosenzweig of Harvard College Writing Center says, “Our students are not products to be moved down a frictionless assembly line, and the hard work of reading, writing, and thinking is not a problem to be solved.”

If not used properly AI can get in the way of learning. This summer I received an email marketing assignment in which a student “wrote” bland generic email copy. Then a paragraph explained how the email “fosters a deep emotional connection with the audience” and “reflects a deep understanding of the target audience’s needs.” But it didn’t! It sounded like the correct but unfeeling, general copy LLMs tend to generate.

The LLM knew what good email copy should do, but couldn’t write it. My student needed to be the human in the loop. I can teach how to write copy that forms an emotional connection with the audience based on human insight, but not if a student uses AI to write the entire assignment. Why would an employer hire them if AI could complete the entire project on its own?

Is liberal arts education the answer to AI job losses?

If AI takes away skills, many say the way to remain relevant is through liberal arts. Business Insider says AI startup founders hire liberal arts grads to get an edge. In Bloomberg, a Goldman Tech Guru says AI is spurring a “Revenge of the Liberal Arts.” WIRED proclaims, “To Own the Future, Read Shakespeare.”

IBM’s AI chief advises students who want a tech job to learn the language and creative thinking skills you get in the liberal arts. Because AI speaks our language not computer code we need prompt engineers “to train it up in human behavior and thinking.”

Marketing AI Institute’s Paul Roetzer believes the next generation entering the workforce will remain relevant with a broad-based liberal arts education. Literature, philosophy, history, and art are what make us human and teach critical thinking, analysis, creativity, communication, collaboration, integrity, understanding, and nuance. What AI can’t do.

But what if AI can also get in the way of learning liberal arts? Using AI to skip the reading and skip the writing skips the learning to save time for what? Doing the reading, processing the information, committing it to memory, and explaining it through writing is how you learn critical thinking and creativity. To have an imagination you need knowledge.

Is AI the answer to our loneliness epidemic?

In 2023, the Surgeon General released an advisory warning of a crisis of loneliness, isolation, and lack of connection. Nearly half in the U.S. experience loneliness which can increase the risk of death comparable to smoking. Rates of anxiety and depression on college campuses have never been higher. More than 60% of college students report at least one mental health problem.

Some are promising AI can solve this problem with Artificial intelligence friends. I sat in a room full of people with an author talking about human connection while students in front of me focused on their screens. Are AI friends really the best solution? What I often see in the classroom is students not talking to each other because they are focused on their screens.

After Ross finished there was a Q&A. A mental health professional behind me wanted to thank him. She gives her patients struggling with anxiety and depression Ross’s books as homework. Many report back that the books make the difference in being able to get out of bed some days.

I did not miss the irony. Many students struggle with anxiety and depression and many feel screen time is a part of it. Yet, here I was between a mental health professional, students, and one of her solutions. A real human in the room. I’m sad for the students who missed out on the joy of the evening – right past the screens taking their attention. In reading Jonathan Haidt’s The Anxious Generation my understanding and empathy for Gen Z has grown.

The exception rather than the rule.

It’s important to note this was a couple students. I’m grateful for the larger group of enthusiastically engaged students at the event. In my experience, most students are not looking for the easy way out and want to learn their disciplines by integrating AI in beneficial ways. But they need our guidance.

In a review by Turnitin, they found that of the more than 200 million writing assignments reviewed by Turnitin’s AI detection tool last year, some AI use was detected in just 1 out of 10 assignments. Only 3 out of every 100 were generated mostly by AI.

Education experts warn that focusing too much on AI cheating can cause distrust between instructors and students. We should frame the conversation around ways AI can both support and detract from learning. Our role is AI literacy providing specific guidance on when and when not to use AI.

I hope to educate students on the role of AI in their lives and how to make intentional choices about what to outsource to AI, what to keep for ourselves, and how to prepare for careers with AI to keep humans in the loop.

This Was Human Created Content!

More Than Prompt Engineers: Careers with AI Require Subject Matter Expertise.

This graphic shows that in stages of learning you go through attention, encoding, storage, and retrieval. You need your brain to learn this process not just use AI for the process.

This is the fourth post in a series of five on AI. In my last post, I proposed a framework for AI prompt writing. But before you can follow a prompt framework, you need to know what to ask and how to evaluate its response. This is where subject matter expertise and critical thinking skills come in. A reason we need to keep humans in the loop when working with large language models (LLM) like ChatGPT (Copilot), Gemini, Claude, and Llama.

Photo by Shopify Partners from Burst

Will we all be prompt engineers?

Prompt engineering is promoted as the hot, new high-paying career.” Learning AI prompt techniques is important but doesn’t replace being a subject matter expert. The key to a good prompt is more than format. As I described in my post on AI prompts, you must know how to describe the situation, perspective, audience, and what data to use. The way a marketer or manager will use AI is different than an accountant or engineer.

You also must know enough to judge AI output whether it’s information, analysis, writing, or a visual. If a prompt engineer doesn’t have subject knowledge they won’t know what AI got right, got wrong, and what is too generic. AI is not good at every task producing general and wrong responses with the right ones. With hallucination rates of 15% to 20% for ChatGPT, former marketing manager Maryna Bilan says AI integration is a significant challenge for professionals that risks a company’s reputation.

AI expert Christopher S. Penn says, “Subject matter expertise and human review still matter a great deal. To the untrained eye, … responses might look fine, but for anyone in the field, they would recognize responses as deeply deficient.” Marc Watkins, of the AI Mississippi Institute says AI is best with “trained subject matter experts using a tool to augment their existing skills.” And Marketing AI Institute’s Paul Roetzer says, “AI can’t shortcut becoming an expert at something.”

Prompt engineering skills are not enough.

As a college professor, this means my students still need to do the hard work of learning the subject and discipline on their own. But their social feeds are full of AI influencers promising learning shortcuts and easy A’s without listening to a lecture or writing an essay. Yet skipping the reading, having GPT take lecture notes, answer quiz questions, and write your report is not the way to get knowledge into your memory.

Some argue that ChatGPT is like a calculator. Yes and no. This author explains, “Calculators automate a . . . mundane task for people who understand the principle of how that task works. With Generative AI I don’t need to understand how it works, or even the subject I’m pretending to have studied, to create an impression of knowledge.”

My major assignments are applied business strategies. I tell students if they enter my assignment prompt into ChatGPT and it writes the report for them then they’ve written themselves out of a job. Why would a company hire them when they could enter the prompt themselves? That doesn’t mean AI has no place. I’ve written about outsourcing specific tasks to AI in a professional field, but you can’t outsource the base discipline knowledge learning.

AI can assist learning or get in the way.

I know how to keep humans in the loop in my discipline, but I can’t teach students if they outsource all their learning to AI. Old-fashioned reading, annotating, summarizing, writing, in-person discussion, and testing remain important. Once students get the base knowledge then we can explore ways to utilize generative AI to supplement and shortcut tasks, not skip learning altogether. We learn through memory and scientists have studied how memory works. Used the wrong way AI can skip all stages of learning.

Click the image for a downloadable PDF of this graphic.

I remember what it was like being a student. It’s very tempting to take the second path in the graphic above – the easiest path to an A and a degree. But that can lead to an over-reliance on technology, no real discipline knowledge, and a lack of critical thinking skills. The tool becomes a crutch to something I never learned how to do on my own. My performance is dependent on AI performance and I lack the discernment to know how well it performed.

Research skills in searching databases, evaluating information, citing sources, and avoiding plagiarism are needed to discern AI output. The online LLM Perplexity promised reliable answers with complete sources and citations, but a recent article in WIRED finds the LLM search engine makes things up and Forbes accuses it of plagiarizing its content.

A pitch from OpenAI selling ChatGPT Edu, says, “Undergraduates and MBA students in Professor Ethan Mollick’s courses at Wharton completed their final reflection assignments through discussions with a GPT trained on course materials, reporting that ChatGPT got them to think more deeply about what they’ve learned.”  This only works if the students do the reading and reflection assignments themselves first.

Outsourcing an entire assignment to AI doesn’t work.

A skill I teach is situation analysis. It’s a foundation for any marketing strategy or marketing communications (traditional, digital, or social) plan. Effective marketing recommendations aren’t possible without understanding the business context and objective. The result of that situation analysis is writing a relevant marketing objective.

As a test, I asked ChatGPT (via Copilot) to write a marketing objective for Saucony that follows SMART (Specific, Measurable, Achievable, Relevant, Time-bound) guidelines. It recommended boosting online sales by targeting fitness enthusiasts with social media influencers. I asked again, and it suggested increasing online sales of trail running shoes among outdoor enthusiasts 18-35 using social media and email.

Then I asked it to write 20 more and it did. Options varied: focusing on eco-friendly running shoes for Millennials and Gen Z, increasing customer retention with a loyalty program, expanding into Europe, increasing retail locations, developing a new line of women’s running shoes, and increasing Saucony’s share of voice with a PR campaign highlighting the brand’s unique selling propositions (USP). It didn’t tell me what those USPs were.

Which one is the right answer? The human in the loop would know based on their expertise and knowledge of the specific situation. Generated with AI (Copilot) ∙ July 2, 2024 at 3:30 PM

I asked Copilot which is best. It said, “The best objectives would depend on Saucony’s specific business goals, resources, and market conditions. It’s always important to tailor the objectives to the specific context of the business. As an AI, I don’t have personal opinions. I recommend discussing these objectives with your team to determine which one is most suitable for your current needs.” If students outsource all learning to LLMs how could they have the conversation?

To get a more relevant objective I could upload proprietary data like market reports and client data and then have AI summarize. But uploading Mintel reports into LLMs is illegal and many companies restrict this as well. Even if I work for a company that has built an internal AI system on proprietary data its output can’t be trusted. Ethan Mollick has warned that many companies building talk-to-your document RAG systems with AI need to test the final LLM output as it can produce many errors.

I need to be an expert to test LLM output in open and closed systems. Even then I’m not confident I could come up with truly unique solutions based on human insight If I didn’t engage information on my own. Could I answer client questions in an in-person meeting with a brief review of AI-generated summaries and recommendations?

AI as an assistant to complete assignments can work.

For the situation analysis assignment, I want students to know the business context and form their own opinions. That’s the only way they’ll learn to become subject matter experts. Instead of outsourcing the entire assignment, AI can act as a tutor. Students often struggle with the concept of a SMART marketing objective. I get a lot of wrong formats no matter how I explain it.

I asked GPT if statements were a marketing objective that followed SMART guidelines. I fed it right and wrong statements. It got all correct. It also did an excellent job of explaining why the statement did or did not adhere to SMART guidelines. Penn suggests explain it to me prompts to tell the LLM it is an expert in a specific topic you don’t understand and ask it to explain it to you in terms of something you do understand. This is using AI to help you become an expert versus outsourcing your expertise to AI.

ChatGPT can talk but can it network?

Last spring I attended a professional business event. We have a new American Marketing Association chapter in our area, and they had a mixer. It was a great networking opportunity. Several students from our marketing club were there mingling with the professionals. Afterward, a couple of the professionals told me how impressed they were with our students.

These were seniors and juniors. They had a lot of learning under their belts before ChatGPT came along. I worry about the younger students. If they see AI as a way to outsource the hard work of learning, how would they do? Could they talk extemporaneously at a networking event, interview, or meeting?

Will students learn with the new AI tools that summarize reading, transcribe lectures, answer quiz questions, and write assignments? Or will they learn to be subject matter experts who have discerned via AI Task Frameworks and AI Prompt Frameworks the beneficial uses of AI making them an asset to hire? In my next post, the final in this 5 part AI series, I share a story that inspired this AI research and explore how AI can distract from opportunities for learning and human connection.

This Was Human Created Content!