The Big Story About The Big Game for Super Bowl Ads is Brand Storytelling.

(Updated January 31, 2025)

For advertisers paying $8 million for a 30-second TV ad in the NFL Championship game, the big story isn’t the Philadelphia Eagles versus Kanas City Chiefs, Jalen Hurts versus Patrick Mahomes, or even the odds on a Travis Kelce-Taylor Swift Super Bowl proposal.

Advertisers need to please a lot of eyeballs.

For them, Super Bowl LIX is about the 2025 Super Bowl of advertising and which brand ads will garner the most votes in the Super Bowl ad polls (winners get lots of press) and the most views on social media before, during, and after Sunday’s game. There’s a lot of pressure on marketing managers, ad agencies, and the creative team.

Neilson reports 123 million people in the U.S. watched last year’s Super Bowl LVII with 120 million in the U.S. – roughly 34% of the country. The most popular TV shows like Yellowstone only reach 11.5 million. How do you write a hit Super Bowl Ad for TV and social media?

How are this year’s brand advertisers trying to please?

Adweek reports that 2025’s Super Bowl ad trends include nostalgia, celebrities, animals, Americana imagery, bro culture, and crowd-sourced commercials. Reports say there will be more ads for AI, not ads created by AI.

As an ad copywriter, I felt pressure with regular TV ads. I never had a national Super Bowl ad, but I did create one that ran locally during the Super Bowl. I also worked on Spot Bowl for years – our ad agency’s national Super Bowl ad ratings poll. I gave each ad a title and description as they ran so we could get them up on the website for voting.

Our research of Super Bowl ads found the best way to please is story.

So, what makes one ad more likable to finish in the top ten of USA Today Ad Meter and Spot Bowl versus the bottom ten? When I became a professor my colleague Michael Coolsen and I asked that very question. Was it humor or emotion? Sex appeal or cute animals? This year will it be nostalgia or using TikTok influencers?

Our two-year analysis of 108 Super Bowl commercials published in the Journal of Marketing Theory and Practice found the key to popularity was telling a story. It didn’t matter if you had animals or celebrities and used humor or sex appeal, the underlying factor to likability was a plot. Super Bowl Ad Poll ratings were higher for ads that follow a full five-act story arc and the more acts commercials had the higher the ratings.

The key is a five-act dramatic story structure.

Why five-acts? Remember studying five-act Shakespearian Plays in high school? There was a reason Shakespeare was so popular and why he used to tell a story in five-acts. It is a powerful formula that has drawn people’s attention for hundreds of years.

The classical drama framework we used was conceived by Aristotle, followed by Shakespeare and depicted by German novelist and playwright Gustav Freytag as a pyramid. His theory of drama advanced Aristotle’s to include a more precise five-act structure as seen below.

Five-act stories also draw views and shares in social media.

Ad rating polls of TV ads are one thing, but how does a story perform in social media? We wanted to find out, so we conducted another research study published in the Journal of Interactive Marketing. We analyzed 155 viral advertising YouTube videos from randomly selected brands in different industries over a year.

Videos that told a more developed or complete story had significantly higher shares and views. We coded the videos based on the same five-act dramatic structure in Freytag’s Pyramid: introduction, rising action, climax, falling action, and resolve.

Analyze this year’s Super Bowl ads for story with this template.

Try doing a little storytelling analysis for yourself! Use the downloadable template below. It describes what needs to happen in each act on the left. Then on the right fill out your description of what happened as you watch the Super Bowl ads.

Some will have all five-acts. Some will have only three, two, one, or even zero. In our viral ad study, only 25% of our sample were five-act stories. In fact, there were more zero-act ads at 31%. After coding for the number of acts compare your results to see how they fare in the two ad polls (Ad Meter, Spot Bowl) and in YouTube views.

Budweiser’s Clydesdales are back this year. How will they do?

Budweiser is bringing back its storied Clydesdale ads for a second year after they abandoned them in 2015. The Clydesdale ads were storied because they told full five-act stories and finished in the top 5 of USA Today’s Ad Meter 8 times in 10 years.

In 2014, I successfully predicted that Bud’s Clydesdale ad “Puppy Love” would be the winner because it was a full full-five-act story and it did finish first in ad polls.

In 2016, I successfully predicted their first non-Clydesdale ad “Don’t Back Down” would not finish in the top 10 because it did not tell a complete story – it finished 28th. I recently found this article from iSpot.tv and how their data confirms our academic research findings.

If you’re interested in applying story to all forms of marketing communications our book Brand Storytelling explains how to follow this 5-act dramatic form for TV, online video, and all IMC touchpoints such as print ads, banner ads, direct, radio, and PR.

This Was Human Created Content!

AI’s Multimodal Future Is Here. Integrating New AI Capabilities In The Classroom.

AI image generated using Google ImageFX from a prompt “Create an image of a professor training an AI computer chip as if it was a dog in a university classroom.” https://labs.google/fx/tools/image-fx

In my last post, I needed a pep talk. In teaching digital and social media marketing I’m used to scrambling to keep up with innovations. But AI is a whole other pace. It’s as if I’m trying to keep up with Usain Bolt when I’m used to running marathons.

Like the marathon I signed up for in July, November comes quickly. No matter how training goes the start time comes, the horn goes off, and you run. Here comes the Spring semester. No matter the number of AI updates dropped in December I need to show up ready to go in early January.

If I want to make a difference and have an influence on how AI impacts my discipline and teaching, I don’t have a choice. I can relate to what AI expert Ethan Molick said in his latest Substack,

“This isn’t steady progress – we’re watching AI take uneven leaps past our ability to easily gauge its implications. And this suggests that the opportunity to shape how these technologies transform your field exists now when the situation is fluid, and not after the transformation is complete.”

The other morning, when I should’ve been finishing Fall grades, I spent a couple of hours exploring AI updates and planning how I’ll advance AI integration for Spring. Instead of AI bans (illustrated by the Fahrenheit 451 inspired image of my last post), I’m going deeper with how we can train AI to be our teaching friend, not foe.

AI image generated using Google ImageFX from a prompt “Create an image of a professor training an AI computer chip as if it was a dog in a university classroom.” https://labs.google/fx/tools/image-fx
AI image generated using Google ImageFX from a prompt “Create an image of a professor training an AI computer chip as if it was a dog in a university classroom.” https://labs.google/fx/tools/image-fx

NotebookLM opens up teaching possibilities.

A lot of new AI updates came this Fall. One that caught my eye was Google’s NotebookLM. In a NotebookLM post, I explained how I was blown away by its Audio Overview of my academic research that it turned into an engaging podcast of two hosts explaining the implications for social media managers.

I see potential to integrate it into my Spring Digital Marketing course. NotebookLM is described as a virtual research assistant –  an AI tool to help you explore and take notes about a source or sources that you upload. Each project you work on is saved in a Notebook that you title.

These are the various notebooks I’ve used so far for research and the new course notebook.
The various notebooks I’ve used so far for research and for my Digital Marketing class.

Whatever reference you upload or link, NotebookLM becomes an expert on that information. It uses your sources to answer questions and complete requests. Responses include clickable citations that take you to where the information came from in sources.

As a Google Workspace for Education user, uploads, queries, and responses are not reviewed by human reviewers or used to train AI models. If you use your personal Google account and choose to provide feedback, human reviewers may see what you submit. To learn more click here.

Source files can be Google Docs, Google Slides, PDFs, Text files, Web URLs, Copy-pasted text, public YouTube video URLs, and Audio files. Each can contain up to 500,000 words or 200MB files. Each notebook can contain up to 50 sources. Added up NotebookLM’s context window is large compared to other models. ChatGPT 4o’s context window is roughly 96,000 words.

When you upload to NotebookLM, it creates an overview summarizing sources, key topics, and suggested questions. It also has a set of standard documents with an FAQ, Study Guide, Table of Contents, Timeline, or Briefing Doc. An impressive feature is the Audio Overview which generates an audio file of two podcast hosts explaining your source or sources.

NotebookLM as an AI tutor.

I plan on using NotebookLM as an AI tutor for students in my Spring Digital Marketing course. I like the open-source text I’ve been using for years, but the author has stopped updates. The strategic process and concepts are sound, so I update content with outside reading and in-class instruction.

I tested NotebookLM creating a notebook for Digital Marketing course resources. First, I uploaded the PDF of the text. Then, I added website links to six digital marketing websites that I use for assigned readings and in-class teaching. Finally, I added my blog. I plan to show students how to create theirs at the beginning of the semester.

This is my notebook for Digital Marketing. I was impressed with asking it questions that I often get from students about assignments.
This is my notebook for Digital Marketing. I was impressed with the answers it gave to questions I often get from students.

AI may not be accurate 100% of the time, but controlling the sources seems to help and puts less pressure on crafting a perfect prompt. My discipline knowledge knows when it gets something wrong. I tested my Digital Marketing NotebookLM asking questions on how to complete main course assignments such as personal branding blogs, email, SEO, and content audits. I haven’t noticed any wrong answers thus far.

Important note about copyright.

I’m testing NotebookLM in this class because my main text is open source and all the websites I link to are publicly published sites (not behind paywalls). Google is clear about its copyright policy,

“Do not share copyrighted content without authorization or provide links to sites where people can obtain unauthorized downloads of copyrighted content.”

We should set a good example and educate students by not uploading copyrighted books or information only accessible through subscriptions or library databases. Below is my general AI policy for the course.

The policy carves out acceptable and helpful uses of AI while explaining the ways AI should not be used.
This policy carves out acceptable/helpful AI use while explaining ways AI shouldn’t be used.

In completing final reports students will access information behind paywalls such as Mintel reports. They’ll add the information and cite it as they’ve done in the past. The goal isn’t to use NotebookLM to complete their assignments for them. The goal is to give them a resource to better understand how to complete their assignments.

NotebookLM as a study tool.

I see NotebookLM as a positive tool for student learning if used as a study guide, reinforcement, or tutor. It would have a negative impact if used to simply replace reading and listening. What’s missed when you use AI in the wrong way is depicted in an infographic I created for a previous blog post on the importance of subject matter expertise when using AI.

For a website assignment, my course NotebookLM gave a nice summary of the process and best practices to follow. That’s something students often struggle to find in the text and other sources. The assignment requires pulling from multiple chapters and resources. The notebook summary included direct links to the information from various text chapters and digital marketing blogs. I also tested its accuracy with questions about an email assignment and had it create a useful study guide.

This will be so helpful for an assignment that student often miss steps and best practices as it draws from multiple parts of the text.
Answering questions will be helpful in assignments where students often miss steps and best practices that draw from multiple parts of the text and readings.

Students can create audio overviews of podcast hosts talking about a topic drawing from the sources. Impressively, when I asked for an Audio Overview explaining the value of a personal professional blog assignment to students it understood the student’s perspective of thinking blogs are outdated. It began, “As a student, I know you’re thinking blogs are outdated, but personal professional blogs are a great …” The Audio Overview also adjusted the text process for businesses and applied it to a personal branding perspective.

Going beyond Copilot in other areas.

I also plan on students leveraging new AI capabilities in Adobe Express and Google’s ImageFX in multiple classes. Our students have free access to Adobe Creative Suite where new AI capabilities go beyond Firefly generated images. In Express you can give it text prompts to create mockups of Instagram and Facebook posts, Instagram stories, YouTube thumbnails, etc.

Students' ideas will be able to be expressed even better with Abobe’s new text to create AI interface in Adobe Express along with its image creation capabilities with Firefly.
Students’ ideas can be expressed better with the text to create AI interface in Adobe Express along with the image creation capabilities of Firefly.

AI’s multimodal future is here.

That other morning I also dove deeper into new AI multimodal capabilities. It was so remarkable I recorded videos of my experience. I explored new live audio interactions in NotebookLM and created a demonstration of what’s possible with Google’s Gemini 2.0 multimodal live video.

I was blown away when testing the new ability to “Join” the conversation of the podcast hosts in NotebookLM’s Audio Overview. While the hosts explained the value of a personal professional blog, I interrupted asking questions with my voice.

 

Near the beginning, the hosts tell students to write about their unique skills. I clicked a “Join” button and they said something like,

“Looks like someone wants to talk.” I asked, “How do you know your unique skills?” They said “Good question,” gave good tips, and continued with the main subject. Later I interrupted and asked, “Can you summarize what you have covered so far?” They said sure, gave a nice summary, and then picked back up where they left off.

Finally, I interrupted to ask a common student question, “What if I’m nervous about publishing a public blog?” The hosts reassured me saying people value honesty and personality, not perfection. What really impressed me was the hosts answering questions about things not specifically in the sources. They could apply concepts from the sources to understand the unique perspective of a given audience.

Multimodal AI as a live co-worker.

This last demonstration of the new multimodal capabilities of AI is for my own use. With Gemini 2.0 in my Google AI Studio account, I could interact in real time using text, voice, video, or screen sharing.

The video below is a demonstration of what’s possible in live video and conversations with Gemini 2.0 as it “sees” what‘s on my screen. I had a conversation with it to get feedback on the outline for my new five-part AI integration workshop I’m planning this Spring for faculty on campus.

Writing the last two blog posts was time well spent.

Planning what I’ll do in the Spring and writing these last two blog posts has taken me two-three days. Because it was 100% human created there was a struggle and a time commitment. But that is how I learn. This knowledge is in my memory so I can explain it, apply it, or answer questions.

Talking to Gemini was helpful, but it doesn’t compare to the conversations I’ve had with colleagues. AI doesn’t know what it feels like to be a professor, professional, or human in this unprecedented moment. Let me know how you’re moving beyond AI bans and where you’re executing caution.

I have a lot of work to do to implement these ideas. That starting horn for the new semester is approaching fast.

100% Human Created!