Will Artificial Intelligence eventually replace chefs?
That's the question I'd like you to ask yourself as your read what follows.
For this post, I'm going to offer a short version and an extended version, and you can decide for yourself how much you'd like to read.
The short version of this story is that last week I asked an artificial intelligence program for help with my dinner... and the result was delicious.
The exact prompt I provided to ChatGPT (the large language model chatbot developed by OpenAI that lets you prompt it with a question or task and get surprisingly human-like responses) and the system's response are in the screenshot below.
If you want to read more about the implications behind this, you'll need to read the longer version of the story that appears after the recipe. Otherwise, see you next time!
Hey, you're still here? Thanks for sticking around!
The extended version of this story starts off by addressing an obvious question: "Why would you ask AI for a dinner recipe?"
And the simple answer is "because I wanted to know what would happen".
Here's the backstory: earlier last week, I purchased a few gigantic pork shoulders because they were on sale for a ridiculously low price, and two things I often find too difficult to resist are meat and great deals.
The problem is these slabs of meat were too large to fit into our family's slow cooker, and we haven't had a functional oven for over a year.* I could have cut up the slabs of meat into smaller portions but instead decided instead to try using my barbecue.
I've never barbecued a giant pork shoulder before and didn't know where to start.
At this point, most people would Google "BBQ Pork Shoulder recipes" and go from there.
But I'm becoming increasingly fascinated with how Artificial Intelligence can be applied to our everyday lives, and the impact it might have on not only the marketing profession but also the world at large. I've been testing various A.I. tools whenever I can gain access to them to try and learn about their general capabilities and limitations, and creating a recipe seemed like a great use case.
So I decided I would prompt Chat GPT to give me "a detailed recipe for a pork shoulder cooked on a BBQ with a coffee rub."
Why with a coffee rub? Two reasons. First, a former colleague once prepared a BBQ dinner for our team with a coffee rub, and it was delicious. And second, I didn't want to make it too easy for the A.I. to complete my request. It was a test, after all.
But it was a test the A.I. passed with flying colours: you can see the recipe it offered above.
If you're curious, I prepared it exactly to the instructions I was given with three exceptions:
I didn't have Kosher salt, so I used regular salt instead. When I posted the A.I.'s recipe on LinkedIn, a Jewish friend of mine joked, "I thought if you put kosher salt on pork, the universe collapses into itself!" Hmmm... perhaps A.I. is out to get us after all.
A friend who's much more proficient with a BBQ than I am told me that two tablespoons of salt wouldn't be enough for all that meat, so I increased the quantity to 1/4 cup. (That might have been a mistake; the rub was definitely on the salty side.)
I didn't have "at least 4 hours" to let the pork sit in the refrigerator if it needed "about 6-7 hours" to cook because I began preparing this meal at 9:30 am and didn't want to eat dinner at eight o clock. So it only sat in the fridge for 90 minutes and cooked for six solid hours at 350F.
Now let's recap:
I asked ChatGPT for a recipe;
It provided me with one;
I followed most of the instructions I was given...
... and the result was delicious.
The $29 billion question is this: was ChatGPT intelligent enough to "create" this recipe or did it simply find it on the internet and claim it as its own?
When I shared my intention to let AI help with my dinner on LinkedIn yesterday morning, I received a flood of comments.
One connection suggested, "It just googled a recipe for you."
I don't think that's entirely accurate, though. Because if that's what it did, then I should have been able to take a segment of the recipe, put it into quotes, search for it on Google, and have Google return an exact match.
But when I tried that, Google failed to return anything:
If you understand how Google works, this is significant. When you search for something on Google in quotes, you're telling the search engine you want an exact match to your query. And so Google goes off and searches the internet for exactly what you typed.
But in this case, Google couldn't find exactly what I typed (i.e. the excerpt of the ChatGPT recipe) anywhere on the world wide web, which suggests one of two things:
The exact recipe DOES exist on the internet but Google couldn't find it.
The exact recipe DOES NOT exist on the internet; ChatGPT created it.
If the first scenario is true, and Google wasn't able to find the exact recipe that exists somewhere on the internet, then the obvious question becomes: how was ChatGPT able to find it if Google couldn't? The implication here is that ChatGPT's search capabilities are better than Google's, and at least for the moment, that probably isn't true.
(On a related note, if you ever thought Google is so big and powerful that it can never be disrupted, well, this might be how it happens. All companies can be disrupted over time: there are good reasons the Fortune 500 list from 1955 looks so different than it does today.)
If the second scenario is true, and the exact recipe I was given doesn't exist somewhere on the internet, that would mean ChatGPT couldn't "steal" it and serve it up to me. Instead, it had to "create" a recipe from whatever information about BBQ pork shoulder recipes it could find, or find a recipe that closely met my parameters and modify it.
This brings us to a second interesting comment posted on my LinkedIn thread.
A friend remarked, "Ok, but really this was a recipe that someone else made that somehow the AI read, understood, maybe tweaked slightly--- but it doesn't *know* how to cook. It knows how to learn and steal."
Perhaps that's true. But doesn't all learning involve an element of stealing established ideas? Using the example I gave my friend as a reply, my university students "steal" the ideas I present to them each class and then either reject them (and form their own contrary opinions) or adopt and build upon them to create ideas of their own.
So if ChatGPT found an existing recipe, realized it didn't perfectly meet my needs, and then modified it until it did... isn't that exactly how a human would tackle the problem?
Isn't what ChatGPT did the same as a human taking their existing knowledge and building upon it to end up with a better outcome?
I'm not an expert in A.I. but it sure seems that way to me.
The Globe and Mail recently suggested 2023 could be the year of artificial intelligence. TechCrunch wrote about what we might expect next year from the rapidly progressing technology. And if you have a LinkedIn account, it's not difficult to find people from all backgrounds testing the limits of tools like ChatGPT.
As I was scrolling through my social media feed last week, I stumbled across a great insight: "A.I. won't replace your job. Someone who knows how to use A.I. will replace your job."
Taking the time to start learning how to use A.I. tools is time well spent.
And perhaps asking for some help with dinner is one easy way to begin.
* It's a long story that involves the need to replace a defective Power Selector Timer for our Ancona oven and a stressed global supply chain that's prevented us from securing one... don't ask. But if you happen to know a part supplier, please message me!
P.S. In keeping with the theme of this post, the image used above was also created via Artificial Intelligence; I prompted Bria.ai to give me a picture of "a robot making a meat dish in a nice kitchen", and voila! You can sign up to Bria.ai for free and then use the site to find and edit visuals that meet your exact requirements. A special thanks to Omri Perahia for giving me "Ultimate" access so I could play with the tool and share my findings with you.
Comments