Probably 2024's hottest name in marketing, Jay Schwedelson joined us on Avidly Talks to discuss all...
Have you heard someone saying "SEO is dead in 2024" or "SEO is not gonna part of marketing in the future"? If yes then you will probably want to give this episode a listen!
Marie Haynes is a leading authority in the SEO industry, especially recognized for her expertise in Google algorithm updates and website quality assessment. She's also the founder of her own community The Search Bar.
Watch the episode below or listen on Spotify to learn more about how ai will affect search and why we need people who understand it if we want to have a chance to rank in the future:
Listen on Spotify
Takeaways from this episode
- Google's primary goal is to help users find information.
- AI is increasingly integral to Google's search algorithms.
- Content quality and user intent are crucial for SEO success.
- The mixture of experts model enhances Google's AI capabilities.
- User behavior significantly influences search rankings.
- Marketers must adapt to the evolving landscape of SEO.
- Creating helpful content is essential for ranking well.
- AI tools can aid in understanding and improving content strategies.
- The future of SEO will require a focus on user engagement.
- Continuous learning and adaptation are key in the AI era.
Chapters
Full video transcript
Marie Heynes (00:00.248)
Google's goal is not to make websites rank. Their goal is to help people find the information that they're looking for.
Paul Mortimer (00:19.95)
Welcome to AvidlyTalks, a mostly marketing, sales and HubSpot focused podcast. Each week we're going to dive into topics that help you do better in your work and try to make you smile along the way. This week we're joined by Marie Haynes, a true icon in the SEO world, consultant, public speaker, podcast host, CEO, leader of her own community of the search bar. Hi, Marie. Hi, thanks for having me. Morning where you are, isn't it? Over in Canada? It is, yes.
Paul Mortimer (00:48.974)
Cool, right, so I want to know what your take is on how fast things have sort of changed and accelerated, particularly in the past one or two months. Yeah, interesting. I mean, I think in the last couple of years we've seen, well, obviously, chat GPT only came on the scene end of 22. But in the last few months, I think that's not really evident to a lot of people how big the change has been.
Marie Heynes (01:16.974)
So I'm sure we'll talk about this, about how Google search algorithms are so strongly driven by AI, by different machine learning systems. We learned in 22 that the helpful content system was a machine learning system, so machine learning is a subset of AI. And then with the March Core update, I really believe that Google set the stage for significantly
Marie Heynes (01:44.448)
more AI being involved in making the decisions on what ranks. So the March core update happened, it rolled out early March of 2024. And just before that in February, I think it was like February 15th, Google put out a blog post about Gemini 1.5. Now you've heard of Gemini, you know that Gemini is the chat bot that is very similar to chat GPT. But Gemini is also the name of
Marie Heynes (02:11.372)
the this era of language models of machine learning systems. Are you familiar at all, with what the transformer architecture is? Have you heard the word? OK, well, you you've heard the word transformer because that's the T in chat GPT is general retrain transformers. Right. So the transformer architecture was something that Google freely gave to the world. And it is the basis of most of what we call AI today.
Marie Heynes (02:39.222)
language models, chat bots, things like that. It's basically just math and everything AI is just math that works to make predictions. So chat GPT, when I type a sentence and I ask a question, the words that come back have been predicted by math as the likely answer to what I want to see. So
In February of 2015, Google made this announcement about Gemini 1.5 and it was not to do with the chatbot. It was about a new architecture that they have called a mixture of experts architecture. And my understanding because I mean, I'm just learning about all of these things so I'll do my best, but I did talk to Jeff Dean, the head of AI at Google. He randomly sat next to me at the IO conference which was just wild that we had a 20 minute conversation about these things. And I didn't know at the time that he was one of the authors of this mixture of experts paper. So here I am sharing my theory with the person who actually designed the architecture. And I said to him, am I right in thinking that this is a bigger deal than the transformer? And he agreed with me that this is a really, really big deal. So.
Marie Heynes
Instead of now Google having AI that is based on the transformer network, which is doing math to figure out what to predict, now the mixture of experts model has a gating neural network, which then decides what information to send to thousands of other neural networks, potentially millions. I don't know if it's thousands or millions, but it's a mixture of many neural networks.
Marie Heynes (04:25.686)
The mixture of experts model means that now, so Google said that this architecture is the foundation of all of their AI models. Now, in order to understand why that's important to search, we have to understand how AI is used in search. For this part of the conversation, you'll have to believe me that a huge component of search relies on machine learning systems.
Marie Heynes (04:51.694)
and many of them work together along with some of the traditional attributes that we as SEOs have optimized for. And over the last seven or so years, Google's been more and more putting these AI systems into the core of search. So with the mixture of experts model, what it means is that Google systems now can use way more information.
Marie Heynes (05:18.606)
And they're way more efficient at doing these calculations. So again, when I was at the IO conference, Jeff Dean gave a fireside talk and it was mostly just influencers there. They had me there as an influencer. And it was just a very small group of people that he shared that the thing that always hinders Google, the thing that has hindered their progress is computational power. Because ranking has always just been math. So what
Marie Heynes (05:47.978)
Over the years, Google has done more and more with AI. And the Gemini 1.5 announcement that said that they had much more efficiency and they could use much more data kind of tells us that now all of a sudden they can do more calculations for search. so they just get more more specific in accomplishing the goal that has always been their goal, which is to organize the world's information and make it accessible and useful.
Paul Mortimer (06:17.39)
to everyone. Yeah, no, definitely. think the bit you're talking about is quite deep into some language to do with AI, no pun intended. What's the impact of everything you've just said for your general marketers like myself? So I'm more of a content marketer. We do SEO, but we've got out and out SEO specialists, we've got AI specialists listening and we'll probably...
Paul Mortimer (06:47.456)
I've understood things you said and being on that journey with you. For perhaps like your marketing managers, your CMOs or hands-off people or more content people like me. What's that journey been that you've talked about? Is it going from, you know, how it's improved guessing using based on your location, based on your previous searches? What's going into those mathematical equations? If I wanted to really simplify everything that I just said.
Marie Heynes (07:16.854)
And it takes like a whole book to explain it. But if I wanted to simplify it all, when an update happens, what does Google always say? Like they say, well, the key is to create great content, create helpful content. And Google gave us this documentation on the helpful content questions that you could ask yourself, you know, is the content original, insightful? Does it appear to come from a place of expertise? That type of thing. Well, when those questions first came out,
Marie Heynes (07:46.934)
And that was really one of the things that really drew me into studying Google's algorithms was Google said they gave us these the quality rater guidelines that were similar. You know, really the helpful content questions are kind of like a condensed version of what's in the guidelines of how to determine what is high quality. When those came out, anybody who knew anything about SEO would say, well, there's no way an algorithm could determine that type of thing. Anybody who understood PageRank
Marie Heynes (08:15.81)
would say, like there's no way Google can say this page is helpful and this page is not helpful. It just doesn't make sense. Like how would you do that math? And so all of this stuff that I've been babbling on about is, explains, it makes it possible for Google to understand two things. First, AI helps Google understand the intent behind a query. And that's really important. Like one of the papers that I talked about in my book,
Marie Heynes (08:45.332)
says that, if I said the phrase, the car left the house and turned down the driveway, your brain sees way more than is actually in those words. Like, you know, that somebody got in the car, put a key in the ignition, turned the key. Well, it wasn't just that the car went down the driveway and turned like a lot of things were implied in that one sentence that I just said, there's a
Marie Heynes (09:13.09)
video from Frederick de Bu from, I think it was 2020, he was in search at Bing and he said that SEOs are going to need to change from keyword research to intent research. And the reason is that whether it's Bing or Google, they're using AI to better understand the intent behind a query. What's important to know is that it used to be about keywords. It used to be that if I typed, you know,
Marie Heynes (09:41.356)
best accountant in my city, something like that. The pages that would rank would be pages that were relevant to those keywords, that had the words best, had the word accountant, had the city in the title tag and on the page. And to some extent, that's still important. Google's got to start somewhere in figuring out which pages are likely to be relevant. But there may be a lot more behind that intent. And some of it can be personalized. Like Google might know, well,
Marie Heynes (10:10.89)
I own a small business, and so really I'm looking for the best small business accountant. And then Google might know that there's, infer that somebody who searches that query maybe doesn't just want a business that the content says they're the best, but they're really looking for what do other people think are the best.
Marie Heynes (10:37.74)
The simple explanation is that when Google says this helpful content criteria, what they're saying is these are the types of things that their systems are built to reward. And whether we can understand how they do it or not, they're getting closer and closer to being able to do that. So as a content creator, you should really be paying attention to those documents. The very first line says, the information original? And is it insightful? And so much content.
Marie Heynes (11:07.116)
that has done well historically on the web, has done well because we understand Google, that we understand how search engines work, but maybe isn't the most original and insightful. Let's tell us about August Core update then. What's been your findings? What's been your takeaways? There are a number of sites that were impacted by the September Helpful Content update, which was devastating to many sites that are seeing some very significant improvements. You can see these beautiful charts where
Marie Heynes (11:35.822)
where traffic is improving, keyword rankings are improving. Although just this morning I'm analyzing a number of these sites. I think the improvements that they're seeing are related to further drops that happened with March Core and not necessarily helpful content, which is a bit confusing. There's one site that I reviewed. This is a really interesting case actually. This site saw huge improvements with the March Core update. I can't share with you who it is. It's a medical site.
Marie Heynes (12:05.056)
And one of the things I noticed when I was looking at this site, because what I'll do when I'm analyzing sites impacted other positively or negatively by core updates is look at what keywords changed and look at who Google elevated or didn't, know, who did Google start preferring? And then just kind of look at the content and see like, can you tell? Is it more helpful? In this situation, this one particular site,
Marie Heynes (12:30.902)
the pages that were elevated, every one of them had a really helpful video on it that explained the medical condition with like 3D graphics and, you know, it was really, really, I thought something that would be helpful. And my theory, and it still is my theory, is that Gemini 1.5, which I believe the new architecture, the mixture of experts architecture was launched in, along with the March core update, which
Marie Heynes (12:59.82)
There's an article on Search Engine Land that Elizabeth Tucker from Google says that Google introduced architecture changes and introduced new signals into the core algorithm. The March core update allowed Google to also understand video because Gemini is multimodal, meaning that it can understand video and audio and a number of other things.
Marie Heynes (13:25.044)
So it makes sense that the predictions that are trying to say, well, is this likely to be helpful? Could say, well, this is something that contributes towards helpfulness. And it was also original. We didn't see those videos on other sites. And so this site saw this really, really big increase with March Core. Then so far, we're only a few days into August Core. They're seeing almost all of those gains be clawed back. So what's happening?
Marie Heynes (13:52.992)
In order to understand what's happening, we need to talk about the DOJ versus Google trial because I believe it's about NAV Boost. So this court case, I didn't pay much attention to it at first because it seemed to be mostly about Google's deal and it is mostly about Google's deals with Apple and with the mobile phone carriers. But in the case, they go into great detail about the ranking systems. And if you're an SEO and you have not read Pandu Nayak's testimony, like that's
Marie Heynes (14:22.69)
You should stop, I'm sorry, Paul, you should stop listening to this podcast like right now and go read, maybe finish the podcast and then go read this document. So Pandu Nayak gave testimony and they talk about the three systems that are the main ranking systems. One is RankBrain, RankinBed BERT is another, and then DeepRank are these three systems that they talk about. And RankBrain is essentially Google's AI brain for ranking. It helps.
Marie Heynes (14:51.912)
They talk in the trial about how Google will... It's actually pretty amazing how if I put a query into Google, out of the trillions of pieces of content that are out there, it can figure out which few hundred are likely to be relevant to my query. So Google uses some traditional systems to find those few hundred cases, which ones have keywords that are relevant and...
Marie Heynes (15:21.59)
even PageRank is likely a part of that. And then RankBrain re-ranks the top 20 to 30 results. That's pretty wild. When I saw that, like my jaw just dropped because essentially once you're into the top few pages of search, then it's AI that decides what ranks next. Now I have a bunch of theories on how they do that, but Google has told us repeatedly that it's the results that
Marie Heynes (15:50.046)
are most likely to be helpful that get ranked in that way. So those three systems are fascinating. Rank in bed BERT is something that is absolutely fascinating. BERT is also, have you heard of BERT? Do know what I'm talking about? I don't know about it, no. Okay. So there's an article that Google wrote in 2018 talking about BERT, BERT. And it also uses transformers. It's AI that understands intent.
It's trying to figure out that in this big long query that you type, what words are important? What did you actually mean when you typed that? But also in the DOJ trial, they talked about the system called NavBoost. Had you heard of NavBoost before the last few months? Yeah. I've heard of all these things that you're talking about, but not really considered as you're talking. I'm thinking, yeah, I remember us talking about that.
reading about that. This is over the past almost 10 years. Yes, NavBoost is not new, but it's new to us as SEOs. Google has not said very much about it. So this NavBoost system stores every single query that is performed on Google search. I believe you can go into your settings and turn it off, not NavBoost specifically, but you can tell Google, don't want you to track my
my queries, but nobody does that. So every single query that's searched is stored by NavBoost, and not just the query, the actions you took in Google Search. And then we learned more when the API files came out that were supposedly leaked but had been on GitHub for two years that showed all of these different attributes that potentially could be used in ranking, and some of them relate to NavBoost.
So NavBoost can determine what the user clicked on first. So what was their first click? The next is the longest click. So something that captivates a user. So say I'm doing research and I click on your site and I scroll through and I read a bunch of stuff. Now, can Google see that I scrolled through your site? It's debatable. The privacy policy says that they can see our actions in Chrome. They can see what we purchase.
(18:15.598)
I think they do use that, I don't have any evidence for that. But what we do have evidence for is that they know if you spend a lot of time on a site and then you went back to the search results and continued your search, then that's a signal that might indicate that I found your site to be helpful. The third one is whether you're the last click. And that, I think, is probably the most important thing because...
(18:40.566)
That means that that was the page that satisfied my search. The reason why I picked up my phone or opened up my computer was, you you satisfied my search and I didn't go back to the search results. Now, clearly it's not black and white. Like it's possible that I closed my computer because I got frustrated or I got distracted and went on to something else. But it's not hard to imagine that you could take all those signals, especially for every single search that's ever been done with privacy, without privacy controls turned off.
(19:10.718)
that you could put all that together to create a system that tries to predict whether somebody is going to find content helpful. So, NavBoost was discussed in great detail in the trial. Part of the trial was talking about how much Google relies on user behavior. so, every time you do a search, you're training Google as to what was relevant to that search, what you found helpful to that search.
(19:39.852)
So let's go back to this case that I was telling you about where, with the March core update, they saw big improvements and I thought it was because, there's a video on the page and Google's new Gemini technology allows them to, you know, that's more signals that they can use. And so they predicted that these pages with videos were likely to be helpful and that gave them a boost, which is why they saw the increase. Well, then why did they decrease with the August core update, assuming this decrease sticks?
(20:08.684)
And I think it's because of NavBoost. So I think Google predicted that these pages would be helpful. And then when they looked at actual user engagement, I only looked at a couple of queries so far, but the pages that are outranking them are like the Cleveland Clinic, Mayo Clinic, and also scientific articles above these pages that had videos. So Google predicted what was likely to be helpful, and then they adjust their predictions
(20:36.908)
based on what the nav boost signals actually tell them. So that, to me, to sum it up, because like, well, that all sounds great, and I probably made a whole much people's dopamine go up because like, wow, I never knew this, right? But what do do with that information, like as an SEO? And it all comes down to what the Google employees tell us, which everybody makes fun of them for, is like, create content.
r (21:04.654)
that your audience is actually going to find helpful. So those sort of AI models you started the podcast with, it all comes back to its ranking those. Can you say the first 30 results? Ranked brain, yeah. Which we don't know exactly how ranked brain works, but yeah. I remember there being a headline, I think it was this year, perhaps it was last year, Google getting rid of a lot of quality raters. How would Google use in sort of human feedback?
(21:32.994)
beyond those interacting with generative answers. And how heavy do see the generative answers being and how does SEO content as an industry respond? What do you see those things heading? So quality raters, a couple of years ago, Google changed some wording every time the wording changes in the QRG, like, you know, I jump at it to see what did they change? Because it's not random paying attention to the wording in the QRG matters. Well, a couple of years ago, they changed the wording under
Paul Mortimer (22:01.804)
one of the beginning parts where it talks about the purpose of the quality raters. And they said that, I'm not saying it exactly, that their answers helped provide Google with helpful, no, with examples of helpful and unhelpful results. And so it made sense to me that if the quality raters were rating content as helpful or not, those are examples that could be used for training machine learning systems. Now it turns out that
(22:31.726)
That was like partly right and I'm trying to see if I can explain this without getting too confused In the DOJ versus Google trial they talk about this thing called is data Which they don't say what it's called, but it seems that it's information satisfaction you know when when Google talks about AI overviews and they're like Users are more satisfied with it or it's shown a 40 % reduction in
(23:01.402)
or improvement in satisfaction, like those are really based on the quality rating ratings. There's a video from Larry Page, co-founder of Google in the year 2000, where he talks about essentially the perfect version of search would be that you could ask any question and it would provide you with the answer. And then he goes on to say, that would be artificial intelligence. so Google's goal is not
(23:29.666)
to make websites rank. Like their goal is to help people find the information that they're looking for. And it is frustrating because they played a game with us that they gave us this opportunity to create content, put it on the internet, make money from it, whether it's ads or affiliate sales or leads or whatever it is that you make money from. And in return, we have more more incentive to just keep creating content, right?
(23:58.156)
This is why the helpful content system, when Google first came out with it, they called it, they talked about people first content, that it was important to have an audience. And it's really hard, and I feel like I have a really hard time describing this because if you are a content creator, it's hard to see the difference. Like, it's hard to see that my content might be more helpful, like as a content creator, but why are you ranking this person who like,
Paul Mortimer (24:28.014)
has content that seems less helpful, but yet they have like a real life store. And it's because of the intent that's behind it. This is AI's goal is to connect people with what they wanted to find. So if that is information, they're getting, AI is getting better and better at providing me with information. Like if I want to know about a medical condition, I can get a pretty good answer from ChatGPT or from Gemini Chatbot.
Paul Mortimer (24:57.642)
Now there are certain things that I'm going to want to go to somebody with authority on. Like I'm not going to make a decision on taking a medication, you know, from ChatGPT, but if ChatGPT recommends the Mayo Clinic and the Mayo Clinic says, you know, this medication is good and here's all the research behind it, like, okay, that's something good. So I think that we're always going to need this combination of language models backed up by authoritative
Paul Mortimer (25:27.384)
content. The question that always comes up then is, well, why would anybody publish content if the language model is just going to absorb it and use it? How do you benefit from that? We'll go back to me again. Sometimes if you ask ChatGPT, who's a good SEO to understand Google's algorithms, it will say it'll recommend me. Why does it recommend me? It's because I've produced a wealth of content on this.
Paul Mortimer (25:57.32)
And because of, even though ChatGPT doesn't specifically use EEAT, it's the same concept. It knows who in the world is known. When I, just doing this podcast with you, that puts more evidence into the world to say, you know, people actually want to talk to Marie about Google algorithms. Therefore she must be respected as somebody who knows what she's talking about. There's clearly like situations, you you could, you could try to, and people do try to manipulate.
Paul Mortimer (26:26.274)
E E A T in that way, you know, that's why you see, on Twitter, some of the spam comments are like these crypto bots and you know, they're just trying to put more and more information into the world to say that, people trust this. And, so that actually let's talk about Twitter. Why do you think Elon Musk spent $44 billion on Twitter? Not broadcastable. It's not broadcastable. Okay. It seems ludicrous, doesn't it? Like 44 billion is
Paul Mortimer (26:55.03)
an insane amount of money for a social media platform. Twitter is probably the leading source of breaking news information. If you ask Musk why he bought Twitter, he wants X, Twitter now X, to be the source of truth. They're working towards it. And I do believe, I think we're in an era right now where people still know how to manipulate algorithms, but algorithms are continually learning.
Paul Mortimer (27:24.514)
what it is that is likely to be true. And so this is why Twitter started paying people who could produce viral content, like why you could get payments from Twitter, is because they need the experts, they need the world's experts. So when I go on Twitter and I share, I saw these sites make improvement, here's my thoughts on that.
Paul Mortimer (27:52.512)
And Twitter knows that there's enough evidence that people who are verified as people, that's debatable as well, but like a good number of people who the X algorithm tends to trust pay attention to what I write, then that's a signal that something's good. So Twitter amplifies my tweets because they need my tweets. They need Glenn Gabe and Lily Ray and all the, and Barry Schwartz, like people who are sharing the news.
Paul Mortimer (28:22.03)
And in this quest towards what's actually true. So if you're this manufacturing company, you need to put more evidence into the world that you are good. Now, some of that could be content, but it's not just like content that looks good to search engines, it's content that people actually seek out and engage. So I would be writing content that people can't find on the web about your products.
Paul Mortimer (28:51.596)
I'd be sharing stuff that, you know, if there's 100 articles written about this one particular type of product, you gotta figure out, how could I produce something that people would actually wanna see alongside those 100 articles? It's not easy, right? And especially as an SEO, it's like our bag of tricks that we have to make something look like it's better than it is, is getting smaller and smaller. So, you know, links, yeah, links are important, but not...
Paul Mortimer (29:21.012)
all links. And there was a time where we could trick algorithms into thinking that we were better by getting links. But if that customer got mentioned in authoritative news sites, that puts more evidence into the world that people were talking about them. However, again, when your brain starts thinking about, manipulation, I know how to get into Forbes. I know people I can pay to get into Forbes. But even Gary Eash from Google said years ago that
Paul Mortimer (29:49.57)
Google knows which parts of Forbes to trust and which journalists they can pay attention to. So what do you do? You become the business that is known for making that product, which I know that's not a great answer because what does that mean for SEO? It's It's hard because our knowledge of how search works got us really far.
Paul Mortimer (30:19.314)
And today what search is trying to do is to actually say who is the best. Yeah, so, okay, so let's swing it back around to the AI overviews then. The AI overviews got a lot of really bad publicity and some of it for good reason. So you probably heard the stories of like one person asked for what to do about depression and the answer the AI overview gave was
Paul Mortimer (30:47.712)
According to one Reddit poster, you should jump off a bridge, which is clearly not what Google would want to have ranking in their search results. That actual answer was manufactured. was a fake screenshot. It made it to the New York Times. The New York Times had to retract that part of their article saying it was manufactured. Now, some of them were real. I don't know, I haven't tried it recently, but if you ask for like,
Paul Mortimer (31:15.746)
how to keep things from sliding off your pizza, it said according to a one Reddit user, again, you should try non-toxic glue. And like now I'm pretty sure that if you ask that now, it won't give that answer. People don't go to social media to talk about the great answer, like how they were helped by the AI overviews. And there are many people that are. So.
Paul Mortimer (31:40.674)
Google in their earnings call and in some documentation have said, remember we talked about those information satisfaction scores, that when people use AI overviews, they are generally satisfied and they come back and they use them more. so at first, when this all happened where there was like some AI overviews that were questionable, Google really cut back on how many queries they answered with those. And now they're starting to creep up again. Google would not be putting AI in
Paul Mortimer (32:10.732)
these overview answers and in Gemini, unless they're starting to feel comfortable that they're helping people. And it's not just a gut feeling, you know, like they have actual data to show that people do go back to these, they ask more follow-up questions. And if you get listed in the link cards in an AI overview, according to their data, a lot of the time you get more clicks. So, That's just the rich snippets. I remember when they first launched rich snippets and everybody thought,
Paul Mortimer (32:40.77)
Yeah. You're not going to click clicks, but you get so many more clicks for being in a rich snippet. Being in one of those cards, it sounds pathetic, but when you get it on your phone, it's safe. You can just click instead of having to scroll. There's an interview, well, not an interview. Sergey Brin a few months ago was at a hackathon for Gemini.
Paul Mortimer (33:04.702)
And if you saw anything of it in the news, the only thing that people reported on was some guy in the audience had this shirt that made him look like he had boobs. And that's what everybody focused on. Instead, like there's a 45 minute video where near the end of the video, somebody asked Sergey about Google's business model, because this is the question that always comes up is if Google needs content and content creators can't make any money, like
Paul Mortimer (33:33.816)
How does that, they can't show ads and ads is Google's biggest source of revenue. So like, are they in trouble? And he said something to the extent of we've done search for 25 years and for 25 years we've had like a pretty good deal, you know? And he talked about like content and so Google needed that content. I believe that all of search from the beginning to today was to train their, what will eventually be their AI assistant. So.
Paul Mortimer (34:03.63)
But what he said at the end that was really interesting was that regarding the business model, he says if you provide people with a good enough product, the business model will just work out. And he spoke about how people pay for ChatGPT +, and now at that time people were starting to pay for Gemini Advanced. He really made it sound like it will be so good that people will pay to use Google. And then...
Paul Mortimer (34:31.958)
That's just talking about search. We haven't even talked about what happens when businesses start to use this technology. And I think that Google has made it purposely difficult to use their APIs because we're not quite at the point where it is good enough for a business to be super accurate with it. But we'll get there. It's continuing to learn. on that theme, last question to wrap up.
Paul Mortimer (34:59.616)
literally do a full series with you, Marie, on this. I it's fascinating. think we've, well, it's fed to say we've not even scratched the surface, but you're obviously, like you say, it's your job to be on top of these things. You talk about businesses waking up and realizing the power of AI. We were talking about with the changes this summer, the pace that things are moving at revisiting what our client strategies should be for next year.
Paul Mortimer (35:28.646)
and moving forward where we thought the medium term plan was. What's your advice for 80 % of the audience, which is not like on this train yet. We still meet people who were like, to be honest, sort of wanting to go more into SEO as though it was five, seven, eight years ago. So what's your advice to the 80 % of the audience who is not attuned to this yet?
Paul Mortimer (35:55.938)
How do you prepare? Is there a chance to get ahead? What do people do? The short answer is that nobody knows. I really believe that. So Sundar Pichai, the CEO of Google, said many times that this technology will be more profound for society than fire or electricity. Now that sounds like hyperbole that's you know, said to for marketing purposes, but I actually believe him. I think that.
Paul Mortimer (36:25.518)
SEOs are like lamp lighters in the turn of the century where people would be employed to take their oil and go down the street and light all the lamps. It's like asking those lamp lighters how to write a computer program. There would be no comprehension there. So the short-term answer is I don't think anybody knows how to prepare, which is frustrating.
Paul Mortimer (36:51.49)
But that doesn't help you because you need to be doing something and you have businesses that are looking to you. And the businesses are not looking for like, I think this could happen in the future. Like they, want to, they want to prepare now. So if you're an SEO, let's start with that first. The first thing I would say is use language models every day. Use chat GPT and use Gemini. Even if you're not paying for the advanced version, use the free version because
Paul Mortimer (37:21.262)
it will be a skill that will be in high demand, people who understand. And if you believe the media, like everything in the media that's written about AI is like it's garbage, it's junk. Well, there's reason for that because AI is gonna make a lot of the media not necessary. So use it and if you have any interest in developing, try to create something with it.
Paul Mortimer (37:50.926)
when I was at Google I.O., I heard about the Google Developer Contest to create something with Gemini. And I just dabble with programming. I don't really know very much. And I made an app that used this theory that I have about pain, and it uses Gemini to make visualizations for people and actually help a couple people already to not have pain, which is really cool, right? But in order to do that, I needed to learn a lot of
Paul Mortimer (38:20.564)
I didn't actually program. I gave it all to the language models and they gave me the Python for the program. We will get to the point where we don't program. just talk with language and say, my business needs more customers. How do think we could use AI? And then AI will just figure out how to do it for you. That doesn't help you right now though, does it? So one thing I would say is when you're playing with the language models, start asking about
Paul Mortimer (38:49.59)
your clients. Let me tell you an experience and this is getting into theory here, but I asked Gemini a while back something about does Marie Hanes have a community and I do. have this little search bar, a little community and Gemini said, well first it said it didn't know who I was and that's because Gemini is really careful about saying things about people because
Paul Mortimer (39:13.71)
it wants to be accurate. If you push it about people, if you're like, no, you know, Marie Haines, the SEO, or like, it will often come back and say, yes, actually, I think it apologized to me. And it said, no, she does not have a community. And I said, I do actually, it's this, and I gave it a link to the search bar. Then I went to another account on a different computer.
Paul Mortimer (39:42.078)
nothing connected to like I have a couple of different Google accounts and I asked does Marie Hanes have a community and it said yes it does yes she does here it is and it gave me the community now one possibility is that you get different results you know depending on you know maybe I would have gotten that the first time on my first account but I think it's possible that you can teach the language models about your business by giving it information
Paul Mortimer (40:09.718)
I think that's one of the ways that they learn. So for your client that is a manufacturer, you could be doing research that starts off as trying to figure out like does Gemini actually recommend your business and then ask, well have you heard of this company? And give a link to their website. And Gemini will often say things like, I did not know that they did this and that looks like a really helpful page or something.
Paul Mortimer (40:36.362)
And so by conversing about your company with the language models, you can help to get it known in the language models. So that's one thing that I would recommend doing. For a business owner, I really don't know. And it's one of the struggles that I have. Google put out a paper a while back on what they think is going to be the economic impact of AI. And there is concern for job loss.
Paul Mortimer (41:06.19)
But on the flip side, Google says that there's going to be a massive need for re-skilling as people learn to use these models and that the economy will benefit greatly from use of AI. So I would say as a business owner to just learn what you can, to find people that you trust, like you and like me, and to find people who...
Paul Mortimer (41:34.722)
don't have an agenda, you know? I mean, I guess I have an agenda. I want people to read my stuff, but like I'm not, you know what I'm trying to say, right? Like- Yeah, the message I take from that, yeah, keep learning and the fact that you're switching onto it is the first most important step. That's what I take from that advice. Now you talked about getting people to put things on the internet, get people to read your stuff. You've got a new book. I do.
Paul Mortimer (42:04.29)
that we obviously need to plug SEO in the Gemini era. You can clearly talk about this in such great levels of understanding and depth, Marie. we'll put the links in the show notes. SEO in the Gemini era, the story of how AI changed Google search. think you've touched on a few of the chapters today. So yeah, how's that gone? How's that process been for you? Yeah, well, I'll tell you, the best reason to write a book is to learn a subject.
Paul Mortimer (42:34.602)
You know, you think you might, I thought I knew Search and then when I started, it started with just before the helpful content update rolled out, Danny Sullivan from Google reached out to me and to a couple other SEOs and said, could we talk to you about this upcoming update? And I was like, yes, like, you know, how often does Google actually do that? And I realized, like, he didn't really tell me all that much, but we knew that the helpful content system, which now is no longer, it's a part of Google's core updates,
Paul Mortimer (43:03.49)
was a machine learning system. And I realized that like I knew very little about machine learning. And I went off and did Google's course on machine learning and Python and all these things. And realized that if I don't know about AI systems, like very few people do. I mean, there are AI researchers that clearly know more than I do, but.
Marie Heynes (43:30.286)
In my circles, when it comes to other SEOs and to business owners, there's so little that we know. So my goal was to just try to simplify it all. And I didn't realize it would take me two years to write this book. And in the meantime, the DOJ versus Google trial came out and then these API files that told us a lot about the attributes that, you know, I think a lot of those attributes.
Marie Heynes (43:59.116)
I think Google's okay with us knowing what they are because they're used in a completely different way now with machine learning systems. It's too much to like comprehend how it's all used. So yeah, so I wrote the book and it's, you know, it's selling okay. It hasn't sold a ton of cop. I wouldn't recommend writing a book to get rich, but I was finding at the end of, it was around the time that Chat GPT came out. the end of 2022,
Marie Heynes (44:28.686)
I found that I wasn't happy with the advice that I was giving. Like I could understand that like, your traffic has dropped because Google's algorithms have changed. But I couldn't really explain why. So this, I feel like I understand search way better. And I feel like doing podcasts like this is the start of getting, you you excited about learning more. And I actually think that we're in like this really exciting time in search because
Paul Mortimer (44:58.498)
because there are very few people who understand what's happening. And so there's so much opportunity to learn, especially with the AI overviews and now, with Gemini providing people with answers. Like we have to make whole new systems and whole new standards to follow for SEO. You're right. Remark that clip. think the way to sum this up is...
Paul Mortimer (45:23.35)
Don't just check out the book, check out Marie's newsletter. That's how we met. It's absolutely fantastic for people who don't, like you say, business owners say, or marketing managers or general marketers who can get some expert insights. And that's one of the ones that we really look forward to coming in our inbox. So thank you, Marie. It's been an absolute pleasure. think we've got about three episodes worth. Yeah. Really, I think we could bring another three easily. I really appreciate your time.
Marie Heynes (45:52.29)
Thank you. Thanks for letting me ramble on. was great. We'll see you next time. Thank you. Bye bye. Bye bye. Hey, did you enjoy this episode? Then be sure to leave us a comment, a review and hit that follow button. See you next episode.