Skip to main content
Interview with Andrew from Metamonster : The Future of SEO in the Era of Generative AI

Andrew, you’re both building MetaMonster and actively consulting in SEO—how did your hands‑on experience with agencies and clients lead you to focus specifically on the future of SEO in a world dominated by generative AI?

The majority of my agency experience was actually building and running a product design agency, and doing SEO as part of our own marketing efforts (although I now do SEO consulting through my co-founder's agency). Content marketing was always our biggest growth driver, but we didn't have the budget to hire an SEO expert or the time or knowledge to optimize things ourselves. And I didn't like the existing content optimization tools, they encouraged you to create bad content stuffed to the brim with keywords. So our SEO always suffered. Then as I saw generative AI getting better and better I thought there could be an opportunity to create a tool that works differently. By leveraging AI you could help people optimize their content while keeping the quality high, and that's what we're doing today.

From your vantage point inside MetaMonster, how is generative AI most concretely changing day‑to‑day SEO work today—beyond the hype—and which parts of the traditional workflow (keyword research, on‑page, content briefs, audits) do you think will be fundamentally redefined rather than just automated?

I love the distinction between fundamental change versus automation. Because so much automation still follows the same processes, it just speeds up the work. I think it's the search platforms' use of AI that is creating the fundamental changes. Query fan out changes what you should optimize for from simple keywords, to a more comprehensive set of longer queries (although there's still a lot of juice to squeeze from traditional SEO). And the platforms are all getting better at evaluating content for quality, making it harder to game the system and more important than ever to take a holistic approach to optimization. It's not just does this page use these keywords 10 times each? It matters that the content includes original ideas, social proof, authority signals, and is readable and engaging.

MetaMonster relies on embeddings, structured workflows and Search Console data to avoid the ‘generic AI content’ trap. Can you walk us through a real example where this approach produced SEO results that a standard prompt‑and‑chat workflow would likely have missed or messed up?

I was giving a demo of MetaMonster once, and we were generating page titles for an ecommerce site selling chewable vitamins for women. And the page titles kept mentioning that one of the founders was a doctor, and other details about their bio. Our prospect was confused, she was like, "where is this coming from?" So we pulled up the site, and sure enough, there it was in a small section at the bottom of the page. To achieve that same level of specificity in a standard prompt-and-chat workflow you need to paste in the content for every page you're working on. Another example is the content optimization tool. A common recommendation we see is something like, "these three pages ranking for your keyword include this information, would you like to add that?" Getting ChatGPT or Claude to do find that info out of the box, for the right keywords, requires a lot of setup and tuning of your prompts and tools. Th

Many SEOs fear that AI‑generated over‑optimization (mass titles, meta descriptions, H1s) will trigger quality or spam signals in Google and in AI‑driven search experiences. How do you think about ‘safe’ scale—both technically and strategically—when you’re helping agencies roll out MetaMonster across hundreds of pages?

I think there are two keys to "safe scale." First, you need to use AI for the right jobs. Asking AI to write something from scratch that needs direct, human experience is going to result in slop. This is why experience is a core part of EEAT. Second, you need to provide the right context. I really wish prompt engineering had been called context engineering instead, because the more important thing than the specific instructions is the context you provide to the LLM. So for example, using AI to optimize page titles and providing the existing page content, keyword to optimize for, a voice and tone guide, as well as GSC data is a pretty safe approach. You're not asking the AI to create anything novel that requires human experience, you're just doing something tedious a little bit faster than normal. That said, you should always note when you make changes and monitor your results. And review the output of the AI as much as possible (which is hard to do at truly large scales),

We're constantly monitoring experiments in the industry to keep up with what's working well in AI search and shipping new workflows in the Sandbox, the space in MetaMonster for creating and testing custom workflows. Because the Sandbox is so flexible, we can test and ship workflows super fast. So if the industry is talking about FAQs, summaries, or query fan out this week we have workflows to help automate all of those. And the thing we keep seeing over and over again is that one of the most reliable signals for being cited across all platforms is recency, and our core content optimization workflow makes it significantly easier to update content to keep it up to date.

Looking ahead 3–5 years, imagine MetaMonster in a mature era of AI search: what does a ‘modern’ SEO strategy look like to you, and what kinds of signals and datasets do you think tools like yours will need to optimize for, beyond classic rankings and click‑through rates?

This is a great question, and my hot take is that it's not actually going to be that different from what has always worked. Sure, a modern strategy will probably incorporate query fan out analysis, and more off-page signals that looks closer to brand building than a lot of what we traditionally consider the domain of SEOs. But the core is still going to come down to creating good, comprehensive content that draws from real-world experiences and offers some sort of unique insight. Tools like MetaMonster will help to make sure the content is well-structured and readable, and will be able to draw from your company's notes and data to make suggestions of what you could add. But everything will still hinge on people putting in the time to follow their curiosity, develop expertise, do interesting things, and share them with the world.

For SEO agencies and in‑house teams feeling overwhelmed by generative AI, what is your no‑BS advice: what should they stop doing immediately, what one or two AI‑powered practices should they start this quarter, and how can they future‑proof their skills in a landscape that’s changing this fast?

Yes, the landscape is changing fast, but it has always changed. SEOs have constantly been in this game of having to learn new skills, reverse engineer the algorithms, run experiments, and evolve their approach. I would say if you aren't already a deeply curious person, what are you doing working in SEO? And if you are deeply curious, great, follow that curiosity! Try new tools, run experiments to find out for yourself what works and what doesn't. Don't let the constant stream of AI content freak you out, you have the skills to adapt to this new world. This is an exciting time, things are accessible to you that were out of reach before, you can build your own tools, run experiments at new levels of scale, and tackle tasks that just weren't worth the time before. I do think every SEO should try to build something for themselves, just to see how much more they are capable of, and to understand the limits of these tools. And I know it can be tough to find the time, but carve out one day a month to experiment, you don't need to do everything the unemployed twitter bros are doing to learn and develop your skillset.

Learn more: https://metamonster.ai

Published on   •   Updated on