← Back to all posts

Picture of tablet on Google homepage, Cherry Blossom Tech logo, and title: "AIO & SEO in 2026"

Search engine optimization (SEO) is an old term that we’ve all heard before. Most of us have a general understanding of the idea out there: high rank on Google = good! The methodology, however… that’s open to some interpretation.

Unfortunately, the path to organic exposure isn’t getting any less complicated with the introduction of AI optimization (AIO). SEO has long been the realm of “trust me, bro” tricks and trends, often espoused by people who still don’t properly understand how organic ranking works.

There’s a whole industry devoted to making optimization seem so incredibly complicated that you’re willing to hand gobs of money over for a process you don’t actually understand. Luckily, no one really has to understand how either SEO or AIO works to make top-ranking content, so let’s talk about how to cut out the snake oil salesmen.

The Nebulous World of Optimization

There aren’t clear and proven guidelines to optimize content, which is a fact that anyone who has ever worked in marketing is painfully aware of. Search engines don’t reveal their algorithm’s specifics. There are, however, tons of marketers who have anecdotal accounts of what has worked for them before. The typical SEO treatment is to stack a lot of these possible winning strategies into a double decker maybe burger, then feast on the possible ranking deliciousness… Then hope you don’t end up sick to your stomach

Of course, there are things that we do know for certain.

- Any LLM or search engine is looking for keywords that match the query they’ve been given.

- A site’s structure needs to be intuitive and navigable so that the bot can figure out how to find the information it’s after.

- Content needs to be original and authoritative, so that Google or an LLM recognize it as an important primary source of information.

But… how many keywords does the site need? And how many repetitions of each one? What is “intuitive structure” when you’re talking about an automated crawler? Should URLs represent the topic or the category more prominently? What would a search engine indexing your site think is authoritative? Does that mean that people are linking back to your page, or that you’re linking to valuable and reliable sources within your page?

Many of us do agree about what contributes to quality SEO and AIO in a general sense. The specific execution of those, however, is an entirely different story. And with all the myriad factors that can affect ranking… we’re back to “trust me, bro”.

Keywords All Grown Up

Keywords have been the central support beam of SEO since the beginning. Search the web for a specific set of words, and it finds that set of words — simple. But before long the baby internet grew up and there were more web pages than grains of sand at the beach (not really though… 5 square meters of beach probably contains about one internet worth of sand grains… of course depending on depth and porosity!) Since there are now so many occurrences of the queried keywords on the trillions of web pages out there, the search engines had to get more clever about which were more important than others. Ranking was born.

Picture of a man walking down a road, covered in sand"

There are an estimated 7.5x10^18 grains of sand on Earth. While not that many, there are theoretically trillions of individual web pages out there in the world. Frankly, we’re not sure. It’s hard to estimate objects that don’t take up physical space.

While search engines have evolved more or less as a black box, we’ve seen some things clearly change over time. They’ve adapted to interpreting keywords more intuitively. Google started leaning hard into “semantic search” around 2012. This technique allows the algorithm to somewhat interpret the user’s intent and cluster like keywords together. “Kleenex box” and “tissue package”, for instance, are treated the same. Google will interpret that the searcher intent of these phrases is nearly identical, even if the words themselves are different.

In the age of LLMs, this context-informed ranking is even more prevalent. LLMs are using this technique regularly to seek out information that can provide a satisfactory answer to a query.

LLMs are based on transformer architecture, which was pioneered by Google research scientists in the landmark 2017 paper “Attention Is All You Need”. The idea behind transformer tech (to severely water down the nuance) is that phrases can be given value the same way numbers are. This enables a machine learning algorithm to rank specific responses the way it would numbers in a series.

Sound familiar? This is a more sophisticated version of exactly what search algorithms have been doing to keywords for decades.

Google has always ranked keywords, but previous search indexing methodologies were based mostly on recursion. Google search used to rely heavily on the PageRank system, which defines the importance of each page based on how often it is linked back to on other pages. We call this a recursive method due to the fact that the importance of a page is altered based on reference, after the page already exists.

When we talk about recursion in regards to sorting methods like PageRank, that shouldn’t be confused with holistic marketing practices like recursive keyword research. This is a practice in which you start with a straightforward keyword and use Google’s recommended and related keywords to build longtail keywords that are well-suited to your niche and offering.

This, in our humble opinions, is not dated nor is it snake oil like some of SEO. It is logical and thorough research methodology, even if sometimes anecdotal with situational success.

PageRank was incredibly important for its time, but now it’s just one of many methods indexes use to serve results. Most rely much more heavily on machine learning and the transformer tech that LLMs are based on these days.

Transformer models make precise calculations about words in a way that has never been possible before. This means that LLMs are starting to change the way search engines treat keywords too. The more input the LLMs receive from users, the more potential test cases for keywords and recommendations these models have.

So… how does that all affect keywords in SEO and AIO?

Keywords are just plain better now. They do what was always intended, but with better results.

Stuffing isn’t needed — no endless variations on the same phrase to make sure and hit all possibilities. Users aren’t typing strings of keywords anymore, they’re typing questions. The search engines are responding to that context now, which allows content to be more natural and organic with keywords and phrases… and still RANK!

Accessibility for Human and Botkind

If you’ve ever worked with an SEO firm, they probably had a list of changes that you needed to make that seemed absolutely ludicrous. Making you cram 50+ internal links per page, nonsense keyword stuffing in metadata, URL taxonomy that makes no sense. Massive walls of text all over every page.

Obviously, these aren’t “rules”. Click in the top 10 of any search query and you’ll see examples that don’t meet this criteria. Yet this is the kind of bold claim that is standard in the SEO industry. There are tons of people out there claiming that their way is the best… trust me, bro. It’s almost like they’re trying to sell you that their way is the best.

Accessibility is a bit of a loaded term in this regard. There are definitely guidelines to make a site accessible for humans, but that’s not the entire story here. We’re looking specifically to make the site accessible to LLMs and search engines when we’re talking about SEO/AIO. What does that actually mean?

According to Google, search works in three distinct phases:

1. Crawling: Google downloads content from your sites with bots called “crawlers”. There are tons of specialized bots run by Google for this purpose, such as Googlebot-News, Adsbot, Googlebot-Image. Nearly 30% of bot hits on the web, however, will be plain old Googlebot.

2. Indexing: Google analyzes the content and stores it in the Google database, allowing it to be recalled later by queries.

3. Serving Search Results: The database returns relevant information from the indexed content based on a user’s query.

LLM crawlers like GPTbot work much the same way, though they index intermittently instead of constantly like Googlebot.

Looking objectively at the process it becomes pretty clear what accessibility means. Crawlers need to be able to access your pages and get enough information about them that they can be indexed with enough context to be useful to search. So the SEO gurus have always been right, but the application of this general idea is often misunderstood.

You’ll hear people claim that there are certain ways a site must be structured for it to be accessible to crawlers. People get hung up on these sorts of maxims for good reason. Back in the 90s, Google started as “BackRub”. Its main function was analyzing backlinks. It measured where and how much a site was referenced via backlinks, which then helped the algorithm establish how “useful” and “authoritative” that particular URL was. Thus, SEO was born.

Backlinks still matter. That still affects how the crawlers see your page and the context that they index along with it. That having been said, Google’s database is far better at sussing out context now than it was in the early days.

The bots have gotten clever. They don’t need you to follow a rigid formula that’s exactly like everyone else. The same things that tend to be “accessible” to a human are now “accessible” to a bot. Google’s bots have been designed for decades to emulate the preferences of humans, with roughly 8.5 billion anecdotal data points coming in daily to refine that.

At this point, designing accessible sites for LLMs and search engines is less about tailoring your site map to some odd and obscure alchemy and more about just making a site that makes sense. Make it intuitive. Have the URL taxonomy make logical sense. Create common sense links that go where they seem like they should and are anchored to relevant text. Make your page structure somewhat uniform across different URLs.

That’s it.

The bots don’t need it to be perfect. They’ll get it.

Good Content is King

In many ways content optimization sensibility is still stuck in the past. We want to make content that ranks better, but many of the ‘trust me, bro’ methodologies are outdated entirely. Keyword stuffing or designing for inhuman bots are things of the past. Optimizing for human consumption and optimizing for search/LLMs have grown together.

In 2024 thousands of pages of internal Google documentation were accidentally published to a public Github repository. While these documents didn’t reveal everything about the search algorithm, they certainly shed a lot of light on Google’s ranking factors. (The Verge, May 2024). People lost their minds. Surely this would change everything about SEO and AIO going forward… right?

It didn’t. It revealed some already suspected things about domain authority, content quality rating, keyword weighting… more than anything, the leak showed that most of the noise about SEO is overblown entirely. The best way to rank high is to make high quality, useful, original content. See what’s working, then iterate to make it work better.

We wanted to embed a really great, unbiased article about the 2024 leak here… but there really isn’t one. Nearly every article out there is trying to sell you SEO wizardry or sponsored by someone. So instead, have this link to the Google Search Central Blog. It’s awesome. You’re welcome.

Google Blog

That take isn’t the most sellable for an SEO or AIO company, so you may not hear it that often. No matter how you dress it up though, that’s reality. Today’s search results understand searcher intent far better than in the past — there’s no reason to trick the algorithm anymore. It’s trying really hard to value what an actual person would theoretically value!

In 2026 nothing will get you to the top of search faster than just being the best content on your query and in your niche. Organize your pages for enjoyable browsing. Keep your site safe and secure. All copy on your site should be original, descriptive, and compelling. There is no reason to overthink it in 2025, we’ve gotten to the point we’ve always been hoping for. The algorithms are after the exact same thing your audience is after!

The best thing is, you don’t have to trust me, bro. Test this. It’s so simple now you can easily iterate and see results. If it doesn’t work, fine. Go back to the old ways… no loss from your end.

No one’s talking over your head about some arcane hidden keyphrase loading on the backend that you don’t understand here… just make it nice. Both bots and people love great content.

To properly index and serve your site bots have to see it and humans have to enjoy it… so one thing still does and always will affect your ranking is your site’s speed, responsiveness, and design. At Cherry Blossom Tech, we have experienced experts from every aspect of web design ready to build you a quality site with all the bones it needs to rank high.

Want to start your site right? Let us help build you a custom, static site that’s accessible and lightning fast!