Everyone's freaking out about GEO, LLMO, and AEO. After 7 months of running tests across tons of sites… I can tell you this: It's all built on SEO fundamentals. The same principles that rank you on Google also get you cited in ChatGPT, Claude, and Perplexity. So before you buy into shiny new tactics that promise “AI visibility”…here's what actually moves the needle: 1. Trust Signals AI tools pull from review platforms to assess business credibility and expertise. Build trust signals in the right places: - Local businesses: prioritize Google Business Profile reviews and responses - SaaS companies: maintain strong G2 and Capterra profiles - Ecommerce: focus on Trustpilot or industry-specific review platforms - Respond to reviews professionally and keep profiles updated 2. Document Structure LLMs love well-structured documents. Instead of optimizing just for human readers, structure content for AI platforms too: - Add company context throughout documents. Instead of "our latest update," write "Acme Corp's Q4 2024 update" - Use clear headings and comprehensive sections that can stand alone - Include key facts in multiple formats (inline text, bulleted lists, data tables) 3. Link Building for Relevance Quality and topical relevance matter more than quantity for AI visibility. Focus your link building efforts: - Target industry-relevant sites where your brand mention makes logical sense - Pursue guest posts and collaborations within your industry - Don't ignore nofollow links from high-authority sites in your niche - Seek brand mentions even without direct links. (the mention itself carries weight) Avoid completely unrelated sites. 4. Topical Authority Still Rules LLMs are trained on the same web content that Google indexes. The more deep, high-quality content you publish around your niche, the more AI systems recognize you as the go-to source, the more you get mentioned. Take out the trash. Delete random blog posts about topics unrelated to your business. They're actually hurting your AI visibility. 5. Be everywhere LLMs crawl Repurpose your content across Reddit, Medium, LinkedIn, and YouTube. These platforms get crawled heavily by AI, and showing up on them regularly builds brand visibility. LLMs love patterns. The more places they see you, the more they assume you’re an authority. 6. Technical setup - Use HTML-driven pages - Add schema markup - Clean site architecture (no page more than 3 clicks from homepage) - Ensure your critical content loads server-side (most AI crawlers don't render JavaScript) 7. Traditional Search Feeds AI Most AI tools use Bing or Google's index for real-time data. Better search rankings directly improve AI visibility.
AI in SEO
Explore top LinkedIn content from expert professionals.
-
-
How does Google use AI internally? What measurable impact are we seeing? Today we’re sharing our playbook. At Google, we are our own most demanding customer. In my blog post, I dive into our "Google AI at Google" initiative, in which we use our own AI in order to improve operations—from a 14% increase in lead conversions to a 96% reduction in threat intelligence response times. Beyond the data, there are three key takeaways for every leader navigating this shift: - Focus is key: Stop "letting a thousand flowers bloom." Scattered experiments don't drive ROI. Instead, focus on a "cultivated bouquet": 5–7 high-impact use cases that define your core business. Saying "no" to marginal projects is the only way to say "yes" to the ones that will redefine your industry. - Culture over code: This is a human transformation. When your experts shift from executing tasks to "teaching the machine," you've won. - Platform choice is destiny: You need a partner who understands the full stack—from the silicon to the software to the security. Read the full playbook: https://lnkd.in/gUyusBsq #GoogleCloud #AI #Leadership #DigitalTransformation
-
In the last 3 months at Ahrefs, we analyzed over 1 billion data points across 11 studies*. Here's what we learned about AI search optimization: 1. YouTube mentions are the single strongest predictor of AI visibility (correlation: 0.737) – stronger than Domain Rating, backlinks, or any traditional SEO factor. YouTube is heavily cited in AI responses, and both Google and OpenAI train on YouTube content. 2. For a given query, AI Mode and AI Overviews reach the same conclusions 86% of the time – but cite almost entirely different sources (only 13.7% citation overlap). AI Mode responses are 4x longer and mention 3x more entities. 3. Content length has essentially zero correlation with AI citations (0.04). 53% of all AI Overview citations go to pages under 1,000 words. Writing ultra-long contentisan't necessary for AI visibility. 4. Google still sends 345x more traffic than ChatGPT, Gemini, and Perplexity combined – but ChatGPT accounts for 80%+ of all AI-driven website traffic. 5. AI Overviews have a 70% chance of changing from one observation to the next, with content lasting an average of just 2.15 days. But semantic meaning stays remarkably consistent (0.95 cosine similarity). 6. "Best X" blog lists make up 43.8% of all page types cited in ChatGPT responses. 35% of those lists come from low-authority domains. 7. 79% of blog lists cited by ChatGPT were updated in 2025, and 76% of top-cited pages were refreshed within the last 30 days. Freshness matters more than ever. 8. When asked questions without valid answers, AI systems choose fabricated content with specific numbers almost every time. ChatGPT resisted best (84% accuracy), but Grok and Copilot were fully manipulated. 9. Domain Rating correlates weakly with AI visibility (just 0.266-0.326 across platforms). Number of site pages is even weaker at 0.194. 10. 67% of ChatGPT's top 1,000 citations are essentially off-limits to marketers – Wikipedia alone accounts for 29.7%, followed by homepages (23.8%) and educational content (at just 19.4%). *i'll share all the study links in a comment!
-
SEO isn’t “dead” or broken. (The definition of search changed) SEO isn't just about keywords and backlinks anymore. Search now happens across multiple engines, AI systems, and user behaviours. Traditional SEO still matters: Google alone processes 8.5B searches/day. But if you're only optimizing for traditional search, You're missing out on real visibility (aka revenue) opportunity. Because in 2025, search lives across 4 layers: - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Layer 1: SXO - Search Experience Optimisation ↳ Turn clicks into conversions with UX that actually works. Best practices: - Optimise page speed for sub-2 second loads - Design mobile-first experiences (60% of searches are mobile) - Align your content with user intent - Test your conversion funnels ruthlessly - Track dwell time and scroll depth, not just rankings Layer 2: AIO (AI Optimisation) ↳ Scale visibility with AI-powered content systems. Best practices: - Automate internal linking strategies - Use AI for content drafting and optimisation at scale - Create content templates that maintain quality across volume - Implement AI-powered site audits for continuous optimisation - Build systems for multi-format content repurposing Layer 3: GEO (Generative Engine Optimisation) ↳ Be cited by AI in summaries and retrieval-based answers. Best practices: - Publish factual, data-backed content that AI systems trust - Create comprehensive industry studies and original research - Build topical authority through consistent, expert-level content - Structure information in formats AI can easily extract and cite - Focus on being the definitive source on specific topics Layer 4: AEO (Answer Engine Optimisation) ↳ Get chosen by AI overviews and zero-click answers. Best practices: - Structure content with clear questions as headers - Lead every section with one-sentence answers AI can extract - Implement schema markup (FAQ, How-To, Q&A structured data) - Target question-based keywords people ask voice assistants - Create content that directly matches how people prompt AI systems - - - - - - - - - - - - - - - - - - - - - - - - - - - - - If you've noticed your SEO stalling, The answer might be in what you're lacking... Not in what you're doing wrong. If you want your site analysed, fixed, and optimised 24/7, Searchable automates all four layers (end to end). It’s designed to: - Show where your brand is visible (Google + AI search) - Automatically optimise for all search environments - Turn visibility into clicks and revenue So you can prepare for the new era of search, without burning yourself out. Learn more: https://lnkd.in/djHykzsY ♻️ Repost to help your network prepare for AI search Follow me Emilia Möller for AI search tips and frameworks.
-
WTH is a vector database and how does it work? If you’re stepping into the world of AI engineering, this is one of the first systems you need to deeply understand 👇 🧩 Why traditional databases fall short for GenAI Traditional databases (like PostgreSQL or MySQL) were built for structured, scalar data: → Numbers, strings, timestamps → Organized in rows and columns → Optimized for transactions and exact lookups using SQL They work great for business logic and operational systems. But when it comes to unstructured data, like natural language, code, images, or audio- they struggle. These databases can’t search for meaning or handle high-dimensional semantic queries. 🔢 What are vector databases? Vector databases are designed for storing and querying embeddings: high-dimensional numerical representations generated by models. Instead of asking, “Is this field equal to X?”- you’re asking, “What’s semantically similar to this example?” They’re essential for powering: → Semantic search → Retrieval-Augmented Generation (RAG) → Recommendation engines → Agent memory and long-term context → Multi-modal reasoning (text, image, audio, video) ♟️How vector databases actually work → Embedding: Raw input (text/image/code) is passed through a model to get a vector (e.g., 1536-dimensional float array) → Indexing: Vectors are organized using Approximate Nearest Neighbor (ANN) algorithms like HNSW, IVF, or PQ → Querying: A new input is embedded, and the system finds the closest vectors based on similarity metrics (cosine, dot product, L2) This allows fast and scalable semantic retrieval across millions or billions of entries. 🛠️ Where to get started Purpose-built tools: → Pinecone, Weaviate, Milvus, Qdrant, Chroma Embedded options: → pgvector for PostgreSQL → MongoDB Atlas Vector Search → OpenSearch, Elasticsearch (vector-native support) Most modern stacks combine vector search with keyword filtering and metadata, a hybrid retrieval approach that balances speed, accuracy, and relevance. 🤔Do you really need one? It depends on your use case: → For small-scale projects, pgvector inside your Postgres DB is often enough → For high-scale, real-time systems or multi-modal data, dedicated vector DBs offer better indexing, throughput, and scaling → Your real goal should be building smart retrieval pipelines, not just storing vectors 📈📉 Rise & Fall of Vector DBs Back in 2023–2024, vector databases were everywhere. But in 2025, they’ve matured into quiet infrastructure, no longer the star of the show, but still powering many GenAI applications behind the scenes. The real focus now is: → Building smarter retrieval systems → Combining vector + keyword + filter search → Using re-ranking and hybrid logic for precision 〰️〰️〰️〰️ ♻️ Share this with your network 🔔 Follow me (Aishwarya Srinivasan) for data & AI insights, and subscribe to my Substack to find more in-depth blogs and weekly updates in AI: https://lnkd.in/dpBNr6Jg
-
If you search for "How to lower my bill" in a standard SQL database, you might get zero results if the document is titled "AWS Cost Optimization Guide." Why? Because the keywords don't match. This is the fundamental problem Vector Databases solve. They allow computers to understand that "lowering bills" and "cost optimization" are semantically identical, even if they share no common words. Here is the end-to-end flow of how we move from Raw Data to Semantic Search (as illustrated in the sketch): 1. The Transformation (Vectorization) Everything starts with Embeddings. We take raw text, images, or code and pass them through an Embedding Model (like OpenAI or Cohere). Input: "Reduce AWS cloud costs" Output: [0.12, -0.83, 0.44...] We turn meaning into numbers. 2. The Heart (Vector Store) We don't just store the text; we store the vector. Vector Index: Used for the semantic search (finding the "nearest neighbor" mathematically). Metadata Index: Used for filtering (e.g., "Only show docs from 2024"). 3. The Query Flow When a user asks, "How can I lower my AWS bill?" we don't scan for keywords. We convert the user's question into a vector. We look for other vectors in the database that are mathematically close to it. We retrieve the "AWS Cost Optimization Guide" because it is close in meaning, not just spelling. Why does this matter for GenAI? This is the backbone of RAG (Retrieval-Augmented Generation). LLMs can be confident but wrong (hallucinations). Vector DBs provide the "Relevant Context" (the ground truth) so the LLM can answer accurately based on your proprietary data. The future of search isn't about matching characters; it's about matching intent.
-
💡Vector databases have become one of the most important infrastructure layers in modern AI systems. Most of us use LLMs every day without realizing that vectors and similarity search are doing the heavy lifting underneath. Let’s find out why vector databases matter and how they power real world AI applications. 🔸What a Vector Database Really Is A storage and retrieval engine for high dimensional embeddings that allow models to search by meaning instead of keywords. 🔸Why AI Converts Everything to Vectors Embeddings capture semantic intent, structure, tone, and relationships between concepts in a way that machines can measure mathematically. This is what enables AI to interpret meaning the way humans do. 🔸How Vector Databases Work Embed → Index → Similarity Search → Rank → Reason. This pipeline is the foundation of retrieval augmented generation systems and intelligent search workloads. 🔸What Similarity Search Enables The engine can find items that are conceptually aligned even when they use different words or formats. This is semantic retrieval instead of lexical matching. 🔸Why Traditional Databases Fall Short Relational stores and document stores are optimized for structured data and exact match queries. They are not built for embeddings, cosine similarity computations, or efficient navigation of high dimensional spaces. 🔸Why Vector Databases Matter for AI They enable long term memory, reduce hallucinations, and create stable grounding for reasoning. This is critical when deploying LLMs in production use cases that require accuracy. 🔸How They Power RAG Systems Before a model generates an answer, the system pulls factual context from internal knowledge sources. This makes responses more reliable and aligned with a company’s domain knowledge. 🔸How Chatbots Use Them They maintain conversational context, retrieve business specific data, and interpret intent across multiple interactions. 🔸How Search Engines Benefit They support semantic, multimodal, and concept driven search that goes beyond simple keyword matching. 🔸Recommendations Powered by Vectors Embeddings map user behavior and item characteristics into a shared semantic space which allows for highly personalized and context aware recommendations. 🔸Popular Vector Databases in 2025 Pinecone, Weaviate, ChromaDB, FAISS, Milvus, Qdrant. 🔸Key Technical Features to Know Approximate nearest neighbor search, hybrid search with BM25 or dense retrieval, distributed indexing, sharded vector stores, real time embedding refresh, and LLM based re ranking. 🔹The Technical Reality Vector databases are now a foundational layer in the AI stack that enables multimodal understanding, agent memory, semantic reasoning, and enterprise grade reliability. I think that understanding how embedding architectures, similarity metrics, and vector stores work will give you a strong technical advantage as a developer. Save this doc for future reference. #VectorDatabases
-
Search is no longer about ranking. It is about being selected in the answer. During a recent test, an AI tool was asked for the “best platforms to improve customer retention.” The response was simple. A concise summary with a few brands mentioned. No links to compare. No second page to explore. That shift is hard to ignore. The competition is no longer for clicks. It is for inclusion inside the answer itself. For years, teams optimized for keywords, backlinks, and traffic spikes. Now the focus is shifting toward understanding. AI systems do not just crawl pages. They interpret meaning, connect entities, and surface brands they recognize as credible. When messaging is inconsistent, data is fragmented, or expertise is unclear, visibility quietly drops. This is not just an SEO shift. It is a visibility shift. Clarity outperforms volume. Structure outperforms density. And clearly expressed expertise becomes a signal machines can trust. This week’s newsletter breaks down what AI search really means, why entity authority is gaining importance, and what teams need to change now. For those thinking about how to be part of the answer, not left out of it, this is worth the read.
-
Over the past two years, we’ve been fed a steady diet of “AI will change SEO.” And sure, it will — it already has. But that’s not the full picture. The AI shift that’s happening on Google’s homepage — through AI Overviews — isn’t just changing organic performance. It’s changing the entire search experience. Paid included. Let me spell it out: • Google’s AI Overviews are giving users full answers before they even think of clicking. • The format takes up prime real estate above the fold — pushing both paid and organic listings down. • In some cases, the overview takes the entire page. This isn’t a theoretical future. This is a visible, structural change to how results are served and seen, and as of a few days back it surpassed the 20% mark of all searhes. I pitched a major client this week who said, “AI isn’t a focus for us right now — we’re prioritizing Paid.” That’s the blind spot. Because whatever you’re running in Paid will be affected — through impressions, CTRs, conversions — all downstream from this new AI layer. If your strategy assumes that SEO is under attack while Paid stays safe… You're not seeing the whole board. Look below, you see paid ad?