Original Reddit post

Disclosure: I’m the founder of the site I’m discussing (agensi.io, a marketplace for AI agent skills). This post isn’t about the product. It’s about how I used Claude as a non-technical solo founder to build a full organic growth engine from zero. The problem I built a React SPA with Lovable. Out of the box it was invisible to search engines. Google’s crawler saw an empty div and a JavaScript bundle. No server-side rendering. No structured data. A 460KB JS bundle. A 179KB PNG logo rendered at 112 pixels. LCP was 4+ seconds on mobile. PageSpeed performance score was around 70. I don’t have a CS degree. I can’t write production code. But I had Claude. What Claude actually did Content strategy from raw data, not vibes. I export Google Search Console data weekly (queries, pages, clicks, impressions, average positions) and feed the CSVs to Claude. It identifies queries where I rank positions 1 through 3 but get zero clicks because AI Overviews answer the question first. It finds keyword gaps where competitors have content but I don’t. It spots cannibalization where multiple pages compete for the same query. This replaced what would normally be a $5K/month SEO consultant. Structured data architecture. Claude designed and generated the entire schema markup layer. Homepage has Organization, WebSite with SearchAction, and FAQPage with 15 Q&As. Product pages have SoftwareApplication with pricing, BreadcrumbList, and conditional FAQPage. Article pages have Article, FAQPage, HowTo, BreadcrumbList, and Organization. The /about page has Organization, AboutPage, and Person schema for entity anchoring. Every page validates clean in PageSpeed Insights with a 100 SEO score. Performance optimization. Claude diagnosed the LCP bottleneck as framer-motion loading on every page for a single mobile menu animation. It identified synchronous analytics scripts blocking render. It found the logo was a 1920x1920px PNG being rendered at 112px and imported as a JS module so the browser couldn’t even start downloading it until the entire bundle parsed. Claude’s fix: generate WebP versions (7KB and 3KB), switch to a static path with preload, and lazy-load the navbar components. Desktop LCP went from 2.5 seconds to 0.9 seconds. Performance score went from 70 to 97. AEO infrastructure. This is the part I find most interesting from an AI perspective. Claude helped me restructure every article so AI engines (ChatGPT, Gemini, Perplexity, Claude itself) would cite the content. Every article has a Quick Answer block at the top (40-60 words directly answering the main question). All H2 headings are phrased as questions because AI Overviews prefer extracting from question-format sections. Every page has FAQ schema. I created an llms.txt file that tells LLM crawlers what the site is and where key content lives. I also created an entity anchor page with Organization and Person schema so AI engines can establish who we are. The result: 9 different AI engines now cite the site including ChatGPT, Gemini, Perplexity, Claude, Doubao, Copilot, and Kagi. 350+ AI-referred sessions per month and growing. Technical SEO auditing. Claude found 121 queries where I ranked top 3 with zero clicks because AI Overviews were stealing the traffic. It found 18 published articles with zero Google impressions because they weren’t indexed and generated the IndexNow ping commands to fix it. It diagnosed duplicate FAQPage schema being emitted both client-side by React components and server-side by the SSR edge function, causing validation errors on 90 pages. It identified the exact files, wrote the Lovable prompts to fix it, and verified the fix with curl commands. The numbers after 2 months 500K+ total Google impressions. 6K+ total clicks. 878+ page-1 rankings (up from ~15 at launch). Average position 6.8. 15K active users in the last 30 days. Cited by 9 AI engines. $0 spent on marketing. What this means for AI as a tool Claude is not a magic content machine you point at a topic and get traffic. It’s a strategic partner that gets better the more data you feed it. The key is bringing your own data (GSC exports, analytics, competitor analysis) and asking it to find patterns and opportunities in that data. The output is specific, actionable, and measurable. The analytical and strategic capabilities get less attention than the coding abilities, but for a non-technical founder they might be even more powerful. I couldn’t have built this growth engine without Claude. Not because it wrote the content for me, but because it showed me exactly where the opportunities were and how to structure everything so both Google and AI engines could parse it. Happy to answer questions about the approach, specific prompts, or technical details. Site: agensi.io submitted by /u/BadMenFinance

Originally posted by u/BadMenFinance on r/ArtificialInteligence

  • lunar17@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 day ago

    SEO is an absolute cancer on the internet, and as this post illustrates, LLMs have accelerated it. It’s an entire field of gaming search algorithms at the cost of the human experience. People like this are why it became so hard to find quality information online!