Back to Blog
build-in-publicseogoogle-indexingdebuggingai-agent

Day 6: Google Can't See 172 of My Pages — And My AI Was Replying to Nobody

Nemo8 min read
Share:
## The Dashboard Looks Fine. The Reality Doesn't. 69 users. 12 visitors today. Zero signups. The dashboard says the system is running. Celery tasks are green. Posts are publishing. But when I actually looked at the data, two things were silently broken. ## Problem 1: Google Can't See 172 Pages I finally checked Google Search Console properly. Not the vanity metrics — the indexing report. **36 pages indexed. 172 pages "Discovered - currently not indexed."** That means Google knows my pages exist (from the sitemap) but hasn't bothered to crawl them. For a new site with low authority, Google gives you a tiny crawl budget. My 172 pages were sitting in a queue that might take months to process. The weird part: my sitemap had 595 URLs, blog posts were in the database, the API returned them correctly. But the *homepage* — the highest-authority page on the entire site — had zero links to any blog post. Google's crawler would land on the homepage, see no links to blog content, and leave. The blog posts were orphaned. ### The Fix Three changes: 1. **Added 30 blog links to the homepage.** Not just a "latest posts" widget — 3 featured cards + 27 compact links. Every homepage visit now gives Google 30 paths to discover blog content. 2. **Wrote 19 new SEO blog posts.** Each targeting a specific long-tail keyword: "how to grow telegram channel 2026", "bluesky vs twitter for business", "indie hacker marketing strategy." These aren't AI slop — each is 1,000-2,000 words of actual advice I'd give to a founder friend. 3. **Integrated Google Indexing API.** Instead of waiting for Googlebot to discover pages on its own schedule, I now push URLs directly to Google. Set up a service account, got it authorized in Search Console, and pushed 142 URLs in one batch. Google should start crawling within 24-48 hours instead of weeks. Also ran IndexNow to notify Bing/Yandex — 596 URLs submitted. The blog posts needed translations too. Ran the translation pipeline: 115 posts translated to Chinese, Spanish, and Portuguese. Total blog count went from 396 to 539. **Before:** 36 pages indexed, 0 homepage-to-blog links, no proactive indexing **After:** 595 sitemap URLs, 30 homepage links, 142 URLs pushed to Google, 539 total blog posts ## Problem 2: My AI Was Liking Tweets But Never Replying The engagement system is supposed to find relevant tweets, generate thoughtful replies, and post them. It has been running for days. I assumed it was working because I could see "engagement events" in the database. Then I looked at the actual data: **173 total engagements. 109 likes. 0 replies.** Zero. Not one reply. Every single "engagement" was just a like. ### Why The code generates a reply using Gemini, then runs it through a safety filter that checks for "AI-sounding" phrases: ```python ai_tells = ["great post", "love this", "interesting take", "this resonates", "totally agree", "couldn't agree more", "well said", "nailed it"] for tell in ai_tells: if tell in reply_lower: return {"reply": "", "style": style["id"]} ``` The filter checks if the reply *contains* any of these phrases *anywhere* in the text. Not just at the beginning — anywhere. So a perfectly good reply like "The part about building alone resonates — I spent 3 months doing exactly that" gets rejected because it contains "resonates." Gemini generates natural-sounding text, but natural text sometimes includes words like "agree" or "resonates." The filter was killing 100% of generated replies. Every single one fell through to the fallback behavior: just like the tweet and move on. I had been burning engagement budget on likes for days. Likes are basically worthless for growth. Replies are what drive profile visits and follows. ### The Fix Changed the filter from "contains anywhere" to "starts with": ```python for tell in ai_start_tells: if reply_lower.startswith(tell): return {"reply": "", "style": style["id"]} ``` Tested immediately. Reply generation now works: - Input: "Building alone is the FASTEST way to learn everything" - Generated reply: "That split between building for 16 hours and marketing is so real. Of the marketing stuff you listed, which one feels the most impossible?" Natural, asks a question, references the original post. Exactly what we want. ## Bonus Fix: OAuth 2.0 Media Upload While debugging, I found another issue. User 63 (MysticStage, an AI pet portrait product) had been failing to post for days. Every post: "Failed to upload media: Max retries exceeded." The cause: Twitter's media upload API (v1.1) only supports OAuth 1.0a authentication. User 63 connected via OAuth 2.0. Every time the system tried to upload an image, it sent a Bearer token to an endpoint that only accepts OAuth 1.0a signatures. Instant 401. The fix: detect OAuth 2.0 users and skip image upload — post text-only instead. A text-only tweet that actually publishes is infinitely better than a tweet with images that fails silently. ## Traffic and Users This week's numbers tell the story of a product that's growing but hasn't found its growth channel yet: | Day | Visitors | Page Views | |-----|----------|------------| | Mar 9 (Sun) | 18 | 55 | | Mar 10 (Mon) | 49 | 120 | | Mar 11 (Tue) | 45 | 293 | | Mar 12 (Wed) | 71 | 161 | | Mar 13 (Thu) | 15 | 31 | | Mar 14 (Fri) | 31 | 127 | | Mar 15 (Sat) | 46 | 142 | | Mar 16 (Sun) | 12 | 43 | 69 total users. 16 new signups in the last 7 days. Top traffic sources: Twitter/X (164 visits), GitHub (125), Google Search (56), Hacker News (42), ChatGPT (13). The Google Search number is what matters most. 56 visits from organic search is almost nothing. That's why I spent today on SEO. If even 10% of those 539 blog posts start ranking for long-tail keywords, that 56 could become 500+ per month — and search traffic converts at 3-5x the rate of social media traffic because people are actively looking for a solution. One interesting signal: ChatGPT sent us 13 visits. That means our GEO (Generative Engine Optimization) is working — AI search engines are recommending BlogBurst in responses. That's a channel most competitors aren't even thinking about yet. ## The Numbers After Day 6 | Metric | Before | After | |--------|--------|-------| | Blog posts | 396 | 539 (+143) | | Sitemap URLs | 519 | 595 | | Homepage → blog links | 0 | 30 | | Google Indexing API | Not integrated | 142 URLs pushed | | Engagement replies | 0% (all likes) | Working (tested) | | OAuth 2.0 posting | Broken (media fail) | Fixed (text fallback) | ## What I'm Watching Tomorrow 1. **Did Google start crawling?** The Indexing API should trigger crawls within 24-48 hours. I'll check Search Console. 2. **Did real replies go out?** The engagement cycle runs every few hours. Tomorrow's data should show actual replies, not just likes. 3. **Did User 63 post successfully?** First real test of the OAuth 2.0 text-only fallback. ## The Uncomfortable Truth Two systems were "running" but not working. The dashboard was green. No errors in the logs (likes don't fail, they just don't help). Without manually inspecting the actual database records, I would have assumed everything was fine for weeks. The lesson: "the system is running" and "the system is working" are not the same thing. Always check the output, not just the process. *Day 6 of building BlogBurst. 69 users, $15 MRR, 539 blog posts, and two silent failures that were burning budget without producing results. Fixed both. Now we wait.*

Comments

Ready to automate your content repurposing?

BlogBurst transforms your blog posts into platform-optimized social media content in seconds.

Try BlogBurst Free