In 2012, to add a site to Google, you filled out a form add-url.html.
In 2026, if you are relying on Google’s passive crawling, your content might take weeks to appear.
Especially for News, Jobs, and Real Estate sites, passive crawling is dead. You need to push data to Google using APIs.
1. The Google indexing API (not just for jobs)
Officially, Google says the Indexing API is only for JobPosting and BroadcastEvent.
Unofficially, developers have been using it for regular content for years with varying success.
However, the strategy is clear: Use it for time-sensitive updates.
How to implement in WordPress:
Don’t write the cURL request yourself. Use a library like googleapis/google-api-php-client.
// Example: Notifying Google of a URL update
$client = new Google_Client();
$client->setAuthConfig('service_account.json');
$client->addScope(Google_Service_Indexing::INDEXING);
$service = new Google_Service_Indexing($client);
$postBody = new Google_Service_Indexing_UrlNotification();
$postBody->setUrl('/pl/new-post');
$postBody->setType('URL_UPDATED');
$service->urlNotifications->publish($postBody);
2. Debugging “discovered - Currently not indexed”
This is the most common error in GSC in 2026. It means: Google knows your page exists, but decided it wasn’t worth the crawl budget.
The Fix is NOT technical. It’s usually quality-related.
- Duplicate Content: Is this page 90% similar to another?
- Thin Content: Does the page rely on JS to render text that Googlebot isn’t executing?
- Internal Linking: Is it an “orphan page”? (No links pointing to it).
3. XML sitemaps: The modern way
Yoast and RankMath do this automatically, but as a developer, you should know that Image Sitemaps are critical for AVIF/WebP indexing.
Ensure your sitemap.xml includes <image:image> tags:
<url>
<loc>/pl/my-post</loc>
<image:image>
<image:loc>https://wppoland.com/wp-content/uploads/image.avif</image:loc>
</image:image>
</url>
If your sitemap lacks images, you lose traffic from Google Images/Lens.
4. Robots.txt IN 2026
With AI bots scrapping everything, your robots.txt is your first line of defense.
Block the AI scrapers that don’t send traffic, but allow the search engines.
User-agent: GPTBot
Disallow: /
User-agent: CCBot
Disallow: /
User-agent: Googlebot
Allow: /
Summary
SEO is no longer just “Keywords”. It’s “Infrastructure”.
- Push: Use APIs to notify Google.
- Monitor: Check GSC API for coverage errors programmatically.
- Defend: Block useless AI bots via robots.txt.
Don’t wait for Google. Invite them.



