fix(seo): noindex tag pages and remove from sitemap

Google Search Console shows 0/10,519 pages indexed. Root cause: 5,000+
thin tag pages are diluting site quality signals and consuming crawl budget.

Changes:
- Add noindex,follow meta tag to blog tag pages (ListByTag.ejs)
- Remove tag sitemaps from sitemap index (Sitemap.ts)

This tells Google to:
1. Stop trying to index tag pages (they're thin content)
2. Still follow links on those pages to discover real content
3. Focus crawl budget on valuable pages (blog posts, product pages)

Expected impact:
- Improved crawl budget efficiency
- Better quality signals for the domain
- Gradual improvement in indexing of valuable pages
This commit is contained in:
Jamie Mallers
2026-02-07 15:49:51 +00:00
parent 7f2192206f
commit 8348bf6897
2 changed files with 3 additions and 8 deletions

View File

@@ -327,14 +327,8 @@ export async function generateSitemapIndexXml(): Promise<string> {
lastmod: timestamp,
});
// Blog tags sitemaps (paginated)
const tagsPageCount: number = await getTagsSitemapPageCount();
for (let i: number = 1; i <= tagsPageCount; i++) {
sitemaps.push({
loc: `${baseUrlString}/sitemap-tags-${i}.xml`,
lastmod: timestamp,
});
}
// Note: Blog tag sitemaps removed - tag pages are noindex to improve
// site quality signals and crawl budget efficiency
// Blog post sitemaps (paginated)
const blogPageCount: number = await getBlogSitemapPageCount();

View File

@@ -6,6 +6,7 @@
<head>
<title>Latest Posts on <%= tagName %> - OneUptime Blog</title>
<meta name="description" content="Read our latest posts on <%= tagName %>.">
<meta name="robots" content="noindex, follow">
<%- include('../head') -%>