Skip to content

Commit bb9dcd1

Browse files
perf: reduce web requests with caching, bot control, and deferred JS
Add robots.txt to block aggressive AI scrapers (GPTBot, Bytespider, SemrushBot, etc.) and set crawl delays for legitimate search engines. Add Cache-Control headers for HTML pages (1h TTL with stale-while-revalidate) so repeat visits are served from Netlify's CDN edge cache instead of origin. Defer below-fold video components from client:load to client:visible so their JS chunks only load when scrolled into view — bots and users who don't scroll never trigger those requests. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
1 parent a91cfcb commit bb9dcd1

File tree

3 files changed

+93
-2
lines changed

3 files changed

+93
-2
lines changed

public/_headers

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
# HTML pages — 1 hour cache, serve stale while revalidating
2+
/*
3+
Cache-Control: public, max-age=3600, stale-while-revalidate=86400
4+
5+
# Hashed build assets — immutable forever
6+
/_astro/*
7+
Cache-Control: public, max-age=31536000, immutable
8+
9+
/fonts/*
10+
Cache-Control: public, max-age=31536000, immutable
11+
12+
/favicon.ico
13+
Cache-Control: public, max-age=604800
14+
15+
/favicon.svg
16+
Cache-Control: public, max-age=604800
17+
18+
/*.svg
19+
Cache-Control: public, max-age=86400
20+
21+
/*.pdf
22+
Cache-Control: public, max-age=604800
23+
24+
/sitemap-*.xml
25+
Cache-Control: public, max-age=3600

public/robots.txt

Lines changed: 66 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,66 @@
1+
# Audacity Website - robots.txt
2+
3+
# Allow legitimate search engines with a crawl delay
4+
User-agent: Googlebot
5+
Crawl-delay: 2
6+
7+
User-agent: Bingbot
8+
Crawl-delay: 2
9+
10+
User-agent: DuckDuckBot
11+
Crawl-delay: 5
12+
13+
User-agent: Yandex
14+
Crawl-delay: 10
15+
16+
# Block AI scrapers and aggressive bots
17+
User-agent: GPTBot
18+
Disallow: /
19+
20+
User-agent: ChatGPT-User
21+
Disallow: /
22+
23+
User-agent: Google-Extended
24+
Disallow: /
25+
26+
User-agent: CCBot
27+
Disallow: /
28+
29+
User-agent: anthropic-ai
30+
Disallow: /
31+
32+
User-agent: Claude-Web
33+
Disallow: /
34+
35+
User-agent: Bytespider
36+
Disallow: /
37+
38+
User-agent: PetalBot
39+
Disallow: /
40+
41+
User-agent: Sogou
42+
Disallow: /
43+
44+
User-agent: SemrushBot
45+
Disallow: /
46+
47+
User-agent: AhrefsBot
48+
Disallow: /
49+
50+
User-agent: MJ12bot
51+
Disallow: /
52+
53+
User-agent: DotBot
54+
Disallow: /
55+
56+
User-agent: BLEXBot
57+
Disallow: /
58+
59+
User-agent: DataForSeoBot
60+
Disallow: /
61+
62+
# Default: allow everything, but slow down
63+
User-agent: *
64+
Crawl-delay: 5
65+
66+
Sitemap: https://www.audacityteam.org/sitemap-index.xml

src/components/homepage/ReleaseVideo.astro

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,11 +13,11 @@ const videoPromos = getFilteredPromos(Object.values(promoData), {
1313
>
1414
<div class="flex flex-col md:flex-row gap-12 md:gap-16">
1515
<div class="w-full md:w-1/2">
16-
<SplitFeaturedVideo client:load promos={videoPromos} slot="1" />
16+
<SplitFeaturedVideo client:visible promos={videoPromos} slot="1" />
1717
</div>
1818

1919
<div class="w-full md:w-1/2">
20-
<SplitFeaturedVideo client:load promos={videoPromos} slot="2" />
20+
<SplitFeaturedVideo client:visible promos={videoPromos} slot="2" />
2121
</div>
2222
</div>
2323
</div>

0 commit comments

Comments
 (0)