-
-
Notifications
You must be signed in to change notification settings - Fork 644
Update AIOStream latest #440
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Individual indexer request so timeout can work per indexer
Update to respect timeout and return what has been found after search torrent timeout
Make requests to each indexer individually, instead of sending all. This way we can return the indexers with results until the timeout.
Fix code bugs
bug fix
config timeout
torznob request fix
json
json api call
json v2
parallel
New indexer
WalkthroughThis pull request refactors search orchestration across Prowlarr and Torznab addons to introduce deadline-based timeouts, parallel multi-indexer searching, and incremental result processing. The Jackett preset gains optional user-configurable indexer selection to enable targeted searches. Changes
Sequence DiagramsequenceDiagram
actor User
participant Addon as Search Addon
participant Indexer1 as Indexer 1
participant Indexer2 as Indexer 2
participant Deadline as Deadline Timer
User->>Addon: Search request
activate Addon
Addon->>Deadline: Start deadline (e.g., 30s)
Note over Addon: Guard: Check queries & indexers exist
par Parallel Search
Addon->>Indexer1: Query (with per-request timeout)
activate Indexer1
Addon->>Indexer2: Query (with per-request timeout)
activate Indexer2
and Deadline Wait
Deadline->>Deadline: Wait for deadline or all results
end
Note over Addon: Incremental Processing Phase
opt Indexer1 responds first
Indexer1-->>Addon: Results
deactivate Indexer1
Addon->>Addon: Validate, deduplicate, append
Addon->>User: (Partial results available)
end
opt Indexer2 responds
Indexer2-->>Addon: Results
deactivate Indexer2
Addon->>Addon: Validate, deduplicate, append
end
alt Deadline reached
Addon->>Addon: Log timeout event
alt Results found
Addon-->>User: Return partial results
else No results
Addon-->>User: Throw timeout error
end
else All complete before deadline
Addon-->>User: Return full results
end
deactivate Addon
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes This diff introduces significant architectural changes across multiple files: new schema types and configuration, deadline-based timeout racing logic, parallel search orchestration, and incremental result processing with deduplication. The changes are heterogeneous—each file requires distinct reasoning (Prowlarr timeout logic differs from Torznab's multi-indexer orchestration, which differs from Jackett's configuration propagation). The async control flow with Poem
Pre-merge checks and finishing touches❌ Failed checks (1 inconclusive)
✅ Passed checks (2 passed)
✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 3
🧹 Nitpick comments (8)
packages/core/src/presets/jackett.ts (2)
67-74: New jackettIndexers option looks good; consider basic input validation.Accepting raw IDs invites typos or invalid characters. Validate to a conservative slug pattern (e.g., [a-z0-9_-]) when parsing. This prevents malformed URLs downstream.
152-154: Sanitise indexer IDs before embedding into the manifest config.Filter out invalid entries to avoid path issues when rewriting /all/ to per-indexer URLs.
Apply:
- timeout: options.timeout, - indexers: options.jackettIndexers ? options.jackettIndexers.split(',').map((s: string) => s.trim()).filter(Boolean) : [], + timeout: options.timeout, + indexers: options.jackettIndexers + ? options.jackettIndexers + .split(',') + .map((s: string) => s.trim()) + .filter((s: string) => /^[a-z0-9_-]+$/i.test(s)) + : [],packages/core/src/builtins/prowlarr/addon.ts (3)
35-38: Avoid hard‑coded 10s deadline; derive from env/config.Make the deadline a function of the outer timeout (with a small safety margin) to prevent mismatches across environments.
For example:
-const SEARCH_DEADLINE_MS = 10000; // 10 seconds +const SEARCH_DEADLINE_MS = + Math.max(1000, (Env.DEFAULT_TIMEOUT ?? 15000) - 500);
182-226: Prefer infoHash over downloadUrl; normalise seeders; reduce broken links.Align with the Torznab path to avoid flaky HTTP download URLs when a valid hash exists and to stabilise dedup.
Apply:
- for (const result of data) { - const magnetUrl = result.guid.includes('magnet:') ? result.guid : undefined; - const downloadUrl = result.magnetUrl?.startsWith('http') ? result.magnetUrl : result.downloadUrl; - const infoHash = validateInfoHash(result.infoHash || (magnetUrl ? extractInfoHashFromMagnet(magnetUrl) : undefined)); - if (!infoHash && !downloadUrl) continue; - if (seenTorrents.has(infoHash ?? downloadUrl!)) continue; - seenTorrents.add(infoHash ?? downloadUrl!); - - torrents.push({ - hash: infoHash, - downloadUrl: downloadUrl, - sources: magnetUrl ? extractTrackersFromMagnet(magnetUrl) : [], - seeders: result.seeders, + for (const result of data) { + const magnetUrl = + typeof result.guid === 'string' && result.guid.includes('magnet:') + ? result.guid + : undefined; + const infoHash = validateInfoHash( + result.infoHash || + (magnetUrl ? extractInfoHashFromMagnet(magnetUrl) : undefined) + ); + // If we have a reliable hash, omit downloadUrl to avoid broken links. + const downloadUrl = infoHash + ? undefined + : (typeof result.downloadUrl === 'string' && result.downloadUrl.startsWith('http') + ? result.downloadUrl + : (typeof result.magnetUrl === 'string' && result.magnetUrl.startsWith('http') + ? result.magnetUrl + : undefined)); + if (!infoHash && !downloadUrl) continue; + const dedupKey = infoHash ?? downloadUrl!; + if (seenTorrents.has(dedupKey)) continue; + seenTorrents.add(dedupKey); + + torrents.push({ + hash: infoHash, + downloadUrl, + sources: magnetUrl ? extractTrackersFromMagnet(magnetUrl) : [], + seeders: + typeof result.seeders === 'number' && result.seeders >= 0 + ? result.seeders + : undefined, title: result.title, size: result.size, indexer: result.indexer, type: 'torrent', }); }
239-249: Race timeout: clear the timer to avoid dangling timeouts.Timers persist after Promise.race unless cleared.
Apply:
- const timeoutPromise = new Promise((_, reject) => - setTimeout(() => reject(new Error('Search deadline reached')), SEARCH_DEADLINE_MS) - ); + let timer: ReturnType<typeof setTimeout>; + const timeoutPromise = new Promise((_, reject) => { + timer = setTimeout( + () => reject(new Error('Search deadline reached')), + SEARCH_DEADLINE_MS + ); + }); ... - } catch (error) { + } catch (error) { // This catch block will be triggered if the timeout wins the race this.logger.info(`Search deadline of ${SEARCH_DEADLINE_MS}ms reached. Returning ${torrents.length} results found so far.`); - } + } finally { + if (timer) clearTimeout(timer); + }packages/core/src/builtins/torznab/addon.ts (3)
85-107: Bound concurrency to protect Jackett and avoid bursts.Running queries × indexers concurrently can flood the endpoint. Use the existing limiter used in Prowlarr.
Apply within this block:
- const searchPromises = searchTasks.map(({ query, indexerId }) => async () => { + // Reuse the standard concurrency limiter for network calls + const { createQueryLimit } = await import('../utils/general.js'); + const limit = createQueryLimit(); + const searchPromises = searchTasks.map(({ query, indexerId }) => () => + limit(async () => { const start = Date.now(); try { const params: Record<string, string | number | boolean> = { q: query, o: 'json' }; if (parsedId.season) params.season = parsedId.season; if (parsedId.episode) params.ep = parsedId.episode; const results = await this.api.searchIndexer(indexerId, 'search', params); this.processResults(results, torrents, seenTorrents, indexerId); } catch (error) { this.logger.warn( `Jackett search for "${query}" on [${indexerId}] failed after ${getTimeTakenSincePoint(start)}: ${error instanceof Error ? error.message : String(error)}` ); } - }); + }) + );
109-121: Also bound concurrency in the /all/ fallback.Apply:
- const searchPromises = queries.map((query) => async () => { + const { createQueryLimit } = await import('../utils/general.js'); + const limit = createQueryLimit(); + const searchPromises = queries.map((query) => () => + limit(async () => { try { const params: Record<string, string | number | boolean> = { q: query, o: 'json' }; if (parsedId.season) params.season = parsedId.season; if (parsedId.episode) params.ep = parsedId.episode; const results = await this.api.search('search', params); this.processResults(results, torrents, seenTorrents); } catch (error) { this.logger.warn(`Jackett /all/ search for "${query}" failed: ${error instanceof Error ? error.message : String(error)}`); } - }); + }) + );
169-179: Race timeout: clear the timer to avoid dangling timeouts.Same pattern as in Prowlarr; free the timer in a finally block.
Apply:
- const timeoutPromise = new Promise((_, reject) => - setTimeout(() => reject(new Error('Search deadline reached')), deadline) - ); + let timer: ReturnType<typeof setTimeout>; + const timeoutPromise = new Promise((_, reject) => { + timer = setTimeout( + () => reject(new Error('Search deadline reached')), + deadline + ); + }); try { await Promise.race([allSearchesPromise, timeoutPromise]); } catch (error) { this.logger.info(`Search deadline of ${deadline}ms reached. Returning results found so far.`); - } + } finally { + if (timer) clearTimeout(timer); + }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (3)
packages/core/src/builtins/prowlarr/addon.ts(3 hunks)packages/core/src/builtins/torznab/addon.ts(3 hunks)packages/core/src/presets/jackett.ts(2 hunks)
🧰 Additional context used
🧬 Code graph analysis (2)
packages/core/src/builtins/prowlarr/addon.ts (4)
packages/core/src/debrid/utils.ts (1)
UnprocessedTorrent(51-56)packages/core/src/builtins/prowlarr/api.ts (1)
ProwlarrApiIndexer(60-60)packages/core/src/utils/logger.ts (1)
getTimeTakenSincePoint(149-156)packages/core/src/builtins/utils/debrid.ts (3)
validateInfoHash(38-42)extractInfoHashFromMagnet(48-52)extractTrackersFromMagnet(44-46)
packages/core/src/builtins/torznab/addon.ts (4)
packages/core/src/builtins/base/nab/addon.ts (1)
NabAddonConfigSchema(13-18)packages/core/src/builtins/base/nab/api.ts (1)
BaseNabApi(194-359)packages/core/src/debrid/utils.ts (1)
UnprocessedTorrent(51-56)packages/core/src/builtins/utils/debrid.ts (2)
extractTrackersFromMagnet(44-46)validateInfoHash(38-42)
🔇 Additional comments (4)
packages/core/src/builtins/prowlarr/addon.ts (2)
62-63: Per‑request timeout comment is clear.The distinction between per‑request timeout and overall deadline is helpful.
178-181: Early return guard is fine.Avoids wasted work when no queries or indexers.
packages/core/src/builtins/torznab/addon.ts (2)
123-126: Throw‑on‑empty behaviour matches Prowlarr; LGTM.
55-69: Constructor/schema evolution looks consistent.Please confirm UnprocessedTorrent includes optional fields used here (title, size, seeders, indexer). If not, I can adjust typing or narrow to the declared shape.
| import { createLogger } from '../../utils/index.js'; | ||
| import { createLogger, getTimeTakenSincePoint } from '../../utils/index.js'; | ||
| import { Torrent, NZB, UnprocessedTorrent } from '../../debrid/index.js'; | ||
| import { SearchMetadata } from '../base/debrid'; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
ESM import likely to break at runtime (.js extension missing).
All other ESM imports use the .js suffix; this one does not and will fail under NodeNext/ESM.
Apply:
-import { SearchMetadata } from '../base/debrid';
+import { SearchMetadata } from '../base/debrid.js';🏁 Script executed:
#!/bin/bash
# Verify inconsistent ESM imports in this package
rg -nP --type=ts -C1 "from '../base/debrid(?!\.js)'" packages/core/src/builtins/torznabLength of output: 269
Add .js extension to ESM import for Node.js compatibility.
Relative import paths require explicit file extensions in ECMAScript imports when '--moduleResolution' is 'node16' or 'nodenext'. Line 4 of this file correctly uses '../../debrid/index.js', but line 5 is missing the extension.
Apply:
-import { SearchMetadata } from '../base/debrid';
+import { SearchMetadata } from '../base/debrid.js';📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| import { SearchMetadata } from '../base/debrid'; | |
| import { SearchMetadata } from '../base/debrid.js'; |
🤖 Prompt for AI Agents
packages/core/src/builtins/torznab/addon.ts around line 5: the ESM import on
line 5 uses a relative path without the .js extension which breaks Node.js with
'node16'/'nodenext' resolution; update the import to include the explicit .js
extension (e.g., change '../base/debrid' to '../base/debrid.js') so the file
path matches ESM requirements and Node can resolve the module.
| private processResults(results: any[], torrents: UnprocessedTorrent[], seenTorrents: Set<string>, indexerId?: string) { | ||
| for (const result of results) { | ||
| const infoHash = this.extractInfoHash(result); | ||
|
|
||
| // **THE FIX: Prioritize the reliable infoHash over the unreliable downloadUrl.** | ||
| // If an infoHash exists, we set downloadUrl to undefined. This forces AIOStreams | ||
| // to use its more robust metadata fetching method and avoids getting stuck on broken links. | ||
| const downloadUrl = infoHash | ||
| ? undefined | ||
| : result.enclosure.find( | ||
| (e: any) => | ||
| e.type === 'application/x-bittorrent' && !e.url.includes('magnet:') | ||
| )?.url; | ||
|
|
||
| if (!infoHash && !downloadUrl) continue; | ||
| if (seenTorrents.has(infoHash ?? downloadUrl!)) continue; | ||
| seenTorrents.add(infoHash ?? downloadUrl!); | ||
|
|
||
| torrents.push({ | ||
| hash: infoHash, | ||
| downloadUrl: downloadUrl, | ||
| sources: result.torznab?.magneturl?.toString() | ||
| ? extractTrackersFromMagnet(result.torznab.magneturl.toString()) | ||
| : [], | ||
| seeders: | ||
| typeof result.torznab?.seeders === 'number' && | ||
| ![-1, 999].includes(result.torznab.seeders) | ||
| ? result.torznab.seeders | ||
| : undefined, | ||
| indexer: result.jackettindexer?.name ?? indexerId ?? 'unknown', | ||
| title: result.title, | ||
| size: | ||
| result.size ?? | ||
| (result.torznab?.size ? Number(result.torznab.size) : 0), | ||
| type: 'torrent', | ||
| }); | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Guard against missing enclosure arrays.
Some feeds omit enclosure; the direct .find() access can throw.
Apply:
- const downloadUrl = infoHash
- ? undefined
- : result.enclosure.find(
+ const downloadUrl = infoHash
+ ? undefined
+ : result.enclosure?.find?.(
(e: any) =>
e.type === 'application/x-bittorrent' && !e.url.includes('magnet:')
)?.url;🤖 Prompt for AI Agents
In packages/core/src/builtins/torznab/addon.ts around lines 130 to 167, the code
assumes result.enclosure is always an array and calls .find() directly which can
throw when enclosure is missing or not an array; change the downloadUrl
assignment to first check that result.enclosure is an array (or coerce to an
empty array) before calling .find(), and ensure the .url and
.includes('magnet:') checks are only executed when the enclosure item exists;
keep the existing behavior of preferring infoHash (set downloadUrl undefined
when infoHash exists) and continue early when both infoHash and downloadUrl are
absent.
| private extractInfoHash(result: any): string | undefined { | ||
| return validateInfoHash( | ||
| result.torznab?.infohash?.toString() || | ||
| ( | ||
| result.torznab?.magneturl || | ||
| result.enclosure.find( | ||
| (e: any) => | ||
| e.type === 'application/x-bittorrent' && e.url.includes('magnet:') | ||
| )?.url | ||
| ) | ||
| ?.toString() | ||
| ?.match(/(?:urn(?::|%3A)btih(?::|%3A))([a-f0-9]{40})/i)?.[1] | ||
| ?.toLowerCase() | ||
| ( | ||
| result.torznab?.magneturl || | ||
| result.enclosure.find( | ||
| (e: any) => | ||
| e.type === 'application/x-bittorrent' && e.url.includes('magnet:') | ||
| )?.url | ||
| ) | ||
| ?.toString() | ||
| // **THE FIX: Corrected a subtle regex typo from a previous version.** | ||
| ?.match(/(?:urn(?::|%3A)btih(?::|%3A))([a-f0-9]{40})/i)?.[1] | ||
| ?.toLowerCase() | ||
| ); | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also null‑safe in hash extraction when scanning enclosure.
Apply:
- result.enclosure.find(
+ result.enclosure?.find?.(
(e: any) =>
e.type === 'application/x-bittorrent' && e.url.includes('magnet:')
)?.url📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| private extractInfoHash(result: any): string | undefined { | |
| return validateInfoHash( | |
| result.torznab?.infohash?.toString() || | |
| ( | |
| result.torznab?.magneturl || | |
| result.enclosure.find( | |
| (e: any) => | |
| e.type === 'application/x-bittorrent' && e.url.includes('magnet:') | |
| )?.url | |
| ) | |
| ?.toString() | |
| ?.match(/(?:urn(?::|%3A)btih(?::|%3A))([a-f0-9]{40})/i)?.[1] | |
| ?.toLowerCase() | |
| ( | |
| result.torznab?.magneturl || | |
| result.enclosure.find( | |
| (e: any) => | |
| e.type === 'application/x-bittorrent' && e.url.includes('magnet:') | |
| )?.url | |
| ) | |
| ?.toString() | |
| // **THE FIX: Corrected a subtle regex typo from a previous version.** | |
| ?.match(/(?:urn(?::|%3A)btih(?::|%3A))([a-f0-9]{40})/i)?.[1] | |
| ?.toLowerCase() | |
| ); | |
| } | |
| private extractInfoHash(result: any): string | undefined { | |
| return validateInfoHash( | |
| result.torznab?.infohash?.toString() || | |
| ( | |
| result.torznab?.magneturl || | |
| result.enclosure?.find?.( | |
| (e: any) => | |
| e.type === 'application/x-bittorrent' && e.url.includes('magnet:') | |
| )?.url | |
| ) | |
| ?.toString() | |
| // **THE FIX: Corrected a subtle regex typo from a previous version.** | |
| ?.match(/(?:urn(?::|%3A)btih(?::|%3A))([a-f0-9]{40})/i)?.[1] | |
| ?.toLowerCase() | |
| ); | |
| } |
🤖 Prompt for AI Agents
In packages/core/src/builtins/torznab/addon.ts around lines 188–203, the
enclosure scanning can throw if result.enclosure is undefined or not an array
and individual enclosure items may be null; make the extraction null-safe by
treating result.enclosure as an array fallback (e.g. use result.enclosure ??
[]), use optional chaining on enclosure item properties (e?.type, e?.url) when
calling find, and ensure the chosen URL string is passed through the existing
regex and toLowerCase chain only after confirming it's a non-null string so
validateInfoHash receives either a string or undefined.
Summary by CodeRabbit
Release Notes
New Features
Improvements