Skip to content

Conversation

@lucamaia9
Copy link

@lucamaia9 lucamaia9 commented Oct 18, 2025

Summary by CodeRabbit

Release Notes

  • New Features

    • Configure parallel search across multiple indexers for faster results
    • New Jackett indexer ID configuration option supporting comma-separated values
  • Improvements

    • Global search deadline ensures results are returned within acceptable timeframes
    • Incremental result processing delivers partial results when timeout occurs
    • Enhanced error handling logs individual search failures without stopping other searches

Individual indexer request so timeout can work per indexer
Update to respect timeout and return what has been found after search torrent timeout
Make requests to each indexer individually, instead of sending all. This way we can return the indexers with results until the timeout.
Fix code bugs
config timeout
torznob request fix
json api call
New indexer
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Oct 18, 2025

Walkthrough

This pull request refactors search orchestration across Prowlarr and Torznab addons to introduce deadline-based timeouts, parallel multi-indexer searching, and incremental result processing. The Jackett preset gains optional user-configurable indexer selection to enable targeted searches.

Changes

Cohort / File(s) Summary
Search Deadline & Timeout Mechanism
packages/core/src/builtins/prowlarr/addon.ts, packages/core/src/builtins/torznab/addon.ts
Introduces SEARCH_DEADLINE_MS constant and deadline-based timeout racing via Promise.race. Guards added for empty queries/indexers. Upon timeout, returns partial results gathered so far or throws error if none found.
Multi-Indexer Parallel Search
packages/core/src/builtins/torznab/addon.ts
Adds TorznabAddonConfig schema and type extending NabAddonConfig with timeout and indexers fields. Implements parallel search across user-defined indexers with fallback to Jackett /all/ endpoint when no indexers configured. New searchIndexer method in TorznabApi for querying specific indexers.
Incremental Result Processing
packages/core/src/builtins/prowlarr/addon.ts, packages/core/src/builtins/torznab/addon.ts
Results validated, de-duplicated, and appended to torrents list immediately as each indexer returns data, rather than after all promises resolve. New processResults helper coalesces results prioritising infoHash over magnet-derived URLs. Per-request try/catch error handling with individual failure logging.
Jackett Indexer Configuration
packages/core/src/presets/jackett.ts
Adds optional jackettIndexers field to JackettPreset.METADATA OPTIONS for comma-separated Jackett indexer IDs. generateManifestUrl propagates timeout and parsed indexers array to runtime configuration.

Sequence Diagram

sequenceDiagram
    actor User
    participant Addon as Search Addon
    participant Indexer1 as Indexer 1
    participant Indexer2 as Indexer 2
    participant Deadline as Deadline Timer

    User->>Addon: Search request
    activate Addon
    
    Addon->>Deadline: Start deadline (e.g., 30s)
    Note over Addon: Guard: Check queries & indexers exist
    
    par Parallel Search
        Addon->>Indexer1: Query (with per-request timeout)
        activate Indexer1
        Addon->>Indexer2: Query (with per-request timeout)
        activate Indexer2
    and Deadline Wait
        Deadline->>Deadline: Wait for deadline or all results
    end
    
    Note over Addon: Incremental Processing Phase
    opt Indexer1 responds first
        Indexer1-->>Addon: Results
        deactivate Indexer1
        Addon->>Addon: Validate, deduplicate, append
        Addon->>User: (Partial results available)
    end
    
    opt Indexer2 responds
        Indexer2-->>Addon: Results
        deactivate Indexer2
        Addon->>Addon: Validate, deduplicate, append
    end
    
    alt Deadline reached
        Addon->>Addon: Log timeout event
        alt Results found
            Addon-->>User: Return partial results
        else No results
            Addon-->>User: Throw timeout error
        end
    else All complete before deadline
        Addon-->>User: Return full results
    end
    
    deactivate Addon
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~60 minutes

This diff introduces significant architectural changes across multiple files: new schema types and configuration, deadline-based timeout racing logic, parallel search orchestration, and incremental result processing with deduplication. The changes are heterogeneous—each file requires distinct reasoning (Prowlarr timeout logic differs from Torznab's multi-indexer orchestration, which differs from Jackett's configuration propagation). The async control flow with Promise.race and error handling patterns demand careful verification of edge cases (timeout scenarios, partial results, error propagation).

Poem

🐰 Hoppy searches now race the clock,
Multiple warrens searched in parallel lock—
Results trickle in as indexers reply,
No more waiting for the slowest to fly!
Deadlines enforced, partial bounties we keep, ✨🏃‍♂️

Pre-merge checks and finishing touches

❌ Failed checks (1 inconclusive)
Check name Status Explanation Resolution
Title Check ❓ Inconclusive The pull request title "Update AIOStream latest" is vague and generic, failing to convey meaningful information about the changeset. The actual changes involve significant refactoring across three files: implementing deadline-based timeout mechanisms and incremental result processing in Prowlarr and Torznab addons, introducing parallel search orchestration across multiple indexers, adding new configuration schemas, and enhancing error handling. None of these substantial changes are reflected in the title. A teammate scanning pull request history would not understand what this PR accomplishes based solely on the provided title, as it contains no descriptive or specific details about the core functionality being modified. Consider revising the title to be more specific and descriptive of the main changes, such as "Implement deadline-based timeout and parallel indexer search orchestration" or "Add timeout mechanisms and multi-indexer search support for Prowlarr and Torznab addons". This would clearly communicate the primary objective to reviewers and make the commit history more navigable.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Docstring Coverage ✅ Passed No functions found in the changes. Docstring coverage check skipped.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🧹 Nitpick comments (8)
packages/core/src/presets/jackett.ts (2)

67-74: New jackettIndexers option looks good; consider basic input validation.

Accepting raw IDs invites typos or invalid characters. Validate to a conservative slug pattern (e.g., [a-z0-9_-]) when parsing. This prevents malformed URLs downstream.


152-154: Sanitise indexer IDs before embedding into the manifest config.

Filter out invalid entries to avoid path issues when rewriting /all/ to per-indexer URLs.

Apply:

-      timeout: options.timeout,
-      indexers: options.jackettIndexers ? options.jackettIndexers.split(',').map((s: string) => s.trim()).filter(Boolean) : [],
+      timeout: options.timeout,
+      indexers: options.jackettIndexers
+        ? options.jackettIndexers
+            .split(',')
+            .map((s: string) => s.trim())
+            .filter((s: string) => /^[a-z0-9_-]+$/i.test(s))
+        : [],
packages/core/src/builtins/prowlarr/addon.ts (3)

35-38: Avoid hard‑coded 10s deadline; derive from env/config.

Make the deadline a function of the outer timeout (with a small safety margin) to prevent mismatches across environments.

For example:

-const SEARCH_DEADLINE_MS = 10000; // 10 seconds
+const SEARCH_DEADLINE_MS =
+  Math.max(1000, (Env.DEFAULT_TIMEOUT ?? 15000) - 500);

182-226: Prefer infoHash over downloadUrl; normalise seeders; reduce broken links.

Align with the Torznab path to avoid flaky HTTP download URLs when a valid hash exists and to stabilise dedup.

Apply:

-          for (const result of data) {
-            const magnetUrl = result.guid.includes('magnet:') ? result.guid : undefined;
-            const downloadUrl = result.magnetUrl?.startsWith('http') ? result.magnetUrl : result.downloadUrl;
-            const infoHash = validateInfoHash(result.infoHash || (magnetUrl ? extractInfoHashFromMagnet(magnetUrl) : undefined));
-            if (!infoHash && !downloadUrl) continue;
-            if (seenTorrents.has(infoHash ?? downloadUrl!)) continue;
-            seenTorrents.add(infoHash ?? downloadUrl!);
-
-            torrents.push({
-              hash: infoHash,
-              downloadUrl: downloadUrl,
-              sources: magnetUrl ? extractTrackersFromMagnet(magnetUrl) : [],
-              seeders: result.seeders,
+          for (const result of data) {
+            const magnetUrl =
+              typeof result.guid === 'string' && result.guid.includes('magnet:')
+                ? result.guid
+                : undefined;
+            const infoHash = validateInfoHash(
+              result.infoHash ||
+                (magnetUrl ? extractInfoHashFromMagnet(magnetUrl) : undefined)
+            );
+            // If we have a reliable hash, omit downloadUrl to avoid broken links.
+            const downloadUrl = infoHash
+              ? undefined
+              : (typeof result.downloadUrl === 'string' && result.downloadUrl.startsWith('http')
+                  ? result.downloadUrl
+                  : (typeof result.magnetUrl === 'string' && result.magnetUrl.startsWith('http')
+                      ? result.magnetUrl
+                      : undefined));
+            if (!infoHash && !downloadUrl) continue;
+            const dedupKey = infoHash ?? downloadUrl!;
+            if (seenTorrents.has(dedupKey)) continue;
+            seenTorrents.add(dedupKey);
+
+            torrents.push({
+              hash: infoHash,
+              downloadUrl,
+              sources: magnetUrl ? extractTrackersFromMagnet(magnetUrl) : [],
+              seeders:
+                typeof result.seeders === 'number' && result.seeders >= 0
+                  ? result.seeders
+                  : undefined,
               title: result.title,
               size: result.size,
               indexer: result.indexer,
               type: 'torrent',
             });
           }

239-249: Race timeout: clear the timer to avoid dangling timeouts.

Timers persist after Promise.race unless cleared.

Apply:

-    const timeoutPromise = new Promise((_, reject) =>
-      setTimeout(() => reject(new Error('Search deadline reached')), SEARCH_DEADLINE_MS)
-    );
+    let timer: ReturnType<typeof setTimeout>;
+    const timeoutPromise = new Promise((_, reject) => {
+      timer = setTimeout(
+        () => reject(new Error('Search deadline reached')),
+        SEARCH_DEADLINE_MS
+      );
+    });
...
-    } catch (error) {
+    } catch (error) {
       // This catch block will be triggered if the timeout wins the race
       this.logger.info(`Search deadline of ${SEARCH_DEADLINE_MS}ms reached. Returning ${torrents.length} results found so far.`);
-    }
+    } finally {
+      if (timer) clearTimeout(timer);
+    }
packages/core/src/builtins/torznab/addon.ts (3)

85-107: Bound concurrency to protect Jackett and avoid bursts.

Running queries × indexers concurrently can flood the endpoint. Use the existing limiter used in Prowlarr.

Apply within this block:

-      const searchPromises = searchTasks.map(({ query, indexerId }) => async () => {
+      // Reuse the standard concurrency limiter for network calls
+      const { createQueryLimit } = await import('../utils/general.js');
+      const limit = createQueryLimit();
+      const searchPromises = searchTasks.map(({ query, indexerId }) => () =>
+        limit(async () => {
           const start = Date.now();
           try {
             const params: Record<string, string | number | boolean> = { q: query, o: 'json' };
             if (parsedId.season) params.season = parsedId.season;
             if (parsedId.episode) params.ep = parsedId.episode;
 
             const results = await this.api.searchIndexer(indexerId, 'search', params);
             this.processResults(results, torrents, seenTorrents, indexerId);
           } catch (error) {
             this.logger.warn(
               `Jackett search for "${query}" on [${indexerId}] failed after ${getTimeTakenSincePoint(start)}: ${error instanceof Error ? error.message : String(error)}`
             );
           }
-      });
+        })
+      );

109-121: Also bound concurrency in the /all/ fallback.

Apply:

-      const searchPromises = queries.map((query) => async () => {
+      const { createQueryLimit } = await import('../utils/general.js');
+      const limit = createQueryLimit();
+      const searchPromises = queries.map((query) => () =>
+        limit(async () => {
           try {
             const params: Record<string, string | number | boolean> = { q: query, o: 'json' };
             if (parsedId.season) params.season = parsedId.season;
             if (parsedId.episode) params.ep = parsedId.episode;
             const results = await this.api.search('search', params);
             this.processResults(results, torrents, seenTorrents);
           } catch (error) {
              this.logger.warn(`Jackett /all/ search for "${query}" failed: ${error instanceof Error ? error.message : String(error)}`);
           }
-      });
+        })
+      );

169-179: Race timeout: clear the timer to avoid dangling timeouts.

Same pattern as in Prowlarr; free the timer in a finally block.

Apply:

-    const timeoutPromise = new Promise((_, reject) =>
-      setTimeout(() => reject(new Error('Search deadline reached')), deadline)
-    );
+    let timer: ReturnType<typeof setTimeout>;
+    const timeoutPromise = new Promise((_, reject) => {
+      timer = setTimeout(
+        () => reject(new Error('Search deadline reached')),
+        deadline
+      );
+    });
     try {
       await Promise.race([allSearchesPromise, timeoutPromise]);
     } catch (error) {
       this.logger.info(`Search deadline of ${deadline}ms reached. Returning results found so far.`);
-    }
+    } finally {
+      if (timer) clearTimeout(timer);
+    }
📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 5ddd111 and 24fe682.

📒 Files selected for processing (3)
  • packages/core/src/builtins/prowlarr/addon.ts (3 hunks)
  • packages/core/src/builtins/torznab/addon.ts (3 hunks)
  • packages/core/src/presets/jackett.ts (2 hunks)
🧰 Additional context used
🧬 Code graph analysis (2)
packages/core/src/builtins/prowlarr/addon.ts (4)
packages/core/src/debrid/utils.ts (1)
  • UnprocessedTorrent (51-56)
packages/core/src/builtins/prowlarr/api.ts (1)
  • ProwlarrApiIndexer (60-60)
packages/core/src/utils/logger.ts (1)
  • getTimeTakenSincePoint (149-156)
packages/core/src/builtins/utils/debrid.ts (3)
  • validateInfoHash (38-42)
  • extractInfoHashFromMagnet (48-52)
  • extractTrackersFromMagnet (44-46)
packages/core/src/builtins/torznab/addon.ts (4)
packages/core/src/builtins/base/nab/addon.ts (1)
  • NabAddonConfigSchema (13-18)
packages/core/src/builtins/base/nab/api.ts (1)
  • BaseNabApi (194-359)
packages/core/src/debrid/utils.ts (1)
  • UnprocessedTorrent (51-56)
packages/core/src/builtins/utils/debrid.ts (2)
  • extractTrackersFromMagnet (44-46)
  • validateInfoHash (38-42)
🔇 Additional comments (4)
packages/core/src/builtins/prowlarr/addon.ts (2)

62-63: Per‑request timeout comment is clear.

The distinction between per‑request timeout and overall deadline is helpful.


178-181: Early return guard is fine.

Avoids wasted work when no queries or indexers.

packages/core/src/builtins/torznab/addon.ts (2)

123-126: Throw‑on‑empty behaviour matches Prowlarr; LGTM.


55-69: Constructor/schema evolution looks consistent.

Please confirm UnprocessedTorrent includes optional fields used here (title, size, seeders, indexer). If not, I can adjust typing or narrow to the declared shape.

import { createLogger } from '../../utils/index.js';
import { createLogger, getTimeTakenSincePoint } from '../../utils/index.js';
import { Torrent, NZB, UnprocessedTorrent } from '../../debrid/index.js';
import { SearchMetadata } from '../base/debrid';
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

ESM import likely to break at runtime (.js extension missing).

All other ESM imports use the .js suffix; this one does not and will fail under NodeNext/ESM.

Apply:

-import { SearchMetadata } from '../base/debrid';
+import { SearchMetadata } from '../base/debrid.js';

🏁 Script executed:

#!/bin/bash
# Verify inconsistent ESM imports in this package
rg -nP --type=ts -C1 "from '../base/debrid(?!\.js)'" packages/core/src/builtins/torznab

Length of output: 269


Add .js extension to ESM import for Node.js compatibility.

Relative import paths require explicit file extensions in ECMAScript imports when '--moduleResolution' is 'node16' or 'nodenext'. Line 4 of this file correctly uses '../../debrid/index.js', but line 5 is missing the extension.

Apply:

-import { SearchMetadata } from '../base/debrid';
+import { SearchMetadata } from '../base/debrid.js';
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
import { SearchMetadata } from '../base/debrid';
import { SearchMetadata } from '../base/debrid.js';
🤖 Prompt for AI Agents
packages/core/src/builtins/torznab/addon.ts around line 5: the ESM import on
line 5 uses a relative path without the .js extension which breaks Node.js with
'node16'/'nodenext' resolution; update the import to include the explicit .js
extension (e.g., change '../base/debrid' to '../base/debrid.js') so the file
path matches ESM requirements and Node can resolve the module.

Comment on lines +130 to +167
private processResults(results: any[], torrents: UnprocessedTorrent[], seenTorrents: Set<string>, indexerId?: string) {
for (const result of results) {
const infoHash = this.extractInfoHash(result);

// **THE FIX: Prioritize the reliable infoHash over the unreliable downloadUrl.**
// If an infoHash exists, we set downloadUrl to undefined. This forces AIOStreams
// to use its more robust metadata fetching method and avoids getting stuck on broken links.
const downloadUrl = infoHash
? undefined
: result.enclosure.find(
(e: any) =>
e.type === 'application/x-bittorrent' && !e.url.includes('magnet:')
)?.url;

if (!infoHash && !downloadUrl) continue;
if (seenTorrents.has(infoHash ?? downloadUrl!)) continue;
seenTorrents.add(infoHash ?? downloadUrl!);

torrents.push({
hash: infoHash,
downloadUrl: downloadUrl,
sources: result.torznab?.magneturl?.toString()
? extractTrackersFromMagnet(result.torznab.magneturl.toString())
: [],
seeders:
typeof result.torznab?.seeders === 'number' &&
![-1, 999].includes(result.torznab.seeders)
? result.torznab.seeders
: undefined,
indexer: result.jackettindexer?.name ?? indexerId ?? 'unknown',
title: result.title,
size:
result.size ??
(result.torznab?.size ? Number(result.torznab.size) : 0),
type: 'torrent',
});
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Guard against missing enclosure arrays.

Some feeds omit enclosure; the direct .find() access can throw.

Apply:

-        const downloadUrl = infoHash 
-          ? undefined 
-          : result.enclosure.find(
+        const downloadUrl = infoHash
+          ? undefined
+          : result.enclosure?.find?.(
               (e: any) =>
                 e.type === 'application/x-bittorrent' && !e.url.includes('magnet:')
             )?.url;
🤖 Prompt for AI Agents
In packages/core/src/builtins/torznab/addon.ts around lines 130 to 167, the code
assumes result.enclosure is always an array and calls .find() directly which can
throw when enclosure is missing or not an array; change the downloadUrl
assignment to first check that result.enclosure is an array (or coerce to an
empty array) before calling .find(), and ensure the .url and
.includes('magnet:') checks are only executed when the enclosure item exists;
keep the existing behavior of preferring infoHash (set downloadUrl undefined
when infoHash exists) and continue early when both infoHash and downloadUrl are
absent.

Comment on lines 188 to 203
private extractInfoHash(result: any): string | undefined {
return validateInfoHash(
result.torznab?.infohash?.toString() ||
(
result.torznab?.magneturl ||
result.enclosure.find(
(e: any) =>
e.type === 'application/x-bittorrent' && e.url.includes('magnet:')
)?.url
)
?.toString()
?.match(/(?:urn(?::|%3A)btih(?::|%3A))([a-f0-9]{40})/i)?.[1]
?.toLowerCase()
(
result.torznab?.magneturl ||
result.enclosure.find(
(e: any) =>
e.type === 'application/x-bittorrent' && e.url.includes('magnet:')
)?.url
)
?.toString()
// **THE FIX: Corrected a subtle regex typo from a previous version.**
?.match(/(?:urn(?::|%3A)btih(?::|%3A))([a-f0-9]{40})/i)?.[1]
?.toLowerCase()
);
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Also null‑safe in hash extraction when scanning enclosure.

Apply:

-        result.enclosure.find(
+        result.enclosure?.find?.(
           (e: any) =>
             e.type === 'application/x-bittorrent' && e.url.includes('magnet:')
         )?.url
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
private extractInfoHash(result: any): string | undefined {
return validateInfoHash(
result.torznab?.infohash?.toString() ||
(
result.torznab?.magneturl ||
result.enclosure.find(
(e: any) =>
e.type === 'application/x-bittorrent' && e.url.includes('magnet:')
)?.url
)
?.toString()
?.match(/(?:urn(?::|%3A)btih(?::|%3A))([a-f0-9]{40})/i)?.[1]
?.toLowerCase()
(
result.torznab?.magneturl ||
result.enclosure.find(
(e: any) =>
e.type === 'application/x-bittorrent' && e.url.includes('magnet:')
)?.url
)
?.toString()
// **THE FIX: Corrected a subtle regex typo from a previous version.**
?.match(/(?:urn(?::|%3A)btih(?::|%3A))([a-f0-9]{40})/i)?.[1]
?.toLowerCase()
);
}
private extractInfoHash(result: any): string | undefined {
return validateInfoHash(
result.torznab?.infohash?.toString() ||
(
result.torznab?.magneturl ||
result.enclosure?.find?.(
(e: any) =>
e.type === 'application/x-bittorrent' && e.url.includes('magnet:')
)?.url
)
?.toString()
// **THE FIX: Corrected a subtle regex typo from a previous version.**
?.match(/(?:urn(?::|%3A)btih(?::|%3A))([a-f0-9]{40})/i)?.[1]
?.toLowerCase()
);
}
🤖 Prompt for AI Agents
In packages/core/src/builtins/torznab/addon.ts around lines 188–203, the
enclosure scanning can throw if result.enclosure is undefined or not an array
and individual enclosure items may be null; make the extraction null-safe by
treating result.enclosure as an array fallback (e.g. use result.enclosure ??
[]), use optional chaining on enclosure item properties (e?.type, e?.url) when
calling find, and ensure the chosen URL string is passed through the existing
regex and toLowerCase chain only after confirming it's a non-null string so
validateInfoHash receives either a string or undefined.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant