Skip to content

refactor: realign update flow with Moder.Update incremental chain model#323

Merged
ModerRAS merged 1 commit intomasterfrom
refactor/moder-update-incremental-chain
Apr 25, 2026
Merged

refactor: realign update flow with Moder.Update incremental chain model#323
ModerRAS merged 1 commit intomasterfrom
refactor/moder-update-incremental-chain

Conversation

@ModerRAS
Copy link
Copy Markdown
Owner

@ModerRAS ModerRAS commented Apr 25, 2026

Summary by CodeRabbit

  • New Features

    • Enhanced update delivery system to support multi-step update chains, allowing updates to be installed in sequential steps rather than as single packages.
  • Chores

    • Refactored internal update infrastructure and build pipeline for improved catalog management and artifact handling.

- Extract shared UpdateCatalog/UpdateCatalogEntry/UpdateManifest/UpdateFile
  models to TelegramSearchBot.Common/Model/Update/
- Rewrite UpdateBuilder to generate step packages (changed files only) and
  cumulative/anchor packages, merge with existing catalog
- Rewrite SelfUpdateBootstrap runtime with PlanUpdatePath for chain-based
  upgrade path planning, cumulative preference, and sequential multi-package
  application
- Update push.yml to fetch existing catalog for merge, download previous
  release standalone for step package comparison, and replace file-based
  prune with catalog-reachability-based prune
@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Apr 25, 2026

📝 Walkthrough

Walkthrough

This PR refactors the update system to implement a multi-step incremental chain model based on issue #322. Changes include: pre-fetching prior catalog and version metadata in the GitHub workflow, introducing new model classes for structured catalog/manifest/entry/file definitions, completely reworking UpdateBuilder to generate multiple packages (step, cumulative, anchor) with multi-entry catalogs, switching B2 pruning from local file enumeration to catalog-driven reachability, and refactoring SelfUpdateBootstrap runtime to plan and sequentially apply multi-step update paths instead of selecting a single package.

Changes

Cohort / File(s) Summary
GitHub Workflow
.github/workflows/push.yml
Pre-fetches prior catalog.json and GitHub release metadata before build; exports version/directory info; dynamically constructs UpdateBuilder arguments with conditional --existing-catalog, --prev-source-dir, --prev-version, --anchor-source-dir, --anchor-version flags; replaces local-file-based keep-path logic with catalog-driven reachability pruning that only retains packages referenced by uploaded catalog entries.
Update Model Types
TelegramSearchBot.Common/Model/Update/UpdateCatalog.cs, UpdateCatalogEntry.cs, UpdateFile.cs, UpdateManifest.cs
Adds four new sealed model classes defining structured update metadata: UpdateCatalog (latest version + entries list + optional min required/updater checksum), UpdateCatalogEntry (package path, version bounds, cumulative/anchor flags, chain depth, checksums, file/size counts), UpdateFile (relative path + new checksum), and UpdateManifest (target/source version bounds, flags, files list, checksums, timestamps).
UpdateBuilder Refactor
TelegramSearchBot.UpdateBuilder/Program.cs
Restructures to generate multiple packages and merged multi-entry catalogs instead of single entry. Adds optional loading/merging of existing catalogs with case-insensitive deserialization; conditionally generates step package from --prev-source-dir/--prev-version, cumulative package from --anchor-source-dir/--anchor-version, and fallback full anchor package; replaces single manifest/package logic with helper methods (BuildPackageFile, ComputeChangedFiles, PruneCatalogEntries); outputs to packages/ subdirectory; migrates to common update model types.
UpdateBuilder Project
TelegramSearchBot.UpdateBuilder/TelegramSearchBot.UpdateBuilder.csproj
Adds project reference to TelegramSearchBot.Common for shared model access; removes BOM prefix from opening tag.
Runtime Update Logic
TelegramSearchBot/Service/Update/SelfUpdateBootstrap.Windows.cs
Refactors from single-entry selection to multi-step path planning and sequential application. Replaces "best single entry" logic with PlanUpdatePath returning ordered UpdatePath of multiple steps or NoPathFound; renames PrepareUpdateAsync to PrepareUpdateChainAsync to download/verify/extract each package sequentially into shared staging directory; derives final target version from last step; updates gating/messaging accordingly; removes local type definitions in favor of common model types.

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Possibly related PRs

  • PR #294: Directly modifies the same update subsystem (UpdateBuilder, Common.Model.Update types, SelfUpdateBootstrap) with shared logic for multi-step update integration.
  • PR #313: Modifies push.yml and B2 package pruning strategy, sharing workflow/cleanup logic refinements.
  • PR #312: Modifies push.yml and B2 cleanup/versioning, overlapping in release automation and storage management.

Poem

🐰 A rabbit hops through catalog entries so neat,
Packages bundled in chains, step by step beat!
No longer just one, but a path long and true,
Incremental updates from old versions to new! 🚀

🚥 Pre-merge checks | ✅ 4 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (4 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The PR title 'refactor: realign update flow with Moder.Update incremental chain model' accurately summarizes the main objective: refactoring the update system to implement incremental chain-based semantics.
Linked Issues check ✅ Passed The PR comprehensively implements the linked issue #322 requirements: extracts shared update models, rewrites UpdateBuilder for step/cumulative packages with multi-entry catalogs, refactors runtime for chain-based planning with sequential package application, and updates workflow for catalog-driven pruning.
Out of Scope Changes check ✅ Passed All changes directly support the incremental chain model implementation: model definitions, builder refactoring for step packages, runtime path planning, and workflow catalog management. No unrelated modifications detected.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
📝 Generate docstrings
  • Create stacked PR
  • Commit on current branch
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch refactor/moder-update-incremental-chain

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@github-actions
Copy link
Copy Markdown
Contributor

🔍 PR检查报告

📋 检查概览

🧪 测试结果

平台 状态 详情
Ubuntu 🟢 成功 测试通过,产物已上传
Windows 🟢 成功 测试通过,产物已上传

📊 代码质量

  • ✅ 代码格式化检查
  • ✅ 安全漏洞扫描
  • ✅ 依赖包分析
  • ✅ 代码覆盖率收集

📁 测试产物

  • 测试结果 artifacts 数量: 2
  • 代码覆盖率已上传到Codecov

🔗 相关链接


此报告由GitHub Actions自动生成

Copy link
Copy Markdown

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 6

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.github/workflows/push.yml:
- Around line 248-254: The try/catch around the b2 call won't catch
non-terminating failures; instead run the b2 download-file-by-name command, then
check $LASTEXITCODE to determine success, log "Downloaded existing catalog.json
for merge." only on success and log "No existing catalog found on B2, will
create fresh catalog." on failure; use Test-Path
'.\artifacts\existing-catalog.json' only after confirming $LASTEXITCODE was 0
(or explicitly remove any stale/partial file on failure) and then call b2
account clear regardless. Ensure you reference the b2 download-file-by-name
invocation, $LASTEXITCODE, Test-Path '.\artifacts\existing-catalog.json', and b2
account clear when making the changes.
- Around line 270-271: The current steps use Write-Host piped to Out-File which
emits nothing to the pipeline, so PREV_VERSION and PREV_STANDALONE_DIR are not
written to $env:GITHUB_ENV; replace those lines to write directly to the GitHub
env file (for example use Add-Content -Path $env:GITHUB_ENV -Value
"PREV_VERSION=$prevVersion" and Add-Content -Path $env:GITHUB_ENV -Value
"PREV_STANDALONE_DIR=artifacts\prev-standalone") so the variables PREV_VERSION
and PREV_STANDALONE_DIR are actually exported into the environment for later
steps.

In `@TelegramSearchBot.UpdateBuilder/Program.cs`:
- Around line 228-276: BuildPackageFile currently builds the package twice and
computes the checksum over the first build (tempBytes) while returning the
second build (finalBytes), so the stored PackageChecksum will never match the
bytes written to disk; also minSourceVersion is unused. Fix by building the
package only once: create the manifest with Checksum = string.Empty (use
minSourceVersion for MinSourceVersion instead of fromVersion), call
CreatePackage once to produce packageBytes (drop the second
CreatePackage/finalManifest), compute finalChecksum =
ComputeSha512(packageBytes), and return (packageBytes, finalChecksum); update
any call sites to remove the now-unused minSourceVersion parameter if you change
the signature, and keep references to
tempManifest/tempBytes/finalChecksum/finalBytes to locate the changes.
- Around line 325-346: ComputeChangedFiles currently ignores deletions, causing
stale files to persist; modify ComputeChangedFiles to detect any path present in
prevFiles but missing from currentFiles and signal a fallback to cumulative
packages by returning null (or another sentinel) instead of a partial changed
list. Implement by building prevMap as now, also build a HashSet<string> of
current RelativePath values, and after computing changed add a check: if any key
in prevMap is not in the current set then return null; otherwise return the
changed list as before; keep the function name ComputeChangedFiles and update
any callers to handle the null sentinel as "deletion detected -> use cumulative
package" behavior.

In `@TelegramSearchBot/Service/Update/SelfUpdateBootstrap.Windows.cs`:
- Around line 141-155: The direct-to-target selection currently picks the first
matching element from candidates (via directToTarget) which can be
nondeterministic and ignores the "prefer cumulative" rule; change the selection
to filter candidates that parse to >= targetVersion and then OrderByDescending(e
=> e.IsCumulative) (and then OrderByDescending by parsed TargetVersion as a
tie-breaker) before taking FirstOrDefault so cumulative entries are preferred
deterministically; update the directToTarget assignment (and keep the
Version.TryParse logic used elsewhere) to use this ordered LINQ query and then
assign bestPick accordingly.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: b68368a0-0496-4cbd-84fb-aacbf6c38cf9

📥 Commits

Reviewing files that changed from the base of the PR and between a584537 and 392576e.

📒 Files selected for processing (8)
  • .github/workflows/push.yml
  • TelegramSearchBot.Common/Model/Update/UpdateCatalog.cs
  • TelegramSearchBot.Common/Model/Update/UpdateCatalogEntry.cs
  • TelegramSearchBot.Common/Model/Update/UpdateFile.cs
  • TelegramSearchBot.Common/Model/Update/UpdateManifest.cs
  • TelegramSearchBot.UpdateBuilder/Program.cs
  • TelegramSearchBot.UpdateBuilder/TelegramSearchBot.UpdateBuilder.csproj
  • TelegramSearchBot/Service/Update/SelfUpdateBootstrap.Windows.cs

Comment on lines +248 to +254
try {
b2 download-file-by-name $env:B2_BUCKET TelegramSearchBot/catalog.json .\artifacts\existing-catalog.json --quiet
Write-Host "Downloaded existing catalog.json for merge."
} catch {
Write-Host "No existing catalog found on B2, will create fresh catalog."
}
b2 account clear
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

try/catch won't trap b2 failures; native commands return exit codes, not terminating errors.

If b2 download-file-by-name fails (e.g., 404 because the catalog doesn't exist yet), the catch block won't run; the step continues, and the later Test-Path '.\artifacts\existing-catalog.json' may pass against a stale or partial file (b2 sometimes touches the path before failing). Gate on $LASTEXITCODE instead so the "no catalog" log message accurately reflects what happened.

🛠️ Suggested fix
-        try {
-          b2 download-file-by-name $env:B2_BUCKET TelegramSearchBot/catalog.json .\artifacts\existing-catalog.json --quiet
-          Write-Host "Downloaded existing catalog.json for merge."
-        } catch {
-          Write-Host "No existing catalog found on B2, will create fresh catalog."
-        }
+        b2 download-file-by-name $env:B2_BUCKET TelegramSearchBot/catalog.json .\artifacts\existing-catalog.json --quiet 2>$null
+        if ($LASTEXITCODE -eq 0 -and (Test-Path .\artifacts\existing-catalog.json)) {
+          Write-Host "Downloaded existing catalog.json for merge."
+        } else {
+          if (Test-Path .\artifacts\existing-catalog.json) { Remove-Item .\artifacts\existing-catalog.json -Force }
+          Write-Host "No existing catalog found on B2, will create fresh catalog."
+        }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/push.yml around lines 248 - 254, The try/catch around the
b2 call won't catch non-terminating failures; instead run the b2
download-file-by-name command, then check $LASTEXITCODE to determine success,
log "Downloaded existing catalog.json for merge." only on success and log "No
existing catalog found on B2, will create fresh catalog." on failure; use
Test-Path '.\artifacts\existing-catalog.json' only after confirming
$LASTEXITCODE was 0 (or explicitly remove any stale/partial file on failure) and
then call b2 account clear regardless. Ensure you reference the b2
download-file-by-name invocation, $LASTEXITCODE, Test-Path
'.\artifacts\existing-catalog.json', and b2 account clear when making the
changes.

Comment on lines +270 to +271
Write-Host "PREV_VERSION=$prevVersion" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append
Write-Host "PREV_STANDALONE_DIR=artifacts\prev-standalone" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Critical: Write-Host piped to Out-File writes nothing to $GITHUB_ENV.

Write-Host writes to the information stream and emits no objects to the success pipeline, so Out-File receives nothing and PREV_VERSION / PREV_STANDALONE_DIR are never exported. Subsequent steps will see $env:PREV_VERSION as empty, the if ($env:PREV_VERSION -and ...) guard at line 294 fails, and step packages will silently never be generated — the entire chain-update feature regresses to fallback-only. The same idiom is used correctly elsewhere in this file (e.g., line 44).

🛠️ Proposed fix
-            Write-Host "PREV_VERSION=$prevVersion" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append
-            Write-Host "PREV_STANDALONE_DIR=artifacts\prev-standalone" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append
+            "PREV_VERSION=$prevVersion" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append
+            "PREV_STANDALONE_DIR=artifacts\prev-standalone" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append
+            Write-Host "Exported PREV_VERSION=$prevVersion and PREV_STANDALONE_DIR."
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
Write-Host "PREV_VERSION=$prevVersion" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append
Write-Host "PREV_STANDALONE_DIR=artifacts\prev-standalone" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append
"PREV_VERSION=$prevVersion" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append
"PREV_STANDALONE_DIR=artifacts\prev-standalone" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append
Write-Host "Exported PREV_VERSION=$prevVersion and PREV_STANDALONE_DIR."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/push.yml around lines 270 - 271, The current steps use
Write-Host piped to Out-File which emits nothing to the pipeline, so
PREV_VERSION and PREV_STANDALONE_DIR are not written to $env:GITHUB_ENV; replace
those lines to write directly to the GitHub env file (for example use
Add-Content -Path $env:GITHUB_ENV -Value "PREV_VERSION=$prevVersion" and
Add-Content -Path $env:GITHUB_ENV -Value
"PREV_STANDALONE_DIR=artifacts\prev-standalone") so the variables PREV_VERSION
and PREV_STANDALONE_DIR are actually exported into the environment for later
steps.

Comment on lines +203 to +226
static List<UpdateCatalogEntry> PruneCatalogEntries(
List<UpdateCatalogEntry> entries,
string latestVersion,
string? minRequiredVersion,
string? prevVersion,
string? anchorVersion)
{
const int MaxEntries = 20;

var latest = entries.Where(e =>
string.Equals(e.TargetVersion, latestVersion, StringComparison.OrdinalIgnoreCase))
.ToList();

var rest = entries.Where(e =>
!string.Equals(e.TargetVersion, latestVersion, StringComparison.OrdinalIgnoreCase))
.OrderByDescending(e => e.IsCumulative)
.ThenByDescending(e => Version.TryParse(e.TargetVersion, out var v) ? v : new Version(0, 0))
.ToList();

var result = latest;
result.AddRange(rest.Take(MaxEntries - latest.Count));

return result;
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

PruneCatalogEntries ignores three of its parameters and doesn't enforce the documented retention policy.

minRequiredVersion, prevVersion, and anchorVersion are accepted but never referenced in the body; the actual policy is just "keep all entries targeting latestVersion, then top-up to 20 by (cumulative-first, then version)". That doesn't match the PR objective:

Define retention/prune policy: keep current and previous anchor windows, generate cumulative packages after thresholds (e.g., 5 steps or size ratio), and only delete packages not on any supported path to latestVersion.

Concrete risks of the current implementation:

  • An entry with MinSourceVersion < minRequiredVersion may still be retained, contradicting the catalog's own MinRequiredVersion invariant.
  • Old anchor windows aren't preferentially kept; once 17 newer non-cumulative entries accumulate, the previous anchor can be evicted, breaking longer chains.
  • Step entries that bridge two anchors can be evicted while the anchors survive, creating gaps in PlanUpdatePath.

Either implement reachability/anchor-aware retention using the parameters you've already declared, or remove the dead parameters and document the simplified policy explicitly so reviewers don't expect more.

🛠️ Sketch — anchor-aware retention
 static List<UpdateCatalogEntry> PruneCatalogEntries(
     List<UpdateCatalogEntry> entries,
     string latestVersion,
     string? minRequiredVersion,
     string? prevVersion,
     string? anchorVersion)
 {
     const int MaxEntries = 20;

+    Version? minVer = null;
+    if (!string.IsNullOrWhiteSpace(minRequiredVersion))
+    {
+        Version.TryParse(minRequiredVersion, out minVer);
+    }
+
+    // Drop entries whose source range is entirely below minRequiredVersion.
+    if (minVer is not null)
+    {
+        entries = entries
+            .Where(e => !Version.TryParse(e.MaxSourceVersion ?? e.MinSourceVersion, out var max) || max >= minVer)
+            .ToList();
+    }
+
     var latest = entries.Where(e =>
             string.Equals(e.TargetVersion, latestVersion, StringComparison.OrdinalIgnoreCase))
         .ToList();

     var rest = entries.Where(e =>
             !string.Equals(e.TargetVersion, latestVersion, StringComparison.OrdinalIgnoreCase))
-        .OrderByDescending(e => e.IsCumulative)
+        .OrderByDescending(e => e.IsAnchor)             // keep anchors first
+        .ThenByDescending(e => e.IsCumulative)
         .ThenByDescending(e => Version.TryParse(e.TargetVersion, out var v) ? v : new Version(0, 0))
         .ToList();

     var result = latest;
     result.AddRange(rest.Take(MaxEntries - latest.Count));
     return result;
 }

Comment on lines +228 to 276
static (byte[] PackageBytes, string Checksum) BuildPackageFile(
IReadOnlyList<UpdateFileContent> files,
string fromVersion,
string toVersion,
string? minSourceVersion,
bool isCumulative,
bool isAnchor,
int chainDepth)
{
var manifestFiles = files
.Select(file => new UpdateFile
{
RelativePath = file.RelativePath,
NewChecksum = ComputeSha512(file.Content)
})
.ToList();

var tempManifest = new UpdateManifest
{
TargetVersion = toVersion,
MinSourceVersion = fromVersion,
MaxSourceVersion = isCumulative ? null : fromVersion,
IsAnchor = isAnchor,
IsCumulative = isCumulative,
ChainDepth = chainDepth,
Files = manifestFiles,
Checksum = string.Empty,
CreatedAt = DateTime.UtcNow
};

var tempBytes = CreatePackage(tempManifest, files);
var finalChecksum = ComputeSha512(tempBytes);

var finalManifest = new UpdateManifest
{
TargetVersion = toVersion,
MinSourceVersion = fromVersion,
MaxSourceVersion = isCumulative ? null : fromVersion,
IsAnchor = isAnchor,
IsCumulative = isCumulative,
ChainDepth = chainDepth,
Files = manifestFiles,
Checksum = finalChecksum,
CreatedAt = DateTime.UtcNow
};

var finalBytes = CreatePackage(finalManifest, files);
return (finalBytes, finalChecksum);
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Confirm runtime never reads manifest.Checksum (manifest.json is skipped during extraction).
rg -nP -C3 'manifest\.json|manifest\.Checksum|UpdateManifest' --type=cs

Repository: ModerRAS/TelegramSearchBot

Length of output: 3383


🏁 Script executed:

rg -nP 'BuildPackageFile' --type=cs -A2 -B2

Repository: ModerRAS/TelegramSearchBot

Length of output: 2179


Critical: stored PackageChecksum will never match the bytes written to disk — runtime verification will always fail.

BuildPackageFile builds the package twice:

  1. tempBytes = CreatePackage(tempManifest, files) with tempManifest.Checksum = string.Empty.
  2. finalChecksum = ComputeSha512(tempBytes).
  3. finalManifest.Checksum = finalChecksum, then finalBytes = CreatePackage(finalManifest, files).
  4. Returns (finalBytes, finalChecksum).

The returned finalChecksum is SHA512(tempBytes), but the bytes written to disk are finalBytes. Because manifest.json is embedded inside the tar package, and the manifest's Checksum field changed from "" to a 128-char hex string, finalBytes ≠ tempBytes and therefore SHA512(finalBytes) ≠ finalChecksum.

At runtime, SelfUpdateBootstrap.Windows.cs::VerifyPackageChecksum hashes the downloaded bytes (= finalBytes) and compares against the stored PackageChecksum (= finalChecksum). The check will fail for every package, breaking auto-updates entirely.

Furthermore, ExtractPackageToDirectory explicitly skips manifest.json at line 300, so embedding a self-hash in the manifest has no consumer — the second build pass is wasted work. Also, the minSourceVersion parameter is never used inside the method; the manifest uses fromVersion instead.

🛠️ Suggested fix — single-build, hash the on-disk bytes
-static (byte[] PackageBytes, string Checksum) BuildPackageFile(
-    IReadOnlyList<UpdateFileContent> files,
-    string fromVersion,
-    string toVersion,
-    string? minSourceVersion,
-    bool isCumulative,
-    bool isAnchor,
-    int chainDepth)
-{
-    var manifestFiles = files
-        .Select(file => new UpdateFile
-        {
-            RelativePath = file.RelativePath,
-            NewChecksum = ComputeSha512(file.Content)
-        })
-        .ToList();
-
-    var tempManifest = new UpdateManifest
-    {
-        TargetVersion = toVersion,
-        MinSourceVersion = fromVersion,
-        MaxSourceVersion = isCumulative ? null : fromVersion,
-        IsAnchor = isAnchor,
-        IsCumulative = isCumulative,
-        ChainDepth = chainDepth,
-        Files = manifestFiles,
-        Checksum = string.Empty,
-        CreatedAt = DateTime.UtcNow
-    };
-
-    var tempBytes = CreatePackage(tempManifest, files);
-    var finalChecksum = ComputeSha512(tempBytes);
-
-    var finalManifest = new UpdateManifest
-    {
-        TargetVersion = toVersion,
-        MinSourceVersion = fromVersion,
-        MaxSourceVersion = isCumulative ? null : fromVersion,
-        IsAnchor = isAnchor,
-        IsCumulative = isCumulative,
-        ChainDepth = chainDepth,
-        Files = manifestFiles,
-        Checksum = finalChecksum,
-        CreatedAt = DateTime.UtcNow
-    };
-
-    var finalBytes = CreatePackage(finalManifest, files);
-    return (finalBytes, finalChecksum);
-}
+static (byte[] PackageBytes, string Checksum) BuildPackageFile(
+    IReadOnlyList<UpdateFileContent> files,
+    string fromVersion,
+    string toVersion,
+    bool isCumulative,
+    bool isAnchor,
+    int chainDepth)
+{
+    var manifestFiles = files
+        .Select(file => new UpdateFile
+        {
+            RelativePath = file.RelativePath,
+            NewChecksum = ComputeSha512(file.Content)
+        })
+        .ToList();
+
+    var manifest = new UpdateManifest
+    {
+        TargetVersion = toVersion,
+        MinSourceVersion = fromVersion,
+        MaxSourceVersion = isCumulative ? null : fromVersion,
+        IsAnchor = isAnchor,
+        IsCumulative = isCumulative,
+        ChainDepth = chainDepth,
+        Files = manifestFiles,
+        // Per-package self-hash isn't readable by the runtime (manifest.json is skipped on extract);
+        // the catalog's PackageChecksum is the source of truth.
+        Checksum = string.Empty,
+        CreatedAt = DateTime.UtcNow
+    };
+
+    var bytes = CreatePackage(manifest, files);
+    var checksum = ComputeSha512(bytes);
+    return (bytes, checksum);
+}

(Update the three call sites to drop the now-unused minSourceVersion argument.)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@TelegramSearchBot.UpdateBuilder/Program.cs` around lines 228 - 276,
BuildPackageFile currently builds the package twice and computes the checksum
over the first build (tempBytes) while returning the second build (finalBytes),
so the stored PackageChecksum will never match the bytes written to disk; also
minSourceVersion is unused. Fix by building the package only once: create the
manifest with Checksum = string.Empty (use minSourceVersion for MinSourceVersion
instead of fromVersion), call CreatePackage once to produce packageBytes (drop
the second CreatePackage/finalManifest), compute finalChecksum =
ComputeSha512(packageBytes), and return (packageBytes, finalChecksum); update
any call sites to remove the now-unused minSourceVersion parameter if you change
the signature, and keep references to
tempManifest/tempBytes/finalChecksum/finalBytes to locate the changes.

Comment on lines +325 to +346
static List<UpdateFileContent> ComputeChangedFiles(
List<UpdateFileContent> prevFiles,
List<UpdateFileContent> currentFiles)
{
var prevMap = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
foreach (var f in prevFiles)
{
prevMap[f.RelativePath] = ComputeSha512(f.Content);
}

var changed = new List<UpdateFileContent>();
foreach (var f in currentFiles)
{
var currentHash = ComputeSha512(f.Content);
if (!prevMap.TryGetValue(f.RelativePath, out var prevHash) || prevHash != currentHash)
{
changed.Add(f);
}
}

return changed;
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🌐 Web query:

Does the Moder.Update updater (https://github.com/ModerRAS/Moder.Update) merge staging files into the target directory or replace it wholesale, and does it support file deletions in incremental packages?

💡 Result:

The Moder.Update updater replaces files in the target directory atomically using Windows ReplaceFile API rather than merging staging files. There is no evidence it supports file deletions in incremental packages; packages contain complete application files for replacement ("完整替换文件(非差分)"), and no mention of deletion handling in manifests or FileReplacementService.

Citations:


🏁 Script executed:

# Search for PrepareUpdateChainAsync and staging directory references
rg "PrepareUpdateChainAsync|staging" --type cs -B 2 -A 5

Repository: ModerRAS/TelegramSearchBot

Length of output: 6109


🏁 Script executed:

# Check if UpdateManifest has any removal/deletion semantics
rg "UpdateManifest|RemovedPaths|DeletedFiles" --type cs -B 2 -A 3

Repository: ModerRAS/TelegramSearchBot

Length of output: 1923


🏁 Script executed:

# Verify how ComputeChangedFiles is used and whether step packages are actually used
rg "ComputeChangedFiles" --type cs -B 3 -A 3

Repository: ModerRAS/TelegramSearchBot

Length of output: 1723


🏁 Script executed:

# Check for incremental vs cumulative package logic
rg "step.*package|incremental|cumulative" --type cs -i

Repository: ModerRAS/TelegramSearchBot

Length of output: 3837


🏁 Script executed:

cat -n TelegramSearchBot.Common/Model/Update/UpdateManifest.cs

Repository: ModerRAS/TelegramSearchBot

Length of output: 702


🏁 Script executed:

cat -n TelegramSearchBot/Service/Update/SelfUpdateBootstrap.Windows.cs | sed -n '/ExtractPackageToDirectory/,/^}/p' | head -30

Repository: ModerRAS/TelegramSearchBot

Length of output: 1615


🏁 Script executed:

# Check if there's any manifest loading or manifest file in the package
rg "manifest|DeletedFiles|RemovedPaths" TelegramSearchBot/Service/Update/ --type cs -B 2 -A 2

Repository: ModerRAS/TelegramSearchBot

Length of output: 651


🏁 Script executed:

# Check ResetDirectory implementation
rg "ResetDirectory" --type cs -B 2 -A 5

Repository: ModerRAS/TelegramSearchBot

Length of output: 1645


🏁 Script executed:

# Verify cumulative package includes all files from anchor
sed -n '75,130p' TelegramSearchBot.UpdateBuilder/Program.cs | grep -A 10 "changedFromAnchor = ComputeChangedFiles"

Repository: ModerRAS/TelegramSearchBot

Length of output: 839


Step packages cannot represent file deletions and will leave stale binaries in incremental update chains.

ComputeChangedFiles only emits files that are new or whose content changed in currentFiles. Files that existed in prevFiles but were removed are silently dropped. Since tar extraction is additive (files not in the archive remain on disk), a release that removes any file will cause step-applied installs to retain orphaned binaries alongside the new manifest.

Example: if v1.0→v2.0 adds fileA, and v2.0→v3.0 removes it, extracting both step packages into the shared staging directory leaves fileA present in the final install even though v3.0 should not contain it.

Implement the proposed guard to detect deletions and fall back to cumulative packages, which correctly represent the final state:

Deletion detection fix
 static List<UpdateFileContent> ComputeChangedFiles(
     List<UpdateFileContent> prevFiles,
     List<UpdateFileContent> currentFiles)
 {
     var prevMap = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
     foreach (var f in prevFiles)
     {
         prevMap[f.RelativePath] = ComputeSha512(f.Content);
     }
+
+    var currentPaths = new HashSet<string>(
+        currentFiles.Select(f => f.RelativePath), StringComparer.OrdinalIgnoreCase);
+    var hasDeletions = prevMap.Keys.Any(p => !currentPaths.Contains(p));
+    if (hasDeletions)
+    {
+        // Step packages can't represent removals; signal caller to fall back to cumulative.
+        return [];
+    }

     var changed = new List<UpdateFileContent>();
     foreach (var f in currentFiles)
     {
         var currentHash = ComputeSha512(f.Content);
         if (!prevMap.TryGetValue(f.RelativePath, out var prevHash) || prevHash != currentHash)
         {
             changed.Add(f);
         }
     }

     return changed;
 }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
static List<UpdateFileContent> ComputeChangedFiles(
List<UpdateFileContent> prevFiles,
List<UpdateFileContent> currentFiles)
{
var prevMap = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
foreach (var f in prevFiles)
{
prevMap[f.RelativePath] = ComputeSha512(f.Content);
}
var changed = new List<UpdateFileContent>();
foreach (var f in currentFiles)
{
var currentHash = ComputeSha512(f.Content);
if (!prevMap.TryGetValue(f.RelativePath, out var prevHash) || prevHash != currentHash)
{
changed.Add(f);
}
}
return changed;
}
static List<UpdateFileContent> ComputeChangedFiles(
List<UpdateFileContent> prevFiles,
List<UpdateFileContent> currentFiles)
{
var prevMap = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
foreach (var f in prevFiles)
{
prevMap[f.RelativePath] = ComputeSha512(f.Content);
}
var currentPaths = new HashSet<string>(
currentFiles.Select(f => f.RelativePath), StringComparer.OrdinalIgnoreCase);
var hasDeletions = prevMap.Keys.Any(p => !currentPaths.Contains(p));
if (hasDeletions)
{
// Step packages can't represent removals; signal caller to fall back to cumulative.
return [];
}
var changed = new List<UpdateFileContent>();
foreach (var f in currentFiles)
{
var currentHash = ComputeSha512(f.Content);
if (!prevMap.TryGetValue(f.RelativePath, out var prevHash) || prevHash != currentHash)
{
changed.Add(f);
}
}
return changed;
}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@TelegramSearchBot.UpdateBuilder/Program.cs` around lines 325 - 346,
ComputeChangedFiles currently ignores deletions, causing stale files to persist;
modify ComputeChangedFiles to detect any path present in prevFiles but missing
from currentFiles and signal a fallback to cumulative packages by returning null
(or another sentinel) instead of a partial changed list. Implement by building
prevMap as now, also build a HashSet<string> of current RelativePath values, and
after computing changed add a check: if any key in prevMap is not in the current
set then return null; otherwise return the changed list as before; keep the
function name ComputeChangedFiles and update any callers to handle the null
sentinel as "deletion detected -> use cumulative package" behavior.

Comment on lines +141 to +155
var directToTarget = candidates.FirstOrDefault(e =>
Version.TryParse(e.TargetVersion, out var tv) && tv >= targetVersion);

if (directToTarget is not null) {
bestPick = directToTarget;
} else {
var cumulative = candidates
.Where(e => e.IsCumulative)
.OrderByDescending(e => Version.TryParse(e.TargetVersion, out var v) ? v : new Version(0, 0))
.FirstOrDefault();

bestPick = cumulative ?? candidates
.OrderByDescending(e => Version.TryParse(e.TargetVersion, out var v) ? v : new Version(0, 0))
.First();
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Direct-to-target selection ignores the "prefer cumulative" rule.

When multiple candidates can reach targetVersion in one hop (e.g., a step entry and a cumulative entry both directly reach targetVersion), FirstOrDefault picks whichever the HashSet enumerates first — order is not guaranteed. This contradicts the PR objective of preferring cumulative packages and can yield non-deterministic plans across runs. Apply the same cumulative-first ordering to direct-to-target candidates.

🛠️ Proposed fix
-            UpdateCatalogEntry bestPick;
-            var directToTarget = candidates.FirstOrDefault(e =>
-                Version.TryParse(e.TargetVersion, out var tv) && tv >= targetVersion);
-
-            if (directToTarget is not null) {
-                bestPick = directToTarget;
-            } else {
-                var cumulative = candidates
-                    .Where(e => e.IsCumulative)
-                    .OrderByDescending(e => Version.TryParse(e.TargetVersion, out var v) ? v : new Version(0, 0))
-                    .FirstOrDefault();
-
-                bestPick = cumulative ?? candidates
-                    .OrderByDescending(e => Version.TryParse(e.TargetVersion, out var v) ? v : new Version(0, 0))
-                    .First();
-            }
+            var ordered = candidates
+                .OrderByDescending(e =>
+                    Version.TryParse(e.TargetVersion, out var tv) && tv >= targetVersion)
+                .ThenByDescending(e => e.IsCumulative)
+                .ThenByDescending(e => Version.TryParse(e.TargetVersion, out var v) ? v : new Version(0, 0))
+                .ToList();
+            var bestPick = ordered.First();
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@TelegramSearchBot/Service/Update/SelfUpdateBootstrap.Windows.cs` around lines
141 - 155, The direct-to-target selection currently picks the first matching
element from candidates (via directToTarget) which can be nondeterministic and
ignores the "prefer cumulative" rule; change the selection to filter candidates
that parse to >= targetVersion and then OrderByDescending(e => e.IsCumulative)
(and then OrderByDescending by parsed TargetVersion as a tie-breaker) before
taking FirstOrDefault so cumulative entries are preferred deterministically;
update the directToTarget assignment (and keep the Version.TryParse logic used
elsewhere) to use this ordered LINQ query and then assign bestPick accordingly.

@ModerRAS ModerRAS merged commit 20acf24 into master Apr 25, 2026
18 checks passed
@ModerRAS ModerRAS deleted the refactor/moder-update-incremental-chain branch April 25, 2026 03:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

refactor: realign update flow with Moder.Update incremental chain model

1 participant