Conversation
- Extract shared UpdateCatalog/UpdateCatalogEntry/UpdateManifest/UpdateFile models to TelegramSearchBot.Common/Model/Update/ - Rewrite UpdateBuilder to generate step packages (changed files only) and cumulative/anchor packages, merge with existing catalog - Rewrite SelfUpdateBootstrap runtime with PlanUpdatePath for chain-based upgrade path planning, cumulative preference, and sequential multi-package application - Update push.yml to fetch existing catalog for merge, download previous release standalone for step package comparison, and replace file-based prune with catalog-reachability-based prune
📝 WalkthroughWalkthroughThis PR refactors the update system to implement a multi-step incremental chain model based on issue Changes
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 4 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
🔍 PR检查报告📋 检查概览
🧪 测试结果
📊 代码质量
📁 测试产物
🔗 相关链接此报告由GitHub Actions自动生成 |
There was a problem hiding this comment.
Actionable comments posted: 6
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/workflows/push.yml:
- Around line 248-254: The try/catch around the b2 call won't catch
non-terminating failures; instead run the b2 download-file-by-name command, then
check $LASTEXITCODE to determine success, log "Downloaded existing catalog.json
for merge." only on success and log "No existing catalog found on B2, will
create fresh catalog." on failure; use Test-Path
'.\artifacts\existing-catalog.json' only after confirming $LASTEXITCODE was 0
(or explicitly remove any stale/partial file on failure) and then call b2
account clear regardless. Ensure you reference the b2 download-file-by-name
invocation, $LASTEXITCODE, Test-Path '.\artifacts\existing-catalog.json', and b2
account clear when making the changes.
- Around line 270-271: The current steps use Write-Host piped to Out-File which
emits nothing to the pipeline, so PREV_VERSION and PREV_STANDALONE_DIR are not
written to $env:GITHUB_ENV; replace those lines to write directly to the GitHub
env file (for example use Add-Content -Path $env:GITHUB_ENV -Value
"PREV_VERSION=$prevVersion" and Add-Content -Path $env:GITHUB_ENV -Value
"PREV_STANDALONE_DIR=artifacts\prev-standalone") so the variables PREV_VERSION
and PREV_STANDALONE_DIR are actually exported into the environment for later
steps.
In `@TelegramSearchBot.UpdateBuilder/Program.cs`:
- Around line 228-276: BuildPackageFile currently builds the package twice and
computes the checksum over the first build (tempBytes) while returning the
second build (finalBytes), so the stored PackageChecksum will never match the
bytes written to disk; also minSourceVersion is unused. Fix by building the
package only once: create the manifest with Checksum = string.Empty (use
minSourceVersion for MinSourceVersion instead of fromVersion), call
CreatePackage once to produce packageBytes (drop the second
CreatePackage/finalManifest), compute finalChecksum =
ComputeSha512(packageBytes), and return (packageBytes, finalChecksum); update
any call sites to remove the now-unused minSourceVersion parameter if you change
the signature, and keep references to
tempManifest/tempBytes/finalChecksum/finalBytes to locate the changes.
- Around line 325-346: ComputeChangedFiles currently ignores deletions, causing
stale files to persist; modify ComputeChangedFiles to detect any path present in
prevFiles but missing from currentFiles and signal a fallback to cumulative
packages by returning null (or another sentinel) instead of a partial changed
list. Implement by building prevMap as now, also build a HashSet<string> of
current RelativePath values, and after computing changed add a check: if any key
in prevMap is not in the current set then return null; otherwise return the
changed list as before; keep the function name ComputeChangedFiles and update
any callers to handle the null sentinel as "deletion detected -> use cumulative
package" behavior.
In `@TelegramSearchBot/Service/Update/SelfUpdateBootstrap.Windows.cs`:
- Around line 141-155: The direct-to-target selection currently picks the first
matching element from candidates (via directToTarget) which can be
nondeterministic and ignores the "prefer cumulative" rule; change the selection
to filter candidates that parse to >= targetVersion and then OrderByDescending(e
=> e.IsCumulative) (and then OrderByDescending by parsed TargetVersion as a
tie-breaker) before taking FirstOrDefault so cumulative entries are preferred
deterministically; update the directToTarget assignment (and keep the
Version.TryParse logic used elsewhere) to use this ordered LINQ query and then
assign bestPick accordingly.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: b68368a0-0496-4cbd-84fb-aacbf6c38cf9
📒 Files selected for processing (8)
.github/workflows/push.ymlTelegramSearchBot.Common/Model/Update/UpdateCatalog.csTelegramSearchBot.Common/Model/Update/UpdateCatalogEntry.csTelegramSearchBot.Common/Model/Update/UpdateFile.csTelegramSearchBot.Common/Model/Update/UpdateManifest.csTelegramSearchBot.UpdateBuilder/Program.csTelegramSearchBot.UpdateBuilder/TelegramSearchBot.UpdateBuilder.csprojTelegramSearchBot/Service/Update/SelfUpdateBootstrap.Windows.cs
| try { | ||
| b2 download-file-by-name $env:B2_BUCKET TelegramSearchBot/catalog.json .\artifacts\existing-catalog.json --quiet | ||
| Write-Host "Downloaded existing catalog.json for merge." | ||
| } catch { | ||
| Write-Host "No existing catalog found on B2, will create fresh catalog." | ||
| } | ||
| b2 account clear |
There was a problem hiding this comment.
try/catch won't trap b2 failures; native commands return exit codes, not terminating errors.
If b2 download-file-by-name fails (e.g., 404 because the catalog doesn't exist yet), the catch block won't run; the step continues, and the later Test-Path '.\artifacts\existing-catalog.json' may pass against a stale or partial file (b2 sometimes touches the path before failing). Gate on $LASTEXITCODE instead so the "no catalog" log message accurately reflects what happened.
🛠️ Suggested fix
- try {
- b2 download-file-by-name $env:B2_BUCKET TelegramSearchBot/catalog.json .\artifacts\existing-catalog.json --quiet
- Write-Host "Downloaded existing catalog.json for merge."
- } catch {
- Write-Host "No existing catalog found on B2, will create fresh catalog."
- }
+ b2 download-file-by-name $env:B2_BUCKET TelegramSearchBot/catalog.json .\artifacts\existing-catalog.json --quiet 2>$null
+ if ($LASTEXITCODE -eq 0 -and (Test-Path .\artifacts\existing-catalog.json)) {
+ Write-Host "Downloaded existing catalog.json for merge."
+ } else {
+ if (Test-Path .\artifacts\existing-catalog.json) { Remove-Item .\artifacts\existing-catalog.json -Force }
+ Write-Host "No existing catalog found on B2, will create fresh catalog."
+ }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.github/workflows/push.yml around lines 248 - 254, The try/catch around the
b2 call won't catch non-terminating failures; instead run the b2
download-file-by-name command, then check $LASTEXITCODE to determine success,
log "Downloaded existing catalog.json for merge." only on success and log "No
existing catalog found on B2, will create fresh catalog." on failure; use
Test-Path '.\artifacts\existing-catalog.json' only after confirming
$LASTEXITCODE was 0 (or explicitly remove any stale/partial file on failure) and
then call b2 account clear regardless. Ensure you reference the b2
download-file-by-name invocation, $LASTEXITCODE, Test-Path
'.\artifacts\existing-catalog.json', and b2 account clear when making the
changes.
| Write-Host "PREV_VERSION=$prevVersion" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append | ||
| Write-Host "PREV_STANDALONE_DIR=artifacts\prev-standalone" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append |
There was a problem hiding this comment.
Critical: Write-Host piped to Out-File writes nothing to $GITHUB_ENV.
Write-Host writes to the information stream and emits no objects to the success pipeline, so Out-File receives nothing and PREV_VERSION / PREV_STANDALONE_DIR are never exported. Subsequent steps will see $env:PREV_VERSION as empty, the if ($env:PREV_VERSION -and ...) guard at line 294 fails, and step packages will silently never be generated — the entire chain-update feature regresses to fallback-only. The same idiom is used correctly elsewhere in this file (e.g., line 44).
🛠️ Proposed fix
- Write-Host "PREV_VERSION=$prevVersion" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append
- Write-Host "PREV_STANDALONE_DIR=artifacts\prev-standalone" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append
+ "PREV_VERSION=$prevVersion" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append
+ "PREV_STANDALONE_DIR=artifacts\prev-standalone" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append
+ Write-Host "Exported PREV_VERSION=$prevVersion and PREV_STANDALONE_DIR."📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| Write-Host "PREV_VERSION=$prevVersion" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append | |
| Write-Host "PREV_STANDALONE_DIR=artifacts\prev-standalone" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append | |
| "PREV_VERSION=$prevVersion" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append | |
| "PREV_STANDALONE_DIR=artifacts\prev-standalone" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append | |
| Write-Host "Exported PREV_VERSION=$prevVersion and PREV_STANDALONE_DIR." |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In @.github/workflows/push.yml around lines 270 - 271, The current steps use
Write-Host piped to Out-File which emits nothing to the pipeline, so
PREV_VERSION and PREV_STANDALONE_DIR are not written to $env:GITHUB_ENV; replace
those lines to write directly to the GitHub env file (for example use
Add-Content -Path $env:GITHUB_ENV -Value "PREV_VERSION=$prevVersion" and
Add-Content -Path $env:GITHUB_ENV -Value
"PREV_STANDALONE_DIR=artifacts\prev-standalone") so the variables PREV_VERSION
and PREV_STANDALONE_DIR are actually exported into the environment for later
steps.
| static List<UpdateCatalogEntry> PruneCatalogEntries( | ||
| List<UpdateCatalogEntry> entries, | ||
| string latestVersion, | ||
| string? minRequiredVersion, | ||
| string? prevVersion, | ||
| string? anchorVersion) | ||
| { | ||
| const int MaxEntries = 20; | ||
|
|
||
| var latest = entries.Where(e => | ||
| string.Equals(e.TargetVersion, latestVersion, StringComparison.OrdinalIgnoreCase)) | ||
| .ToList(); | ||
|
|
||
| var rest = entries.Where(e => | ||
| !string.Equals(e.TargetVersion, latestVersion, StringComparison.OrdinalIgnoreCase)) | ||
| .OrderByDescending(e => e.IsCumulative) | ||
| .ThenByDescending(e => Version.TryParse(e.TargetVersion, out var v) ? v : new Version(0, 0)) | ||
| .ToList(); | ||
|
|
||
| var result = latest; | ||
| result.AddRange(rest.Take(MaxEntries - latest.Count)); | ||
|
|
||
| return result; | ||
| } |
There was a problem hiding this comment.
PruneCatalogEntries ignores three of its parameters and doesn't enforce the documented retention policy.
minRequiredVersion, prevVersion, and anchorVersion are accepted but never referenced in the body; the actual policy is just "keep all entries targeting latestVersion, then top-up to 20 by (cumulative-first, then version)". That doesn't match the PR objective:
Define retention/prune policy: keep current and previous anchor windows, generate cumulative packages after thresholds (e.g., 5 steps or size ratio), and only delete packages not on any supported path to latestVersion.
Concrete risks of the current implementation:
- An entry with
MinSourceVersion < minRequiredVersionmay still be retained, contradicting the catalog's ownMinRequiredVersioninvariant. - Old anchor windows aren't preferentially kept; once 17 newer non-cumulative entries accumulate, the previous anchor can be evicted, breaking longer chains.
- Step entries that bridge two anchors can be evicted while the anchors survive, creating gaps in
PlanUpdatePath.
Either implement reachability/anchor-aware retention using the parameters you've already declared, or remove the dead parameters and document the simplified policy explicitly so reviewers don't expect more.
🛠️ Sketch — anchor-aware retention
static List<UpdateCatalogEntry> PruneCatalogEntries(
List<UpdateCatalogEntry> entries,
string latestVersion,
string? minRequiredVersion,
string? prevVersion,
string? anchorVersion)
{
const int MaxEntries = 20;
+ Version? minVer = null;
+ if (!string.IsNullOrWhiteSpace(minRequiredVersion))
+ {
+ Version.TryParse(minRequiredVersion, out minVer);
+ }
+
+ // Drop entries whose source range is entirely below minRequiredVersion.
+ if (minVer is not null)
+ {
+ entries = entries
+ .Where(e => !Version.TryParse(e.MaxSourceVersion ?? e.MinSourceVersion, out var max) || max >= minVer)
+ .ToList();
+ }
+
var latest = entries.Where(e =>
string.Equals(e.TargetVersion, latestVersion, StringComparison.OrdinalIgnoreCase))
.ToList();
var rest = entries.Where(e =>
!string.Equals(e.TargetVersion, latestVersion, StringComparison.OrdinalIgnoreCase))
- .OrderByDescending(e => e.IsCumulative)
+ .OrderByDescending(e => e.IsAnchor) // keep anchors first
+ .ThenByDescending(e => e.IsCumulative)
.ThenByDescending(e => Version.TryParse(e.TargetVersion, out var v) ? v : new Version(0, 0))
.ToList();
var result = latest;
result.AddRange(rest.Take(MaxEntries - latest.Count));
return result;
}| static (byte[] PackageBytes, string Checksum) BuildPackageFile( | ||
| IReadOnlyList<UpdateFileContent> files, | ||
| string fromVersion, | ||
| string toVersion, | ||
| string? minSourceVersion, | ||
| bool isCumulative, | ||
| bool isAnchor, | ||
| int chainDepth) | ||
| { | ||
| var manifestFiles = files | ||
| .Select(file => new UpdateFile | ||
| { | ||
| RelativePath = file.RelativePath, | ||
| NewChecksum = ComputeSha512(file.Content) | ||
| }) | ||
| .ToList(); | ||
|
|
||
| var tempManifest = new UpdateManifest | ||
| { | ||
| TargetVersion = toVersion, | ||
| MinSourceVersion = fromVersion, | ||
| MaxSourceVersion = isCumulative ? null : fromVersion, | ||
| IsAnchor = isAnchor, | ||
| IsCumulative = isCumulative, | ||
| ChainDepth = chainDepth, | ||
| Files = manifestFiles, | ||
| Checksum = string.Empty, | ||
| CreatedAt = DateTime.UtcNow | ||
| }; | ||
|
|
||
| var tempBytes = CreatePackage(tempManifest, files); | ||
| var finalChecksum = ComputeSha512(tempBytes); | ||
|
|
||
| var finalManifest = new UpdateManifest | ||
| { | ||
| TargetVersion = toVersion, | ||
| MinSourceVersion = fromVersion, | ||
| MaxSourceVersion = isCumulative ? null : fromVersion, | ||
| IsAnchor = isAnchor, | ||
| IsCumulative = isCumulative, | ||
| ChainDepth = chainDepth, | ||
| Files = manifestFiles, | ||
| Checksum = finalChecksum, | ||
| CreatedAt = DateTime.UtcNow | ||
| }; | ||
|
|
||
| var finalBytes = CreatePackage(finalManifest, files); | ||
| return (finalBytes, finalChecksum); | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Confirm runtime never reads manifest.Checksum (manifest.json is skipped during extraction).
rg -nP -C3 'manifest\.json|manifest\.Checksum|UpdateManifest' --type=csRepository: ModerRAS/TelegramSearchBot
Length of output: 3383
🏁 Script executed:
rg -nP 'BuildPackageFile' --type=cs -A2 -B2Repository: ModerRAS/TelegramSearchBot
Length of output: 2179
Critical: stored PackageChecksum will never match the bytes written to disk — runtime verification will always fail.
BuildPackageFile builds the package twice:
tempBytes = CreatePackage(tempManifest, files)withtempManifest.Checksum = string.Empty.finalChecksum = ComputeSha512(tempBytes).finalManifest.Checksum = finalChecksum, thenfinalBytes = CreatePackage(finalManifest, files).- Returns
(finalBytes, finalChecksum).
The returned finalChecksum is SHA512(tempBytes), but the bytes written to disk are finalBytes. Because manifest.json is embedded inside the tar package, and the manifest's Checksum field changed from "" to a 128-char hex string, finalBytes ≠ tempBytes and therefore SHA512(finalBytes) ≠ finalChecksum.
At runtime, SelfUpdateBootstrap.Windows.cs::VerifyPackageChecksum hashes the downloaded bytes (= finalBytes) and compares against the stored PackageChecksum (= finalChecksum). The check will fail for every package, breaking auto-updates entirely.
Furthermore, ExtractPackageToDirectory explicitly skips manifest.json at line 300, so embedding a self-hash in the manifest has no consumer — the second build pass is wasted work. Also, the minSourceVersion parameter is never used inside the method; the manifest uses fromVersion instead.
🛠️ Suggested fix — single-build, hash the on-disk bytes
-static (byte[] PackageBytes, string Checksum) BuildPackageFile(
- IReadOnlyList<UpdateFileContent> files,
- string fromVersion,
- string toVersion,
- string? minSourceVersion,
- bool isCumulative,
- bool isAnchor,
- int chainDepth)
-{
- var manifestFiles = files
- .Select(file => new UpdateFile
- {
- RelativePath = file.RelativePath,
- NewChecksum = ComputeSha512(file.Content)
- })
- .ToList();
-
- var tempManifest = new UpdateManifest
- {
- TargetVersion = toVersion,
- MinSourceVersion = fromVersion,
- MaxSourceVersion = isCumulative ? null : fromVersion,
- IsAnchor = isAnchor,
- IsCumulative = isCumulative,
- ChainDepth = chainDepth,
- Files = manifestFiles,
- Checksum = string.Empty,
- CreatedAt = DateTime.UtcNow
- };
-
- var tempBytes = CreatePackage(tempManifest, files);
- var finalChecksum = ComputeSha512(tempBytes);
-
- var finalManifest = new UpdateManifest
- {
- TargetVersion = toVersion,
- MinSourceVersion = fromVersion,
- MaxSourceVersion = isCumulative ? null : fromVersion,
- IsAnchor = isAnchor,
- IsCumulative = isCumulative,
- ChainDepth = chainDepth,
- Files = manifestFiles,
- Checksum = finalChecksum,
- CreatedAt = DateTime.UtcNow
- };
-
- var finalBytes = CreatePackage(finalManifest, files);
- return (finalBytes, finalChecksum);
-}
+static (byte[] PackageBytes, string Checksum) BuildPackageFile(
+ IReadOnlyList<UpdateFileContent> files,
+ string fromVersion,
+ string toVersion,
+ bool isCumulative,
+ bool isAnchor,
+ int chainDepth)
+{
+ var manifestFiles = files
+ .Select(file => new UpdateFile
+ {
+ RelativePath = file.RelativePath,
+ NewChecksum = ComputeSha512(file.Content)
+ })
+ .ToList();
+
+ var manifest = new UpdateManifest
+ {
+ TargetVersion = toVersion,
+ MinSourceVersion = fromVersion,
+ MaxSourceVersion = isCumulative ? null : fromVersion,
+ IsAnchor = isAnchor,
+ IsCumulative = isCumulative,
+ ChainDepth = chainDepth,
+ Files = manifestFiles,
+ // Per-package self-hash isn't readable by the runtime (manifest.json is skipped on extract);
+ // the catalog's PackageChecksum is the source of truth.
+ Checksum = string.Empty,
+ CreatedAt = DateTime.UtcNow
+ };
+
+ var bytes = CreatePackage(manifest, files);
+ var checksum = ComputeSha512(bytes);
+ return (bytes, checksum);
+}(Update the three call sites to drop the now-unused minSourceVersion argument.)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@TelegramSearchBot.UpdateBuilder/Program.cs` around lines 228 - 276,
BuildPackageFile currently builds the package twice and computes the checksum
over the first build (tempBytes) while returning the second build (finalBytes),
so the stored PackageChecksum will never match the bytes written to disk; also
minSourceVersion is unused. Fix by building the package only once: create the
manifest with Checksum = string.Empty (use minSourceVersion for MinSourceVersion
instead of fromVersion), call CreatePackage once to produce packageBytes (drop
the second CreatePackage/finalManifest), compute finalChecksum =
ComputeSha512(packageBytes), and return (packageBytes, finalChecksum); update
any call sites to remove the now-unused minSourceVersion parameter if you change
the signature, and keep references to
tempManifest/tempBytes/finalChecksum/finalBytes to locate the changes.
| static List<UpdateFileContent> ComputeChangedFiles( | ||
| List<UpdateFileContent> prevFiles, | ||
| List<UpdateFileContent> currentFiles) | ||
| { | ||
| var prevMap = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase); | ||
| foreach (var f in prevFiles) | ||
| { | ||
| prevMap[f.RelativePath] = ComputeSha512(f.Content); | ||
| } | ||
|
|
||
| var changed = new List<UpdateFileContent>(); | ||
| foreach (var f in currentFiles) | ||
| { | ||
| var currentHash = ComputeSha512(f.Content); | ||
| if (!prevMap.TryGetValue(f.RelativePath, out var prevHash) || prevHash != currentHash) | ||
| { | ||
| changed.Add(f); | ||
| } | ||
| } | ||
|
|
||
| return changed; | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
Does the Moder.Update updater (https://github.com/ModerRAS/Moder.Update) merge staging files into the target directory or replace it wholesale, and does it support file deletions in incremental packages?
💡 Result:
The Moder.Update updater replaces files in the target directory atomically using Windows ReplaceFile API rather than merging staging files. There is no evidence it supports file deletions in incremental packages; packages contain complete application files for replacement ("完整替换文件(非差分)"), and no mention of deletion handling in manifests or FileReplacementService.
Citations:
🏁 Script executed:
# Search for PrepareUpdateChainAsync and staging directory references
rg "PrepareUpdateChainAsync|staging" --type cs -B 2 -A 5Repository: ModerRAS/TelegramSearchBot
Length of output: 6109
🏁 Script executed:
# Check if UpdateManifest has any removal/deletion semantics
rg "UpdateManifest|RemovedPaths|DeletedFiles" --type cs -B 2 -A 3Repository: ModerRAS/TelegramSearchBot
Length of output: 1923
🏁 Script executed:
# Verify how ComputeChangedFiles is used and whether step packages are actually used
rg "ComputeChangedFiles" --type cs -B 3 -A 3Repository: ModerRAS/TelegramSearchBot
Length of output: 1723
🏁 Script executed:
# Check for incremental vs cumulative package logic
rg "step.*package|incremental|cumulative" --type cs -iRepository: ModerRAS/TelegramSearchBot
Length of output: 3837
🏁 Script executed:
cat -n TelegramSearchBot.Common/Model/Update/UpdateManifest.csRepository: ModerRAS/TelegramSearchBot
Length of output: 702
🏁 Script executed:
cat -n TelegramSearchBot/Service/Update/SelfUpdateBootstrap.Windows.cs | sed -n '/ExtractPackageToDirectory/,/^}/p' | head -30Repository: ModerRAS/TelegramSearchBot
Length of output: 1615
🏁 Script executed:
# Check if there's any manifest loading or manifest file in the package
rg "manifest|DeletedFiles|RemovedPaths" TelegramSearchBot/Service/Update/ --type cs -B 2 -A 2Repository: ModerRAS/TelegramSearchBot
Length of output: 651
🏁 Script executed:
# Check ResetDirectory implementation
rg "ResetDirectory" --type cs -B 2 -A 5Repository: ModerRAS/TelegramSearchBot
Length of output: 1645
🏁 Script executed:
# Verify cumulative package includes all files from anchor
sed -n '75,130p' TelegramSearchBot.UpdateBuilder/Program.cs | grep -A 10 "changedFromAnchor = ComputeChangedFiles"Repository: ModerRAS/TelegramSearchBot
Length of output: 839
Step packages cannot represent file deletions and will leave stale binaries in incremental update chains.
ComputeChangedFiles only emits files that are new or whose content changed in currentFiles. Files that existed in prevFiles but were removed are silently dropped. Since tar extraction is additive (files not in the archive remain on disk), a release that removes any file will cause step-applied installs to retain orphaned binaries alongside the new manifest.
Example: if v1.0→v2.0 adds fileA, and v2.0→v3.0 removes it, extracting both step packages into the shared staging directory leaves fileA present in the final install even though v3.0 should not contain it.
Implement the proposed guard to detect deletions and fall back to cumulative packages, which correctly represent the final state:
Deletion detection fix
static List<UpdateFileContent> ComputeChangedFiles(
List<UpdateFileContent> prevFiles,
List<UpdateFileContent> currentFiles)
{
var prevMap = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase);
foreach (var f in prevFiles)
{
prevMap[f.RelativePath] = ComputeSha512(f.Content);
}
+
+ var currentPaths = new HashSet<string>(
+ currentFiles.Select(f => f.RelativePath), StringComparer.OrdinalIgnoreCase);
+ var hasDeletions = prevMap.Keys.Any(p => !currentPaths.Contains(p));
+ if (hasDeletions)
+ {
+ // Step packages can't represent removals; signal caller to fall back to cumulative.
+ return [];
+ }
var changed = new List<UpdateFileContent>();
foreach (var f in currentFiles)
{
var currentHash = ComputeSha512(f.Content);
if (!prevMap.TryGetValue(f.RelativePath, out var prevHash) || prevHash != currentHash)
{
changed.Add(f);
}
}
return changed;
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| static List<UpdateFileContent> ComputeChangedFiles( | |
| List<UpdateFileContent> prevFiles, | |
| List<UpdateFileContent> currentFiles) | |
| { | |
| var prevMap = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase); | |
| foreach (var f in prevFiles) | |
| { | |
| prevMap[f.RelativePath] = ComputeSha512(f.Content); | |
| } | |
| var changed = new List<UpdateFileContent>(); | |
| foreach (var f in currentFiles) | |
| { | |
| var currentHash = ComputeSha512(f.Content); | |
| if (!prevMap.TryGetValue(f.RelativePath, out var prevHash) || prevHash != currentHash) | |
| { | |
| changed.Add(f); | |
| } | |
| } | |
| return changed; | |
| } | |
| static List<UpdateFileContent> ComputeChangedFiles( | |
| List<UpdateFileContent> prevFiles, | |
| List<UpdateFileContent> currentFiles) | |
| { | |
| var prevMap = new Dictionary<string, string>(StringComparer.OrdinalIgnoreCase); | |
| foreach (var f in prevFiles) | |
| { | |
| prevMap[f.RelativePath] = ComputeSha512(f.Content); | |
| } | |
| var currentPaths = new HashSet<string>( | |
| currentFiles.Select(f => f.RelativePath), StringComparer.OrdinalIgnoreCase); | |
| var hasDeletions = prevMap.Keys.Any(p => !currentPaths.Contains(p)); | |
| if (hasDeletions) | |
| { | |
| // Step packages can't represent removals; signal caller to fall back to cumulative. | |
| return []; | |
| } | |
| var changed = new List<UpdateFileContent>(); | |
| foreach (var f in currentFiles) | |
| { | |
| var currentHash = ComputeSha512(f.Content); | |
| if (!prevMap.TryGetValue(f.RelativePath, out var prevHash) || prevHash != currentHash) | |
| { | |
| changed.Add(f); | |
| } | |
| } | |
| return changed; | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@TelegramSearchBot.UpdateBuilder/Program.cs` around lines 325 - 346,
ComputeChangedFiles currently ignores deletions, causing stale files to persist;
modify ComputeChangedFiles to detect any path present in prevFiles but missing
from currentFiles and signal a fallback to cumulative packages by returning null
(or another sentinel) instead of a partial changed list. Implement by building
prevMap as now, also build a HashSet<string> of current RelativePath values, and
after computing changed add a check: if any key in prevMap is not in the current
set then return null; otherwise return the changed list as before; keep the
function name ComputeChangedFiles and update any callers to handle the null
sentinel as "deletion detected -> use cumulative package" behavior.
| var directToTarget = candidates.FirstOrDefault(e => | ||
| Version.TryParse(e.TargetVersion, out var tv) && tv >= targetVersion); | ||
|
|
||
| if (directToTarget is not null) { | ||
| bestPick = directToTarget; | ||
| } else { | ||
| var cumulative = candidates | ||
| .Where(e => e.IsCumulative) | ||
| .OrderByDescending(e => Version.TryParse(e.TargetVersion, out var v) ? v : new Version(0, 0)) | ||
| .FirstOrDefault(); | ||
|
|
||
| bestPick = cumulative ?? candidates | ||
| .OrderByDescending(e => Version.TryParse(e.TargetVersion, out var v) ? v : new Version(0, 0)) | ||
| .First(); | ||
| } |
There was a problem hiding this comment.
Direct-to-target selection ignores the "prefer cumulative" rule.
When multiple candidates can reach targetVersion in one hop (e.g., a step entry and a cumulative entry both directly reach targetVersion), FirstOrDefault picks whichever the HashSet enumerates first — order is not guaranteed. This contradicts the PR objective of preferring cumulative packages and can yield non-deterministic plans across runs. Apply the same cumulative-first ordering to direct-to-target candidates.
🛠️ Proposed fix
- UpdateCatalogEntry bestPick;
- var directToTarget = candidates.FirstOrDefault(e =>
- Version.TryParse(e.TargetVersion, out var tv) && tv >= targetVersion);
-
- if (directToTarget is not null) {
- bestPick = directToTarget;
- } else {
- var cumulative = candidates
- .Where(e => e.IsCumulative)
- .OrderByDescending(e => Version.TryParse(e.TargetVersion, out var v) ? v : new Version(0, 0))
- .FirstOrDefault();
-
- bestPick = cumulative ?? candidates
- .OrderByDescending(e => Version.TryParse(e.TargetVersion, out var v) ? v : new Version(0, 0))
- .First();
- }
+ var ordered = candidates
+ .OrderByDescending(e =>
+ Version.TryParse(e.TargetVersion, out var tv) && tv >= targetVersion)
+ .ThenByDescending(e => e.IsCumulative)
+ .ThenByDescending(e => Version.TryParse(e.TargetVersion, out var v) ? v : new Version(0, 0))
+ .ToList();
+ var bestPick = ordered.First();🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@TelegramSearchBot/Service/Update/SelfUpdateBootstrap.Windows.cs` around lines
141 - 155, The direct-to-target selection currently picks the first matching
element from candidates (via directToTarget) which can be nondeterministic and
ignores the "prefer cumulative" rule; change the selection to filter candidates
that parse to >= targetVersion and then OrderByDescending(e => e.IsCumulative)
(and then OrderByDescending by parsed TargetVersion as a tie-breaker) before
taking FirstOrDefault so cumulative entries are preferred deterministically;
update the directToTarget assignment (and keep the Version.TryParse logic used
elsewhere) to use this ordered LINQ query and then assign bestPick accordingly.
Summary by CodeRabbit
New Features
Chores