Skip to content

[bug]: “InvokeAI 6.4 and above shuts down after generating about 250 images.” #8543

@baradoc76

Description

@baradoc76

Is there an existing issue for this problem?

  • I have searched the existing issues

Install method

Invoke's Launcher

Operating system

macOS

GPU vendor

Apple Silicon (MPS)

GPU model

Mac Studio M2 Max

GPU VRAM

64

Version number

6.5

Browser

No response

System Information

No response

What happened

After generating approximately 200–300 images (either via the Web UI or Desktop app), InvokeAI suddenly crashes without error dialogs, returning: Process exited with code: 0
In some cases, the following stack trace appears:
BrokenPipeError: [Errno 32] Broken pipe
...
self.sp = self.status_printer(self.fp)
...
File "/invokeai/backend/stable_diffusion/diffusers_pipeline.py", line 395, in latents_from_embeddings
for i, t in enumerate(self.progress_bar(timesteps)):
Additionally, in the UI, this exception occurs:
{
"name": "TypeError",
"message": "e.map is not a function",
"stack": "... App-BAHkxqK1.js"
}
• Crash does not correlate with a specific prompt or model.
• The issue occurs regardless of the frontend used (desktop app or Web UI).
• The issue already existed in version 6.4 and is still present.
• Memory usage is monitored and stable before crash.
• Logs indicate full VRAM use (VRAM: 100%) just before termination.
• Happened both with the --max_loaded_models=1 setting and without.
• Reproducible after consistent long runs, regardless of whether using scheduler queues or direct prompt
• Model cache logs show MPS device memory being filled progressively:
VRAM: 100% just before crash

What you expected to happen

not to crash :-)

How to reproduce the problem

Steps to Reproduce
1. Launch InvokeAI (UI or Web)
2. Queue or manually trigger 200+ image generations in a session
3. Wait for process crash or silent termination

Additional context

•	The system is stable, has plenty of resources, and the issue is not linked to any specific model or prompt.
•	No memory leak or OOM errors are observed.
•	Precision set to float32 for compatibility with MPS but the bug is still here

Discord username

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions