mirror of
https://github.com/louislam/dockge.git
synced 2026-03-03 02:06:55 -05:00
stack marked as "exited" when conatiners are running #40
Labels
No labels
bug
feature-request
help
help wanted
invalid-format
need-reproduce-steps
question
security
upstream
wontfix
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/dockge-louislam#40
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @zx900930 on GitHub (Nov 24, 2023).
⚠️ Please verify that this bug has NOT been reported before.
🛡️ Security Policy
Description
I'm trying to deploy this project using dockge

https://github.com/makeplane/plane
but it was marked as exited even if those containers are running.
👟 Reproduction steps
👀 Expected behavior
Stack will be marked as ''active'' when containers starts running.
😓 Actual Behavior
Stack marked as ''exited'' when containers starts running.
Dockge Version
1.1.1
💻 Operating System and Arch
Debian GNU/Linux 12 (bookworm) x86_64
🌐 Browser
Google Chrome 119.0.6045.160
🐋 Docker Version
Docker CE 24.0.7
🟩 NodeJS Version
No response
📝 Relevant log output
No response
@louislam commented on GitHub (Nov 25, 2023):
I tried this stack, I saw the
miniocontainer is not started, I think it is the reason, because the stack is active only if all containers are up.@zx900930 commented on GitHub (Nov 25, 2023):
That
createbucketsminio container is an init task, like thejobsin k8s, once the needed bucket is created, it will exit.Can we add a filter (using labels for example:
- "dockge.container.status.enable=false") to exclude containers from being checked by dockge?@Yann-J commented on GitHub (Nov 29, 2023):
I have a similar issue with a Plex compose file, which contains an init container that installs/updates some plugins, then exits. There is a
depends_on: {plex_plugins:{condition: service_completed_successfully}}condition on the main plex container@Yann-J commented on GitHub (Nov 29, 2023):
Looking a bit into the code, I fear this might be tricky to implement, as right now the status is computed based on parsing the response from
docker compose lswhich will return something likeexited(1), running(2), without further details...@thefrana commented on GitHub (Dec 1, 2023):
I also encountered this issue and all responses from
docker compose lsare in statusrunning. It still shows exited in the UI.@nzprog commented on GitHub (Dec 6, 2023):
Im also having this problem.
@queeup commented on GitHub (Dec 22, 2023):
Same here. I am using bash container to do some jobs before services start like init task.
@golgor commented on GitHub (Jan 2, 2024):
I also bumped into this problem. I haven't looked into the code at all, but isn't is possible to somehow specify which services to include in the status? I.e. no real changes in the how everything is executed/managed and no changes needed in the docker-compose, but just some updated to the GUI like a setting "Ignore these services to track status".
Sorry if that is a stupid idea... I'm fairly new to docker as a whole and especially dockge.
@carelinus commented on GitHub (Jan 4, 2024):
Same issue here.
docker compose lsshows correct status, dockge shows stack as exited.@akshara-tg commented on GitHub (Jan 10, 2024):
I also have the same issue. I have total 50 containers.
Around 40 containers are showing as running. The remaining 10 are shown as inactive.
Below is one of the stack showing as inactive while actually it is up & running. docker compose ls also shows it as running.
@tippfehlr commented on GitHub (Jan 10, 2024):
I would propose to just show how many of the containers are running, just like the output of
docker compose ls.e.g. "4/5 running"
If one of the containers crashes/exits before the others, there is currently no indicator that some containers might still be running.
@arminus commented on GitHub (Jan 24, 2024):
Here's another perfectly valid example where an init container is stopped by default:
Appreciate the work on this regardless!
@ChrisB85 commented on GitHub (Jan 24, 2024):
Same issue here with just one container only.

@Triskae commented on GitHub (Mar 6, 2024):
Me too here, is the satck missing a config, or something like that ?
Great job for dockge, saves me a lot of time !
@bwcummings1 commented on GitHub (Jun 3, 2024):
Has anyone found a resolution to this bug yet? I have a project that is running in the browser port, but showing as exited in the UI.
@x1ao4 commented on GitHub (Jun 9, 2024):
You're right, when a stack has one or more containers that have exited but still has other running containers, the stack is shown as "exited." To me, this seems like a bug because if there are still running containers, the stack's status should be "running" rather than "exited." In Docker Desktop or Orbstack, such a state would be shown as "running," which better meets user expectations. I hope the logic can be modified so that in this situation, the status is displayed as "running." Only when all containers have exited should it show as "exited."
@Shponzo commented on GitHub (Jul 16, 2024):
I'm experiencing the same problem. Is there any update on this bug?
@Preclowski commented on GitHub (Aug 31, 2024):
cd /opt/stacks/yourstack && docker compose up -d --remove-orphansshould help most of you guys :)@Handrail9 commented on GitHub (Sep 2, 2024):
Perhaps another solution to this bug could be for Dockge to automatically run this before marking a container as exited.
@chaun14 commented on GitHub (Oct 30, 2024):
Same weird issue here, all containers are working, but the status is shown as exited which doesn't really make sense.
I tried the above solutions of clearing orphans but no real effect.
@GilDev commented on GitHub (Dec 12, 2024):
Same issue with default Paperless-ngx and InvenTree stacks:
@possiblyanowl commented on GitHub (Jan 12, 2025):
I also am running into this issue. I use a container to do some init scripts, and the other containers in my compose file use
depends_on:
init:
condition: service_completed_succesfully
I would love to have a solution to exempt a container from the stacks overall "status".
@justin13888 commented on GitHub (Mar 22, 2025):
Wanted to add that any docker compose setup where there are some sort of init task that are suppose to exit early (e.g. all the Zitadel docker compose examples in my case) would have this symptom.
Perhaps some sort of flag specific dockge to indicate that this is intended would be a potential solution.
@blackshroud commented on GitHub (Jul 22, 2025):
This is a great idea. A simple check box on each item to exclude from the overall status or something similar?
@GabeDuarteM commented on GitHub (Dec 31, 2025):
@louislam Is anyone working on this? If not, I'd like to tackle it!
From the discussions here, I see a few options:
dockge.container.status.enable=falseto exclude containers from status checksPersonally, I think something like option 3 would work pretty nice, it provides better status visibility without requiring users to add labels or config, and naturally handles init containers that are expected to exit.
Do you have a preference, or another approach in mind?
@major-mayer commented on GitHub (Jan 21, 2026):
@GabeDuarteM I think option 3 is a good way to tackle this issue.
In my case, one stack is constantly showing as inactive because I use docker profiles to prevent containers from running on default stack startup: https://docs.docker.com/compose/how-tos/profiles/
So this stack would currently never show as active, even though the relevant containers are running.
docker compose lsshows the status correctly: