mirror of
https://github.com/louislam/uptime-kuma.git
synced 2026-03-02 22:57:00 -05:00
Select monitors for maintenance windows by AND-ing Tags #2157
Labels
No labels
A:accessibility
A:api
A:cert-expiry
A:core
A:dashboard
A:deployment
A:documentation
A:domain expiry
A:incidents
A:maintenance
A:metrics
A:monitor
A:notifications
A:reports
A:settings
A:status-page
A:ui/ux
A:user-management
Stale
ai-slop
blocked
blocked-upstream
bug
cannot-reproduce
dependencies
discussion
duplicate
feature-request
feature-request
good first issue
hacktoberfest
help
help wanted
house keeping
invalid
invalid-format
invalid-format
question
releaseblocker 🚨
security
spam
type:enhance-existing
type:new
wontfix
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/uptime-kuma#2157
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @gardium90 on GitHub (May 3, 2023).
⚠️ Please verify that this feature request has NOT been suggested before.
🏷️ Feature Request Type
UI Feature
🔖 Feature description
Enable the use of tags in maintenance window configurations in addition to specific monitor selection.
Allow a combination of tags for flexibility (e.g., platform/service and environment).
When selecting tags, ensure an 'all-inclusive match' (all tags must be matched), to define monitors as selected for a maintenance window.
If the new monitor had the appropriate set of tags with this new feature, it would then automatically be picked up by the maintenance window config.
✔️ Solution
Implement a tagging system where users can group monitors with relevant tags (e.g., services
Foo,Bar, and environmentsDev,Prod) Depending on the type of maintenance/issues, it may affect any combination of these services and environments.By grouping together the necessary tags, e.g.
A=>A's environments, orProd=>Prodsystems, or a combination, sayFoo+Dev, then the maintenance window knows which monitors to deactivate during the maintenance window, without having to select specific monitors.❓ Alternatives
It would already help, to see the tags assigned to a monitor in the 'affected monitors' dropdown, but I'm not sure if this is more cumbersome to also have a good UI for this,
📝 Additional Context
The issue was reworked to be clearer/less wordy by @CommanderStorm. Please refer to the issue history for the original post.
@CommanderStorm commented on GitHub (Jun 1, 2023):
Is this a duplicate of #2457?
If no, please edit your feature request to make it more distinct and/or how the change you want builds upon the linked issue.
If yes, please close this issue (duplicate issues just makes managing issues harder)
@gardium90 commented on GitHub (Jun 1, 2023):
Apologies. I didn't find this feature request while searching keywords in the Issue Tracker.
However, after reading that feature request, there is a major differences in my opinion that I already address in my request, plus in terms of functionality and 'description' of what I want to achieve, my feature request is more specific.
The major difference I see, is that they request "[...] all monitors with any of the tags selected, will be added to the maintenance windows". I specify in this request I'd like an 'all-inclusive match', due to multi instance tagging AND service tagging.
I hope this clarifies, but if you disagree to my point then please let me know and I'll close the request.
Thank you, and have a nice day.
@CommanderStorm commented on GitHub (Jun 1, 2023):
The reason I like the other issue more is, that it is more concise, readable, older and more liked (we use 👍🏻 to prioritise work).
I don't understand what you mean by “all-inclusive match”. What do you mean?
Could this not be integrated into the issue @peschmae created?
@peschmae commented on GitHub (Jun 1, 2023):
I think adding a toggle to switch between -
and/ormatching for the selected tags shouldn't be too much of an issue.I can easily add this to my proposal
@CommanderStorm commented on GitHub (Dec 8, 2023):
@gardium90
I have reworded the issue to make the content more clear what you are requesting.
@gardium90 commented on GitHub (Dec 11, 2023):
@CommanderStorm Hello, sorry I didn't have time to respond to your comment from 2 days ago. I thought this request was put to inactive as part of being consolidated with @peschmae 's proposal.
I believe upon reviewing your changes to the request, you seem to have understood what I meant. Service and instance seem to be reflected correctly, and my wish was the appropriate "and-ing of tags" as you have clarified now. I still see a reason to have "or-ing" as well, so however you chose to consolidate the requests I'll just be happy if this feature makes in into a upstream release.
Thank you again for the consideration, have a merry festive season!
@sevmonster commented on GitHub (Dec 16, 2023):
Yes, as someone that uses tags for tons of services, sometimes they intersect. For example, if I will be performing maintenance on Docker on a specific host, then I would want monitors with the
host1ANDdockertags to be included. But if I am taking down two different hosts for maintenance, I would want monitorshost1ORhost2to be included.The fact that there is no capability of using tags for maintenance makes the feature excessively cumbersome for users with a lot of monitors (like me), to the point where I do not use it due to the extra administrative overhead.
Specifically, when adding or modifying a new service that should be affected by update windows, you must find and update said maintenance configurations one at a time. But when using tags, all you have to do is add/remove tags and have the maintenance config query the tags dynamically.
@bmbvenom commented on GitHub (May 21, 2025):
Hello, it's a year and half later and I was just wondering if there has been any progress on this. If not in v1 maybe in v2?
Thank you for all your work on this great project!
@CommanderStorm commented on GitHub (May 21, 2025):
Nobody has done work on this to our knowlege.
It is also not on my cutting block of issues I want to implement => feel free to implement it.
If the UX is close to the rest and does not incurr masssive overhead for the workflow this would be something I would merge.
@CommanderStorm commented on GitHub (May 21, 2025):
Also, this issue does not need to be in v2.0 I don't see any possibility for breakage.