mirror of
https://github.com/AdguardTeam/AdGuardHome.git
synced 2026-03-04 00:01:12 -05:00
The application is difficult to scale (K8S, Docker Swarm, OSH etc.) (TBD) #4516
Labels
No labels
P1: Critical
P2: High
P3: Medium
P4: Low
UI
bug
cannot reproduce
compatibility
dependencies
docker
documentation
duplicate
enhancement
enhancement
external libs
feature request
good first issue
help wanted
infrastructure
invalid
localization
needs investigation
performance
potential-duplicate
question
recurrent
research
snap
waiting for data
wontfix
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference
starred/AdGuardHome#4516
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @ammnt on GitHub (Jun 10, 2023).
Prerequisites
I have checked the Wiki and Discussions and found no answer
I have searched other issues and found no duplicates
I want to request a feature or enhancement and not ask a question
Description
What problem are you trying to solve?
For example, I want to deploy a Kubernetes cluster with two nodes and eight replicas with any cloud provider. But the application is difficult to scale because it uses local files to store the database (logs etc.) and configuration. This causes external CSI and persistent volumes to be used. But sharing the query database and configuration file does not work well between replicas/pods🫣
Proposed solution
Use an external NoSQL database or otherwise change the algorithm for working with the database and configuration file. We also can consider to other ways🤔
Additional information
I can provide in pm my k8s manifest to deploy Kubernetes so you can play back the inconvenience I encountered trying to cluster the app😀
@ainar-g commented on GitHub (Jun 13, 2023):
Merging into #573.