Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

VPA Not honoring maxAllowed Memory Limit #6996

Open
kmsarabu opened this issue Jul 2, 2024 · 1 comment
Open

VPA Not honoring maxAllowed Memory Limit #6996

kmsarabu opened this issue Jul 2, 2024 · 1 comment
Labels
area/vertical-pod-autoscaler kind/bug Categorizes issue or PR as related to a bug.

Comments

@kmsarabu
Copy link

kmsarabu commented Jul 2, 2024

I am encountering an issue with the Vertical Pod Autoscaler (VPA) where it does not honor the maxAllowed resource limits for memory. Below is the VPA definition I am using:

apiVersion: "autoscaling.k8s.io/v1"
kind: VerticalPodAutoscaler
metadata:
  name: test-vpa
spec:
  targetRef:
    apiVersion: "apps/v1"
    kind: Deployment
    name: test-deployment
  updatePolicy:
    updateMode: "Auto"
  resourcePolicy:
    containerPolicies:
      - containerName: '*'
        controlledResources: ["cpu", "memory"]
        minAllowed:
          cpu: 500m
          memory: 4Gi
        maxAllowed:
          cpu: 4
          memory: 16Gi

After running a CPU stress test, the resulting resource limits observed on the pods are:

Limits:
  cpu:     4
  memory:  32Gi   <- this is more than VPA Object's MaxAllowed->memory?
Requests:
  cpu:      500m
  memory:   4Gi

Despite setting the maxAllowed memory limit to 16Gi, the VPA scaled the memory up to 32Gi.

Steps to Reproduce:

  1. Deploy a VPA with the provided configuration.
  2. Apply a CPU stress test to the target deployment.
  3. Observe the memory and CPU limits on the autoscaled pods.

Expected Behavior: The memory limit should not exceed the maxAllowed value of 16Gi.
Actual Behavior: The memory limit scales up to 32Gi, exceeding the maxAllowed value.

Could there be any known issues or configurations that might lead to this behavior? Thank you in advance for your help!

@kmsarabu kmsarabu added the kind/bug Categorizes issue or PR as related to a bug. label Jul 2, 2024
@adrianmoisey
Copy link
Contributor

/area vertical-pod-autoscaler

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/vertical-pod-autoscaler kind/bug Categorizes issue or PR as related to a bug.
Projects
None yet
Development

No branches or pull requests

3 participants