Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update gcs storage solution design #3410

Open
wants to merge 1 commit into
base: develop
Choose a base branch
from

Conversation

ighosh98
Copy link
Contributor

@ighosh98 ighosh98 commented Dec 16, 2024

Add support to allow customers to use already existing gcs buckets by leveraging existing_gcs_bucket_name and link it to Persistent Volume. In case gcs bucket isn't given as an input, we create a new one for the customer.

Testing Details

  • Created gcs bucket(cluster-toolkit-gcs-test)
  • Updated storage-gke.yaml to include existing bucket details
 - id: data-bucket
    source: community/modules/file-system/cloud-storage-bucket
    settings:
      local_mount: /data
      random_suffix: true
      force_destroy: true
      existing_gcs_bucket_name: "cluster-toolkit-gcs-test"
  • Deployed the storage-gke blueprint
  • Ran kubectl get pv. Output:
NAME                          CAPACITY   ACCESS MODES   RECLAIM POLICY   STATUS   CLAIM                                  STORAGECLASS   VOLUMEATTRIBUTESCLASS   REASON   AGE
cluster-toolkit-gcs-test-pv   5000Gi     RWX            Retain           Bound    default/cluster-toolkit-gcs-test-pvc                  <unset>                          20m
storage-gke-01-b32ed75c-pv    1Ti        RWX            Retain           Bound    default/storage-gke-01-b32ed75c-pvc                   <unset>                          20m
  • Ran kubectl describe pv cluster-toolkit-gcs-test-pv. Output:
Name:            cluster-toolkit-gcs-test-pv
Labels:          ghpc_blueprint=storage-gke
                 ghpc_deployment=storage-gke-01
                 ghpc_module=gke-persistent-volume
                 ghpc_role=file-system
Annotations:     pv.kubernetes.io/bound-by-controller: yes
Finalizers:      [kubernetes.io/pv-protection]
StorageClass:    
Status:          Bound
Claim:           default/cluster-toolkit-gcs-test-pvc
Reclaim Policy:  Retain
Access Modes:    RWX
VolumeMode:      Filesystem
Capacity:        5000Gi
Node Affinity:   <none>
Message:         
Source:
    Type:              CSI (a Container Storage Interface (CSI) volume source)
    Driver:            gcsfuse.csi.storage.gke.io
    FSType:            
    VolumeHandle:      cluster-toolkit-gcs-test
    ReadOnly:          false
    VolumeAttributes:  <none>
Events:                <none>

On provisioning without gcs_bucket. A new bucket gets created. Output of pv describe:

Name:            storage-gke-01-6960df25-pv
Labels:          ghpc_blueprint=storage-gke
                 ghpc_deployment=storage-gke-01
                 ghpc_module=gke-persistent-volume
                 ghpc_role=file-system
Annotations:     pv.kubernetes.io/bound-by-controller: yes
Finalizers:      [kubernetes.io/pv-protection]
StorageClass:    
Status:          Bound
Claim:           default/storage-gke-01-6960df25-pvc
Reclaim Policy:  Retain
Access Modes:    RWX
VolumeMode:      Filesystem
Capacity:        5000Gi
Node Affinity:   <none>
Message:         
Source:
    Type:              CSI (a Container Storage Interface (CSI) volume source)
    Driver:            gcsfuse.csi.storage.gke.io
    FSType:            
    VolumeHandle:      storage-gke-01-6960df25
    ReadOnly:          false
    VolumeAttributes:  <none>
Events:                <none>

@ighosh98 ighosh98 added the release-improvements Added to release notes under the "Improvements" heading. label Dec 16, 2024
Copy link
Contributor

@ankitkinra ankitkinra left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Still debating whether this is the right strategy or instead should we have a different module with only option to import a gcs bucket rather than reusing this module.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
release-improvements Added to release notes under the "Improvements" heading.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants