Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cost Analysis, S3 Storage Analysis and Cleanup for Prod, UAT, and TI #965

Open
4 of 15 tasks
derekgiardino opened this issue Nov 14, 2024 · 3 comments
Open
4 of 15 tasks
Assignees
Labels
Priority - High Priority Level Task 1 GAMA1 Task 1
Milestone

Comments

@derekgiardino
Copy link
Collaborator

derekgiardino commented Nov 14, 2024

We are accumulating files in S3 and the cost is continuing to escalate. We need to balance costs of this item which means a few things need to be looked at and completed

  • DEV - S3 Object Counts and Metrics (Storage Lens)
  • UAT
  • Prod
  • Analyze what S3 Storage is required for TI Apocalyptic Testing if anything.
  • Unnecessary cloud trail removal as tracked in ticket: 1004
  • Turn off near real time pipelines in Dev unless performing the testing
  • Write quick script to delete S3 bucket in UAT for region 1 to avoid CLI issues
  • Remove anything in S3 buckets that is unneeded and is routinely written and stored. Any data needed for development should not be impacted
  • Stop any processes that accumulate data in TI that isn't being used / if data is required limit to past 30 days
  • Analyze usage of S3 data in UAT and what its used for
  • Remove any unnecessary files in UAT S3 older than the past 30 days
  • Turn off UAT Pipelines unless we are actively testing services in UAT
  • Set up a rolling barrel deletion of files in UAT S3 older than 30 days
  • Production Archive needs to be cut back eventually. What would be storage savings deleting all from 2022 - Do not delete off of production at this time, simple analysis of amount of storage in 2022 and cost savings if deleted and creating a 2 year rolling barrel deletion.
  • Investigate cost savings from S3 storage tiers and S3 Intelligent storage tier
@derekgiardino derekgiardino added the Task 1 GAMA1 Task 1 label Nov 14, 2024
@derekgiardino derekgiardino added this to the V2.1.8 milestone Nov 14, 2024
@derekgiardino derekgiardino changed the title S3 Storage Analysis and Cleanup for Prod, UAT, and TI Cost Analysis, S3 Storage Analysis and Cleanup for Prod, UAT, and TI Nov 19, 2024
@RobHanna-NOAA
Copy link
Contributor

I have am well down the path of looking to see what we can cut back on the fim-dev and ras2fim side too. That being said, you guys will know this stuff a lot better than me and keep me in the loop if there is something we can check that is related to our enviros. Also.. When Fernando and I were looking at our fim-dev costs more closely, we found out we had backups running against a handful of our Ec2's that we didn't know about. I shut most of that off. Two of my images would not let me shut off backups but it should help quite a bit

@EdisonOrellana-NOAA
Copy link
Contributor

Thank you @RobHanna-NOAA for noticing this:

s3://hydrovis-dev-fim-us-east-1/processing_outputs. It is 2.5 TiB of data from the periodic of appx mid Dec 2022 and Jan 17th 2023. The rest of the folder appears to be data and tools for loading data. Back then, I believe it was called the "dev" area, but now we just use "ti".

I am placing this here for notes to revisit when I can dedicate more time to this ticket

@shawncrawley
Copy link
Collaborator

As of now, the storage required for TI Apocalyptic Testing is 81.9 GB. If I did the cost estimate correctly, this should translate to no more than about $2/month.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Priority - High Priority Level Task 1 GAMA1 Task 1
Projects
None yet
Development

No branches or pull requests

5 participants