Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix memory stats collection during submission running #4807

Merged
merged 1 commit into from
Jul 4, 2023

Conversation

chvp
Copy link
Member

@chvp chvp commented Jul 4, 2023

This pull request fixes collection of memory stats during submission running. There were two problems:

  • Collecting stats took 1.8 seconds, which meant that a lot of submissions were finished by the time the first results were in.
  • memory_stats.max_usage is only available if docker uses cgroups v1 under the hood. We use v2 on all our hosts.

@chvp chvp added the bug Something isn't working label Jul 4, 2023
@chvp chvp requested a review from a team as a code owner July 4, 2023 12:14
@chvp chvp requested review from bmesuere and niknetniko and removed request for a team July 4, 2023 12:14
memory = stats['memory_stats']['max_usage'] / (1024.0 * 1024.0) if stats['memory_stats']&.fetch('max_usage', nil)
before_stats = Time.zone.now
# Check if container is still running
if !Rails.env.test? && (Docker::Container.all.select { |c| c.id.starts_with?(container.id) || container.id.starts_with?(container.id) }.any? && container.refresh!.info['State']['Running'])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I prefer the old guard style here

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The old guard style is hard to combine with the conditional sleep below.

memory = [stats['memory_stats']['usage'] / (1024.0 * 1024.0), memory].max if stats['memory_stats']&.fetch('usage', nil)
end

# Gathering stats still takes a long time, so if we spent enough time on
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it takes a long time, is there a performance impact (since we now do it 5 times more often)?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not noticeably on my machine. The runtime is because docker does some sleeps internally to gather the data, IIUC.

@chvp chvp merged commit 28383aa into main Jul 4, 2023
12 of 13 checks passed
@chvp chvp deleted the fix/memory-collection branch July 4, 2023 12:53
@chvp chvp temporarily deployed to naos July 4, 2023 12:53 — with GitHub Actions Inactive
@chvp chvp temporarily deployed to production July 4, 2023 12:57 — with GitHub Actions Inactive
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants