-
I have been testing an HPCondor cluster using containers. When a run is finished I would like to transfer the results to a storage bucket so that I can access them on my local machine. However, it seems the access point node does not have write access to a storage bucket and gsutil cp fails. (It actually seems to crash ...) Although the documentation suggests that I should be able to grant access I have not discovered how. Any help appreciated. Carl |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
@carlkross let's start with the job submit file that you are using. You can eliminate any private information like the precise URL of your non-public docker container. HTCondor does have built-in support for staging data to and from Cloud Storage. It does require downloading a key to your directory and specifying it in the job submission. If you follow the link in the above docs to generating HMAC credentials for your bucket, then you can add this info to your job and HTCondor will handle it for you. |
Beta Was this translation helpful? Give feedback.
@carlkross let's start with the job submit file that you are using. You can eliminate any private information like the precise URL of your non-public docker container.
HTCondor does have built-in support for staging data to and from Cloud Storage. It does require downloading a key to your directory and specifying it in the job submission. If you follow the link in the above docs to generating HMAC credentials for your bucket, then you can add this info to your job and HTCondor will handle it for you.