Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Testsuite hangs forever after test 232 #147

Closed
glaubitz opened this issue Dec 9, 2021 · 8 comments
Closed

Testsuite hangs forever after test 232 #147

glaubitz opened this issue Dec 9, 2021 · 8 comments
Labels
bug This issue is a bug. p3 This is a minor priority issue

Comments

@glaubitz
Copy link

glaubitz commented Dec 9, 2021

I just packaged aws-c-auth for openSUSE and noticed that the testsuite hangs forever after test 232:

[   14s] 228/234 Test #234: sigv4_get_space_unnormalized_test .............................................   Passed    0.01 sec
[   15s] 229/234 Test  #36: credentials_provider_ecs_real_new_destroy .....................................   Passed    1.01 sec
[   15s] 230/234 Test  #43: credentials_provider_x509_real_new_destroy ....................................   Passed    1.01 sec
[   15s] 231/234 Test  #77: imds_client_multiple_resource_requests_random_responses_finally_all_success ...   Passed    1.06 sec
[   20s] 232/234 Test  #28: credentials_provider_imds_real_new_destroy ....................................   Passed    6.01 sec
@anatol
Copy link

anatol commented Jan 6, 2022

I see the same issue at Arch Linux:

 33/185 Test  #33: credentials_provider_ecs_basic_success ........................................   Passed    0.01 sec
        Start  34: credentials_provider_ecs_no_auth_token_success
 34/185 Test  #34: credentials_provider_ecs_no_auth_token_success ................................   Passed    0.01 sec
        Start  35: credentials_provider_ecs_success_multi_part_doc
 35/185 Test  #35: credentials_provider_ecs_success_multi_part_doc ...............................   Passed    0.00 sec
        Start  36: credentials_provider_ecs_real_new_destroy
 36/185 Test  #36: credentials_provider_ecs_real_new_destroy .....................................   Passed    1.20 sec
        Start  37: credentials_provider_x509_new_destroy
 37/185 Test  #37: credentials_provider_x509_new_destroy .........................................   Passed    0.06 sec
        Start  38: credentials_provider_x509_connect_failure
 38/185 Test  #38: credentials_provider_x509_connect_failure .....................................   Passed    0.02 sec
        Start  39: credentials_provider_x509_request_failure
 39/185 Test  #39: credentials_provider_x509_request_failure .....................................   Passed    0.02 sec
        Start  40: credentials_provider_x509_bad_document_failure
 40/185 Test  #40: credentials_provider_x509_bad_document_failure ................................   Passed    0.02 sec
        Start  41: credentials_provider_x509_basic_success
 41/185 Test  #41: credentials_provider_x509_basic_success .......................................   Passed    0.02 sec
        Start  42: credentials_provider_x509_success_multi_part_doc
 42/185 Test  #42: credentials_provider_x509_success_multi_part_doc ..............................   Passed    0.02 sec
        Start  43: credentials_provider_x509_real_new_destroy
 43/185 Test  #43: credentials_provider_x509_real_new_destroy ....................................***Failed    0.14 sec
        Start  44: credentials_provider_sts_web_identity_new_destroy_from_env

@bretambrose
Copy link
Contributor

That doesn't look like the same thing. What happens when you run the failing test individually and get log output?

@anatol
Copy link

anatol commented Jan 6, 2022

That doesn't look like the same thing. What happens when you run the failing test individually and get log output?

interesting that after I updated all aws-c-* dependencies to the latest version this test problem has gone for me.

@bretambrose
Copy link
Contributor

As of the current moment, the aws-c-* libraries are not guaranteed ABI-compatible unless taken as a snapshot. It feels like we're getting packaging/distribution issues here and there and we need to lock that down, but I don't have any kind of ETA.

@glaubitz
Copy link
Author

glaubitz commented Jan 6, 2022

@bretambrose I think the main issue that needs to be sorted out with the aws-c-* libraries are the non-standard cmake paths.

The proper path for cmake files is /usr/lib{,64}/cmake/${AWS_MODULE_PATH}, yet all aws-c-* packages install into /usr/lib{,64}/${AWS_MODULE_PATH}/cmake.

See awslabs/aws-c-event-stream#15 and awslabs/aws-c-common#844.

@bretambrose
Copy link
Contributor

Understandable and I've skimmed all of those path/packaging issues but I don't feel personally comfortable tweaking those due to a lack of expertise and the $*%#-storm that will happen if the path changes break anything in how the libraries are built/consumed internally within Amazon's build systems and platforms. @JonathanHenson might be able to provide more knowledgable support there.

@JonathanHenson
Copy link
Contributor

As long as the changes are all made at once our build system won't care. The problem, of course is synchronizing that and making sure all consumers rebuild everything when they build their dependency closure. Maybe we could bootstrap the find_package scripts to check both paths until the next minor version revision?

@yasminetalby yasminetalby added the bug This issue is a bug. label Jun 23, 2023
@jmklix jmklix added the p3 This is a minor priority issue label Oct 11, 2023
@jmklix
Copy link
Member

jmklix commented Feb 12, 2024

Please make sure that you are using the latest version of all of repo's. If you still run into any testsuite problems please open a new issue

@jmklix jmklix closed this as completed Feb 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug This issue is a bug. p3 This is a minor priority issue
Projects
None yet
Development

No branches or pull requests

6 participants