Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Getting panic on otherwise successful reconnect #256

Open
thosaa opened this issue Mar 16, 2023 · 6 comments
Open

Getting panic on otherwise successful reconnect #256

thosaa opened this issue Mar 16, 2023 · 6 comments

Comments

@thosaa
Copy link

thosaa commented Mar 16, 2023

I am using the current master branch and I have a problem which is very alike #243.

I get a: 'main' panicked at 'MessageQueue::send_message should never be called before make_request_channel'
Which is due to hitting the expect at: https://github.com/locka99/opcua/blob/master/lib/src/client/message_queue.rs#L57

When I unplug our real PLC from the network, wait a bit and then reconnecting the PLC I get:

2023-03-16 14:46:41.744Z DEBUG opcua::client::message_queue - Request 21 was processed by the server
2023-03-16 14:46:53.743Z DEBUG opcua::client::session::session_state - async_publish, request sent with handle 22
2023-03-16 14:46:53.743Z DEBUG opcua::client::message_queue - Request 22 was processed by the server
2023-03-16 14:47:05.743Z DEBUG opcua::client::session::session_state - async_publish, request sent with handle 23
2023-03-16 14:47:05.743Z DEBUG opcua::client::message_queue - Request 23 was processed by the server
2023-03-16 14:47:14.593Z ERROR opcua::client::comms::tcp_transport - Read loop error Os { code: 104, kind: ConnectionReset, message: "Connection reset by peer" }
2023-03-16 14:47:14.593Z INFO  opcua::client::comms::tcp_transport - ReadState has dropped
2023-03-16 14:47:14.593Z INFO  opcua::client::comms::tcp_transport - WriteState has dropped
2023-03-16 14:47:14.593Z DEBUG opcua::client::comms::tcp_transport - Closing connection because the read loop terminated
2023-03-16 14:47:14.593Z DEBUG opcua::client::session::session_state - Session was closed with status = BadConnectionClosed
2023-03-16 14:47:14.603Z INFO  opcua::client::session_retry_policy - Retry retriggered by policy
2023-03-16 14:47:14.603Z INFO  opcua::client::session::session - Retrying to reconnect to server...
2023-03-16 14:47:14.603Z INFO  opcua::client::session::session - Connect
2023-03-16 14:47:14.604Z INFO  opcua::client::session::session - Security policy = Basic256Sha256
2023-03-16 14:47:14.604Z INFO  opcua::client::session::session - Security mode = SignAndEncrypt
2023-03-16 14:47:14.604Z DEBUG opcua::client::comms::tcp_transport - Creating a connection task to connect to 192.168.1.77:4840 with url opc.tcp://CX-61AEAE:4840/
2023-03-16 14:47:14.609Z DEBUG opcua::client::session::session - session:3 open_secure_channel
2023-03-16 14:47:14.609Z INFO  opcua::client::session::session_state - Making secure channel request
2023-03-16 14:47:14.609Z INFO  opcua::client::session::session_state - security_mode = SignAndEncrypt
2023-03-16 14:47:14.609Z INFO  opcua::client::session::session_state - security_policy = Basic256Sha256
thread 'main' panicked at 'MessageQueue::send_message should never be called before make_request_channel', /home/ts/.cargo/git/checkouts/opcua-49af4c4898b3317f/ff2d148/lib/src/client/message_queue.rs:57:14
stack backtrace:
   0: rust_begin_unwind
             at /rustc/2c8cc343237b8f7d5a3c3703e3a87f2eb2c54a74/library/std/src/panicking.rs:575:5
   1: core::panicking::panic_fmt
             at /rustc/2c8cc343237b8f7d5a3c3703e3a87f2eb2c54a74/library/core/src/panicking.rs:64:14
   2: core::panicking::panic_display
             at /rustc/2c8cc343237b8f7d5a3c3703e3a87f2eb2c54a74/library/core/src/panicking.rs:147:5
   3: core::panicking::panic_str
             at /rustc/2c8cc343237b8f7d5a3c3703e3a87f2eb2c54a74/library/core/src/panicking.rs:131:5
   4: core::option::expect_failed
             at /rustc/2c8cc343237b8f7d5a3c3703e3a87f2eb2c54a74/library/core/src/option.rs:1924:5
   5: core::option::Option<T>::expect
             at /rustc/2c8cc343237b8f7d5a3c3703e3a87f2eb2c54a74/library/core/src/option.rs:786:21
   6: opcua::client::message_queue::MessageQueue::send_message
             at /home/ts/.cargo/git/checkouts/opcua-49af4c4898b3317f/ff2d148/lib/src/client/message_queue.rs:56:22
   7: opcua::client::message_queue::MessageQueue::add_request
             at /home/ts/.cargo/git/checkouts/opcua-49af4c4898b3317f/ff2d148/lib/src/client/message_queue.rs:80:17
   8: opcua::client::session::session_state::SessionState::add_request
             at /home/ts/.cargo/git/checkouts/opcua-49af4c4898b3317f/ff2d148/lib/src/client/session/session_state.rs:408:9
   9: opcua::client::session::session_state::SessionState::async_send_request
             at /home/ts/.cargo/git/checkouts/opcua-49af4c4898b3317f/ff2d148/lib/src/client/session/session_state.rs:364:9
  10: opcua::client::session::session_state::SessionState::send_request
             at /home/ts/.cargo/git/checkouts/opcua-49af4c4898b3317f/ff2d148/lib/src/client/session/session_state.rs:321:30
  11: opcua::client::session::session_state::SessionState::issue_or_renew_secure_channel
             at /home/ts/.cargo/git/checkouts/opcua-49af4c4898b3317f/ff2d148/lib/src/client/session/session_state.rs:456:24
  12: <opcua::client::session::session::Session as opcua::client::session::services::SecureChannelService>::open_secure_channel
             at /home/ts/.cargo/git/checkouts/opcua-49af4c4898b3317f/ff2d148/lib/src/client/session/session.rs:1305:9
  13: opcua::client::session::session::Session::connect_no_retry
             at /home/ts/.cargo/git/checkouts/opcua-49af4c4898b3317f/ff2d148/lib/src/client/session/session.rs:531:13
  14: opcua::client::session::session::Session::reconnect_and_activate
             at /home/ts/.cargo/git/checkouts/opcua-49af4c4898b3317f/ff2d148/lib/src/client/session/session.rs:304:13
  15: opcua::client::session::session::Session::poll::{{closure}}
             at /home/ts/.cargo/git/checkouts/opcua-49af4c4898b3317f/ff2d148/lib/src/client/session/session.rs:729:24
  16: opcua::client::session::session::Session::session_task::{{closure}}::{{closure}}
             at /home/ts/.cargo/git/checkouts/opcua-49af4c4898b3317f/ff2d148/lib/src/client/session/session.rs:632:39
  17: opcua::client::session::session::Session::session_task::{{closure}}::{{closure}}
             at /home/ts/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.8.4/src/macros/select.rs:505:49
  18: <tokio::future::poll_fn::PollFn<F> as core::future::future::Future>::poll
             at /home/ts/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.8.4/src/future/poll_fn.rs:38:9
  19: opcua::client::session::session::Session::session_task::{{closure}}
             at /home/ts/.cargo/git/checkouts/opcua-49af4c4898b3317f/ff2d148/lib/src/client/session/session.rs:625:9
  20: opcua::client::session::session::Session::run_loop::{{closure}}
             at /home/ts/.cargo/git/checkouts/opcua-49af4c4898b3317f/ff2d148/lib/src/client/session/session.rs:678:64
  21: <core::pin::Pin<P> as core::future::future::Future>::poll
             at /rustc/2c8cc343237b8f7d5a3c3703e3a87f2eb2c54a74/library/core/src/future/future.rs:125:9
  22: tokio::runtime::basic_scheduler::Inner<P>::block_on::{{closure}}::{{closure}}
             at /home/ts/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.8.4/src/runtime/basic_scheduler.rs:213:62
  23: tokio::coop::with_budget::{{closure}}
             at /home/ts/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.8.4/src/coop.rs:106:9
  24: std::thread::local::LocalKey<T>::try_with
             at /rustc/2c8cc343237b8f7d5a3c3703e3a87f2eb2c54a74/library/std/src/thread/local.rs:446:16
  25: std::thread::local::LocalKey<T>::with
             at /rustc/2c8cc343237b8f7d5a3c3703e3a87f2eb2c54a74/library/std/src/thread/local.rs:422:9
  26: tokio::coop::with_budget
             at /home/ts/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.8.4/src/coop.rs:99:5
  27: tokio::coop::budget
             at /home/ts/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.8.4/src/coop.rs:76:5
  28: tokio::runtime::basic_scheduler::Inner<P>::block_on::{{closure}}
             at /home/ts/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.8.4/src/runtime/basic_scheduler.rs:213:39
  29: tokio::runtime::basic_scheduler::enter::{{closure}}
             at /home/ts/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.8.4/src/runtime/basic_scheduler.rs:323:29
  30: tokio::macros::scoped_tls::ScopedKey<T>::set
             at /home/ts/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.8.4/src/macros/scoped_tls.rs:61:9
  31: tokio::runtime::basic_scheduler::enter
             at /home/ts/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.8.4/src/runtime/basic_scheduler.rs:323:5
  32: tokio::runtime::basic_scheduler::Inner<P>::block_on
             at /home/ts/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.8.4/src/runtime/basic_scheduler.rs:202:9
  33: tokio::runtime::basic_scheduler::InnerGuard<P>::block_on
             at /home/ts/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.8.4/src/runtime/basic_scheduler.rs:516:9
  34: tokio::runtime::basic_scheduler::BasicScheduler<P>::block_on
             at /home/ts/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.8.4/src/runtime/basic_scheduler.rs:162:24
  35: tokio::runtime::Runtime::block_on
             at /home/ts/.cargo/registry/src/github.com-1ecc6299db9ec823/tokio-1.8.4/src/runtime/mod.rs:450:46
  36: opcua::client::session::session::Session::run_loop
             at /home/ts/.cargo/git/checkouts/opcua-49af4c4898b3317f/ff2d148/lib/src/client/session/session.rs:687:9
  37: opcua::client::session::session::Session::run
             at /home/ts/.cargo/git/checkouts/opcua-49af4c4898b3317f/ff2d148/lib/src/client/session/session.rs:584:9
 ...
@thosaa
Copy link
Author

thosaa commented Mar 24, 2023

Just a follow up.

It seems that there some session issues as the the MessageQueue.make_request_channel is called, but on the old session instead of the new session...

I have added a uuid to the MessageQueue, for debug purpose only, to be able to track the instances.
I also added debug logs starting with ### .

2023-03-24 08:46:44.260Z DEBUG opcua::client::session::session_state - async_publish is acknowledging subscription acknowledgements with sequence nrs [3]
2023-03-24 08:46:44.260Z DEBUG opcua::client::message_queue - ### send_message self = MessageQueue {
    inflight_requests: {
        13: None,
    },
    responses: {},
    sender: Some(
        UnboundedSender {
            chan: Tx {
                inner: Chan {
                    tx: Tx {
                        block_tail: 0x00007f53240008e0,
                        tail_position: 12,
                    },
                    semaphore: 0,
                    rx_waker: AtomicWaker,
                    tx_count: 1,
                    rx_fields: "...",
                },
            },
        },
    ),
    mq_id: 8bbeb5b0-adfe-4c8a-ab82-9d268076334a,
}
2023-03-24 08:46:44.260Z DEBUG opcua::client::session::session_state - async_publish, request sent with handle 13
2023-03-24 08:46:44.260Z DEBUG opcua::client::message_queue - Request 13 was processed by the server

///////// Here the connection was interrupted (Ethernet disconnected from PLC)  /////////

2023-03-24 08:46:56.469Z DEBUG opcua::client::message_queue - ### send_message self = MessageQueue {
    inflight_requests: {
        13: None,
        14: None,
    },
    responses: {},
    sender: Some(
        UnboundedSender {
            chan: Tx {
                inner: Chan {
                    tx: Tx {
                        block_tail: 0x00007f53240008e0,
                        tail_position: 13,
                    },
                    semaphore: 0,
                    rx_waker: AtomicWaker,
                    tx_count: 1,
                    rx_fields: "...",
                },
            },
        },
    ),
    mq_id: 8bbeb5b0-adfe-4c8a-ab82-9d268076334a,
}
2023-03-24 08:46:56.469Z DEBUG opcua::client::session::session_state - async_publish, request sent with handle 14
2023-03-24 08:46:56.470Z DEBUG opcua::client::message_queue - Request 14 was processed by the server

///////// Here the connection was retablished (Ethernet inserted)  /////////

2023-03-24 08:47:04.157Z ERROR opcua::client::comms::tcp_transport - Read loop error Os { code: 104, kind: ConnectionReset, message: "Connection reset by peer" }
2023-03-24 08:47:04.157Z INFO  opcua::client::comms::tcp_transport - ReadState has dropped
2023-03-24 08:47:04.157Z INFO  opcua::client::comms::tcp_transport - WriteState has dropped
2023-03-24 08:47:04.157Z DEBUG opcua::client::comms::tcp_transport - Closing connection because the read loop terminated
2023-03-24 08:47:04.158Z DEBUG opcua::client::session::session_state - Session was closed with status = BadConnectionClosed
2023-03-24 08:47:04.159Z INFO  opcua::client::session_retry_policy - Retry retriggered by policy
2023-03-24 08:47:04.159Z INFO  opcua::client::session::session - Retrying to reconnect to server...
2023-03-24 08:47:04.159Z DEBUG opcua::client::message_queue - ### MessageQueue new
2023-03-24 08:47:04.159Z DEBUG opcua::client::session::session_state - ### SessionState new has id: 3, and messag_queue id: e3ee6e23-bb8c-433d-a8c5-cdf72f695f49
2023-03-24 08:47:04.159Z INFO  opcua::client::session::session - Connect
2023-03-24 08:47:04.159Z INFO  opcua::client::session::session - Security policy = Basic256Sha256
2023-03-24 08:47:04.159Z INFO  opcua::client::session::session - Security mode = SignAndEncrypt
2023-03-24 08:47:04.160Z DEBUG opcua::client::comms::tcp_transport - Creating a connection task to connect to 192.168.1.77:4840 with url opc.tcp://CX-61AEAE:4840/
2023-03-24 08:47:04.161Z DEBUG opcua::client::comms::tcp_transport - ### WriteState new was hit, message_queue id: 8bbeb5b0-adfe-4c8a-ab82-9d268076334a
2023-03-24 08:47:04.161Z DEBUG opcua::client::message_queue - ### make_request_channel self = MessageQueue {
    inflight_requests: {},
    responses: {},
    sender: Some(
        UnboundedSender {
            chan: Tx {
                inner: Chan {
                    tx: Tx {
                        block_tail: 0x00007f5324019ad0,
                        tail_position: 0,
                    },
                    semaphore: 0,
                    rx_waker: AtomicWaker,
                    tx_count: 2,
                    rx_fields: "...",
                },
            },
        },
    ),
    mq_id: 8bbeb5b0-adfe-4c8a-ab82-9d268076334a,
}
2023-03-24 08:47:04.165Z DEBUG opcua::client::session::session - session:3 open_secure_channel
2023-03-24 08:47:04.165Z INFO  opcua::client::session::session_state - Making secure channel request
2023-03-24 08:47:04.165Z INFO  opcua::client::session::session_state - security_mode = SignAndEncrypt
2023-03-24 08:47:04.165Z INFO  opcua::client::session::session_state - security_policy = Basic256Sha256
2023-03-24 08:47:04.165Z DEBUG opcua::client::message_queue - ### send_message self = MessageQueue {
    inflight_requests: {
        1: Some(
            SyncSender { .. },
        ),
    },
    responses: {},
    sender: None,
    mq_id: e3ee6e23-bb8c-433d-a8c5-cdf72f695f49,
}
thread 'main' panicked at 'MessageQueue::send_message should never be called before make_request_channel', .../src/client/message_queue.rs:66:14

As the ### make_request_channel self = shows the function is called and it holds a sender but the mq_id is the same as before the connection was interrupted.

I will try to get wiser on the sessions and get back.

@thosaa
Copy link
Author

thosaa commented Mar 24, 2023

I fixed the panic with master...thosaa:opcua:dont-panic-on-reconnect
But the code just stalls afterwards...

2023-03-24 10:48:46.679Z DEBUG opcua::client::session::session_state - ns=1;g=78589b75-5023-7741-b207-90801cb0ed85 Processing 1 async messages                                         
2023-03-24 10:48:46.679Z DEBUG opcua::client::session::session_state - ns=1;g=78589b75-5023-7741-b207-90801cb0ed85 handle_async_response                                               
2023-03-24 10:48:46.679Z DEBUG opcua::client::session::session_state - ns=1;g=78589b75-5023-7741-b207-90801cb0ed85 PublishResponse                                                     
2023-03-24 10:48:46.679Z DEBUG opcua::client::session::session_state - ns=1;g=78589b75-5023-7741-b207-90801cb0ed85 Received notifications, data changes = 1, events = 0                

///////// Storing the data from the PLC to the database /////////

2023-03-24 10:48:46.679Z DEBUG opcua::client::session::session_state - async_publish is acknowledging subscription acknowledgements with sequence nrs [7]                              
2023-03-24 10:48:46.679Z DEBUG opcua::client::session::session_state - async_publish, request sent with handle 17                                                                      
2023-03-24 10:48:46.680Z DEBUG opcua::client::message_queue - Request 17 was processed by the server  

////////// Disconnecting the PLC /////////

2023-03-24 10:48:59.118Z DEBUG opcua::client::session::session_state - async_publish, request sent with handle 18
2023-03-24 10:48:59.119Z DEBUG opcua::client::message_queue - Request 18 was processed by the server

////////// Reconnecting the PLC /////////

2023-03-24 10:49:06.579Z ERROR opcua::client::comms::tcp_transport - Read loop error Os { code: 104, kind: ConnectionReset, message: "Connection reset by peer" }
2023-03-24 10:49:06.579Z INFO  opcua::client::comms::tcp_transport - ReadState has dropped
2023-03-24 10:49:06.579Z INFO  opcua::client::comms::tcp_transport - WriteState has dropped
2023-03-24 10:49:06.579Z DEBUG opcua::client::comms::tcp_transport - Closing connection because the read loop terminated
2023-03-24 10:49:06.580Z DEBUG opcua::client::session::session_state - Session was closed with status = BadConnectionClosed
2023-03-24 10:49:06.589Z INFO  opcua::client::session_retry_policy - Retry retriggered by policy
2023-03-24 10:49:06.589Z INFO  opcua::client::session::session - Retrying to reconnect to server...
2023-03-24 10:49:06.589Z INFO  opcua::client::session::session - Connect
2023-03-24 10:49:06.590Z INFO  opcua::client::session::session - Security policy = Basic256Sha256
2023-03-24 10:49:06.590Z INFO  opcua::client::session::session - Security mode = SignAndEncrypt
2023-03-24 10:49:06.590Z DEBUG opcua::client::comms::tcp_transport - Creating a connection task to connect to 192.168.1.77:4840 with url opc.tcp://CX-61AEAE:4840/
2023-03-24 10:49:06.595Z DEBUG opcua::client::session::session - session:3 open_secure_channel
2023-03-24 10:49:06.595Z INFO  opcua::client::session::session_state - Making secure channel request
2023-03-24 10:49:06.595Z INFO  opcua::client::session::session_state - security_mode = SignAndEncrypt
2023-03-24 10:49:06.595Z INFO  opcua::client::session::session_state - security_policy = Basic256Sha256
2023-03-24 10:49:06.595Z DEBUG opcua::core::comms::secure_channel - AsymmetricSecurityHeader = AsymmetricSecurityHeader { security_policy_uri: UAString { value: Some("http://opcfoundation.org/UA/SecurityPolicy#Basic256Sha256") }, sender_certificate: ByteString { value: Some([48, 130, 3, 199, ...]) }, receiver_certificate_thumbprint: ByteString { value: Some([223, 223, 227, 199, 30, 19, 120, 214, 53, 251, 134, 113, 11, 116, 43, 32, 74, 164, 125, 96]) } }
2023-03-24 10:49:06.596Z DEBUG opcua::client::message_queue - Request 1 was processed by the server
2023-03-24 10:49:07.757Z DEBUG opcua::client::message_queue - Response to Request 1 has been stored
2023-03-24 10:49:07.757Z DEBUG opcua::client::session::session_state - Setting transport's security token
2023-03-24 10:49:07.758Z DEBUG opcua::client::session::session - session:3 Endpoint policy = Some(UserTokenPolicy { policy_id: UAString { value: Some("Anonymous-[0]-SignAndEncrypt-Basic256Sha256") }, token_type: Anonymous, issued_token_type: UAString { value: None }, issuer_endpoint_url: UAString { value: None }, security_policy_uri: UAString { value: None } })
2023-03-24 10:49:07.760Z DEBUG opcua::client::message_queue - Request 2 was processed by the server
2023-03-24 10:49:07.762Z DEBUG opcua::client::message_queue - Response to Request 2 has been stored
2023-03-24 10:49:07.762Z ERROR opcua::client - Received a service fault of BadSessionIdInvalid for the request
2023-03-24 10:49:07.762Z INFO  opcua::client::session::session - Session activation failed on reconnect, error = BadSessionIdInvalid, so creating a new session
2023-03-24 10:49:07.762Z DEBUG opcua::client::session::session - session:3 create_session
2023-03-24 10:49:07.763Z DEBUG opcua::client::session::session - session:3 CreateSessionRequest = CreateSessionRequest { request_header: RequestHeader { authentication_token: NodeId { namespace: 0, identifier: Numeric(0) }, timestamp: DateTime { date_time: 2023-03-24T10:49:07.763119Z }, request_handle: 1, return_diagnostics: (empty), audit_entry_id: UAString { value: None }, timeout_hint: 10000, additional_header: ExtensionObject { node_id: NodeId { namespace: 0, identifier: Numeric(0) }, body: None } }, client_description: ApplicationDescription { application_uri: UAString { value: Some("urn:Discovery") }, product_uri: UAString { value: Some("urn:CdgsOpcuaClient") }, application_name: LocalizedText { locale: UAString { value: Some("") }, text: UAString { value: Some("CDGs OPC UA Client") } }, application_type: Client, gateway_server_uri: UAString { value: None }, discovery_profile_uri: UAString { value: None }, discovery_urls: None }, server_uri: UAString { value: None }, endpoint_url: UAString { value: Some("opc.tcp://CX-61AEAE:4840/") }, session_name: UAString { value: Some("Rust OPC UA Client") }, client_nonce: ByteString { value: Some([46, 222, 26, ...) }, requested_session_timeout: 1.7976931348623157e308, max_response_message_size: 0 }
2023-03-24 10:49:07.763Z DEBUG opcua::client::message_queue - Request 1 was processed by the server
2023-03-24 10:49:08.336Z DEBUG opcua::client::message_queue - Response to Request 1 has been stored
2023-03-24 10:49:08.336Z DEBUG opcua::crypto::certificate_store - Validating cert with name on disk TcOpcUaServer@CX-61AEAE [dfdfe3c71e1378d635fb86710b742b204aa47d60].der
2023-03-24 10:49:08.336Z INFO  opcua::crypto::x509 - Certificate is valid for this time
2023-03-24 10:49:08.336Z INFO  opcua::crypto::x509 - Certificate host name CX-61AEAE is good
2023-03-24 10:49:08.336Z INFO  opcua::crypto::x509 - Certificate application uri urn:BeckhoffAutomation:TcOpcUaServer is good
2023-03-24 10:49:08.336Z DEBUG opcua::client::session::session - session:3 Revised session timeout is 3600000
2023-03-24 10:49:08.337Z DEBUG opcua::client::session::session - session:3 spawn_session_activity_task(3600000)
2023-03-24 10:49:08.337Z DEBUG opcua::client::session::session - session:3 session timeout is 3600000, activity timer is 2700000

2023-03-24 10:55:21.231Z DEBUG opcua::client::comms::tcp_transport - Read loop finished, connection state = Processing
2023-03-24 10:55:21.231Z INFO  opcua::client::comms::tcp_transport - ReadState has dropped
2023-03-24 10:55:21.231Z INFO  opcua::client::comms::tcp_transport - WriteState has dropped
2023-03-24 10:55:21.231Z DEBUG opcua::client::comms::tcp_transport - Closing connection because the read loop terminated
2023-03-24 10:55:21.231Z DEBUG opcua::client::session::session_state - Session was closed with status = Good

I was expecting the code to take up the previous subscription, what am I missing?

@sgoll
Copy link

sgoll commented Jun 27, 2023

FWIW: I am experiencing exactly the same problem after a reconnect (after restart of the server).

@abdelhaksam
Copy link

abdelhaksam commented Jul 13, 2023

@thosaa it looks like the code stalls because of a mutex problem at spawn_session_activity_task method (line 791). The lock cannot be acquired.
Below are my debug logs for your code with more verbose logging.

2023-07-13T13:06:50.164390100+01:00 INFO opcua::client::session::session - Session activation failed on reconnect, error = BadSessionIdInvalid, so creating a new session
Thread ThreadId(12), self.session_state write locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\session\session.rs, line 314
Thread ThreadId(12), self.session_state write lock completed
Thread ThreadId(12), self.message_queue write locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\session\session_state.rs, line 366
Thread ThreadId(12), self.message_queue write lock completed
2023-07-13T13:06:50.166402600+01:00 DEBUG opcua::client::session::session - session:3 create_session
Creating session
Thread ThreadId(12), self.secure_channel read locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\session\session.rs, line 1335
Thread ThreadId(12), self.secure_channel read lock completed
Thread ThreadId(12), self.certificate_store write locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\session\session.rs, line 1343
Thread ThreadId(12), self.certificate_store write lock completed
Thread ThreadId(12), self.session_retry_policy locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\session\session.rs, line 1356
Thread ThreadId(12), self.session_retry_policy lock completed
Thread ThreadId(12), self.session_state write locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\session\session.rs, line 1214
Thread ThreadId(12), self.session_state write lock completed
2023-07-13T13:06:50.228234200+01:00 DEBUG opcua::client::session::session - session:3 CreateSessionRequest = CreateSessionRequest { request_header: RequestHeader { authentication_token: NodeId { namespace: 0, identifier: Numeric(0) }, timestamp: DateTime { date_time: 2023-07-13T12:06:50.227976800Z }, request_handle: 1, return_diagnostics: (empty), audit_entry_id: UAString { value: None }, timeout_hint: 10000, additional_header: ExtensionObject { node_id: NodeId { namespace: 0, identifier: Numeric(0) }, body: None } }, client_description: ApplicationDescription { application_uri: UAString { value: Some("urn:digicrop") }, product_uri: UAString { value: Some("urn:digicrop") }, application_name: LocalizedText { locale: UAString { value: Some("") }, text: UAString { value: Some("DigiCrop") } }, application_type: Client, gateway_server_uri: UAString { value: None }, discovery_profile_uri: UAString { value: None }, discovery_urls: None }, server_uri: UAString { value: None }, endpoint_url: UAString { value: Some("opc.tcp://localhost:48040/") }, session_name: UAString { value: Some("Rust OPC UA Client") }, client_nonce: ByteString { value: None }, client_certificate: ByteString { value: Some([48, 130, 3, 168, 48, 130, 2, 144, 160, 3, 2, 1, 2, 2, 16, 42, 127, 142, 9, 147, 150, 13, 182, 9, 117, 101, 99, 240, 51, 94, 227, 48, 13, 6, 9, 42, 134, 72, 134, 247, 13, 1, 1, 11, 5, 0, 48, 87, 49, 17, 48, 15, 6, 3, 85, 4, 3, 12, 8, 68, 105, 103, 105, 67, 114, 111, 112, 49, 17, 48, 15, 6, 3, 85, 4, 10, 12, 8, 68, 105, 103, 105, 67, 114, 111, 112, 49, 17, 48, 15, 6, 3, 85, 4, 11, 12, 8, 68, 105, 103, 105, 67, 114, 111, 112, 49, 11, 48, 9, 6, 3, 85, 
4, 6, 19, 2, 73, 69, 49, 15, 48, 13, 6, 3, 85, 4, 8, 12, 6, 68, 117, 98, 108, 105, 110, 48, 30, 23, 13, 50, 51, 48, 55, 48, 56, 48, 56, 49, 54, 48, 56, 90, 23, 13, 50, 52, 48, 55, 48, 55, 48, 56, 49, 54, 48, 56, 90, 48, 87, 49, 17, 48, 15, 6, 3, 85, 4, 3, 12, 8, 68, 105, 103, 105, 67, 114, 111, 112, 49, 17, 48, 15, 6, 3, 85, 4, 10, 12, 8, 68, 105, 103, 105, 67, 114, 111, 112, 49, 17, 48, 15, 6, 3, 85, 4, 11, 12, 8, 68, 105, 103, 105, 67, 114, 111, 112, 49, 11, 48, 9, 6, 3, 85, 4, 6, 19, 2, 73, 69, 49, 15, 48, 13, 6, 3, 85, 4, 8, 12, 6, 68, 117, 98, 108, 105, 110, 48, 130, 1, 34, 48, 13, 6, 9, 42, 134, 72, 134, 247, 13, 1, 1, 1, 5, 0, 3, 130, 1, 15, 0, 48, 130, 1, 10, 2, 130, 1, 1, 0, 192, 44, 241, 95, 188, 77, 106, 146, 78, 218, 213, 178, 17, 161, 103, 221, 198, 180, 10, 224, 13, 37, 45, 208, 23, 108, 79, 207, 120, 170, 201, 142, 195, 63, 182, 189, 240, 51, 209, 12, 12, 107, 64, 67, 151, 10, 221, 125, 90, 247, 47, 33, 4, 7, 52, 13, 60, 145, 253, 184, 195, 125, 22, 228, 46, 96, 11, 209, 180, 50, 176, 137, 245, 179, 222, 
178, 141, 104, 91, 187, 114, 255, 197, 42, 255, 169, 61, 123, 92, 228, 249, 126, 82, 130, 14, 8, 211, 39, 200, 126, 180, 9, 47, 238, 52, 11, 162, 32, 182, 29, 219, 249, 89, 162, 208, 17, 163, 21, 73, 238, 128, 200, 135, 105, 218, 121, 61, 184, 228, 137, 178, 93, 34, 98, 105, 182, 216, 20, 117, 255, 103, 233, 166, 132, 52, 65, 72, 30, 145, 172, 113, 197, 198, 97, 243, 37, 116, 154, 27, 192, 51, 253, 57, 207, 110, 112, 224, 154, 90, 129, 170, 161, 33, 183, 215, 101, 99, 45, 44, 89, 155, 0, 32, 2, 101, 169, 188, 85, 98, 154, 183, 252, 86, 54, 127, 195, 228, 22, 130, 239, 227, 174, 231, 230, 251, 132, 184, 247, 196, 193, 182, 173, 174, 97, 100, 175, 238, 74, 116, 164, 45, 194, 29, 120, 36, 153, 133, 217, 144, 68, 17, 203, 157, 124, 69, 53, 37, 189, 156, 147, 183, 79, 174, 155, 161, 118, 242, 72, 233, 152, 59, 141, 86, 214, 245, 245, 2, 3, 1, 0, 1, 163, 112, 48, 110, 48, 11, 6, 3, 85, 29, 15, 4, 4, 3, 2, 2, 244, 48, 29, 6, 3, 85, 29, 37, 4, 22, 48, 20, 6, 8, 43, 6, 1, 5, 5, 7, 3, 2, 6, 8, 43, 6, 1, 5, 5, 7, 3, 1, 48, 64, 6, 
3, 85, 29, 17, 4, 57, 48, 55, 134, 12, 117, 114, 110, 58, 100, 105, 103, 105, 99, 114, 111, 112, 130, 15, 68, 69, 83, 75, 84, 79, 80, 45, 73, 78, 56, 73, 50, 78, 78, 135, 4, 192, 168, 31, 178, 135, 16, 254, 
128, 0, 0, 0, 0, 0, 0, 154, 194, 188, 188, 202, 137, 33, 147, 48, 13, 6, 9, 42, 134, 72, 134, 247, 13, 1, 1, 11, 5, 0, 3, 130, 1, 1, 0, 189, 201, 50, 160, 127, 233, 187, 237, 178, 16, 183, 72, 21, 220, 214, 
169, 231, 208, 9, 170, 140, 47, 199, 124, 24, 81, 235, 86, 41, 154, 199, 231, 135, 49, 185, 83, 249, 134, 148, 84, 24, 197, 103, 158, 39, 156, 122, 89, 4, 129, 224, 215, 32, 141, 198, 8, 85, 92, 138, 235, 70, 216, 247, 137, 53, 140, 57, 199, 204, 107, 125, 93, 217, 158, 133, 49, 118, 112, 44, 178, 82, 230, 243, 67, 182, 187, 205, 18, 252, 162, 38, 117, 79, 201, 171, 9, 230, 250, 178, 221, 98, 206, 135, 25, 91, 
125, 217, 210, 218, 70, 218, 87, 24, 138, 78, 107, 89, 87, 85, 60, 87, 8, 122, 161, 224, 203, 59, 235, 224, 153, 154, 174, 124, 201, 183, 199, 177, 100, 172, 93, 80, 172, 99, 11, 84, 254, 154, 240, 218, 119, 203, 80, 217, 83, 121, 93, 19, 148, 54, 165, 38, 78, 41, 205, 209, 210, 135, 21, 50, 49, 129, 147, 95, 79, 5, 108, 143, 155, 50, 7, 29, 195, 175, 66, 78, 133, 114, 243, 26, 211, 184, 194, 251, 10, 251, 109, 103, 62, 188, 15, 112, 190, 94, 57, 148, 182, 65, 59, 248, 149, 54, 146, 166, 218, 241, 98, 26, 220, 225, 173, 106, 104, 253, 195, 180, 21, 53, 15, 93, 85, 54, 231, 127, 48, 70, 207, 208, 34, 202, 224, 244, 75, 161, 224, 138, 140, 181, 135, 155, 90, 180, 195, 91, 223, 138, 184]) }, requested_session_timeout: 1.7976931348623157e308, max_response_message_size: 0 }
Thread ThreadId(12), self.session_state write locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\session\session.rs, line 1223
Thread ThreadId(12), self.session_state write lock completed
Thread ThreadId(12), self.secure_channel read locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\session\session_state.rs, line 444
Thread ThreadId(12), self.secure_channel read lock completed
Thread ThreadId(12), self.message_queue write locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\session\session_state.rs, line 437
Thread ThreadId(12), self.message_queue write lock completed
Thread ThreadId(19), self.secure_channel read locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\comms\tcp_transport.rs, line 245
Thread ThreadId(19), self.secure_channel read lock completed
Thread ThreadId(19), write_state.message_queue write locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\comms\tcp_transport.rs, line 558
Thread ThreadId(19), write_state.message_queue write lock completed
2023-07-13T13:06:50.250679+01:00 DEBUG opcua::client::message_queue - Request 1 was processed by the server
Thread ThreadId(19), self.secure_channel write locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\comms\tcp_transport.rs, line 111
Thread ThreadId(19), self.secure_channel write lock completed
Thread ThreadId(19), self.secure_channel read locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\comms\tcp_transport.rs, line 115
Thread ThreadId(19), self.secure_channel read lock completed
Thread ThreadId(19), self.secure_channel read locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\comms\tcp_transport.rs, line 95
Thread ThreadId(19), self.secure_channel read lock completed
Thread ThreadId(19), depth_gauge locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\types\encoding.rs, line 48
Thread ThreadId(19), depth_gauge lock completed
Thread ThreadId(19), self.depth_gauge locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\types\encoding.rs, line 33
Thread ThreadId(19), self.depth_gauge lock completed
Thread ThreadId(19), read_state.message_queue write locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\comms\tcp_transport.rs, line 496
Thread ThreadId(19), read_state.message_queue write lock completed
2023-07-13T13:06:50.262159600+01:00 DEBUG opcua::client::message_queue - Response to Request 1 has been stored
Thread ThreadId(12), self.session_state write locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\session\session.rs, line 1379
Thread ThreadId(12), self.session_state write lock completed
Thread ThreadId(12), self.secure_channel write locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\session\session.rs, line 1383
Thread ThreadId(12), self.secure_channel write lock completed
Thread ThreadId(12), self.secure_channel read locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\session\session.rs, line 1055
Thread ThreadId(12), self.secure_channel read lock completed
2023-07-13T13:06:50.269882700+01:00 DEBUG opcua::client::session::session - session:3 Revised session timeout is 3600000
Before spawn session activity task
2023-07-13T13:06:50.270979700+01:00 DEBUG opcua::client::session::session - session:3 spawn_session_activity_task(3600000)
Thread ThreadId(12), self.session_state read locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\session\session.rs, line 771
Thread ThreadId(12), self.session_state read lock completed
2023-07-13T13:06:50.272491900+01:00 DEBUG opcua::client::session::session - session:3 session timeout is 3600000, activity timer is 2700000
After session timeout debug statement
session-activity-thread-ThreadId(12)
Thread ThreadId(12), self.runtime locking at C:\Users\FUJITSU-RMB\.cargo\git\checkouts\_empty-31ce7918aa1c8808\8425fbc\lib\src\client\session\session.rs, line 791

@abdelhaksam
Copy link

The run_loop method didn't release the lock on the session runtime. I made a patch by running the session task on a fresh tokio runtime instead here. It works now, but still requires more testing.

@djanderson
Copy link

@locka99 is this planned to be resolved in 0.13? There seems to be several issues related to this that are fairly well hashed out and even some PRs with potential fixes.

It's seems like a pretty big issue, since I imagine most people are trying to use opcua within some other application (a GUI, a web server, or whatever) and there's a pretty reasonable chance that existing application already has its own async runtime.

Just trying to gauge whether the presented fixes are acceptable and will be rolled in soon, or if I should look to run a patched fork, or look for alternatives, etc. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants