You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* Core creates a protobufs Hello with counter set to the uint32 represented by the most significant 4 bytes of the IV, encrypts the protobufs Hello with AES, and sends the ciphertext to Server.
* Server reads protobufs Hello from socket, taking note of counter. Each subsequent message received from Core must have the counter incremented by 1. After the max uint32, the next message should set the counter to zero.
* Server creates protobufs Hello with counter set to a random uint32, encrypts the protobufs Hello with AES, and sends the ciphertext to Core.
* Core reads protobufs Hello from socket, taking note of counter. Each subsequent message received from Server must have the counter incremented by 1. After the max uint32, the next message should set the counter to zero.
However, as far as I can tell, the counter is a uint16 and the firmware grabs the the top 2 significant bytes of the SALT, not the top 4 from the IV. It would be great if these comments could be updated since it could throw someone else otherwise. (If I'm wrong please let me know of course.)
In trying to understand the protocol I noticed the following comments around https://github.com/spark/spark-protocol/blob/master/js/lib/Handshake.js#L62
However, as far as I can tell, the counter is a uint16 and the firmware grabs the the top 2 significant bytes of the SALT, not the top 4 from the IV. It would be great if these comments could be updated since it could throw someone else otherwise. (If I'm wrong please let me know of course.)
What led me to believe this:
https://github.com/spark/firmware/blob/release/0.4.3/communication/src/spark_protocol.cpp#L1636
https://github.com/spark/spark-protocol/blob/master/js/settings.js#L39
https://github.com/spark/spark-protocol/blob/master/js/clients/SparkCore.js#L373
The text was updated successfully, but these errors were encountered: