Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Investigate hanging tlsn/crates/tls/mpc/tests/test.rs #559

Closed
th4s opened this issue Aug 6, 2024 · 4 comments
Closed

Investigate hanging tlsn/crates/tls/mpc/tests/test.rs #559

th4s opened this issue Aug 6, 2024 · 4 comments

Comments

@th4s
Copy link
Member

th4s commented Aug 6, 2024

When running the tls-mpc-test with a tracing configuration that emits New and Close FmtSpans we were able to capture the following log trace_hang.txt.

After filtering for lines that contain decode_key_private and writing all these Leader events into this file
decode_key_private.txt and doing the same for decode_key_blind and writing these Follower events into this file decode_key_blind.txt I notice that decode_key_private is called twice on the Leader site (line 1 and 21). This is not true for the follower, where it is called only once (line 1). This is what causes the leader site to be hanging in ot_receive_active_encodings (line 14389 in trace_hang.txt).

The cause for this might be that commit which calls decode_key_privateis called twice:

@themighty1
Copy link
Member

Most likelydecode_key_private is called twice due to a race.
It happens under these conditions:

  • the Leader calls commit and causes the notifier to be set by self.notifier.set()

  • we would hope that select_biased! in async-client would start processing the notify future

  • but at the same time the server closes the connection and rx_tls_fut is processed instead which causes client.server_closed() to be called , which calls commit again while self.buffer is not empty.

The fix is to not allow commit to be called more than once, unless there is a reason to do so @sinui0

@th4s
Copy link
Member Author

th4s commented Aug 14, 2024

While working on #539 I encountered the same test hanging. I isolated it into a new investigation-branch.

@th4s
Copy link
Member Author

th4s commented Aug 15, 2024

While working on #539 I encountered the same test hanging. I isolated it into a new investigation-branch.

This is fixed now and was caused by the changes of #539. But the original bug from the trace_hang.txt still needs to be adressed.

@th4s
Copy link
Member Author

th4s commented Aug 16, 2024

Closed by #568

@th4s th4s closed this as completed Aug 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants