Hi all,
I have two questions I am hoping someone might have some information on.
1. I remembered reading somewhere on this forum that you could set the ReadSocket timeout to 0 to help reduce latency and allow the MSG instruction to behave a bit differently/better. The timeout I am referring to is the one inside the MSG instruction that has the service type of ReadSocket and then uses a source element that has a specified timeout.
2. The main issue I am trying to solve is a TCP client connection that will receive variable length data from a TCP server would ideally read until it hits a delimiter character, in my case carriage return, line feed ($n$l). However I realize doing this might not be super straight forward because TCP is a data stream that is primarily based around byte size.
Thank you!
I have two questions I am hoping someone might have some information on.
1. I remembered reading somewhere on this forum that you could set the ReadSocket timeout to 0 to help reduce latency and allow the MSG instruction to behave a bit differently/better. The timeout I am referring to is the one inside the MSG instruction that has the service type of ReadSocket and then uses a source element that has a specified timeout.
2. The main issue I am trying to solve is a TCP client connection that will receive variable length data from a TCP server would ideally read until it hits a delimiter character, in my case carriage return, line feed ($n$l). However I realize doing this might not be super straight forward because TCP is a data stream that is primarily based around byte size.
Thank you!