[Libwebsockets] Question re: packet buffering/packet loss
jon at mobilefs.com
Mon Mar 9 23:58:29 CET 2015
I'm new to this list and to libwebsockets. I have just begun integrating
libwebsockets into my application which has a Windows server that pushes a
potentially bursty stream of data to browser clients. I'm integrating the
libwebsockets code into my MFC based server dialog - so one key thing is
that I do not have a 'main' function loop. The data is all text based (i.e.
stock market data). I'm using pretty much the 'dumb_increment' callback
routine to send packets to the clients (right now I'm testing with three
For the most part, the data is getting from server to client. However, I'm
seeing inconsistent results on the client side which looks like there may be
some packet loss. The clients are staggering when displaying the data - in
other words one client may randomly display the latest data while others may
or may not. It does appear as though the clients are not getting all the
packets sent by the server. I'm not entirely where the source of this issue
is but figured the most obvious would be to start with looking at how I'm
sending the data from the server.
Two other things - I commented out the gettimeofday function and can easily
put it back if inducing a slight delay is something I should do. Also, I
dropped the timer interval for the libwebsocket_service function down to 5
ms. I originally had the timer at the default 50ms and dropping it down
actually seemed to help a bit although it is a bit hard to tell upon glance.
I read that by default libwebsockets will buffer data if it sees congestion
but I'm not sure if this is something I need to turn on or tweek with some
additional functions or parameters.
Any help would be much appreciated. Thanks!
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Libwebsockets