<div dir="ltr">Hello all,<div><br></div><div>I've been playing around with ibrowse, trying to get chunked responses from a server and the behavior I'm seeing isn't at all what I expect given the documentation.</div>
<div><br></div><div style>When I send the following request:</div><div style><br></div><div style><div>ibrowse:send_req(URL, [], get, [], [{stream_to, self()}]).</div><div><br></div><div style>I get the headers back from the server immediately...and then no messages after that, until the timeout message anyway. From what the documentation says, I'd expect to just continue to get messages "as data arrives on the socket" but this apparently isn't the case.</div>
<div style><br></div><div style>If I call stream_next(Request_id) nothing happens at all. However, if I instead change the call to:</div><div style><br></div><div style>ibrowse:send_req(URL, [], get, [], [{stream_to, {self(), once}}]).<br>
</div><div style><br></div><div style>I get the headers, and then when I call stream_next(Request_id) I suddenly begin to get the packets the server is sending.</div><div style><br></div><div style>Can anyone explain to me why I am seeing what I am seeing?</div>
<div style><br></div><div style>The use case I am trying to fulfill is that I connect to a server and get a set of data back that describes the state of the server as a json object. On any changes, which might not happen for a very long time (days/months), a new set of data is streamed to the client. I would like to have one process receiving multiple streams, from multiple servers, handling these status changes. Is this possible, or will I need to spin up workers for each stream and have them wait for new data, providing it to that central process?</div>
</div></div>