<div dir="ltr"><img src="http://t.sidekickopen65.com/e1t/o/5/f18dQhb0S7ks8dDMPbW2n0x6l2B9gXrN7sKj6v4dY_WW8r56Rz643_PgW2B8dD01pctGFW5PXYkH1k1H6H0?si=4827575526621184&pi=b3c9749c-910a-4203-d5bf-52998a31f227" style="display:none!important" height="1" width="1"><div class="gmail_extra"><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div class="gmail_extra">In principle, protocols should come with two things:</div><div class="gmail_extra"><br></div><div class="gmail_extra">* A set of parser combinators which fit into a framework such as Meredith Patterson's Hammer. This would make it next to impossible to accidentally parse the protocol wrong and establish syntactically valid input. In an Erlang world, it would produce proper Erlang terms.</div><div class="gmail_extra"><br></div><div class="gmail_extra">* A QuickCheck specification. This would allow implementors to verify their protocol against a quasi-formal model which would greatly improve quality of software. Every implementation would be at least as precise as the QC model. A more ambitious solution would include a model closer to e.g. TLA+. Then one writes a compiler from TLA+ to a QC system and you can check real-world programs.</div><div class="gmail_extra"><br></div><div class="gmail_extra">I think the reason it is not done has to do with effort. Doing the right thing(tm) often takes an order of magnitude more time compared to the quick and dirty hack. Since software is built primarily on the basis of speed (time), people don't do the necessary work up front. Rather, it is baked on later, when the damage has been done. Up front modeling often changes the spec in ways which removes ambiguity and room for error.</div><div class="gmail_extra"><br></div><div class="gmail_extra">If we should start, *the* place to start is TLS 1.3 :P</div></div></blockquote><div><br></div><div>This all sounds amazing and like a thing I'd like to encourage the world to adopt. Any thoughts on a very simple protocol one could get started in? If we could get a few similarly-written examples in place and popularized perhaps it could encourage adoption? Bonus if we can define a service that checks an implementation, provide 'badges' that link to the service, and convince some implementations of the initial test protocols to link to them in their READMEs/sites :)</div><div><br></div><div>I just started doing math on this sort of thing recently and when I consider the amount of time wasted (at least 4 hours of my time just to prove that slack failed to adhere to the IRC RFC) times the number of engineers that have inevitably run into the same thing, it really sickens me. As software continues to eat the world, I expect there to be a short asymptote to how much we can feasibly do unless we start handling things more correctly as a society. There's...just no acceptable reason to have the situation we presently have, at this stage in our field :(</div><div><br></div><div>I'd be glad to do a few episodes of my screencast on this sort of thing, which (a) would justify my spending the time on such a project, and (b) might help me popularize the idea / get other people up in arms about it. I'd need some help though because I'm a bit out of my depth here :) </div><div> </div></div>-- <br><div class="gmail_signature" data-smartmail="gmail_signature">Josh Adams<br></div>
</div></div>