As someone working on standards development in this area, I would point out that the proliferation of proprietary APIs and protocols are not caused by a desire to create lock-in in many cases. More often than not, the interfaces are proprietary because good standards simply don't exist to do what is required.
A large part of the problem often overlooked by the tech community is that none of the existing web standards are useful for the Internet of Things. It needs a new set of standards designed for the purpose so that applications can support the operations necessary at the throughput and operation rates required by IoT.
For example, something that comes up a lot at the standards meetings is representation of data. Formats like JSON and XML were fine for the web, but when you have to continuously parse a billion records every second the implied computational cost of just parsing using most existing standard formats is extraordinary. It shifts the economics such that computational efficiency becomes totally worth the implementation cost. Unfortunately, there are not a lot of existing standards that reflect the tradeoffs you see with IoT implementations.
What would be a good data representation at that scale? Is a typed/schema-driven approach preferable?
For ingesting such massive amounts of data, it seems a sensible serialization would have minimal divergence between the wire and memory representation, such as Cap'n Proto.
Handling composite types seems tricky too -- eg, for prefixing lengths to strings or lists, it appears the extra CPU time for variable-sized integers would be preferable to the I/O overhead of billions of wasted bytes that come with a fixed-size prefix. Assuming explicit begin/end delimiters aren't even an option here.
Cap'n'Proto is actually pretty good example of a modern, high-performance serialization. I've been using it as a template for IoT wire representation discussions.
Fast wire encodings are almost universally TLV ("tag-length-value") style serializations. Delimiter scanning is inefficient and also means the parser has little ability to predict what will be coming over the wire so that it can optimize processing.
While older serializations tend to be byte oriented, newer formats use word-sized "frames" (even if not aligned) to enable nearly branchless, parallel processing of the bytes in the stream using bit-twiddling techniques or vector instructions.
A large part of the problem often overlooked by the tech community is that none of the existing web standards are useful for the Internet of Things. It needs a new set of standards designed for the purpose so that applications can support the operations necessary at the throughput and operation rates required by IoT.
For example, something that comes up a lot at the standards meetings is representation of data. Formats like JSON and XML were fine for the web, but when you have to continuously parse a billion records every second the implied computational cost of just parsing using most existing standard formats is extraordinary. It shifts the economics such that computational efficiency becomes totally worth the implementation cost. Unfortunately, there are not a lot of existing standards that reflect the tradeoffs you see with IoT implementations.