Trillions from MAYAnMAYA on Vimeo.
A section of it resonated with me greatly; that is that we are not going to create a network of Trillions of devices using the technology we have today. We need to climb a new mountain of ideas that will enable Trillions of devices to work together.
Looking a little further past the clip, you can find the White Paper by Peter Lucus, a founder of the Maya Design group on the challenges faced by creating a Trillion node network. An interesting point he makes is that in terms of information exchange, there have only been two universally accepted units of data representation; the bit and the byte. It has been forty years and there has been no improvement on this. The White Paper also discusses some of the architectural requirements that any solution must meet to make the Trillon node network a reality; these include scalability, tractability and comprehensiveness.
Argot/XPL is our attempt at climbing the right mountain. Argot is designed around those architectural requirements identified by the Maya Design group, however, at a more practical level I believe the solution must meet a number of other technical requirements. These are:
- Strict Internal Data Models – Each node/file in a Trillion node network must have an internal consistent data model of the types of data it contains or it can communicate. This data model describes the structure of each element it can communicate. Most distributed systems already have a strict data model; however the data model is often referenced and is not specific to the node of the network. XML Schema offers a shared data model which makes it difficult to split schemas and implement parts of a schema. A Strict Internal Data Model describes the data types that the individual node is capable of communicating.
- Element Versioning – The Data Model must be versioned at each individual element or structure. This is one area where XML Schema and most other data modelling tools currently fail at providing the necessary functionality to meet the needs of a Trillion node network. As functionality improves and changes, the data model must be updated. XML Schema only allows very limited changes which retain backward compatibility. Sooner or later an XML Schema requires changes which break backward compatibility. When backward compatibility is at this stage all nodes in the network must be updated to support new Schemas. This already happens in large corporate systems that use XML Schemas heavily.
- Partial Comparison – It must be possible to perform a partial comparison of the data model. The data model of a Trillion node network is likely to be made up of many different schemas. Devices will implement different amounts of functionality from different shared data models. It must be possible to allow two nodes to directly discover if devices can communicate and discover the method of communications.
An important aspect of these requirements for building a Trillion node network of heterogeneous devices is that agreement on the data format is not a core requirement. The format could be Argot, or any other suitable format supporting those requirements. The requirements for the Trillion node network is on the data models and the nodes in the network to be able to allow change through versioning, partial data model implementation, partial data model comparisons and finally discovery mechanisms to allow agreement on data formats to be established.
Peter Lucus finishes his white paper by suggesting that attribute/value pairs might be the next universally agreed unit of data representation. This is where I disagree, the next unit of universal data representation needs to be the data model. From this point of view, data modelling to support change is very immature. I've seen very little practical implementations that deal with change acceptably in a heterogeneous network.
In the end, I agree with the Maya Group. We need to start climbing the right mountain to create suitable solutions to the Internet of Things. I don't believe we will get there using REST and XML as they don't support change, and change is one of the few requirements I can truly be certain is required as the Internet of Things become a reality.
Thanks for linking to trillions!
ReplyDeleteYou might find this interesting...
http://www.wired.com/dangerroom/2009/12/command-posts-of-the-future-readied-for-afghan-surge/
CPOF is based on uforms and layered semantics, allows for mutability over time as well as versioning and is all based on the thinking noted in Dr lucas's paper. The system has been widely deployed since 2004 and uses UForms as it's unit of currency (uuids with attr value pairs as an abstract data type). DARPA considers it one of it's most significant successes this decade.
Cheers,
mick