Anyone who has ever tried to negotiate a standard for data storage or communication will confirm that it is difficult to get agreement and even harder to gain adoption. Decades of debate over both analogue and digital communications standards for radio, television and telecommunications have been used as evidence by the information technology sector that there must be a better way.
In the 1990s there was a great desire to avoid having the internet break into the type of divides that saw the television world split into NTSC and PAL (television) camps. If the newly commercial Internet was to be open and connected then it was assumed that everyone needed to speak the same technical standard language. One of the first examples of this was HTML and the creation of the W3C to govern it as a standard.
Moving beyond the rendering of web pages, it was argued that communication of information needed to be open and subject to specifications and standards. XML was born of this push to an open, systematic approach to data specifications. A quick review of XML-based standards shows how many groups have taken this ambition to heart.
The big question is whether standards-based specifications work or at least whether they are as important as they once were.
In an era when technologies were deployed and then static, such as traditional televisions, telephones or radio, the approach to information decoding was fixed at the time of manufacture. A decision on information formats needed to be made before sending large numbers of devices out to the public, otherwise no two telephones could talk and television stations would have had to broadcast in a myriad of formats (assuming they could find enough spectrum to do so). In such a world, it was up to government and major industry bodies to decide on standards, which is why they tended to differ by country. Today, the last remaining bastion of this era is the division of spectrum which continues to lead to frustrating differences in the deployment of mobile technology between countries.
At the very time that the internet was pushing for adoption of standards, the user community seemed to be voting with their feet (or at least fingers) by adopting some of the least standardised formats in the market. A good example is the rapid adoption of Adobe’s PDF format for rendered documents. The PDF format breaks all of the rules that W3C hoped to set with XML and yet it met a very real business and consumer need – an accurate and efficient onscreen rendering of paper documents.
As much as Adobe used the PDF format to its commercial advantage, it was ultimately only able to sustain its position by handing it over as an open standard. In this case, standards have followed commercial evolution rather than the other way around.
Despite appearances, HTML has followed a similar pattern with the dominant browser of the time effectively defining extensions to the format which have then have ultimately been adopted as part of the standard. Attempts to drive HTML in the opposite direction, through initiatives such as the semantic web seem to fail on both agreement and even more importantly adoption.
The internet is fundamentally different as a vehicle for communication to anything that has preceded it. Very few devices are locked-down, with even connected televisions being provided constant software updates. Browsers, word processors, spreadsheets, reporting tools and a myriad of other products used for reading and authoring files support “add-ins” which allow for new file formats to be supported.
The ability for products to rapidly adopt formats and allow for relatively seamless information interchange has been highly evident with the take-up of mobile devices spawning a wide range of software products supporting traditional office documents. It is unthinkable today that you wouldn’t be able to read and update your MS Word document on your iPad using your tool of choice and then send it on to someone else’s Android tablet.
Clearly standards are still important, but our assumptions on drivers and sequencing might need to change. Standard formats need to benefit all parties involved in a way that they can immediately see. The only alternative is for them to be mandated by the party that benefits (such as government or regulatory data submissions). If a standard is to develop and leverage innovation across a sector, it cannot rely on regulation alone.
At the same time, there is less need to be afraid of having several different approaches to communication in play at the same time. Because the internet is “always on”, the market will ensure that those that benefit from translators will have access to them. The market will also naturally encourage convergence over time.
The future of information standards is perhaps to encourage the right markets and economic motivations rather than rely on regulation and expert committees.