• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

An Antic Disposition

  • Home
  • About
  • Archives
  • Writings
  • Links
You are here: Home / Archives for Standards

Standards

The Anatomy of Interoperability

2007/02/20 By Rob 8 Comments

I’d like to talk a little about about interoperability. Although I’ll concentrate on document formats, the concepts here are of wider applicability. Wherever you have standards and implementations, these are some of the issues that you will want to consider.

Rather than attempting the reckless frontal assault of defining interoperability, I’ll take an oblique approach and discuss the forces that tend to increase or decrease interoperability. By understanding these forces, and how they interact, you can get a good sense of how interoperability works, and have a better idea how the decisions that are being made today will determine the interoperability consumers will experience in the years to come.

I’ll start with the specification itself, the text of the standard. What characteristics might a specification have that hinder interoperability?

  1. Ambiguities — The specification may describe a feature in a way that is open to more than one interpretation. This may be caused by imprecise language, or by incomplete description of the feature. For example, if a specification defines a sine and cosine function, but fails to say whether their inputs are in degrees or radians, then this function is ambiguous.
  2. Out of scope features — The specification totally lacks description of a feature, making it out of scope for the standard. For example, neither ODF nor OOXML specifes the storage model, the syntax or the semantics of embedded scripts. If a feature is out of scope, then there is no expectation of interoperability with that feature.
  3. Undefined behaviors — These may be intentional or accidental. A specification may explicitly call out some behaviors as “undefined”, “implementation-dependent” or “implementation-defined”. This is often done in order to allow an implementation to implement the feature in the best performing way. For example, the size of integers are implementation-defined in the C/C++ programming languages, so they are free to take advantage of the capabilities of different machine architectures. Even a language like Java, which goes much further than many to ensure interoperability, has undefined behaviors in the area of multi-threading, for performance reasons. There is a trade-off here. A specification that specifies everything and leaves nothing to the discretion of the implementation will be unable to take advantage of the features of a particular platform. But a specification that leaves too much to the whim of the implementation will hinder interoperability.
  4. Errors — These may range from typographical errors, to incorrect use of control language like “shall” or “shall not”, to missing pages or sections in the specification, to inconsistency in provisions. If one part of the specification says X is required, and another says it is not, then implementations may vary in how feature X is treated.
  5. Non-thematic provisions — If a standard is claimed to be platform-neutral or application-neutral, then anything that is tied to a particular platform or application will hinder that type of interoperability.
  6. Feature Creep — A standard can collapse under its own weight. There is often a trade-off between expressiveness of a standard (what features it can describe) and the ease of implementation. The ideal is to be very expressive as well as easy to implement. If a standard attempts to do everything that everyone could possibly want, and does so indiscriminately, then the unwieldy complexity of the standard will make it more difficult for implementations to implement, and this will hinder interoperability.

The forces that help counteract these problems are:

  1. Public participation — The public gets involved early, reading drafts of the standard before it is approved, and also meeting minutes and discussion list traffic. The public provides comments to the technical committee and reads the comments submitted by others. Note: the mere theoretical ability of the public to participate results in absolutely no benefit to the standard. This is like a beta program that exists in theory only will uncover zero bugs. What counts is the actual participation. Openness is a means to participation, but is insufficient on its own.
  2. Expert review — The feature set for a modern word processor or spreadsheet is large and requires interdisciplinary expertise in such diverse areas as vector graphics, text layout, mathematical layout, accessibility, internationalization, schema design, statistical and financial calculations, etc. So the design of a file format will similarly require a broad range of expertise. Tapping into outside experts (for example, from member companies or other technical committees) when designing the standard, or as reviewers, is a way to reduce the types of problems that hinder interoperability.
  3. Multi-vendor participation — It is good to have multiple active participants in the standards development process, representing implementors both proprietary and open source, both traditional implementations as well as web-based. You might consider other constituencies as well, such as end-users, accessibility advocates, academic and government representatives, consumer advocates, etc. There is a saying familiar to programmers, “Every class of users finds a new class of bugs”. This is as true with standards as it is will code. More eyeballs, especially with different perspectives, is key. It is better to have 5 active participants with different perspectives than to have 100 participants from a single vendor. Again, I’m talking reality here, not just perception. It is easy to have a technical committee that is large on paper, but still have all of the ideas come from a single company. This does nothing for interoperability. You want to have really multi-vendor participation. This is why having public read-only access to the mailing list of the technical committee is so valuable. That is great way to see what the real participation level is.
  4. Multiple implementations — If a standard is intended to be platform-neutral and vendor-neutral, then the participation of multiple implementors working on multiple platforms is essential. They are the ones who truly drive interoperability. Especially if they are implementing while the specification is under development or review, they will find, report and resolve many interoperability issues.
  5. Reuse of existing standards — When writing a program, the availability of reusable code or design patterns can improve quality and shorten schedules. In manufacturing, the use of off-the-shelf components can be faster to tool and be less expensive than custom components. Similarly, in standards development, when you reuse another standard you reuse the domain expertise, the standards development effort and the review effort that went into the development of that standard. If you are lucky and you reuse a widely-implemented standard, implementors will likely also be able to reuse knowledge and even code when implementing it.
  6. Existence of a reference implementation and test suite — These work, together with the standard itself, as a triad of verifications that reinforce each other. The test suite is written to test the provisions of the specification. The reference implementation is written to implement the provisions of the specification. The test suite is executed against the reference implementation. Any errors indicated are thus either errors in the reference implementation, the specification, or the test suite. Fix the bug and repeat until the reference implementation passes the test suite. Having a reference implementation that passes a test suite that has high or complete coverage of the provisions of the specification is ideal for interoperability.
  7. Good engineering — In the end, a technical standard is technical, and it will be influenced by the same kinds of engineering criteria that influence how we architect and design software applications. If you can make a standard that is modular, exhibits internal consistency in its parts, is cohesive, has reasonable dependency chains among its parts, does what it set out to do with minimal complexity, but allows of uses beyond that, then you’ve made a good start. But if you find yourself shaking your head, slapping your forehead or even holding your nose when reading a standard, then you are heading for interoperability hell. The standards that have worked well were as simple as possible, but no simpler. Think XML 1.0 or XSLT 1.0. Engineers know how to build complexity on top of simplicity. But you don’t build anything on top of crap.

Of course, the standard is only one half of the equation. The implementations are the other half, and they have their own list of ways they can hinder or help interoperability.

The factors that hinder interoperability include:

  1. Implementation bugs — Conformance to a standard, like any other product feature, gets weighed against a long list of priorities for any given product release. There is always more work to do than time to do it. Whether a high-quality implementation of a standard becomes a priority will depend on factors such as user-demand, competition, and for open source projects, the level of interest of developers contributing to the community.
  2. Functional subsets — Even in heavily funded commercial ventures standards support can be partial. Look at Microsoft’s Internet Explorer, for example. How many years did it take to get reasonable CSS2 support? When an application supports only a subset of a standard, interoperability with applications that allow the full feature set of the standard, or a different subset of the standard, will suffer.
  3. Functional supersets — Similarly, an application can extend the standard, often using mechanisms allowed and defined by the standard, to create functional supersets that, if poorly designed, can cause interoperability issues.
  4. Varying conceptual models — For example, a traditional WYSIWYG word processor has a page layout that is determined by the metrics of the printer the document will eventually print to. But a web-based editor is free from those constraints. In fact, if the eventual target of the document is a web page, these constraints are irrelevant. So we have here a conceptual difference, where one implementation sees the printed page as a constraint on layout, and another application is in an environment where page width is more flexible. Document exchange between two editors with different conceptual models of page size will require extra effort to ensure interoperability.
  5. Different code bases — The more distinct implementations, the more paths of exchange. If you have only a single implementation of your standard, then there will be no interoperability issues, at least not with others using the same version of your application on the same platform. But if there are two implementations, then you have 2 paths of exchange to worry about. If there are 3 implementations then you have 6 paths, and so on with N*(N-1) possible paths for exchanging files with N applications. Consider the ODF format for word processing documents, where we will soon have implementations in Office (at least two different Plugins), OpenOffice, Lotus Notes, KOffice, Google Docs and Abi Word. This is 42 exchange paths (or more if you consider platform variations) though obviously not all of these paths will be common.

And the forces that counteract these and lead to improved interoperability are:

  1. Test suites and reference implementations — As described earlier, these are key. During the development of the standard they improve the standard, and after the standard is approved they continue paying dividends by helping improve the quality additional implementations.
  2. Conformance certification — This can take several forms, from self-testing and self-certifying, to more formal tests by 3rd party labs. The result is a compliance report, listing which features of the standard were correctly implemented, and which were not. Having this information available to consumers can help interoperability to the extent it allows them to narrow their consideration to the applications that most fully and correctly implement the standard.
  3. Profiles — This is a formal mechanism for defining an interoperable subset of a standard. For example, XHTML Basic is a W3C-defined subset of XHTML for use on mobile devices. By defining such a subset, interoperability is enhanced in several ways. First, a single web service can produce XHTML Basic that is understood by mobile devices from several vendors. Second, XHTML Basic is a proper subset of XHTML, so it is also interoperable with all the tools that accept XHTML.
  4. Adaptive extension mechanisms — It is good to have a formal mechanism for extending the specification while at the same time allowing a graceful recovery in the case where the document is loaded in an application that doesn’t understand the extension.
  5. Inter-vendor cooperation — Make no mistake. Once an interoperability problem is found, where application A and application B are not rendering the same document in an acceptably similar fashion, then we have a bug. Such a bug may be reported by the customers of application A or the customers of application B. Ideally, A and B will cooperate and share information on interoperability bugs found. The customer that reports A’s bug to company B, may tomorrow be matched by another that reports B’s bug to customer A. So some sort of interoperability bug clearinghouse or other protocols can help here.

I may have missed some important factors that help or hinder interoperability. Please share your ideas.


3/29/07 — Corrected error sent in by an anonymous reader. The number of interoperability concerns scale as N(N-1), not as N!.

Filed Under: Standards

Washing Machines are not Lamps

2007/02/14 By Rob 18 Comments

Microsoft standards attorney David Rudin has posted his thoughts on my How Standards Bring Consumers Choice, in a post titled Floor Lamps are not Software.

David correctly points out that some appliances, like washing machines or electric dryers, with higher power requirements, have a different plug design. “Clearly, a one size standard does not fit all”, as he says. However, this is an intentional design decision made for safety reasons. If things cannot safely be interchanged, then good industrial design is to make them impossible to be interchanged. These plugs are incompatible and non-interoperable on purpose.

No one would intentionally do that with a file format, would they?

David then suggests that a single standard is insufficient because it would stifle competition and innovation:

Electricity is a largely mature and stable technology and there is not much room for innovation in the socket and receptacle space. Document formats, on the other hand, are constantly evolving to keep pace with changing technology. Competition is vital to ensure that those formats continue to meet those ever changing needs. Imagine if a single document format was adopted 15 years ago. How would that format deal with things that we take for granted today like including links to web pages, adding digital photos, or even embedding video in our documents? Unlike electricity, document formats are evolving at a rapid pace and competition will help drive that innovation.

I see it differently. Has a single HTML standard held back competition and innovation on the web? Has the move from vendor-specific network protocols to TCP/IP deprived consumers of innovation? Has the standardization of SQL held back the database industry? Have standardized programming languages like C and C++ prevented others from innovating? I see no evidence of this. On the contrary, standardized HTML, TCP/IP, SQL and C/C++ have been fundamental to the modern IT economy and have been responsible for many billions of dollars of value.

I’d also challenge the assertion that standardization equates with lack of innovation. If this were true, how does Microsoft reconcile their work standardizing OOXML, .NET, C++/CLI, C#, etc., with their needs for continuing innovation? Are these areas, “largely mature and stable”?

Or is this really just a belief that standardization is good when Microsoft originates and controls the standard, but it is bad otherwise?

Back to the examples of HTML, TCP/IP, SQL, C/C++. These standards continued to evolve, and innovations were brought to consumers, but they were done in a multi-vendor standards process where they reconciled their multiple perspectives and needs. Is that such a bad model to follow?

In the end, where does innovation come from? Does it require absolute control? Or does it come from having bright people? I’d suggest the latter, and point out that Microsoft employs several, but not all of the bright people in the area of file formats. Microsoft and Microsoft’s customers would benefit greatly if Microsoft would join with their competitors who are already innovating and competing in the evolution of the ISO ODF standard.

Remember the “X” in XML stands for Extensible. Making a single file format that meets Microsoft’s needs, as well as IBM’s, Sun’s, Corel’s, Novell’s, Google’s, etc., is not only technically possible, it is the best approach for the consumer. This does not mean that competition ends, or that all office applications will have identical features, or that we can only have lowest-common-denominator functionality. It just means that we should agree on a common representation for the features that we already have in common, and then have a framework for layering on vendor-specific innovations to express the areas where we differ.

Filed Under: OOXML, Standards

How Standards Bring Consumers Choice

2007/02/10 By Rob 10 Comments

This is an essay on the choices that standards enable. By laying out a framework for ensuring interoperable, interchangeable and substitutable components, standards make it easier for you, the consumer, to shop with confidence and take full advantage of the choices offered in the marketplace.

Let’s take the humble floor lamp as an example, and look at some of the standards that govern its design, and the choices this enables for the consumer. In the United States, many of the parameters for electrical fixtures are governed by standards promulgated by the National Electrical Manufacturers Association (NEMA). Some of these standards are approved by ANSI or the IEC as well.

Let’s start at the wall. We see here a NEMA 5-15 Type B 3-Pin duplex socket. As you may notice, the left slot is slightly taller than the right one. This is the neutral. The live line is on the right, and the ground is on the bottom.

This socket is rated for 5 Amps at 125 Volts. The definition and calibration of the Amp and Volt and other key values in the Metric or SI system of measurement are standardized by the General Conference on Weights and Measures (CGPM).

From the lamp we have the male end, an AC power plug. On my lamp the connector is a polarized, ungrounding NEMA 1-15P plug.

This is an electrical cord for the lamp. The codes indicate that this is a SPT-2 , #18 gauge flexible cord. SPT (Service Parallel Thermoplastic) is a standard defined by the American Society for Testing and Materials (ASTM) and approved by Underwriters Laboratories (UL). The gauge is measured in American Wire Gauge (AWG) units, also known as the Brown & Sharpe gauge, a standard dating back to 1855.


A critical interface is the one between the bulb and the lamp. The most-common connection in the United States is called the MES or Single Contact Medium Edison Screw (E26). This is an ANSI Standard, C78.20-2003.

The bulb itself is an NEMA A21-style bulb, with an E26d style base. It will be 134.9 mm long, 28.2mm wide at the base. The height of the conductor screw will be 24.4mm. As indicated, this is a three-way bulb, rated at 50W, 100W and 150W.

As you can see, the inputs and outputs of the lamp are heavily-constrained by a number of standards. Is this a bad thing? Is the consumer deprived for not having to worry about different plug and outlet types, or what gauge cord to use for their lamp, or what thread connection to use? On the contrary, it is a blessing for the consumer that such pieces are interchangeable commodities.

If one bulb burns out, you can replace it with whatever brand is cheapest. Or you can get a lower wattage one to save electricity. Or even get a florescent one that fits in the same Medium Edison Screw socket. You can get clear glass, soft white, red glass or black light. You have these choices because bulb standards make bulbs interchangeable.

If the dog chews on the cord and you need to replace it, there is no need to return the lamp to the manufacturer at great expense. SPT-2 cords are standard and available at any hardware store. You can simply replace it yourself.

If you move to another house, will your lamp stop working and need to be thrown out? No, of course not. The NEMA 15-5 power outlets are in every home in North America. You can use your lamp wherever you go.

What if you want to buy a new lamp next year, will you need to change the wiring or the outlets in your house to work with it? Certainly not. The power outlets are a standard will work with any lamp, the ones you have today, the old ones you buy at an flea market, or the ones you buy 10 years from now. The standards ensure interoperability, interchangeability and substitutability.

In fact, far from constraining choice, standards enable greater choice. Because the basic plugs, receptors and connectors are governed by standards, these core components have become commodities and are produced off-shore at low cost to you, the consumer. This causes lighting designers and manufacturers to compete on the basis of style, elegance, utility and features. So standards result in lower cost, greater competition and greater choice for the consumer.

These are the choices enabled by standards, and the choices that consumers want:

It is the same thing with document formats. Consumers don’t want to worry about document formats. They don’t want to even think about document formats. They just want them to work, invisibly, without problems. Of course consumers want choice, but it is the choice of applications, choice of features, choice of vendors, choice of support options, choice of open source versus proprietary source, choice of heavy weight versus web-based, a choice of buying a single application versus buying a suite, etc. A single universal file format is what makes these other choices possible, just like a choice of the Medium Edison Screw bulb leads to an affordable choice in lamp designs.

To perpetuate vendor-specific file formats is like having a lamp that requires a special plug and a special light bulb and even special electricity. It is to go through life with a bag full of adapters and transformers that you will need to apply whenever you, or someone else, needs to use your document in another application. The cost of multiple document standards is that the industry will invest in a wide variety of convertors between the formats, and this cost will be passed on to the consumer. These convertors will work sometimes, but will inevitably be slower, buggier, have less fidelity, and their availability will lag the office applications by months or years.

The choice is yours.


Updates:

2/14/2007: A discussion of one critic’s response to this essay can be read in this follow-up.

Filed Under: Standards

Declaring Bankruptcy

2007/02/04 By Rob 10 Comments

Lawrence Lessig called it email bankruptcy: when you have so many unanswered emails in your inbox that you decide to make a clean start and just admit to yourself, and to those who wrote, that you are not going to respond.

I have a related problem, interesting links I’ve collected and have meaning to blog about. But my links have accumulated far faster than I have been able to write about them. So I am declaring “link bankruptcy”. Here is my fire sale, a set of interesting topics for only pennies on the dollar:

  1. Glyn Moody has the story about how platform dependencies has impacted one notable British institution.
  2. Even more startling results in Korea, as reported in The Cost of Monoculture and the Korean Saga.
  3. It is mainly in Polish, but some in English. More coverage of Open Standards in a new blog from Jacek Łęgiewicz.
  4. In case you missed it the first time around, here is a wonderful essay by Dan Bricklin on “Software that Lasts 200 Years“. It made me think of what ramifications this has for file formats that aspire to longevity as well.
  5. This looks interesting. A free OpenOffice Calc add-in for doing “fuzzy math” in OpenOffice.
  6. Sweave adds ODF support to the open source R statistical analysis and graphing platform.
  7. Docvert, an online REST service for converting Microsoft Word documents into ODF format.
  8. I know someone was asking for this a few months ago — A Microsoft Works import filter for OpenOffice.
  9. Office Migration Planning Manager (OMPM) allows bulk conversions of legacy Office binary documents to OOXML. Does anyone have something similar for ODF? Not just bulk conversion, but detection and reporting of possible conversion problems as well.
  10. The eXtensibility Manifesto has some good schema design advice, including: #3 “Design of a data model focuses on all stakeholders’ requirements for the data.” #6 “Designs or components are not reinvented, but rather are leveraged where possible.”
  11. “[Expert Witness] Alepin…alleged that the company [Microsoft]had subverted developers who used Microsoft’s version of Java ‘thinking they were developing multi-platform applications, but were actually developing Windows-specific applications’ “. From PC Pro News.
  12. The Case For ODF — a recent presentation from OpenOffice Community Manager Louis Suarez-Potts.
  13. “Office 2007 lacks some features of earlier versions of Office, and so it can’t fully support some Office files created in earlier versions. For example, Word 2007 cannot open Word files that contain multiple document versions, a feature supported by Word prior to Word 2007”. Anyone know what else is missing? From Directions on Microsoft.
  14. A few months old — European Cities Do Away with Traffic Signs. Does anyone know how this has turned out?
  15. Dashed Lines and their uses.
  16. David Berlind over at ZDNet: “To me, Ecma is not a standards body. As evidenced by the DVD situation (which is ridiculous if you ask me), it’s little more than a puppet with a pipeline through which vendors can pump their proprietary technologies into the ISO standardization process (avoiding the rigor that should normally be applied to anything up for consideratoin as an ISO standard). As such, the ISO is sort of a joke too.”
  17. “One trouble spot we encountered using Vista’s Explorer metadata organization tools was the lack of support for some of the file types we commonly use. For instance, JPEG files happily take attributes under Vista, but PNG files do not. Along similar lines, Vista would not apply metadata to files we had created in the OpenOffice.org format. And, strangely, our attempts to apply metadata to documents created in OpenOffice.org—in Microsoft Office format—were greeted with an error message.” From eWeek.
  18. What is a standard, according to David Rudin, Microsoft’s official Standards Attorney? “A technical specification that enables interoperability between different products and services and is either 1) intended for widespread industry adoption or 2) has achieved wide spread industry adoption.” This is a nice write-up.

Filed Under: ODF, OOXML, Standards

Defining Deviancy Down

2007/01/30 By Rob 10 Comments

Kai Erikson, in his classic study of deviant behavior in early New England, Wayward Puritans, made the important observation that:

…the amount of deviation a community encounters is apt to remain fairly constant over time. To start at the beginning, it is a simple logistic fact that the number of deviancies which come to a community’s attention are limited by the kinds of equipment it uses to detect and handle them, and to that extent the rate of deviation found in a community is at least in part a function of the size and complexity of its social control apparatus. A community’s capacity for handling deviance, let us say, can be roughly estimated by counting its prison cells and hospital beds, its policemen and psychiatrists, its courts and clinics.

In other words, a community’s perception of social deviation is conditioned and limited by their capacity for controlling it. With equal number of punishment cells, equal-sized communities of cloistered monks and bloodthirsty pirates would perceive the same rate of deviancy. Of course the actual deviations would be different: Brother Maynard isn’t praying earnestly enough versus Greybeard slit a crewmate’s throat in the night, without warning the bunkmate below.

The late Senator from New York, Daniel Patrick Moynihan, took this idea and applied it to the social ills that America has increasingly faced since the 1960’s: mental illness, illegitimacy and violent crime. How does society react when the level of deviancy rises unexpectedly and rapidly above accepted norms? He observed, in an essay entitled, “Defining Deviancy Down”:

[…T]he amount of deviant behavior in American society has increased beyond the levels the community can “afford to recognize” and that, accordingly, we have been re-defining deviancy so as to exempt much conduct previously stigmatized, and also quietly raising the “normal” level in categories where behavior is now abnormal by any earlier standard.

I look at the current situation with Office Open XML (OOXML) in a similar way. There is a clearly defined community — JTC1 member National Bodies — with the responsibility for reviewing submitted standards. However, their capacity for exercising control is finite. The JTC1 Directives allow them a fixed period of time to review any submission. They also have a fixed number of volunteers to perform the review, and a fixed (or at least highly constrained) number of meetings to discuss and agree on review comments. So, when presented with a specification of unprecedented length (over 6,000 pages), and rather low quality, what are they to do? Spend hundreds of hours reading the specification? Write up and report thousands of errors? No, the capacity in JTC1 to deal with this level of deviancy does not exist, so the natural way for the community to cope is to to define deviancy down.

How deviant is OOXML? The 6,000+ page length is one aspect. Another is the rate at which it raced through its Ecma review, 20-times the speed of comparable specifications. Certainly, a longer specification will tend to have more problems than a shorter one, and a rushed review will find fewer problems than a thorough one. But that is speaking in generalities. Is there anything we can say for OOXML defect rates?

The Groklaw review, which occurred over a few days found a large number of serious problems. But I think we can quantify this a bit more. I tried an experiment. I used a random-number generator to generate a sample of 20 page numbers in the OOXML specification. I then read each of these pages, looking for technical errors, platform dependencies, lack of extensibility, drafting errors, etc. I did not bother noting spelling, grammatical or usage errors. I recorded how many reportable errors I found on each page. Some pages had zero problems, others had 1, 2 or even 3 problems. I even found one particularly bad error that could send OOXML back to Ecma once reported — more on that another day — but the average errors per page was 1.0. So projecting out to a 6,039 page specification this leads to a prediction of 6,000 +/- 1,000 errors. Reviewing a larger number of pages would reduce the error bars on that prediction, but we seem to be dealing with defects numbering in the thousands.

Are NB’s able to deal with a level of deviancy this great? Do they possibly have the resources to detect and report this number of errors and then verify that they are addressed? If not, the natural reaction is to define deviancy down.

For example, OOXML is currently in a 30-day review period where “contradictions” with existing ISO or IEC standards can be alleged by National Bodies (NB’s). Although the word “contradiction” is not defined in JTC1 Directives, its meaning can be seen from a resolution unanimously adopted at a JTC1 Plenary in 2000:

Resolution 27 – Consistency of JTC 1 Products

JTC 1 stresses the strong need for consistency of its products (ISs and TRs) irrespective of the route through which they were developed. Any inconsistency will confuse users of JTC 1 standards and, hence, jeopardize JTC 1’s reputation. Therefore, referring to clauses 13.2 (Fast Track) and 18.4.3.2 (PAS) of its Directives, JTC 1 reminds ITTF of its obligation to ascertain that a proposed DIS contains no evident contradiction with other ISO/IEC standards. JTC 1 offers any help to ITTF in such undertaking. However, should an inconsistency be detected at any point in the ratification process, JTC 1 together with ITTF will take immediate action to cure the problem.

The clear meaning of this is that contradictions are to be avoided, and that some of the defining characteristics of standards with contradictions are that they are not consistent, that they confuse users, and that they jeopardize JTC1’s reputation.

Further, we have precedents of other contradictions raised within JTC1, such as just last year, when the NB’s of the UK and Germany both alleged contradictions against Microsoft’s C++/CLI specification, then submitted for Fast Track processing from Ecma. The contradiction raised by the German NB (DIN) in that case said in part:

On a technical level, there are some rather different approaches between C++ and C++/CLI which can easily cause considerable confusion when both languages are considered to be “C++” or add unnecessary overhead when trying to write C++ code usable with C++ and C++/CLI. Below are a few example although if there were sufficient time to to thorough analysis of the C++/CLI document more could probably be found.

This is simple, easy to understand, and well within the spirit of the JTC1 Resolution quoted earlier.

But in a notable case of defining deviancy down, we’re starting to see the word “contradiction” defined very narrowly. For example, Microsoft’s Brian Jones suggests contradictions should be looked at this way:

[T]his is where you want to make sure that the approval of this ISO spec won’t cause another ISO standard to break. In the case of OpenXML, there really can’t be a contradiction because it’s always possible to implement OpenXML alongside other technologies. For instance, OpenOffice will soon have support for ODF and OpenXML.

An example of a contradiction would be if there was a standard for wireless technology that required the use of a certain frequency. If by using that frequency you would interfere with folks using another standard that also leverages that frequency, then there may be a contradiction.

To be quite fair, the Chinese WAPI defeat in ISO is also a precedent, but when searching for a definition of “contradiction” all precedents should be considered, not just one. Arguing exclusively from a wireless protocol standard precedent when dealing with the case of an XML markup standard is dubious when contradictions just last year were alleged to a programming language, a technology much closer to OOXML than a wireless protocol is. Surely, since C++/CLI is Microsoft’s technology they would be aware of this precedent? But still they didn’t mention it.

I ask you to consider the impact of taking Microsoft’s definition of “contradiction” and applying it to virtual technologies, like document formats, image formats, presentation formats, programming languages, operating system interfaces, API’s, security protocols, anything in the realm of software rather than hardware. None of these can ever conflict by Microsoft’s definition. Never. Therefor there is never grounds for a contradiction, and JTC1’s own Directives, which adopted the contradiction clause only a few years ago, is a procedural nullity, a no-op, meaningless, a waste of time for a large part of the technologies JTC1 has standards authority for. This is a clear example of defining deviancy down.

Let’s go back in time, 750 years ago to Thomas Aquinas and his Summa Theologica, the 13th century’s God: The Missing Manual. Aquinas had some apt words on contradictions, when discussing whether the powers of God were infinite and omnipotent (Question 25, Article 3):

Therefore, everything that does not imply a contradiction in terms, is numbered amongst those possible things, in respect of which God is called omnipotent: whereas whatever implies contradiction does not come within the scope of divine omnipotence, because it cannot have the aspect of possibility… For whatever implies a contradiction cannot be a word, because no intellect can possibly conceive such a thing.

Aquinas here allows that God can do all things that are possible, but cannot do something which is a contradiction in terms. Going back to Microsoft’s proposed definition of a contradiction, it seems that they are only willing to acknowledge a contradiction if it amounts to a co-existence problem so severe that even God could not resolve it. This seems to be a rather high hurdle to reach, and is clearly not what JTC1 intended. This is defining deviances down, way down.

This is the essential problem JTC1 has with the OOXML submission. It is too large and has too many problems with it for the control mechanisms available to JTC1 (in particular review time and volunteers) for handling the presented level of deviancy. The only recourse available to them is to define deviancy down to the level where they can handle a much smaller number of problems. Of course, this will lead to a much lower-quality ISO Standard than we are accustomed to, but what other choice is there?

This lesson has clear ramifications for Microsoft. The bigger the specification, the less throughly it will be reviewed. If you make it large enough it will barely be reviewed at all. The plan for 2007 should be to combine the .NET, OPC, XPS, JScript, J#, C#, XAML, WPF, HD Photo and whatever other specifications you have handy, put them all into one 50,000 page document, call it the “Open Microsoft Specification” rush it through Ecma and then Fast Track it into ISO. No one can really stop you. JTC1 Fast Track is broken.

Filed Under: OOXML, Standards

  • « Go to Previous Page
  • Page 1
  • Interim pages omitted …
  • Page 5
  • Page 6
  • Page 7
  • Page 8
  • Page 9
  • Go to Next Page »

Primary Sidebar

Copyright © 2006-2026 Rob Weir · Site Policies