• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

An Antic Disposition

  • Home
  • About
  • Archives
  • Writings
  • Links
You are here: Home / Archives for OOXML

OOXML

Compatibility According to Humpty Dumpty

2007/03/04 By Rob 15 Comments

‘I don’t know what you mean by “glory,” ’ Alice said.

Humpty Dumpty smiled contemptuously. ‘Of course you don’t — till I tell you. I meant “there’s a nice knock-down argument for you!” ’

‘But “glory” doesn’t mean “a nice knock-down argument,” ’ Alice objected.

‘When I use a word,’ Humpty Dumpty said, in a rather scornful tone, ‘it means just what I choose it to mean, neither more nor less.’

‘The question is,’ said Alice, ‘whether you can make words mean so many different things.’

‘The question is,’ said Humpty Dumpty, ‘which is to be master – that’s all.’

— Lewis Carroll from Through the Looking-Glass (1871)

I have written about Microsoft’s language games previously. These games continue and it appears to be time for yet another inoculation. Words such as “open”, “choice”, “interoperability”, “standard”, “innovation” and “freedom” have been bandied about like patriotic slogans, but with meanings that are often distorted from their normal uses.

The aggrieved word I want to examine today is “compatibility”. Let’s see how it is being used, with some illustrative examples, the ipsissima verba, Microsoft’s own words:

From an open letter “Interoperability, Choice and Open XML” by Jean Paoli and Tom Robertson:

The specification enables implementation of the standard on multiple operating systems and in heterogeneous environments, and it provides backward compatibility with billions of existing documents.

From another open letter, Chris Capossela’s “A Foundation for the New World of Documents”:

… all the features and functions of Office can be represented in XML and all your older Office documents can be moved from their binary formats into XML with 100 percent compatibility. We see our investment in XML support as the best way for us to meet customers’ interoperability needs while at the same time being compatible with the billions of documents that customers create every year.

From Doug Mahugh: “The new Open XML file formats offer true compatibility with all of the billions of Office documents that already exist.”

And from Craig Kitterman: “Is backward compatibility for documents important to you? How about choice?”

Those are just a handful of examples. Feel free to leave a comment suggesting additional ones.

Compatibility. Better yet, True Compatibility. What is that? And what do you think the average user, or even the average CTO, thinks, when hearing these claims from Microsoft about 100% compatibility?

Let’s explore some scenarios and try to reverse-engineer Microsoft’s meaning of “True compatibility”.

Suppose you get a new, more powerful PC with more memory and upgraded graphics card and you upgrade to Vista and Office 2007. You create a new presentation in PowerPoint 2007 and save it in the new OOXML format. What can you do with it?

Can you exchange it with someone using Office on the Mac? Sorry, no. OOXML is not supported there. They will not be able to read your document.

Is this 100% compatibility?

What about Windows Mobile? Can I read my document there? Sorry, OOXML is not supported there either.

Is this 100% compatibility?

What about sending the file to your friends using SmartSuite, WordPerfect Office or OpenOffice, or KOffice? They all are able to read the legacy Microsoft formats, so surely a new format that is 100% compatible with the legacy formats should work here as well? Sorry, you are out of luck. None of these applications can read your OOXML presentation.

Is this 100% compatibility?

What about legacy versions of Microsoft Office? Can I simply send my OOXML file to a person using an old version of Office and have it load automatically? Sorry, older versions of Office do not understand OOXML. They must either upgrade to Office 2007 or download and install a convertor.

Is this 100% compatibility?

I have Microsoft Access XP and an application built on it that imports data ranges from Excel files and imports them into data tables. Will it work with OOXML spreadsheets? Sorry, it will not. You need to upgrade to Access 2007 for this support.

Is this 100% compatibility?

What about other 3rd party applications that take Office files as input: statistical analysis, spreadsheet compilers, search engines, document viewers, etc. Will they work with OOXML files? No, until they update their software your OOXML documents will not work with software that expects the legacy binary formats.

Is this 100% compatibility?

Suppose I, as a software developer, takes the 6,039 page OOXML specification and write an application that can read and display OOXML perfectly. It will be hard work, but imagine I do it. Will I then be able to read the billions of legacy Office documents? Sorry, the answer is no. The ability to read and write OOXML does not give you the ability to read and write the legacy formats.

Is this 100% compatibility?

So, there it is. A don’t know if we’re any closer to finding out what “100% compatibility” means to Microsoft. But we certainly have found lot of things it doesn’t mean.

A quick analogy. Suppose I designed a new DVD format, and standardized it and said it was 100% compatible with the existing DVD standard. What would consumers think this means? Would they think that the DVD’s in the new format could play in legacy DVD players? Yes, I believe that would be the expectation based on the normal meaning of “100% compatible”.

But what if I created a new DVD Player and said it supported a new DVD format, but also that the Player was 100% compatible with the legacy format. What would consumers think then? Would they expect that the new DVD’s would play in older players? No, that is not implied. Would they expect that older DVD’s could be played in the new Player? Yes, that is implied.

This is the essence of Microsoft’s language game. The are confusing the format with the application. This is easy to do when your format is just a DNA sequence of your application. However, although Microsoft Office 2007, the application, may be able to read both OOXML and the legacy formats, the OOXML format itself is not compatible with any legacy application. None. The only way to get something to work with OOXML to write new code for it.

This is not what people expect when they hear these claims of OOXML being 100% compatible with legacy formats.

Filed Under: OOXML

Essential and Accidental in Standards

2007/02/25 By Rob 16 Comments

The earliest standards were created to support the administration of the government, which in antiquity primarily consisted of religion, justice, taxation and warfare. Crucial standards included the calendar, units of length, area, volume and weight, and uniform coinage.

Uniform coinage in particular was a significant advance. Previously, financial transactions occurred only by barter or by exchanging lumps of metals of irregular purity, size and shape, called “aes rude”. With the mass production of coins of uniform purity and weight imprinted with the emperor’s portrait, money could now be exchanged by simply counting, a vast improvement over having to figure out the purity and weight of an arbitrary lump of metal. Standards reduced the friction of commercial transactions.

Cosmas Indicopleustes, a widely-traveled merchant, later a monk, writing in the 6th Century, said:

The second sign of the sovereignty which God has granted to the Romans is that all nations trade in their currency, and in every place from one end of the world to the other it is acceptable and envied by every other man and every kingdom

“You can see a lot just by observing,” as Yogi Berra once said. A coin can be read much like a book. So, what can you see by reading a coin, and what does this tell us about standards?

To the left are examples from my collection of a single type of coin. The first picture shows the the obverse of one instance, followed by the reverse of eight copies of the same type.

The legend on the observe is “FLIVLCONSTANTIVSNOBC”. The text is highly abbreviated and there are no breaks between the words as is typical in classical inscriptions whether on coins or monuments. Brass or marble was expensive so space was not wasted. We can expand this inscription to “Flavius Julius Constantius Nobilissimus Caesar” which translates to “Flavius Julius Constantius, Most Noble Caesar”.

So this is a coin of Constantius II (317-361 A.D.), the middle son of Constantine the Great. The fact that he is styled “Caesar” rather than “Augustus” indicates that this coin dates from his days as heir-designate (324-337), prior to his father’s death. We know from other evidence that this type of coin was current around 330-337 A.D.

There is not much else interesting on the obverse. Imperial portraits had become stylized so much by this period that you cannot really tell one from the other purely by the portrait.

The reverse is a bit more interesting. When you consider that the such coins were produced by the millions and circulated to the far corners of the empire, it is clear the coins could have propaganda as well as monetary value. In this case, the message is clear. The legend reads “Gloria Exercitus” or “The Glory of the Army”. Since the army’s favor was usually the deciding factor in determining succession, a young Caesar could never praise the army too much. Not coincidentally, Constantius’s brothers, also named as Caesars, also produced coins praising the army before their father’s death.

At bottom of the reverse, in what is called the “exergue”, is where we find the mint marks, telling where the coin was minted, and even which group or “officina” within the mint produced the coin. From the mint marks, we see that these particular coins were minted in Siscia (now Sisak, Croatia), Antioch (Antakya, Turkey), Cyzicus (Kapu-Dagh, Turkey), Thessalonica (Thessaloníki, Greece) and Constantinople (Istanbul, Turkey).

The image on the reverse shows two soldiers, each holding a spear and shield, with two standards, or “signa militaria”, between them. The standard was of vital importance on the battle field, providing a common point by which the troops could orient themselves in their maneuvers. These standards appear to be of the type used by the Centuries (units of 100 men) rather than the legionary standard, which would have the imperial eagle on top. You can see a modern recreation of a standard here.

If you look closely, you’ll notice that soldiers on these coins are not holding the standards (they already have a spear in one hand and a shield in the other), and they lack the animal skin headpiece traditional to a standard bearer or “signifer”. So this tells us that these soldiers are merely rank and file soldiers encamped, with the standards stuck into the ground.

If you compare the coins carefully you will note some differences, for example:

  • Where the breaks in the legend occur. Some break the inscription into “GLOR-IAEXERC-ITVS”, while others have “GLORI-AEXER-CITVS”. Note that neither match word boundaries.
  • The uniforms of the soldiers differ, in particular the helmets. Also the 2nd coin has the soldiers wearing a sash that the other coins lack. This may reflect legionary or regional differences.
  • The standards themselves have differences, in the number of disks and in the shape of the topmost ornament.
  • The stance of the soldiers varies. Compare the orientation of their spear arm, the forward foot and their vertical alignment.

There are also differences in quality. Consider the mechanics of coin manufacture. These were struck, not cast. The dies were hand engraved in reverse (intaglio) and hand struck with hammers into bronze planchets. The engravers, called “celators”, varied in their skills. Some were clearly better at portraits. Others were better at inscriptions. (Note the serifs in the ‘A’ and ‘X’ of the 4th coin) Some made sharp, deep designs that would last many strikes. Others had details that were too fine and wore away quickly. Since these coins are a little under 20mm in diameter, and the dies were engraved by hand, without optical aids, there is considerable skill demonstrated here, even though this time period is pretty much the artistic nadir of classical numismatics.

Despite the above differences, to an ancient Roman, all of these coins were equivalent. They all would have been interchangeable, substitutable, all of the same value. Although they differed slightly in design, they matched the requirements of the type, which we can surmise to have been:

  • obverse portrait of the emperor with the prescribed legend
  • reverse two soldiers with standards, spears and shields with the prescribed legend
  • mint mark to indicate which mint and officina made the coin
  • specified purity and weight

I’d like to borrow two terms from metaphysics: “essential” and “accidental”. The essential properties are those which an object must necessarily have in order to belong to a particular category. Other properties, those which are not necessary to belong to that category, are termed “accidental” properties. These coins are all interchangeable because they all share the same essential properties, even though they differ in many accidental properties.

As another example, take corrective eye-glasses. A definition in terms of the essential properties might be, “Corrective eyeglasses consist of two transparent lenses, held in a frame, to be worn on the face to correct the wearer’s vision.” Take away any essential property and they are no longer eyeglasses. Accidental properties might include the material used to make the lenses, the shape and color of the frame, whether single or bifocals, the exact curvature of the lens etc.

The distinction between the essential and accidental is common wherever precision in words is required, in legislation, in regulations, in contracts, in patent applications, in specifications and in standards. There are risks in not specifying an essential property, as well as in specifying an accidental property. Underspecification leads to lower interoperability, but over-specification leads to increased implementation cost, with no additional benefit.

Technical standards have dealt with this issue in several ways. One is through the use of tolerances. I talked about light bulbs in a previous post and the Medium Edison Screw. The one I showed, type A21/E26 has an allowed length range of 125.4-134.9 mm. An eccentricity of up to 3-degrees is allowed along the base axis. The actual wattage may be as much as 4% plus 0.5 watts greater than specified. Is this just an example of a sloppy standard? Why would anyone use it if it allows 4% errors? Why not have a standard that tells exactly how large the bulb should be?

The point is that bulb sockets are already designed to accept this level of tolerance. Making the standard more precise would do nothing but increase manufacturing costs, while providing zero increase in interoperability. The reason why we have cheap and abundant lamps and light bulbs is that their interconnections are standardized to the degree necessary, but no more so.

There is often a sweet spot in standardization that gives optimal interoperability at minimal cost. Specify less than this and interoperability suffers. Specify more than that and implementation costs increase, though with diminished interoperability returns.

An allowance for implementation-dependent behaviors is another technique a standard has available to find that sweet spot. A standard can define some constraints, but explicitly state that others are implementation-dependent, or even implementation-defined. (Implementation-defined goes beyond implementation-dependent in that not only can an implementation choose their own behavior in this area, but that they should also document what behavior they made implemented.) For example, in the C or C++ programming languages the size of an integer is not specified. It is declared to be implementation-defined. Because of this, C/C++ programs are not as interoperable as, say Java or Python programs, but they are better able to adapt to a particular machine architecture, and in that way achieve better performance. And even Java specifies that some threading behavior is implementation-dependent, knowing that runtime performance would be significantly enhanced if implementations could directly use native OS threads. Even with these implementation-dependent behaviors, C, C++ and Java have been extremely successful.

Let’s apply this line of reasoning to document file formats. Whether you are talking about ODF or OOXML there are pieces left undefined, important things. For example, neither format specifies the exact pixel-perfect positioning of text. There is a common acceptance that issues of text kerning and font rasterization do not belong in the file format, but instead are decisions best deferred to the application and operating environment, so they can make a decision based on factors such as availability of fonts and the desired output device. Similarly a color can be specified, in RGB or another color space, but these are not exact spectral values. Red may appear one way on a CRT under florescent light, and another way on a LCD monitor in darkness and another way on color laser output read under tungsten lighting. An office document does not specify this level of detail.

In the end, a standard is defined as much by what it does not specify as by what it does specify. A standard that specifies everything can easily end up being merely the DNA sequence of a single application.

A standard provides interoperability within certain tolerances and with allowances for implementation-dependent behaviors. A standard can be evaluated based on how well it handles such concerns. Microsoft’s Brian Jones has criticized ODF for having a flexible framework for storing application-specific settings. He lists a range of settings that the OpenOffice application stores in their documents, and compares that to OOXML, where such settings are part of the standardized schema. But this makes me wonder, where then does one store application-dependent settings in OOXML? For example, when Novell completes support of OOXML in OpenOffice, where would OpenOffice store its application-dependent settings? The Microsoft-sponsored ODF Add-in for Word project has made a nice list of ODF features that cannot be expressed in OOXML. These will all need to be stored someplace or else information will be lost when down-converting from ODF to OOXML. So how should OpenOffice store these when saving to OOXML format?

There are other places where OOXML seems to have regarded the needs of Microsoft Office, but not other implementors. For example, section 2.15.2.32 of the WordprocessingML Reference defines an “optimizeForBrowser” element which allows the notation of optimization for Internet Explorer, but no provision is made for Firefox, Opera or Safari.

Section 2.15.1.28 of the same reference specifies a “documentProtection” element:

This element specifies the set of document protection restrictions which have been applied to the contents of a WordprocessingML document. These restrictions shall be enforced by applications editing this document when the enforcement attribute is turned on, and should be ignored (but persisted) otherwise.

This “protection” relies on a storing a hashed password in the XML, and comparing that to the hash of the password the user enters, a familiar technique. But rather than using a secure hash algorithm, SHA256 for example, or any other FIPS compliant algorithm, OOXML specifies a legacy algorithm of unknown strength. Now, I appreciate the need for Microsoft to have legacy compatibility. They fully acknowledge that the protection scheme they provide here is not secure and is only there for compatibility purposes. But why isn’t the standard flexible enough to allow an implementation to utilize a different algorithm, one that is secure? Where is the allowance for innovation and flexibility?

What makes this worse is that Microsoft’s DRM-based approach to document protection, from Office 2003 and Office 2007, is entirely undocumented in the OOXML specification. So we are left with a standard with a broken protection feature that we cannot replace, while the protection that really works is in Microsoft’s proprietary extensions to OOXML that we are not standardizing. How is this a good thing for anyone other than Microsoft?

Section 2.15.3.54 defines an element called “uiCompat97To2003” which is specified simply as, “Disable UI functionality that is not compatible with Word97-2003”. But what use is this if I am using OOXML in OpenOffice or WordPerfect Office? What if I want to disable UI functionality that is not compatible with OpenOffice 1.5? Or WordPerfect 8? Or any other application? Where is the ability for other implementations to specify their preferences?

It seems to me is that OOXML in fact does have application-dependent behaviors, but only for Microsoft Office, and that Microsoft has hard-coded these application-dependent behaviors into the XML schema, without tolerance or allowance for any other implementations settings.

Something does not cease to be application-dependent just because you write it down. It ceases to be application-dependent only when you generalize it and accommodate the needs of more than one application.

Certainly, any application that stores document content or styles or layout as application-dependent settings rather than in the defined XML standard should be faulted for doing so. But I don’t think anyone has really demonstrated that OpenOffice does this. It would be easy enough to demonstrate if it were true. Delete the settings.xml from an ODF document (and the reference to it from the manifest) and show that the document renders differently without it. If it does, then submit a bug report against OpenOffice or (since this is open source) submit a patch to fix it. A misuse of application-settings is that easy to fix.

But a standard that confuses the accidental application-dependent properties of a single vendor’s application for an essential property that everyone should implement, and to do this without tolerance or allowance for other implementations, this is certainly a step back for the choice and freedom of applications to innovate in the marketplace. Better to use ODF, the file format that has multiple implementations and acknowledges the propriety of sensible application-dependent behaviors, and provides a framework for them to record such settings.


3/16/2007 — added Cosmas Indicopleustes quote

Filed Under: Numismatics, OOXML, Standards

Standards and Enablement

2007/02/24 By Rob 8 Comments

I’d like to synthesize some thoughts I’ve been having in recent weeks. But before I do that, let’s have a joke:

A Harvard Divinity School student reviews a proposed dissertation topic with his advisor. The professor looks over the abstract for a minute and gives his initial appraisal.

“You are proposing an interesting theory here, but it isn’t new. It was first expressed by a 4th Century Syrian monk. But he made the argument better than you. And he was wrong.”

So it is with some trepidation that I make an observation which may not be novel, well-stated, or even correct, but here it goes:

There is (or should be) an important relationship between patents and standards, or more precisely, between patent quality and standards quality.

As we all know, a patent is an exclusive property right, granted by the state for a limited period of time to an inventor in return for publicly disclosing the workings of his invention. In fact the meaning of “to patent” was originally, “to make open”. We have a lingering sense of this in phrases like, “that is patently absurd”. So, some public good ensues for the patent disclosure, and the inventor gets a short-term monopoly in the use of that invention in return. It is a win-win situation.

To ensure that the public gets their half of the bargain, a patent may be held invalid if there is not sufficient disclosure, if a “person having ordinary skill in the art” cannot “make and use” the invention without “undue experimentation”. The legal term for this is “enablement”. If a patent application has insufficient enablement then it can be rejected.

For example, take the patent application US 20060168937, “Magnetic Monopole Spacecraft” where it is claimed that a spacecraft of a specified shape can be powered by AC current and thereby induce a field of wormholes and magnetic monopoles. Once you’ve done that, the spacecraft practically flies itself.

The author describes that in one experiment he personally was teleported through hyperspace over 100 meters, and in another he blew smoke into a wormhole where it disappeared and came out another wormhole. However, although the inventor takes us carefully through the details of how the hull of his spacecraft was machined, the most critical aspect, the propulsion mechanism, is alluded to, but not really detailed.

(Granted, I may not be counted as a person skilled in this particular art. I studied astrophysics at Harvard, not M.I.T. Our program did not cover the practical applications of hyperspace wormhole travel.)

But one thing is certain — the existence of the magnetic monopole is still hypothetical. No one has shown conclusively that they exist. The first person who detects one will no doubt win the Nobel Prize in Physics. This is clearly a case of requiring “undue experimentation” to make and use this invention, and I would not be surprised if it is rejected for lack of enablement.

I’d suggest that a similar criterion be used for evaluating a standard. When a company proposes that one of its proprietary technologies be standardized, they are making a similar deal with the public. In return for specifying the details of their technology and enabling interoperability, they are getting a significant head start in implementing that standard, and will initially have the best and fullest implementation of that standard. The benefits to the company are clear. But to ensure that the public gets their half of the bargain, we should ask the question, is there sufficient disclosure to enable a “person having ordinary skill in the art” to “make and use” an interoperable implementation of the standard without “undue experimentation”. If a standard does not enable others to do this, then it should be rejected. The public and the standards organizations that represent them should demand this.

Simple enough? Let’s look at the new Ecma Office Open XML (OOXML) standard from this perspective. Microsoft claims that this standard is 100% compatible with billions of legacy Office documents. But is anyone actually able to use this specification to achieve this claimed benefit without undue experimentation? I don’t think so. For example, macros and scripts are not specified at all in OOXML. The standard is silent on these features. So how can anyone practice the claimed 100% backwards compatibility?

Similarly, there are a number of backwards-compatibility “features” which are specified in the following style:

2.15.3.26 footnoteLayoutLikeWW8 (Emulate Word 6.x/95/97 Footnote Placement)

This element specifies that applications shall emulate the behavior of a previously existing word processing application (Microsoft Word 6.x/95/97) when determining the placement of the contents of footnotes relative to the page on which the footnote reference occurs. This emulation typically involves some and/or all of the footnote being inappropriately placed on the page following the footnote reference.

[Guidance: To faithfully replicate this behavior, applications must imitate the behavior of that application, which involves many possible behaviors and cannot be faithfully placed into narrative for this Office Open XML Standard. If applications wish to match this behavior, they must utilize and duplicate the output of those applications. It is recommended that applications not intentionally replicate this behavior as it was deprecated due to issues with its output, and is maintained only for compatibility with existing documents from that application. end guidance]

This sounds oddly like Fermat’s, “I have a truly marvelous proof of this proposition which this margin is too narrow to contain”, but we don’t give Fermat credit for proving his Last Theorem and we shouldn’t give Microsoft credit for enabling backwards compatibility. How is this description any different than the patent application claim magnetic monopoles to drive hyperspace travel? The OOXML standard simply does not enable the functionality that Microsoft claims it contains.

Similarly, Digital Rights Management (DRM) has been an increasingly prominent part of Microsoft’s strategy since Office 2003. As one analyst put it:

The new rights management tools splinter to some extent the long-standing interoperability of Office formats. Until now, PC users have been able to count on opening and manipulating any document saved in Microsoft Word’s “.doc” format or Excel’s “.xls” in any compatible program, including older versions of Office and competing packages such as Sun Microsystems’ StarOffice and the open-source OpenOffice. But rights-protected documents created in Office 2003 can be manipulated only in Office 2003.

This has the potential to make any other file format disclosure by Microsoft irrelevant. If they hold the keys to the DRM, then they own your data. The OOXML specification is silent on DRM. So how can Microsoft say that OOXML is 100% compatible with Office 2007, let alone legacy DRM’ed documents from Office 2003? The OOXML standard simply does not enable anyone else to practice interoperable DRM.

It should also be noted that the legacy Office binary formats are not publicly available. They have been licensed by Microsoft under various restrictive schemes over the years, for example, only for use on Windows, only for use if you are not competing against Office, etc., but they have never been simply made available for download. And they’ve certainly never been released under the Open Specification Promise. So lacking a non-discriminatory, royalty-free license for the binary file format specification, how can anyone actually practice the claimed 100% compatibility? Isn’t it rather unorthodox to have a “standard” whose main benefit is claimed to be 100% compatibility with another specification that is treated as a trade secret? Doesn’t compatibility require that you disclose both formats?

Now what is probably true is that Microsoft Office 2007, the application, is compatible with legacy documents. But that is something else entirely. That fact would be true even if OOXML is not approved by ISO standard, or even if it were not an Ecma standard. In fact, Microsoft could have stuck with proprietary binary formats in Office 2007 and this would still be true. But by the criterion of whether a person having ordinary skill in the art can practice the claimed compatibility with legacy documents, this claim falls flat on its face. By accepting this standard, without sufficient enablement in the specification, the public risks giving away its standards imprimatur to Microsoft without getting a fair disclosure or the expectation of interoperability in return.

Filed Under: OOXML, Standards

Washing Machines are not Lamps

2007/02/14 By Rob 18 Comments

Microsoft standards attorney David Rudin has posted his thoughts on my How Standards Bring Consumers Choice, in a post titled Floor Lamps are not Software.

David correctly points out that some appliances, like washing machines or electric dryers, with higher power requirements, have a different plug design. “Clearly, a one size standard does not fit all”, as he says. However, this is an intentional design decision made for safety reasons. If things cannot safely be interchanged, then good industrial design is to make them impossible to be interchanged. These plugs are incompatible and non-interoperable on purpose.

No one would intentionally do that with a file format, would they?

David then suggests that a single standard is insufficient because it would stifle competition and innovation:

Electricity is a largely mature and stable technology and there is not much room for innovation in the socket and receptacle space. Document formats, on the other hand, are constantly evolving to keep pace with changing technology. Competition is vital to ensure that those formats continue to meet those ever changing needs. Imagine if a single document format was adopted 15 years ago. How would that format deal with things that we take for granted today like including links to web pages, adding digital photos, or even embedding video in our documents? Unlike electricity, document formats are evolving at a rapid pace and competition will help drive that innovation.

I see it differently. Has a single HTML standard held back competition and innovation on the web? Has the move from vendor-specific network protocols to TCP/IP deprived consumers of innovation? Has the standardization of SQL held back the database industry? Have standardized programming languages like C and C++ prevented others from innovating? I see no evidence of this. On the contrary, standardized HTML, TCP/IP, SQL and C/C++ have been fundamental to the modern IT economy and have been responsible for many billions of dollars of value.

I’d also challenge the assertion that standardization equates with lack of innovation. If this were true, how does Microsoft reconcile their work standardizing OOXML, .NET, C++/CLI, C#, etc., with their needs for continuing innovation? Are these areas, “largely mature and stable”?

Or is this really just a belief that standardization is good when Microsoft originates and controls the standard, but it is bad otherwise?

Back to the examples of HTML, TCP/IP, SQL, C/C++. These standards continued to evolve, and innovations were brought to consumers, but they were done in a multi-vendor standards process where they reconciled their multiple perspectives and needs. Is that such a bad model to follow?

In the end, where does innovation come from? Does it require absolute control? Or does it come from having bright people? I’d suggest the latter, and point out that Microsoft employs several, but not all of the bright people in the area of file formats. Microsoft and Microsoft’s customers would benefit greatly if Microsoft would join with their competitors who are already innovating and competing in the evolution of the ISO ODF standard.

Remember the “X” in XML stands for Extensible. Making a single file format that meets Microsoft’s needs, as well as IBM’s, Sun’s, Corel’s, Novell’s, Google’s, etc., is not only technically possible, it is the best approach for the consumer. This does not mean that competition ends, or that all office applications will have identical features, or that we can only have lowest-common-denominator functionality. It just means that we should agree on a common representation for the features that we already have in common, and then have a framework for layering on vendor-specific innovations to express the areas where we differ.

Filed Under: OOXML, Standards

Once More unto the Breach

2007/02/09 By Rob Leave a Comment

Stephen Walli has a blog, Once More unto the Breach. He writes mainly about open standards/open source, with a solid business/legal angle. He also has hands-on experience with standards development in the IEEE and ISO with POSIX, and an interesting perspective from his broad experience in the industry, including working with standards and community development issues at Microsoft.

Since his blog’s title, like mine, is from Shakespeare, and he occasionally writes about ODF, I am doubly obliged to give a mention.

Looking at the posts tagged ODF, there is some good stuff. In Vendor-Speak: Microsoft and OOXML, he takes a close reading of some of the recent statements from Microsoft on standards and choice, a long-time confusion of terms that I had previously called a language game. Walli points out:

…Standards happen when a technology space matures to the point that customers are over-served and want choice to encourage competition. Customers complaining about price is the market signal. Competitors know they can collectively chase the incumbent vendor with a standard at this point, if they pick the right level of collective abstraction to standardize. This is how standards work in the marketplace. (I would hope the GM for standards at Microsoft knew this.)

A sympathy play isn’t going to work here. Customers WANT the standard that encourages multiple implementations. True Microsoft support for ODF in their Office product suite would have been listening to customers. Complaining that the marketplace is competitive while shoving your own product specification through a standards forum is naive at best and arrogant in the extreme at worst.

Another good post, is How Microsoft Should Have Played the ODF Standards Game:

The interesting thing is to look back on the number of times a vendor with a single implementation tried to win playing an overlapping standards game. Looking at the UNIX wars I remember three occasions off the top of my head where this was tried over a long period.

  • “tar wars” over archiving formats.
  • The GUI Wars where OSF/Motif and Sun’s GUI toolkit battled it out.
  • The sockets versus streams debate.

In each case, we ended up with two standards being forced upon us. In each case, the dominant technology that won in the marketplace was the one with the most implementations, with the other withering on the vine. Even when both specifications became required for an implementation to claim conformance to the single standard that included them, customers in the market used the specification which was most widely implemented every time.

Microsoft chose the wrong strategy here on multiple levels, betting against customers and the market in general. It may buy a bit of time, but will ultimately cost them more in the long run.

This is a theme that Walli repeats in several of his blog posts — the standard with the most implementations wins.

So a hearty recommendation for Once More unto the Breach, a blog that deserves a slot in your feed aggregator.

Filed Under: ODF, OOXML

  • « Go to Previous Page
  • Page 1
  • Interim pages omitted …
  • Page 13
  • Page 14
  • Page 15
  • Page 16
  • Page 17
  • Interim pages omitted …
  • Page 23
  • Go to Next Page »

Primary Sidebar

Copyright © 2006-2026 Rob Weir · Site Policies