• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

An Antic Disposition

  • Home
  • About
  • Archives
  • Writings
  • Links
You are here: Home / Archives for 2007

Archives for 2007

Standards and Enablement

2007/02/24 By Rob 8 Comments

I’d like to synthesize some thoughts I’ve been having in recent weeks. But before I do that, let’s have a joke:

A Harvard Divinity School student reviews a proposed dissertation topic with his advisor. The professor looks over the abstract for a minute and gives his initial appraisal.

“You are proposing an interesting theory here, but it isn’t new. It was first expressed by a 4th Century Syrian monk. But he made the argument better than you. And he was wrong.”

So it is with some trepidation that I make an observation which may not be novel, well-stated, or even correct, but here it goes:

There is (or should be) an important relationship between patents and standards, or more precisely, between patent quality and standards quality.

As we all know, a patent is an exclusive property right, granted by the state for a limited period of time to an inventor in return for publicly disclosing the workings of his invention. In fact the meaning of “to patent” was originally, “to make open”. We have a lingering sense of this in phrases like, “that is patently absurd”. So, some public good ensues for the patent disclosure, and the inventor gets a short-term monopoly in the use of that invention in return. It is a win-win situation.

To ensure that the public gets their half of the bargain, a patent may be held invalid if there is not sufficient disclosure, if a “person having ordinary skill in the art” cannot “make and use” the invention without “undue experimentation”. The legal term for this is “enablement”. If a patent application has insufficient enablement then it can be rejected.

For example, take the patent application US 20060168937, “Magnetic Monopole Spacecraft” where it is claimed that a spacecraft of a specified shape can be powered by AC current and thereby induce a field of wormholes and magnetic monopoles. Once you’ve done that, the spacecraft practically flies itself.

The author describes that in one experiment he personally was teleported through hyperspace over 100 meters, and in another he blew smoke into a wormhole where it disappeared and came out another wormhole. However, although the inventor takes us carefully through the details of how the hull of his spacecraft was machined, the most critical aspect, the propulsion mechanism, is alluded to, but not really detailed.

(Granted, I may not be counted as a person skilled in this particular art. I studied astrophysics at Harvard, not M.I.T. Our program did not cover the practical applications of hyperspace wormhole travel.)

But one thing is certain — the existence of the magnetic monopole is still hypothetical. No one has shown conclusively that they exist. The first person who detects one will no doubt win the Nobel Prize in Physics. This is clearly a case of requiring “undue experimentation” to make and use this invention, and I would not be surprised if it is rejected for lack of enablement.

I’d suggest that a similar criterion be used for evaluating a standard. When a company proposes that one of its proprietary technologies be standardized, they are making a similar deal with the public. In return for specifying the details of their technology and enabling interoperability, they are getting a significant head start in implementing that standard, and will initially have the best and fullest implementation of that standard. The benefits to the company are clear. But to ensure that the public gets their half of the bargain, we should ask the question, is there sufficient disclosure to enable a “person having ordinary skill in the art” to “make and use” an interoperable implementation of the standard without “undue experimentation”. If a standard does not enable others to do this, then it should be rejected. The public and the standards organizations that represent them should demand this.

Simple enough? Let’s look at the new Ecma Office Open XML (OOXML) standard from this perspective. Microsoft claims that this standard is 100% compatible with billions of legacy Office documents. But is anyone actually able to use this specification to achieve this claimed benefit without undue experimentation? I don’t think so. For example, macros and scripts are not specified at all in OOXML. The standard is silent on these features. So how can anyone practice the claimed 100% backwards compatibility?

Similarly, there are a number of backwards-compatibility “features” which are specified in the following style:

2.15.3.26 footnoteLayoutLikeWW8 (Emulate Word 6.x/95/97 Footnote Placement)

This element specifies that applications shall emulate the behavior of a previously existing word processing application (Microsoft Word 6.x/95/97) when determining the placement of the contents of footnotes relative to the page on which the footnote reference occurs. This emulation typically involves some and/or all of the footnote being inappropriately placed on the page following the footnote reference.

[Guidance: To faithfully replicate this behavior, applications must imitate the behavior of that application, which involves many possible behaviors and cannot be faithfully placed into narrative for this Office Open XML Standard. If applications wish to match this behavior, they must utilize and duplicate the output of those applications. It is recommended that applications not intentionally replicate this behavior as it was deprecated due to issues with its output, and is maintained only for compatibility with existing documents from that application. end guidance]

This sounds oddly like Fermat’s, “I have a truly marvelous proof of this proposition which this margin is too narrow to contain”, but we don’t give Fermat credit for proving his Last Theorem and we shouldn’t give Microsoft credit for enabling backwards compatibility. How is this description any different than the patent application claim magnetic monopoles to drive hyperspace travel? The OOXML standard simply does not enable the functionality that Microsoft claims it contains.

Similarly, Digital Rights Management (DRM) has been an increasingly prominent part of Microsoft’s strategy since Office 2003. As one analyst put it:

The new rights management tools splinter to some extent the long-standing interoperability of Office formats. Until now, PC users have been able to count on opening and manipulating any document saved in Microsoft Word’s “.doc” format or Excel’s “.xls” in any compatible program, including older versions of Office and competing packages such as Sun Microsystems’ StarOffice and the open-source OpenOffice. But rights-protected documents created in Office 2003 can be manipulated only in Office 2003.

This has the potential to make any other file format disclosure by Microsoft irrelevant. If they hold the keys to the DRM, then they own your data. The OOXML specification is silent on DRM. So how can Microsoft say that OOXML is 100% compatible with Office 2007, let alone legacy DRM’ed documents from Office 2003? The OOXML standard simply does not enable anyone else to practice interoperable DRM.

It should also be noted that the legacy Office binary formats are not publicly available. They have been licensed by Microsoft under various restrictive schemes over the years, for example, only for use on Windows, only for use if you are not competing against Office, etc., but they have never been simply made available for download. And they’ve certainly never been released under the Open Specification Promise. So lacking a non-discriminatory, royalty-free license for the binary file format specification, how can anyone actually practice the claimed 100% compatibility? Isn’t it rather unorthodox to have a “standard” whose main benefit is claimed to be 100% compatibility with another specification that is treated as a trade secret? Doesn’t compatibility require that you disclose both formats?

Now what is probably true is that Microsoft Office 2007, the application, is compatible with legacy documents. But that is something else entirely. That fact would be true even if OOXML is not approved by ISO standard, or even if it were not an Ecma standard. In fact, Microsoft could have stuck with proprietary binary formats in Office 2007 and this would still be true. But by the criterion of whether a person having ordinary skill in the art can practice the claimed compatibility with legacy documents, this claim falls flat on its face. By accepting this standard, without sufficient enablement in the specification, the public risks giving away its standards imprimatur to Microsoft without getting a fair disclosure or the expectation of interoperability in return.

Filed Under: OOXML, Standards

The Anatomy of Interoperability

2007/02/20 By Rob 8 Comments

I’d like to talk a little about about interoperability. Although I’ll concentrate on document formats, the concepts here are of wider applicability. Wherever you have standards and implementations, these are some of the issues that you will want to consider.

Rather than attempting the reckless frontal assault of defining interoperability, I’ll take an oblique approach and discuss the forces that tend to increase or decrease interoperability. By understanding these forces, and how they interact, you can get a good sense of how interoperability works, and have a better idea how the decisions that are being made today will determine the interoperability consumers will experience in the years to come.

I’ll start with the specification itself, the text of the standard. What characteristics might a specification have that hinder interoperability?

  1. Ambiguities — The specification may describe a feature in a way that is open to more than one interpretation. This may be caused by imprecise language, or by incomplete description of the feature. For example, if a specification defines a sine and cosine function, but fails to say whether their inputs are in degrees or radians, then this function is ambiguous.
  2. Out of scope features — The specification totally lacks description of a feature, making it out of scope for the standard. For example, neither ODF nor OOXML specifes the storage model, the syntax or the semantics of embedded scripts. If a feature is out of scope, then there is no expectation of interoperability with that feature.
  3. Undefined behaviors — These may be intentional or accidental. A specification may explicitly call out some behaviors as “undefined”, “implementation-dependent” or “implementation-defined”. This is often done in order to allow an implementation to implement the feature in the best performing way. For example, the size of integers are implementation-defined in the C/C++ programming languages, so they are free to take advantage of the capabilities of different machine architectures. Even a language like Java, which goes much further than many to ensure interoperability, has undefined behaviors in the area of multi-threading, for performance reasons. There is a trade-off here. A specification that specifies everything and leaves nothing to the discretion of the implementation will be unable to take advantage of the features of a particular platform. But a specification that leaves too much to the whim of the implementation will hinder interoperability.
  4. Errors — These may range from typographical errors, to incorrect use of control language like “shall” or “shall not”, to missing pages or sections in the specification, to inconsistency in provisions. If one part of the specification says X is required, and another says it is not, then implementations may vary in how feature X is treated.
  5. Non-thematic provisions — If a standard is claimed to be platform-neutral or application-neutral, then anything that is tied to a particular platform or application will hinder that type of interoperability.
  6. Feature Creep — A standard can collapse under its own weight. There is often a trade-off between expressiveness of a standard (what features it can describe) and the ease of implementation. The ideal is to be very expressive as well as easy to implement. If a standard attempts to do everything that everyone could possibly want, and does so indiscriminately, then the unwieldy complexity of the standard will make it more difficult for implementations to implement, and this will hinder interoperability.

The forces that help counteract these problems are:

  1. Public participation — The public gets involved early, reading drafts of the standard before it is approved, and also meeting minutes and discussion list traffic. The public provides comments to the technical committee and reads the comments submitted by others. Note: the mere theoretical ability of the public to participate results in absolutely no benefit to the standard. This is like a beta program that exists in theory only will uncover zero bugs. What counts is the actual participation. Openness is a means to participation, but is insufficient on its own.
  2. Expert review — The feature set for a modern word processor or spreadsheet is large and requires interdisciplinary expertise in such diverse areas as vector graphics, text layout, mathematical layout, accessibility, internationalization, schema design, statistical and financial calculations, etc. So the design of a file format will similarly require a broad range of expertise. Tapping into outside experts (for example, from member companies or other technical committees) when designing the standard, or as reviewers, is a way to reduce the types of problems that hinder interoperability.
  3. Multi-vendor participation — It is good to have multiple active participants in the standards development process, representing implementors both proprietary and open source, both traditional implementations as well as web-based. You might consider other constituencies as well, such as end-users, accessibility advocates, academic and government representatives, consumer advocates, etc. There is a saying familiar to programmers, “Every class of users finds a new class of bugs”. This is as true with standards as it is will code. More eyeballs, especially with different perspectives, is key. It is better to have 5 active participants with different perspectives than to have 100 participants from a single vendor. Again, I’m talking reality here, not just perception. It is easy to have a technical committee that is large on paper, but still have all of the ideas come from a single company. This does nothing for interoperability. You want to have really multi-vendor participation. This is why having public read-only access to the mailing list of the technical committee is so valuable. That is great way to see what the real participation level is.
  4. Multiple implementations — If a standard is intended to be platform-neutral and vendor-neutral, then the participation of multiple implementors working on multiple platforms is essential. They are the ones who truly drive interoperability. Especially if they are implementing while the specification is under development or review, they will find, report and resolve many interoperability issues.
  5. Reuse of existing standards — When writing a program, the availability of reusable code or design patterns can improve quality and shorten schedules. In manufacturing, the use of off-the-shelf components can be faster to tool and be less expensive than custom components. Similarly, in standards development, when you reuse another standard you reuse the domain expertise, the standards development effort and the review effort that went into the development of that standard. If you are lucky and you reuse a widely-implemented standard, implementors will likely also be able to reuse knowledge and even code when implementing it.
  6. Existence of a reference implementation and test suite — These work, together with the standard itself, as a triad of verifications that reinforce each other. The test suite is written to test the provisions of the specification. The reference implementation is written to implement the provisions of the specification. The test suite is executed against the reference implementation. Any errors indicated are thus either errors in the reference implementation, the specification, or the test suite. Fix the bug and repeat until the reference implementation passes the test suite. Having a reference implementation that passes a test suite that has high or complete coverage of the provisions of the specification is ideal for interoperability.
  7. Good engineering — In the end, a technical standard is technical, and it will be influenced by the same kinds of engineering criteria that influence how we architect and design software applications. If you can make a standard that is modular, exhibits internal consistency in its parts, is cohesive, has reasonable dependency chains among its parts, does what it set out to do with minimal complexity, but allows of uses beyond that, then you’ve made a good start. But if you find yourself shaking your head, slapping your forehead or even holding your nose when reading a standard, then you are heading for interoperability hell. The standards that have worked well were as simple as possible, but no simpler. Think XML 1.0 or XSLT 1.0. Engineers know how to build complexity on top of simplicity. But you don’t build anything on top of crap.

Of course, the standard is only one half of the equation. The implementations are the other half, and they have their own list of ways they can hinder or help interoperability.

The factors that hinder interoperability include:

  1. Implementation bugs — Conformance to a standard, like any other product feature, gets weighed against a long list of priorities for any given product release. There is always more work to do than time to do it. Whether a high-quality implementation of a standard becomes a priority will depend on factors such as user-demand, competition, and for open source projects, the level of interest of developers contributing to the community.
  2. Functional subsets — Even in heavily funded commercial ventures standards support can be partial. Look at Microsoft’s Internet Explorer, for example. How many years did it take to get reasonable CSS2 support? When an application supports only a subset of a standard, interoperability with applications that allow the full feature set of the standard, or a different subset of the standard, will suffer.
  3. Functional supersets — Similarly, an application can extend the standard, often using mechanisms allowed and defined by the standard, to create functional supersets that, if poorly designed, can cause interoperability issues.
  4. Varying conceptual models — For example, a traditional WYSIWYG word processor has a page layout that is determined by the metrics of the printer the document will eventually print to. But a web-based editor is free from those constraints. In fact, if the eventual target of the document is a web page, these constraints are irrelevant. So we have here a conceptual difference, where one implementation sees the printed page as a constraint on layout, and another application is in an environment where page width is more flexible. Document exchange between two editors with different conceptual models of page size will require extra effort to ensure interoperability.
  5. Different code bases — The more distinct implementations, the more paths of exchange. If you have only a single implementation of your standard, then there will be no interoperability issues, at least not with others using the same version of your application on the same platform. But if there are two implementations, then you have 2 paths of exchange to worry about. If there are 3 implementations then you have 6 paths, and so on with N*(N-1) possible paths for exchanging files with N applications. Consider the ODF format for word processing documents, where we will soon have implementations in Office (at least two different Plugins), OpenOffice, Lotus Notes, KOffice, Google Docs and Abi Word. This is 42 exchange paths (or more if you consider platform variations) though obviously not all of these paths will be common.

And the forces that counteract these and lead to improved interoperability are:

  1. Test suites and reference implementations — As described earlier, these are key. During the development of the standard they improve the standard, and after the standard is approved they continue paying dividends by helping improve the quality additional implementations.
  2. Conformance certification — This can take several forms, from self-testing and self-certifying, to more formal tests by 3rd party labs. The result is a compliance report, listing which features of the standard were correctly implemented, and which were not. Having this information available to consumers can help interoperability to the extent it allows them to narrow their consideration to the applications that most fully and correctly implement the standard.
  3. Profiles — This is a formal mechanism for defining an interoperable subset of a standard. For example, XHTML Basic is a W3C-defined subset of XHTML for use on mobile devices. By defining such a subset, interoperability is enhanced in several ways. First, a single web service can produce XHTML Basic that is understood by mobile devices from several vendors. Second, XHTML Basic is a proper subset of XHTML, so it is also interoperable with all the tools that accept XHTML.
  4. Adaptive extension mechanisms — It is good to have a formal mechanism for extending the specification while at the same time allowing a graceful recovery in the case where the document is loaded in an application that doesn’t understand the extension.
  5. Inter-vendor cooperation — Make no mistake. Once an interoperability problem is found, where application A and application B are not rendering the same document in an acceptably similar fashion, then we have a bug. Such a bug may be reported by the customers of application A or the customers of application B. Ideally, A and B will cooperate and share information on interoperability bugs found. The customer that reports A’s bug to company B, may tomorrow be matched by another that reports B’s bug to customer A. So some sort of interoperability bug clearinghouse or other protocols can help here.

I may have missed some important factors that help or hinder interoperability. Please share your ideas.


3/29/07 — Corrected error sent in by an anonymous reader. The number of interoperability concerns scale as N(N-1), not as N!.

Filed Under: Standards

Washing Machines are not Lamps

2007/02/14 By Rob 18 Comments

Microsoft standards attorney David Rudin has posted his thoughts on my How Standards Bring Consumers Choice, in a post titled Floor Lamps are not Software.

David correctly points out that some appliances, like washing machines or electric dryers, with higher power requirements, have a different plug design. “Clearly, a one size standard does not fit all”, as he says. However, this is an intentional design decision made for safety reasons. If things cannot safely be interchanged, then good industrial design is to make them impossible to be interchanged. These plugs are incompatible and non-interoperable on purpose.

No one would intentionally do that with a file format, would they?

David then suggests that a single standard is insufficient because it would stifle competition and innovation:

Electricity is a largely mature and stable technology and there is not much room for innovation in the socket and receptacle space. Document formats, on the other hand, are constantly evolving to keep pace with changing technology. Competition is vital to ensure that those formats continue to meet those ever changing needs. Imagine if a single document format was adopted 15 years ago. How would that format deal with things that we take for granted today like including links to web pages, adding digital photos, or even embedding video in our documents? Unlike electricity, document formats are evolving at a rapid pace and competition will help drive that innovation.

I see it differently. Has a single HTML standard held back competition and innovation on the web? Has the move from vendor-specific network protocols to TCP/IP deprived consumers of innovation? Has the standardization of SQL held back the database industry? Have standardized programming languages like C and C++ prevented others from innovating? I see no evidence of this. On the contrary, standardized HTML, TCP/IP, SQL and C/C++ have been fundamental to the modern IT economy and have been responsible for many billions of dollars of value.

I’d also challenge the assertion that standardization equates with lack of innovation. If this were true, how does Microsoft reconcile their work standardizing OOXML, .NET, C++/CLI, C#, etc., with their needs for continuing innovation? Are these areas, “largely mature and stable”?

Or is this really just a belief that standardization is good when Microsoft originates and controls the standard, but it is bad otherwise?

Back to the examples of HTML, TCP/IP, SQL, C/C++. These standards continued to evolve, and innovations were brought to consumers, but they were done in a multi-vendor standards process where they reconciled their multiple perspectives and needs. Is that such a bad model to follow?

In the end, where does innovation come from? Does it require absolute control? Or does it come from having bright people? I’d suggest the latter, and point out that Microsoft employs several, but not all of the bright people in the area of file formats. Microsoft and Microsoft’s customers would benefit greatly if Microsoft would join with their competitors who are already innovating and competing in the evolution of the ISO ODF standard.

Remember the “X” in XML stands for Extensible. Making a single file format that meets Microsoft’s needs, as well as IBM’s, Sun’s, Corel’s, Novell’s, Google’s, etc., is not only technically possible, it is the best approach for the consumer. This does not mean that competition ends, or that all office applications will have identical features, or that we can only have lowest-common-denominator functionality. It just means that we should agree on a common representation for the features that we already have in common, and then have a framework for layering on vendor-specific innovations to express the areas where we differ.

Filed Under: OOXML, Standards

The World Ends on May 1st, 2010

2007/02/13 By Rob 5 Comments

Actually, at 6:45AM by my calculations.

According to ZDNet’s Dan Farber, quoting an IBM whitepaper, by 2010, “the world’s information base will be doubling in size every 11 hours.”

Every 11 hours? That’s quite a statement. Let’s see what this means. The largest storage system in the universe is the universe. (Let that sink in for a moment). When I grew up, I was taught that there were approximately 10^79 electrons in the universe. Let’s use them all! 10^79 bits of storage, stored using the spin state of the electrons, in a giant quantum computer.

I have no idea how much data we will have on January 1st, 2010, so let’s assume, for sake of argument, that a virus wipes out all the data in the world on New Year’s Eve, and we start the year with only 1 bit of data, and it doubles every 11 hours. So after 22 hours, we have 2 bits of data, after 33 hours 4 bits, and then after almost two days we get our first byte (8 bits). This isn’t too bad, is it?

The equation is: 2^x=10^79. Solve for x, a simple exercise in logarithms, giving the answer 262.43. We can only double that many times before hitting the universal limit and we exhaust all of the storage in the entire universe on May 1st at 6:45AM. Of course, maybe we’ll just Zip it all up and last until dinner time?

I think I’ll call in sick that day.

But seriously, I wonder if this “every 11 days” figure is a typo? Doubling “every 11 months” would be easier to imagine and would give us to 2250.

Filed Under: Uncategorized

How Standards Bring Consumers Choice

2007/02/10 By Rob 10 Comments

This is an essay on the choices that standards enable. By laying out a framework for ensuring interoperable, interchangeable and substitutable components, standards make it easier for you, the consumer, to shop with confidence and take full advantage of the choices offered in the marketplace.

Let’s take the humble floor lamp as an example, and look at some of the standards that govern its design, and the choices this enables for the consumer. In the United States, many of the parameters for electrical fixtures are governed by standards promulgated by the National Electrical Manufacturers Association (NEMA). Some of these standards are approved by ANSI or the IEC as well.

Let’s start at the wall. We see here a NEMA 5-15 Type B 3-Pin duplex socket. As you may notice, the left slot is slightly taller than the right one. This is the neutral. The live line is on the right, and the ground is on the bottom.

This socket is rated for 5 Amps at 125 Volts. The definition and calibration of the Amp and Volt and other key values in the Metric or SI system of measurement are standardized by the General Conference on Weights and Measures (CGPM).

From the lamp we have the male end, an AC power plug. On my lamp the connector is a polarized, ungrounding NEMA 1-15P plug.

This is an electrical cord for the lamp. The codes indicate that this is a SPT-2 , #18 gauge flexible cord. SPT (Service Parallel Thermoplastic) is a standard defined by the American Society for Testing and Materials (ASTM) and approved by Underwriters Laboratories (UL). The gauge is measured in American Wire Gauge (AWG) units, also known as the Brown & Sharpe gauge, a standard dating back to 1855.


A critical interface is the one between the bulb and the lamp. The most-common connection in the United States is called the MES or Single Contact Medium Edison Screw (E26). This is an ANSI Standard, C78.20-2003.

The bulb itself is an NEMA A21-style bulb, with an E26d style base. It will be 134.9 mm long, 28.2mm wide at the base. The height of the conductor screw will be 24.4mm. As indicated, this is a three-way bulb, rated at 50W, 100W and 150W.

As you can see, the inputs and outputs of the lamp are heavily-constrained by a number of standards. Is this a bad thing? Is the consumer deprived for not having to worry about different plug and outlet types, or what gauge cord to use for their lamp, or what thread connection to use? On the contrary, it is a blessing for the consumer that such pieces are interchangeable commodities.

If one bulb burns out, you can replace it with whatever brand is cheapest. Or you can get a lower wattage one to save electricity. Or even get a florescent one that fits in the same Medium Edison Screw socket. You can get clear glass, soft white, red glass or black light. You have these choices because bulb standards make bulbs interchangeable.

If the dog chews on the cord and you need to replace it, there is no need to return the lamp to the manufacturer at great expense. SPT-2 cords are standard and available at any hardware store. You can simply replace it yourself.

If you move to another house, will your lamp stop working and need to be thrown out? No, of course not. The NEMA 15-5 power outlets are in every home in North America. You can use your lamp wherever you go.

What if you want to buy a new lamp next year, will you need to change the wiring or the outlets in your house to work with it? Certainly not. The power outlets are a standard will work with any lamp, the ones you have today, the old ones you buy at an flea market, or the ones you buy 10 years from now. The standards ensure interoperability, interchangeability and substitutability.

In fact, far from constraining choice, standards enable greater choice. Because the basic plugs, receptors and connectors are governed by standards, these core components have become commodities and are produced off-shore at low cost to you, the consumer. This causes lighting designers and manufacturers to compete on the basis of style, elegance, utility and features. So standards result in lower cost, greater competition and greater choice for the consumer.

These are the choices enabled by standards, and the choices that consumers want:

It is the same thing with document formats. Consumers don’t want to worry about document formats. They don’t want to even think about document formats. They just want them to work, invisibly, without problems. Of course consumers want choice, but it is the choice of applications, choice of features, choice of vendors, choice of support options, choice of open source versus proprietary source, choice of heavy weight versus web-based, a choice of buying a single application versus buying a suite, etc. A single universal file format is what makes these other choices possible, just like a choice of the Medium Edison Screw bulb leads to an affordable choice in lamp designs.

To perpetuate vendor-specific file formats is like having a lamp that requires a special plug and a special light bulb and even special electricity. It is to go through life with a bag full of adapters and transformers that you will need to apply whenever you, or someone else, needs to use your document in another application. The cost of multiple document standards is that the industry will invest in a wide variety of convertors between the formats, and this cost will be passed on to the consumer. These convertors will work sometimes, but will inevitably be slower, buggier, have less fidelity, and their availability will lag the office applications by months or years.

The choice is yours.


Updates:

2/14/2007: A discussion of one critic’s response to this essay can be read in this follow-up.

Filed Under: Standards

  • « Go to Previous Page
  • Page 1
  • Interim pages omitted …
  • Page 10
  • Page 11
  • Page 12
  • Page 13
  • Page 14
  • Interim pages omitted …
  • Page 17
  • Go to Next Page »

Primary Sidebar

Copyright © 2006-2026 Rob Weir · Site Policies