• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

An Antic Disposition

  • Home
  • About
  • Archives
  • Writings
  • Links
You are here: Home / Archives for Rob

Rob

ODF Freely Available

2007/03/19 By Rob 1 Comment

Another step forward for ODF. After gaining ISO approval in May, and Publication status in December, ISO/IEC 26300 is now counted among ISO’s “Freely Available Standards“. What is the significance of this? The text is identical to what it was in May, but you no longer need to pay 342 Swiss Francs to ISO to download an official copy. It is now free. Enjoy!

Filed Under: ODF

The Case for a Single Document Format: Part I

2007/03/18 By Rob 15 Comments

This will be a multi-part post, mixing in a little economics, a little history and a little technology — an intellectual smörgåsbord — attempting to make the argument that a single document format is the inevitable and desired outcome.

In Part I we’ll take a survey of a number of different problem domains, some that resulted in a single standard, some that resulted in multiple standards.

In Part II we’ll try to explain the forces that tend to unify or divide standards and hopefully make sense of what we saw in Part I.

In Part III we’ll look at the document formats in particular, how we got to the present point, and how and why historically there has always been but a single document format.

In Part IV, if needed, we’ll tie it all together and show why there should be, and will be, only a single open digital document format.

Let’s get started!

Standards — in some domains there is a single standard, while in other domains there are multiple standards. What is the logic of this? What domains encourage, or even demand a single standard? And where do multiple standards coexist without problems?

Let’s take a look at some familiar examples and see if we can figure out how this works. We’ll start with some examples where a single standard dominates.

Single Standards

The story of the standard rail gauge is probably familiar to you. At first each rail company laid down their own tracks to their own specifications. In the United States there were different gauges used in the North (5′ 9″) and the South (5′). This was not a major issue so long as rail travel remained local or regional. However, as the reach of commerce increased, the pain of dealing with the “break of gauge” between adjacent gauge systems increased. Passengers and goods needed to be offloaded and transferred to a different train, causing time delays and inefficient utilization of equipment. The decision was made to adopt a Standard Gauge of 5′ 9″ and an ambitious migration project took place on May 31st, 1886, when thousands of workers in the South adjusted the west track and moved it 3″ to the east, lining up with the Northern gauge. Eleven-thousand miles of tracks were converted in thirty-six hours.

It should be noted that this unification was not universally celebrated. In particular, riots occurred at some of the junction points, like Erie, Pennsylvania, where local workers stood to lose the high-paying jobs they had unloading and loading cargo onto new trains. Efficiency is often opposed by those who profited from inefficiency.

Another standard prompted by the railroad was the adoption of standard time. In earlier days each town and city had its own local time, roughly based on solar mean time. When it was noon in Chicago, it was 12:09 in Cincinnati, and 11:50 in St. Louis. The instant of local noon would be communicated to residents by a cannon shot or by dropping a ball from a tower, allowing all to synchronize their clocks. The ball drop could be observed by ships in the harbor by telescope and so was much more accurate than the cannon, since the signal was not delayed by the non-negligible travel time of sound. Some memory of this tradition continues to this day with the New Year’s Eve ball drop in Times Square.

When it took days by coach to travel from Chicago to Cincinnati, it did not matter that your watch was 9-minutes slow. Your watch probably wasn’t accurate enough to tell the difference in any case. When noon came in Cincinnati you would synchronize your watch, knowing that some of the correction was caused by the change in longitude, and some was caused by the imperfections in the watch. But the average person did not care because they did not travel all that much.

However, with the coming of the railroad and then the telegraph, everything changed. People, goods and information could be transferred at far greater speeds. The difference of 9 minutes was now significant.

Initially, each rail company defined its own time, based on the local time of its main office. Timetables would be printed up based on this time. So a large train station, which may serve six different lines, would display six different clocks, all set to different times, some 12 minutes ahead, some 15 minutes behind, etc. At one point, trains in Wisconsin were operating on 38 different times! This was not only an inconvenience to travelers, it was also increasingly a safety concern, since the use of different time systems at the same station increased the chance of collisions.

This was addressed by the adoption of Standard Time in the United Stated on November 18th, 1883, the so-called “Day of Two Noons” . This was the day that the Eastern, Central, Mountain, and Pacific time zones took effect, and on this day every town adjusted its local time to the Standard Time of their new time zone. If you were in the eastern-half of your time zone, then when local noon came you would set your clocks back a specified number of minutes, and would thus observe noon twice. If you were on the western-half of your time zone, you would advance your clocks at local noon a specified number of minutes. The contemporary coverage of this event in The New York Times is worth a read.

Over the years, the every increasing rate of commerce and information flow has lead to greater and greater precision in time-keeping, so that today with atomic clocks and UTC we can now account for the slowing of the Earth’s rotation and the insertion of occasional leap seconds.

The International Civil Aviation Organization (ICAO) is a UN agency that maintains various aeronautical standards, such as airport codes, aircraft codes, etc. They are also responsible for making English the required language for air-to-ground communications. So when an Italian plane, with an Italian crew on an Italian domestic flight contacts the approach tower at an Italian airport, manned by Italian personnel, they will contact the tower in English. Why do you think this is so?

The diameter of beverage cans has but little variation. A can of Coca-Cola and a can of Pepsi will both fit in my car’s cup holder. They also fit fine in the cup holders in my beach chair or rider lawnmower. This works with beer cans as well, with innovative holders such as the novelty beer hat . Vending machines seem to take advantage of this standard as well, since it simplifies their design. The whole beverage can ecosystem works because of standards around beverage can sizes. How is this standard maintained? Was it planned this way?

It is interesting to note that, from the beverage company’s perspective this is non-optimal. A can has minimum surface area for a given volume when it has equal height and diameter. But we never see beverage cans of that shape. Why not?

In the United States, our television signals are encoded in the NTSC system. PAL is used in most of Western Europe and Asia, and SECAM is used in France and Eastern Europe. The United States is moving to a new standard, High Definition, HDTV, by February 17th, 2009. This is the law, as enacted by Congress, that we must move to a new television standard, causing expenses to broadcasters and consumers, as well as generating a lot of revenue for electronic manufacturers. Why did this require a law? If it was good for consumers and for manufacturers, wouldn’t the free market make this move on its own?

The Great Baltimore Fire of 1904 quickly grew beyond the control of local fire companies. As the fire spread to encompass the entire central business district, the unprecedented call went out by telegraph for assistance from fire companies from Washington, DC and Annapolis and as far away as Philadelphia, Atlantic City and New York. But when these companies arrived, with their own equipment, they found that their hose couplings were incompatible. This was a large contributing factor to these fire’s duration and destructive power. Over 1,500 buildings were destroyed over 30 hours. Within a year there was a national standard for fire hoses.

To these can be added the hundreds of standardized items that we work with every day, such as standardized electrical connectors, light bulbs, food nutritional labels, gasoline nozzles, network addresses, batteries, staples, toilet paper holders, telephones networks, remote control infrared signals, envelopes, paper sizes and weights, currency, plumbing fixtures, light switch face plates, radio frequencies and modulations, screws, nails and other fasteners, etc.

Multiple Standards

Now let’s switch to some examples of domains where multiple standards have flourished.

The textbook example is the safety razor. When the safety razor was invented by Gillette, they were interchangeable, disposable blades made of carbon steel. As such they rusted and needed to be frequently replaced. Wilkinson Sword, later owner of the Schick brand, started making compatible stainless steel blades, which Gillette then copied. So there was a good amount of competition going on.

In the early 1970’s Gillette moved to embed the blades into disposable cartridges which, due to their patent protection, could not be copied by other manufacturers. This lead to our present situation of having multiple, incompatible razor systems. Competition remains fierce, with a battle to see who can put the most blades in a cartridge, from the Gillette Trac II with two blades and the Mach 3 with three blades, to Schick’s Quattro with 4 blades, to Gillette’s Fusion with 5 blades. Any guesses on what is next?

Video game consoles are in a similar position. In fact, they are often called a “razor and razor blade” business, since they sell the consoles at less than cost and later make their profit selling the game cartridges in proprietary formats. There is little interest, and seemingly little demand for a universal game cartridge standard.

Another example is the realm of SLR camera lens mounts. Each camera manufacturer has their own system of incompatible lens mounts. Is one clearly better than another? Have the multiple standards encouraged innovation in the area of lens mounts over the past 40 years? Good question. All I know is I have a bag full of Minolta lenses that I can’t use anymore since I moved to a Pentax camera.

We’ve all seen the many optical storage formats in recent years. Just in the realm of writable DVD disk standards, we’ve seen DVD-R, DVD-RW, DVD+RW and DVD-RAM, many of them in single and double-sided variations.

In the past 5 years we’ve seen perhaps a dozen or more varieties and variations of memory card formats, all of them proprietary and incompatible with each other. It makes the state of optical disk formats seem regular and peaceful in comparison.

To these can be added the hundreds of daily items that have managed to avoid a single standard, such as vacuum cleaner bags, coffee filters, laptop power supplies, cell phone chargers, high definition video disc formats, surround sound audio disc formats, etc.

That is all for Part I. Some questions to ask yourself:

  1. In the examples given of domains where there is a single standard, most of them did not start off that way. Most started with many competing approaches. What forces led them to a single standard?
  2. Who won and who lost in moving to a single standard? Who decided to make the move?
  3. In the cases where there are multiple, incompatible standards, is there a market demand for unified standards? Why or why not?
  4. If a government decree came down today and mandated a single standard in those areas, what would be gained? What would be lost?

I hope you will continue on with reading Part II.

Filed Under: Standards

Fast Track. Wrong Direction.

2007/03/13 By Rob 26 Comments

The idea was to make the C++ programming language work better in Microsoft’s .NET framework. It started off as the Managed Extensions for C++, first available in 2000, and later in Visual Studio .NET 2003. Managed Extensions were reformulated in Visual Studio 2005 where they were called C++/CLI, referring to the Common Language Infrastructure, the runtime abstraction in .NET.

CLI itself had earlier been standardized in Ecma (approved in 2000) and Fast Tracked through ISO (approved in 2001). So, it was not much of a surprise when the C++ variant for Microsoft’s .NET Framework, C++/CLI, was proposed for standardization as well. Ecma TC39/TG5 started work on C++/CLI in December 2003 and Ecma approved the specification as Ecma-372 in December 2005. Two years in committee, resulting in a 304-page specification. This used to be considered a fast pace.

After approval by Ecma, C++/CLI was submitted for Fast Track processing to ISO/IEC JTC1/SC22 as DIS 26926. Like any other Fast Track in JTC1, this process started with a 30-day contradiction period. Contradiction submissions were made by both Germany[pdf] and the UK[pdf].

The UK’s position was that calling the standard “C++/CLI” would cause, and in fact was already causing, confusion among users with the already existing C++ programming language. The name of the standard was unacceptable:

We consider that C++/CLI is a new language with idioms and usage distinct from C++. Confusion between C++ and C++/CLI is already occurring and is damaging to both vendors and consumers.

A new language needs a new name. We therefore request that Ecma withdraw this document from fast-track voting and if they must re-submit it, do so under a name which will not conflict with Standard C++.

Similar views were expressed by Germany:

With reference to §13.4 of the JTC1 Directives, 4th edition, DIN brings to the attention of the JTC1 secretariat that we perceive a contradiction between document JTC 1 N 8037 “30 Day Review for Fast Track Ballot ECMA-372 1st edition C++/CLI Language Specification”and the JTC1/C++ standard ISO/IEC 14882:2004 “Programming language C++” and related technical reports.

We propose that the document is input into SC22 as a regular New Work Item Proposal and assigned to WG21 for further processing.

Ecma responded[pdf] to these objections in a 5-page letter, on 29 January 2006, that refused to make even the most basic concession, such as changing the name to remove the C++ reference.

So the objections are ignored, and they move on to the 5-month ballot period, starting March 9th, 2006. When the ballot closed in August, and the votes were counted, C++/CLI had received 11 out of 20 P-Member votes (55%) and a total of 9 negative votes out of 26 total votes cast, or 34.61%. So it failed both to get the required 2/3 approval of P-Members, as well as to keep the negative votes to less than 25%.

Germany and the UK voted disapproval. No surprise there, since they had objected early in the process, and their objections were ignored. In fact one of Germany’s comments in the ballot was:

DIN has commented before, as well as BSI did, that allowing fast-track standardization of the “C++/CLI Language” under this name clearly conflicts with an existing and actively maintained standard: ISO 14882 – the C++ Programming Language. The document under review spells out under “NOTE FROM ITTF”, bullet 2.2, that ITTF will ascertain that this proposed standard does not conflict with any other International Standard but such a conflict was pointed out. No reason has been given why this objection was overridden. Thus, DIN wants to express its surprise that standardization of this proposal went forward.

The US comments included:

The proposed standard is not market driven, nor is it the product of an industry consensus.

We are unimpressed with the very low level of C++ community participation mustered in the design and refinement of the current document, and feel, quite frankly, that the current state of this document is not at a high enough level of technical excellence to merit the ISO imprimatur.

France said:

This document should be withdrawn from the fasttrack approval process pending re-drafting and a more adequate review prior to voting. Better yet, retain it as an Ecma standard only until a clear market consensus develops that a JTC1 standard in this area is needed.

And so on, down the list.

It should be noted that a failing vote in the 5-month ballot is not necessarily fatal. The Fast Track submitter, in this case Ecma, can call on the SC Secretariat to convene a Ballot Resolution Meeting (BRM), where the issues can be discussed and resolved, possibly leading to a positive vote after a further ballot. This is Ecma’s right as a Fast Track submitter. However, C++/CLI did not see a ballot resolution meeting. The JTC1 Secretariat recently notified SC22 members:

We have been advised that the comments accompanying the Fast Track ballot for DIS 26926 are not resolvable and that holding a Ballot Resolution Meeting (BRM) would not be productive or result in a document that would be acceptable to the JTC 1 National Bodies. Therefore, our proposal is to not hold the BRM and to cancel the project.

So, the BRM which had been scheduled for April, 2007 has been canceled, and that’s where it stands today, with the attempted Fast Track of C++/CLI dead from seemingly easily preventable flaws.

Lessons, anyone?

Don’t ignore NB members. If they take the time and make the effort to point out your flaws early in the process, then you should count yourself lucky. This is like the school teacher walking around the classroom during a quiz and pointing to one of your answers and saying, “You might want to take another look at that problem”. If you ignore her advice and just turn in your paper, then you deserve the grade you get.

It is instructive as well that although only two NB’s objected in the C++/CLI contradiction period, this grew to a far larger number by the time the 5-month ballot had ended. Ignoring problems doesn’t make them go away.

One last thing. Any guesses on how long those contradiction arguments stay online before they are taken down to preserve the shrouded secrecy of ISO process? I advise you to make a copy now. I certainly have.

Filed Under: OOXML

Document Migrations

2007/03/06 By Rob 11 Comments

If you’ve been around this business for a while, you’ve seen your share of migrations. New operating systems, new networks, new hardware, even new document formats. I’d like to share some recollections of one such migration, and then some suggest a solution.

In 1995 I was working at Lotus on Freelance Graphics, along with many others, getting SmartSuite ready for Windows 95. One day, as I walked to work and rounded the corner of Binney Street, I saw something unusual, even more unusual than the usual unusual one sees in Cambridge. Something was up. There were news vans parked in front of LDB, camera crews and reporters looking for comments, Lotus security videotaping the reporters asking for comments, and me standing there, clueless.

This was how I first heard of IBM’s take-over offer. It was hard to concentrate on porting to Windows 95 with all that news going on downstairs, but we managed.

In the weeks and months that followed there were many changes. At Lotus we were 100% SmartSuite users. No surprise there. Most of us did not even have a copy of Microsoft Office on our machines, unless we worked on file compatibility. Not only did we use SmartSuite for our collaborative work, creating and reviewing specifications, giving presentations, etc., we also ran some of our business processes on it. In particular we used an expense report application, done in 1-2-3 with LotusScript.

But IBM used Microsoft Office. So when IBM took over, we needed to migrate. Sure, there was whining and moaning and gnashing of teeth on our end about having to move to an inferior product. And it did take a little while to get accustomed to the different conventions of Office, typing AVERAGE() in Excel, rather than @AVG() in 1-2-3 and stuff like that. But we did it. We moved to Office. It was clear to all that the benefits of having a single file format outweighed the short-term pain on migration.

It is interesting what we did not do:

  1. We did not go and convert all existing legacy SmartSuite documents into Office format. What would have been the point? Most old documents are never touched again. Let them rest in peace.
  2. We did not delete SmartSuite from our hard drives. We kept the application there for cases where we needed to access old documents.
  3. We did not simply continue using SmartSuite and tell it to save in Office format. We knew that both fidelity-wise and performance-wise it is far better to use an application that supports a format natively than to rely on conversion software for interoperability.
  4. We did not translate 1-2-3 macro-based applications into Excel macro-based applications. We took the opportunity to move straight to web based applications. Aside from some standard presentation templates and similar boiler-plate templates we did not do a lot of conversion work.

Looking back in retrospect, the migration of file formats was one of the least contentious changes that accompanied the IBM takeover. We can handle file format changes, but eliminating the traditional Friday Beer Cart, now that was something to complain about…

I’m not much of one for committing unprovoked acts of methodology, but if I had to summarize what little wisdom I have in this area, I’d say that for a migration you want evaluate your existing documents by three criteria: stability, complexity and business criticality, and develop a migration plan based on that.

In the first case you classify documents by how stable (unchanging) they are:

  1. Hot documents — the documents that are being heavily changed and edited today, works-in-progress, in active collaborations
  2. Cold documents — the documents which are no longer edited, though perhaps they are still read. Many of these documents may have zero value and are just taking up space. Others may be valuable records, but hidden away on someone’s hard-drive.
  3. Warm documents — These are the ones that are in the middle, not seeing heavy activity, but they aren’t quite frozen either.

From the perspective of complexity we have:

  1. Low complexity — simple text and graphics
  2. Medium complexity — using more advanced features, created by power users
  3. High complexity — “engineered documents”, using scripting and macros to create applications.

Finally you can also look at these documents from the perspective of business criticality. Of course, this will vary according to your business. It might be relevance to ongoing litigation, it might be according to a records retention policy, it might be whether it concerns currently open projects, etc. But for sake of argument, let’s take client or public exposure as a proxy for criticality, so we get this:

  1. Internal use documents — internal presentations and reports
  2. Customer facing documents — engagement reports, proposals, etc.
  3. Publication ready documents — white papers, journal articles, etc.

These three dimensions — stability, complexity and criticality — can be combined, creating 27 different document classes. For example, our old expense report based on 1-2-3 macros would be classified as a hot, high complexity, internal use document.

So you are transitioning from Office legacy binary formats to ODF. What do you do with each of these document classes? You have four main strategies to consider:

  1. Do nothing and preserve the document in the legacy format, maintaining, as needed, access to the legacy application.
  2. Convert document to a portable high fidelity static representation, like PDF
  3. Convert directly to ODF.
  4. Reengineer as something other than a document.

So one migration policy might look like this:

Stability Complexity Exposure Strategy
Cold Low Internal Use Do nothing
Cold Low Customer Facing Do nothing
Cold Low Publication Ready Do nothing
Cold Medium Internal Use Do nothing
Cold Medium Customer Facing Do nothing
Cold Medium Publication Ready Do nothing
Cold High Internal Use Do nothing
Cold High Customer Facing Convert to PDF
Cold High Publication Ready Convert to PDF
Warm Low Internal Use Convert to ODF
Warm Low Customer Facing Convert to ODF
Warm Low Publication Ready Convert to ODF
Warm Medium Internal Use Convert to ODF
Warm Medium Customer Facing Convert to ODF
Warm Medium Publication Ready Convert to ODF
Warm High Internal Use Convert to ODF
Warm High Customer Facing Publish as PDF
Warm High Publication Ready Publish as PDF
Hot Low Internal Use Convert to ODF
Hot Low Customer Facing Convert to ODF
Hot Low Publication Ready Convert to ODF
Hot Medium Internal Use Convert to ODF
Hot Medium Customer Facing Convert to ODF
Hot Medium Publication Ready Convert to ODF
Hot High Internal Use Reengineer
Hot High Customer Facing Reengineer
Hot High Publication Ready Reengineer

There may be a better way of expressing this above (Karnaugh maps anyone?) but that gives the idea. Also, I’m not suggested that this is the “one true answer”, but merely that this may be a useful way of framing the problem.

Variations might include:

  • Have a default policy of doing no conversions, but create all new documents in ODF format.
  • By default, ignore all legacy documents. But the first time any legacy document is read or written, put it into a queue for evaluation and possible conversion.

Much of this lends itself to automation. For example:

  1. First you need to find all of the documents in an organization. This could be done by an activeX control on a page everyone in the company visits, an agent that spiders the intranet web pages and file servers, etc.
  2. Each document is then scored.
  3. Finding the stability of a document could be done by looking at the last read and last write stamps on the file. Also can look weblogs. Maybe even metadata in the document that tells how many times it has been edited.
  4. Complexity could be determined by scanning the document to see what features it uses. Some features, like script, would weight heavily for complexity. Think of it as a “goodness of fit” metric for how well the features used in the document fit within the ODF model.
  5. Business criticality is harder to automate, but could be done based on owner of the document, metadata in the document, location of the document (public web page versus intranet), etc.
  6. Calculate the scores, suggest actions to take, and then automate the action. This could lead to a nice automated migration solution.

In summary, it probably is not worth while simply to go out and convert all of your legacy documents in a giant cathartic orgy of document transformations. Not all documents are worth that effort. In any organization you probably have many many documents that will never be read again, ever. You also likely have some very complex documents that probably should be reengineered as web applications on your intranet. The other documents, the ones in the middle, that is where you focus your migration effort.

Filed Under: ODF

Compatibility According to Humpty Dumpty

2007/03/04 By Rob 15 Comments

‘I don’t know what you mean by “glory,” ’ Alice said.

Humpty Dumpty smiled contemptuously. ‘Of course you don’t — till I tell you. I meant “there’s a nice knock-down argument for you!” ’

‘But “glory” doesn’t mean “a nice knock-down argument,” ’ Alice objected.

‘When I use a word,’ Humpty Dumpty said, in a rather scornful tone, ‘it means just what I choose it to mean, neither more nor less.’

‘The question is,’ said Alice, ‘whether you can make words mean so many different things.’

‘The question is,’ said Humpty Dumpty, ‘which is to be master – that’s all.’

— Lewis Carroll from Through the Looking-Glass (1871)

I have written about Microsoft’s language games previously. These games continue and it appears to be time for yet another inoculation. Words such as “open”, “choice”, “interoperability”, “standard”, “innovation” and “freedom” have been bandied about like patriotic slogans, but with meanings that are often distorted from their normal uses.

The aggrieved word I want to examine today is “compatibility”. Let’s see how it is being used, with some illustrative examples, the ipsissima verba, Microsoft’s own words:

From an open letter “Interoperability, Choice and Open XML” by Jean Paoli and Tom Robertson:

The specification enables implementation of the standard on multiple operating systems and in heterogeneous environments, and it provides backward compatibility with billions of existing documents.

From another open letter, Chris Capossela’s “A Foundation for the New World of Documents”:

… all the features and functions of Office can be represented in XML and all your older Office documents can be moved from their binary formats into XML with 100 percent compatibility. We see our investment in XML support as the best way for us to meet customers’ interoperability needs while at the same time being compatible with the billions of documents that customers create every year.

From Doug Mahugh: “The new Open XML file formats offer true compatibility with all of the billions of Office documents that already exist.”

And from Craig Kitterman: “Is backward compatibility for documents important to you? How about choice?”

Those are just a handful of examples. Feel free to leave a comment suggesting additional ones.

Compatibility. Better yet, True Compatibility. What is that? And what do you think the average user, or even the average CTO, thinks, when hearing these claims from Microsoft about 100% compatibility?

Let’s explore some scenarios and try to reverse-engineer Microsoft’s meaning of “True compatibility”.

Suppose you get a new, more powerful PC with more memory and upgraded graphics card and you upgrade to Vista and Office 2007. You create a new presentation in PowerPoint 2007 and save it in the new OOXML format. What can you do with it?

Can you exchange it with someone using Office on the Mac? Sorry, no. OOXML is not supported there. They will not be able to read your document.

Is this 100% compatibility?

What about Windows Mobile? Can I read my document there? Sorry, OOXML is not supported there either.

Is this 100% compatibility?

What about sending the file to your friends using SmartSuite, WordPerfect Office or OpenOffice, or KOffice? They all are able to read the legacy Microsoft formats, so surely a new format that is 100% compatible with the legacy formats should work here as well? Sorry, you are out of luck. None of these applications can read your OOXML presentation.

Is this 100% compatibility?

What about legacy versions of Microsoft Office? Can I simply send my OOXML file to a person using an old version of Office and have it load automatically? Sorry, older versions of Office do not understand OOXML. They must either upgrade to Office 2007 or download and install a convertor.

Is this 100% compatibility?

I have Microsoft Access XP and an application built on it that imports data ranges from Excel files and imports them into data tables. Will it work with OOXML spreadsheets? Sorry, it will not. You need to upgrade to Access 2007 for this support.

Is this 100% compatibility?

What about other 3rd party applications that take Office files as input: statistical analysis, spreadsheet compilers, search engines, document viewers, etc. Will they work with OOXML files? No, until they update their software your OOXML documents will not work with software that expects the legacy binary formats.

Is this 100% compatibility?

Suppose I, as a software developer, takes the 6,039 page OOXML specification and write an application that can read and display OOXML perfectly. It will be hard work, but imagine I do it. Will I then be able to read the billions of legacy Office documents? Sorry, the answer is no. The ability to read and write OOXML does not give you the ability to read and write the legacy formats.

Is this 100% compatibility?

So, there it is. A don’t know if we’re any closer to finding out what “100% compatibility” means to Microsoft. But we certainly have found lot of things it doesn’t mean.

A quick analogy. Suppose I designed a new DVD format, and standardized it and said it was 100% compatible with the existing DVD standard. What would consumers think this means? Would they think that the DVD’s in the new format could play in legacy DVD players? Yes, I believe that would be the expectation based on the normal meaning of “100% compatible”.

But what if I created a new DVD Player and said it supported a new DVD format, but also that the Player was 100% compatible with the legacy format. What would consumers think then? Would they expect that the new DVD’s would play in older players? No, that is not implied. Would they expect that older DVD’s could be played in the new Player? Yes, that is implied.

This is the essence of Microsoft’s language game. The are confusing the format with the application. This is easy to do when your format is just a DNA sequence of your application. However, although Microsoft Office 2007, the application, may be able to read both OOXML and the legacy formats, the OOXML format itself is not compatible with any legacy application. None. The only way to get something to work with OOXML to write new code for it.

This is not what people expect when they hear these claims of OOXML being 100% compatible with legacy formats.

Filed Under: OOXML

  • « Go to Previous Page
  • Page 1
  • Interim pages omitted …
  • Page 51
  • Page 52
  • Page 53
  • Page 54
  • Page 55
  • Interim pages omitted …
  • Page 69
  • Go to Next Page »

Primary Sidebar

Copyright © 2006-2026 Rob Weir · Site Policies