• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar

An Antic Disposition

  • Home
  • About
  • Archives
  • Writings
  • Links
You are here: Home / Archives for Rob

Rob

Broken Windows and the Ghost of Keynes

2007/01/03 By Rob 11 Comments

Bad ideas never die.

The latest relapse is:

For every dollar of Microsoft revenue from Windows Vista in 2007 in the U.S., the ecosystem beyond Microsoft will reap $18 in revenues. In 2007 this ecosystem should sell about $70 billion in products and services revolving around Windows Vista.

The source for this rosy forecast is the recent IDC whitepaper, The Economic Impact of Microsoft Windows in the United States.

They summarize this boon as:

The IDC research shows that the launch of Windows Vista will precipitate cascading economic benefits, from increased employment in the region to a stronger economic base for those 200,000 or so local firms that will be selling and servicing products that run on Windows Vista. Nearly two million IT professionals and industry employees will be working with Windows Vista in 2007.

These direct benefits —157,000 new jobs and $70 billion in revenues to companies in the US IT Industry — will help local economies grow, improve the labor force, and support the formation of new companies. The indirect benefits of using newer software will help boost productivity, increase competitiveness, and support local innovation.

In the history of economic thought, this is Multiplier Effect, the belief that an increase in spending leads itself to more spending and even more spending, in a feedback loop that in the end amounts grows the entire economy. This bootstrap theory was popularized by John Maynard Keynes and became influential in some circles as a way to reduce underutilization in the economy. In other words, if unemployment is high and industrial capacity is underused, then it is worth while to have the government make work for people or spend money. Work, any work, will get the money flowing again. This lead to the various “alphabet agencies” of F.D.R.’s New Deal program.

Keynes, at his boldest, illustrated the magical properties of his multiplier effect like this:

If the Treasury were to fill old bottles with banknotes, bury them at suitable depths in disused coal mines which are then filled up to the surface with town rubbish, and leave it to private enterprise on well-tried principles of laissez-faire to dig the notes up again (the right to do so being obtained, of course by tendering for leases of the note-bearing territory), there need be no more unemployment and with the help of the repercussions, the real income of the community, and its capital wealth also, would probably become a good deal greater than it actually is.

— from The General Theory of Employment, Interest and Money

The most cogent criticism of the Magic Multiplier goes back 50-years to Henry Hazlitt’s, Economics in One Easy Lesson, where he tells the tale of “The fallacy of the broken window”. It goes something like this:

Imagine the town baker’s shop window is broken by an errant baseball throw. A unfortunate expense to the baker, one might say. But that is a narrow parochial view. Look instead at the benefit to the whole community. The window will cost $300 to replace. That money will go to the glazier who will then use his profits to buy a new sofa from the furniture store, who will then use his profits to buy a new bicycle for his child from the toy store, and so on. The money will continue to circulate in over-widening circles, bringing joy to all. The original loss of $300 by the baker will more than be made up for by the aggregate increase in the amount of goods and services exchanged in the town. Instead of punishing the little boy who broke the window, he should be raised up and praised as a Universal Benefactor and Economic Sage of the First Order.

The problem with that argument is it fails to look at the poor baker and what he might have done with the $300 if his window had not broken. Maybe he would bought a new suit with that money. The tailor then might have bought a new sofa with his profits, and so on. The interconnectedness of the economy was not precipitated by the broken window. It was always there. The only thing that changed by the broken window is that the baker has no new suit, and the glazier has his money. Since you can never see the suit that was never made, it is easy to forget that the benefits to the glazier did not come from nothing.

So back to the IDC report, and this forecast of $70 billion dollars in Vista-related spending. The question to ask is, where is all this money coming from? And what might it have been used for if not spent on Vista-related purchases? Obviously this money was not created out of a vacuum. Is it coming from profits? From shareholders? From deferring other investments? Cutting back on training? Moving more jobs off-shore? Reducing quality? What companies and sectors of the economy are going to suffer for this shift in investment? What innovations will not occur because people are allocating resources to this upgrade?

In the end is $70 billion of new value really being produced? Or are we merely fixing broken Windows?

Filed Under: Economics

How to hire Guillaume Portes

2007/01/03 By Rob 65 Comments

You want to hire a new programmer and you have the perfect candidate in mind, your old college roommate, Guillaume Portes. Unfortunately you can’t just go out and offer him the job. That would get you in trouble with your corporate HR policies which require that you first create a job description, advertise the position, interview and rate candidates and choose the most qualified person. So much paperwork! But you really want Guillaume and only Guillaume.

So what can you do?

The solution is simple. Create a job description that is written specifically to your friend’s background and skills. The more specific and longer you make the job description, the fewer candidates will be eligible. Ideally you would write a job description that no one else in the world could possibly match. Don’t describe the job requirements. Describe the person you want. That’s the trick.

So you end up with something like this:

  • 5 years experience with Java, J2EE and web development, PHP, XSLT
  • Fluency in French and Corsican
  • Experience with the Llama farming industry
  • Mole on left shoulder
  • Sister named Bridgette

Although this technique may be familiar, in practice it is usually not taken to this extreme. Corporate policies, employment law and common sense usually prevent one from making entirely irrational hiring decisions or discriminating against other applicants for things unrelated to the legitimate requirements of the job.

But evidently in the realm of standards there are no practical limits to the application of this technique. It is quite possible to write a standard that allows only a single implementation. By focusing entirely on the capabilities of a single application and documenting it in infuriatingly useless detail, you can easily create a “Standard of One”.

Of course, this begs the question of what is essential and what is not. This really needs to be determined by domain analysis, requirements gathering and consensus building. Let’s just say that anyone who says that a single existing implementation is all one needs to look at is missing the point. The art of specification is to generalize and simplify. Generalizing allows you to do more with less, meeting more needs with fewer constraints.

Let’s take a simplified example. You are writing a specification for a file format for a very simple drawing program, ShapeMaster 2007. It can draw circles and squares, and they can have solid or dashed lines. That’s all it does. Let’s consider two different ways of specifying a file format.

In the first case, we’ll simply dump out what ShapeMaster does in the most literal way possible. Since it allows only two possible shapes and only two possible line styles, and we’re not considering any other use, the file format will look like this:

<document>
<shape iscircle="true" isdotted="false"/>
<shape iscircle="false" isdotted="true"/>
</document>

Although this format is very specific and very accurate, it lacks generality, extensibility and flexibility. Although it may be useful for ShapeMaster 2007, it will hardly be useful for anyone else, unless they merely want to create data for ShapeMaster 2007. It is not a portable, cross-application, open format. It is a narrowly-defined, single application format. It may be in XML. It may even be reviewed by a standards committee. But it is by its nature, closed and inflexible.

How could this have been done in a way which works for ShapeMaster 2007 but also is more flexible, extensible and considerate of the needs of different applications? One possibility is to generalize and simplify:

<document>
<shape type="circle" lineStyle="solid"/>
<shape type="square" lineStyle="dotted"/>
</document>

Rather than hard-code the specific behavior of ShapeMaster, generalize it. Make the required specific behavior be a special case of something more general. In this way we solve the requirements of ShapeMaster 2007, but also accommodate the needs of other applications, such as OpenShape, ShapePerfect and others. For example, it can easily accommodate additional shapes and line styles:

<document>
<shape type="circle" lineStyle="solid"/>
<shape type="square" lineStyle="dotted"/>
<shape type="triangle" lineStyle="dashed"/>
</document>

This is a running criticism I have of Microsoft’s Office Open XML (OOXML). It has been narrowly crafted to accommodate a single vendor’s applications. Its extreme length (over 6,000 pages) stems from it having detailed every wart of MS Office in an inextensible, inflexible manner. This is not a specification; this is a DNA sequence.

The ShapeMaster example given above is very similar to how OOXML handles “Art Page Borders” in a tedious, inflexible way, where a more general solution would have been both more flexible, but also far easier to specify and implement. I’ve written on this in more detail elsewhere.

Here are some other examples of where the OOXML “Standard” has bloated its specification with features that no one but Microsoft will be able to interpret:

2.15.3.6 autoSpaceLikeWord95 (Emulate Word 95 Full-Width Character Spacing)

This element specifies that applications shall emulate the behavior of a previously existing word processing application (Microsoft Word 95) when determining the spacing between full-width East Asian characters in a document’s content.

[Guidance: To faithfully replicate this behavior, applications must imitate the behavior of that application, which involves many possible behaviors and cannot be faithfully placed into narrative for this Office Open XML Standard. If applications wish to match this behavior, they must utilize and duplicate the output of those applications. It is recommended that applications not intentionally replicate this behavior as it was deprecated due to issues with its output, and is maintained only for compatibility with existing documents from that application. end guidance]

(This example and the following examples brought to my attention by this post from Ben at Genii.)

What should we make of that? Not only must an interoperable OOXML application support Word 12’s style of spacing, but it must also support a different way of doing it in Word 95. And by the way, Microsoft is not going to tell you how it was done in Word 95, even though they are the only ones in a position to do so.

Similarly, we have:

2.15.3.26 footnoteLayoutLikeWW8 (Emulate Word 6.x/95/97 Footnote Placement)

This element specifies that applications shall emulate the behavior of a previously existing word processing application (Microsoft Word 6.x/95/97) when determining the placement of the contents of footnotes relative to the page on which the footnote reference occurs. This emulation typically involves some and/or all of the footnote being inappropriately placed on the page following the footnote reference.

[Guidance: To faithfully replicate this behavior, applications must imitate the behavior of that application, which involves many possible behaviors and cannot be faithfully placed into narrative for this Office Open XML Standard. If applications wish to match this behavior, they must utilize and duplicate the output of those applications. It is recommended that applications not intentionally replicate this behavior as it was deprecated due to issues with its output, and is maintained only for compatibility with existing documents from that application. end guidance]

Again, in order to support OOXML fully, and provide support for all those legacy documents, we need to divine the behavior of exactly how Word 6.x “inappropriately” placed footnotes. The “Standard” is no help in telling us how to do this. In fact it recommends that we don’t even try. However, Microsoft continues to claim that the benefit of OOXML and the reason why it deserves ISO approval is that it is the only format that is 100% backwards compatible with the billions of legacy documents. But how can this be true if the specification merely enumerates compatibility attributes like this without defining them ? Does the specification really specify what it claims to specify?

The fact that this and other legacy features are dismissed in the specification as “deprecated” is no defense. If a document contains this element, what is a consuming application to do? If you ignore it, the document will not be formatted correctly. It is that simple. Deprecated doesn’t mean “not important” or “ignorable”. It just means that new documents authored in Office 2007 will not have it. But billions of legacy documents, when converted to OOXML format, may very well have them. How well will a competing word processor do in the market if it cannot handle these legacy tags?

So I’d argue that these legacy tags are some of the most important ones in the specification. But they remain undefined, and by this ruse Microsoft has arranged things so that their lock on legacy documents extends to even when those legacy documents are converted to OOXML. We are ruled by the dead hand of the past.

Let’s go back even further in time to Word 5.0:

2.15.3.32 mwSmallCaps (Emulate Word 5.x for the Macintosh Small Caps Formatting)

This element specifies that applications shall emulate the behavior of a previously existing word processing application (Microsoft Word 5.x for the Macintosh) when determining the resulting formatting when the smallCaps element (§2.3.2.31) is applied to runs of text within this WordprocessingML document. This emulation typically results in small caps which are smaller than typical small caps at most font sizes.

[Guidance: To faithfully replicate this behavior, applications must imitate the behavior of that application, which involves many possible behaviors and cannot be faithfully placed into narrative for this Office Open XML Standard. If applications wish to match this behavior, they must utilize and duplicate the output of those applications. It is recommended that applications not intentionally replicate this behavior as it was deprecated due to issues with its output, and is maintained only for compatibility with existing documents from that application. end guidance]

You’ll need to take my word for it that “This emulation typically results in small caps which are smaller than typical small caps at most font sizes” falls well short of the level of specificity and determinism that is typical of ISO specifications.

Further:

2.15.3.51 suppressTopSpacingWP (Emulate WordPerfect 5.x Line Spacing)

This element specifies that applications shall emulate the behavior of a previously existing word processing application (WordPerfect 5.x) when determining the resulting spacing between lines in a paragraph using the spacing element (§2.3.1.33). This emulation typically results in line spacing which is reduced from its normal size.

[Guidance: To faithfully replicate this behavior, applications must imitate the behavior of that application, which involves many possible behaviors and cannot be faithfully placed into narrative for this Office Open XML Standard. If applications wish to match this behavior, they must utilize and duplicate the output of those applications. It is recommended that applications not intentionally replicate this behavior as it was deprecated due to issues with its output, and is maintained only for compatibility with existing documents from that application. end guidance]

So not only must an interoperable OOXML implementation first acquire and reverse-engineer a 14-year old version of Microsoft Word, it must also do the same thing with a 16-year old version of WordPerfect. Good luck.

My tolerance for cutting and pasting examples goes only so far, so suffice it for me to merely list some other examples of this pattern:

  • lineWrapLikeWord6 (Emulate Word 6.0 Line Wrapping for East Asian Text)
  • mwSmallCaps (Emulate Word 5.x for Macintosh Small Caps Formatting)
  • shapeLayoutLikeWW8 (Emulate Word 97 Text Wrapping Around Floating Objects)
  • truncateFontHeightsLikeWP6 (Emulate WordPerfect 6.x Font Height Calculation)
  • useWord2002TableStyleRules (Emulate Word 2002 Table Style Rules)
  • useWord97LineBreakRules (Emulate Word 97 East Asian Line Breaking)
  • wpJustification (Emulate WordPerfect 6.x Paragraph Justification)
  • shapeLayoutLikeWW8 (Emulate Word 97 Text Wrapping Around Floating Objects)

This is the way to craft a job description so you hire only the person you earmarked in advance. With requirements like the above, no others need apply.

As I’ve stated before, if this were just a Microsoft specification that they put up on MSDN for their customers to use, this would be par for the course, and not worth my attention. But this is different. Microsoft has started calling this a Standard, and has submitted this format to ISO for approval as an International Standard. It must be judged by those greater expectations.


Update:

1/14/2007 — This post was featured on Slashdot on 1/4/07 where you can go for additional comments and debate. I’ve summarized the comments and provided some additional analysis here.

2/16/2007 — fixed some typo’s, tightened up some of the phrases.

Filed Under: OOXML, Popular Posts

And then there were three…

2006/12/27 By Rob Leave a Comment

ODF, OOXML and now, UOF. This story broke back in November, with some good coverage including:

  • Andy Updegrove: Another Open Document Format – From China and More on China’s Uniform Office Format (and much more)
  • Jeff Kaplan: Is China Pulling a Bill Gates on ODF?
  • David Berlind: China’s own document standard: A clear message to US IT vendors?
  • Rick Jelliffe: Why China’s UOF is good
  • Stephen Walli: Open Standards, IPR and Innovation Conference, Beijing (2006)
  • Neil McAllister: China aims to set a new office doc standard
  • Luyi Chen: China’s Own Office Document Format Aiming to Harmonize with ODF
  • Evan Leibovitch: Debate over document formats not just academic

There is not much commentary I can add to what the above authors have already stated. Let’s just say that this is an important and exciting development.

On the technical side there is some important progress on harmonization, some preparatory work done in a joint research program between Peking University and IBM. The results of this year-long effort are now available:

  • A 150-page report (in English and Chinese) called “A Comparison Between ODF and UOF”. This document compares the two standards feature-by-feature and explains how to map data between the two.
  • A UOF-ODF Convertor, an open source Java-based tool, licensed under the LGPL, which provides bi-directional conversions of the three office document types (word processor, spreadsheet and presentation).

The report, the tool, the source code and a lot more information is available up at the project’s page on SourceForge. I hope this both addresses the immediate-term interoperability needs between ODF and UOF, as well as lays the foundation for a deeper technical discussion of additional harmonization steps which can be taken to improve interoperability even further.

Filed Under: ODF

Got ODF?

2006/12/21 By Rob 1 Comment

Are you writing code that works with ODF? Open source, commercial, in-house, internationally distributed, written in Python, Java, Ruby, C++ or Haskell, a one person project or with a team of dozens — if it is done with ODF and is interesting, then I want to hear about it and help share your story.

Here’s the deal. Drop my a line, via email or the comment form, and I’ll arrange to either call you or send you some interview questions via email, things like:

  • What did you do?
  • How did you do it?
  • What tools did you use?
  • Why ODF?
  • What worked well and what didn’t?
  • What next?

If the software is available for me to run and review, then so much the better.

I’ll then write up a story and feature it on my blog and on OpenDocument.xml.org. You’ll get some free publicity, and you’ll help me tell the continuing story of innovation with ODF. It’s a win-win, community thing. My intent to is have this be an ongoing feature in 2007.

I’m especially seeking anyone who is going beyond the traditional heavy-weight editor paradigm and is starting to look at other modes of use for ODF.

Feel free to share this invite with others who may be interested.

Filed Under: ODF

A notable achievement

2006/12/09 By Rob 8 Comments

I believe congratulations are in order to Microsoft and Ecma’s TC45 for what appears to be a new world record for creating a standard. Their recently-approved Office Open XML (OOXML) standard weighed in at 6,456 pages yet took only 357 days to be reviewed, edited and approved, making it not only the largest markup specification, but possibly also the fastest to complete its standardization cycle.

To put the magnitude of this accomplishment into perspective, I looked at a variety of other successful standards from various standards bodies, such as:

  • OASIS OpenDocument Format (ODF)
  • OASIS Darwin Information Typing Architecture (DITA)
  • W3C Extensible Stylesheet Language (XSL)
  • W3C XHTML
  • W3C Scalable Vector Language (SVG)
  • W3C Simple Object Access Protocol (SOAP)
  • IETF MIME
  • Ecma C#
  • Ecma C++/CLI

In all cases I looked at how long the specification took to be standardized, from when the initial draft was made available (whether developed within the technical committee, or submitted by a vendor at committee formation) to the time when the standard was approved. So we’re looking at the complete editing/review/approval time, not including the time to author the initial draft. I also looked at the length of the resulting standard.

(Click on the above chart for a larger view)

As you can see, there is a noticeable trend with previous standards, where longer specifications took longer to edit, review and approve than shorter ones. This was the received wisdom, that standardization was a slow process, and this deliberate pace was necessary not only to achieve technical excellence, but also to socialize the specification and build industry consensus.

Also, previous specifications seemed to top out at around 1,000 pages. Larger than that and they tended to be broken into individual sub-standards which were reviewed and approved individually.

The general practice, as shown in this data, has been for standards to take from 0.1 – 1.2 pages per day, for a complete review/edit/approval cycle. Even other Microsoft specifications in Ecma have fit within these parameters, such as C# (1.2 pages/day) and C++/CLI (0.7 pages/day).

Thus the remarkable achievement of Microsoft and Ecma TC45, who not only managed to create a standard an order of magnitude larger than any other markup standard I’ve seen, but at the same time managed to complete the review/edit/approve cycle faster than any other markup standard I’ve seen. They have achieved an unprecedented review/edit/approval rate of 18.3 pages/day, 20-times faster than industry practice, a record which will likely stand unchallenged for the ages.

I think we would all like to know how they did it. High-altitude training? Performance enhancing drugs? Time travel? A pact with the Devil? I believe you will all share with me an earnest plea for them to share the secret of their productivity and efficiency with the world and especially with ISO, who will surely need similar performance enhancements in order for them to review this behemoth of a standard within their “Fast Track” process.

I am optimistic, that once the secret of OOXML’s achievement gets out, the way we make standards will never be the same again.


Change Log

1/26/07 — corrected two typographical errors pointed out by a reader

Filed Under: OOXML, Standards

  • « Go to Previous Page
  • Page 1
  • Interim pages omitted …
  • Page 58
  • Page 59
  • Page 60
  • Page 61
  • Page 62
  • Interim pages omitted …
  • Page 69
  • Go to Next Page »

Primary Sidebar

Copyright © 2006-2026 Rob Weir · Site Policies