Microsoft Dirty Tricks history
A History of MS' Standards 'Dirty Tricks'
Please feel free to clean this up and add to it as much as you like. Also, it would be nice if we could add a short summary below each point. Currently, links are added to whatever relevant material is found. This includes a lot of Groklaw and other blog posts (it started out to collect a lot of oral history on GL). If you think your particular post should not be referenced or your words should be removed, please remove them and/or leave a message on the talk page.
This post is licensed under the GNU GPL, version 2 or later. Distribution as a delimited part of other works falls under the mere aggregation clause. Please do not refer to "names" or "handles" when including (GrokLaw) posts, but instead, add a link to the original comment.
It seems to me that we can help those that fight MS standards traps by compiling a list of dirty tricks. Here is a try. For overviews look at The Rise of Proprietary Formats, msversus and netaction. A very good (but long) read is the pdf of the 2004 final decision of the EU comision regarding MS. Standards are also part of the first antitrust case, see On the Remedy Phase of the Microsoft Antitrust Trial. The following quote from 2003 is quite revealing:
Standards Bodies and Windows APIs
MS does know how to advocate and implement standards (see Microsoft Discusses Open Standards (versus Open Source Software)). It has rightly criticized AOL for blocking interoperability between different Instant Messaging systems and worked to set a global, unencumbered standard for all (see here, here and here).
On the other hand, MS has been paying out large amounts of money (9B$ I read somewhere in 2004) in court settlements. A prime example is the way MS killed DR-DOS, but this is, strictly speaking, not really an International Standards question, neither are the so-called dirty Windows vs. OS2 and MS versus BeOS fights (which included standards as "weapons"). Also, MS is rather fond of trademarking common words like Windows, Office, SQL server, Word, and Office Open XML, the last of which seems to be targeted at confusing potential users of "OpenOffice" and the rest are confusing to users of X-Windows, Office software, SQL, and Word processing. A more recent illustration of how MS approaches standards was seen after MS acquired Virtual Earth with Seadragon in 2006. Note this article highlights consistent inaccuracies in statements about standards by MS marketing (quote from Roblimo of Linux.com)
Then we met Blaise Aguera y Arcas, who joined Microsoft earlier this year when they acquired his startup company, Seadragon. He first spoke up when I asked why Microsoft's Virtual Earth had been made totally dependent on DirectX instead of using OpenGL or another cross-platform alternative, and was therefore useless to anyone not running Windows (and, as it turned out, Explorer as well).
Alex Daley, a marketing type associated with Virtual Earth, started to say only DirectX had the necessary features when Blaise broke in and said that OpenGL could have done just as well.
Blaise was obviously not a fully-assimilated Microsoftie. He spoke openly about how he and the other Seadragon team members had to "scrub" GPLed code (and JPEG2000) from its work after the acquisition. Of all the presenters, he was the only one whose brain I wanted to pick more. He is a very smart guy. Check his work on PhotoSynth. Amazing stuff. Too bad it's only available to Windows/Explorer users.
The best characterization of Microsoft's attitude to standardization was given in their own words, in the first Halloween document (read them all!):
By folding extended functionality (e.g. Storage+ in file systems, DAV/POD for networking) into today's commodity services, we raise the bar & change the rules of the game.
Eric Raymond has an excellent analysis of this standards behavior (the same Halloween document):
What the author is driving at is nothing less than trying to subvert the entire "commodity network and server" infrastructure (featuring TCP/IP, SMTP, HTTP, POP3, IMAP, NFS, and other open standards) into using protocols which, though they might have the same names, have actually been subverted into customer- and market-control devices for Microsoft (this is what the author really means when he exhorts Microserfs to raise the bar & change the rules of the game).
The `folding extended functionality' here is a euphemism for introducing nonstandard extensions (or entire alternative protocols) which are then saturation-marketed as standards, even though they're closed, undocumented or just specified enough to create an illusion of openness. The objective is to make the new protocols a checklist item for gullible corporate buyers, while simultaneously making the writing of third-party symbiotes for Microsoft programs next to impossible. (And anyone who succeeds gets bought out.)
Arnaud Le Hors posted some personal experiences "straight from the horse's mouth" in What Microsoft’s track record tells us about OOXML’s future. I think MS' employees should know best.
This kept me intrigued until one day my peer at Microsoft told me: "For us, the submission to W3C is the end of the road. What happens after doesn’t really matter."
I didn’t need to ask why. The explanation was obvious. Once the specification is submitted to W3C Microsoft can tell its customers that it is a standard. Technically it’s not, but if any customer ever cares to ask it’s easy enough to put their fear to rest by explaining that the process is started and it’s simply a matter of time. By the time the standard eventually comes out, customers are already using Microsoft’s technology and no longer have much choice if they find out that Microsoft doesn’t even bother adhering to the actual standard. They are locked-in.
In April of 2009, the European Committee for Interoperable Systems (ECIS) provided a paper, A History of Microsoft's Anticompetitive Behavior and Consumer Harm, which includes several examples of Microsoft's behavior regarding standards.
Here is a list of Standards currently covered:
This section may actually be less useful than the "Table of Contents" up a the top of this article. ((I was almost going to edit this section, changing each entry into a link to the corresponding section of this article. But it's not necessary to do that!)) There is a TOC up at the top. Enjoy...
- MS OfficeOpenXML
- HTML and WWW
- Kerberos standard with proprietary extension
- The Sender ID flap, a failed attempt to make all email depend on MS licenses
- RTF, MS' previous interoperability standard that wasn't.
- LDAP, the standard MS broke to close Active Directory
- CIFS, (Samba) a standard poisoned by MS
- OpenGL, a standard MS wants to get killed at ANY cost.
- C#/CLI, an ECMA standard to prevent alternative implementations
- .NET, an ECMA standard that only MS can use legally.
- Java, MS lost the fight
- ODBC API, how to prevent database API standardization
- ActiveX standardization.
- NTP, the Network Time Protocol
- RIFF (WAV) is just a convoluted and subverted AIFC with byte order switched
- VFAT, the file system "standard" that MS suddenly seems to have patented
- C++ Microsoft C++ compiler discourages writing standard and portable code
- RC4 encryption
- Rdesktop, to connect from *nix to Windows
- Boot sector pain.
- PNG, a web graphics standard that might, at last be supported in IE 7.
(this section is becoming much too long, if someone feels like reworking it to a separate page, please feel free)
MS' answer to ISO Open Document Format. OfficeOpenXML is an ECMA standard, Ecma-376, that implements MS Office 2007 (note, MS Office does not follow or implement the standard, it is the other way round). MS has played, some said, as dirty as they could to fight ODF. Just browse Consortium Info, Walt Hucks' blog, or Groklaw's ODF/XML page and follow the links for the story of ODF in Massachusetts and EOOXML in general. A quote from James Love on participants of the "Internet Governance Forum" on open standards:
Many people are nervous about these issues, because Microsoft is investing millions to defeat them, and to attack personally government officials who Microsoft sees as too friendly to open standards, and to reward politicians and government officials who back Microsoft.
This is no empty threat. Look at the story of Peter Quin, a governement official who dared to back ODF.
It is rather common to encounter cynicism with respect to Microsoft's intentions with the standardization efforts around EOOXML. EOOXML needs ISO approaval to be eligible for government bids because "Governments worldwide mandate that only internationally agreed open standards are acceptable when creating documents". A good example of the common feeling is found in Six thousand pages, one month, no chance...:
Its preferred answer is to create its own open document format - OOXML - which can then be recognised as such by the international standards bodies. However, it does not want this to be something that its competitors can adopt freely.
The answer is to game the system. As part of this, the company has created (by itself, unlike Open Doc) a proposal for OOXML that is six thousand pages long, and then put it into the fast-track approval system with very minimal time for discussion and objection. ....
GrokDoc has a page EOOXML objections on the ISO fast track procedure and there are several Groklaw articles explaining it. Two examples: Searching for Openness in Microsoft's OOXML and Finding Contradictions and Deadline Looms to Express Concerns about ECMA 376 Office Open XML, with the usual plethora of comments. There is a permanent ODF/OOXML page on Groklaw, including a chronology going back to January of 2005, when the first government body, in the Commonwealth of Massachusetts, announced it was considering ODF.
For a quick comparison of ODF and OOXML, see Rob Weir's OpenOffice.org Conference 2006 presentation. This also goes into performance questions.
OOXML was not developed as a standard in the traditional way, but rather as a description of MS Office 2007 and presumably future versions (see this historical comment on the mindset of MS developers).
MS claims OOXML is needed because ODF cannot represent all legacy MS Office formats and is inefficient. The OpenDocument Foundation, now dissolved, created the Da Vinci converter plugin and claims 100% round-trip fidelity between ODF and the last MS Office binary formats. Furthermore, this plugin also proves the efficiency of the ODF format. Note that the final release of MS Office 2007 mysteriously broke the plug-ins as discussed by Dana Blankenhorn in Microsoft playing three card monte with XML conversion.
We learn the real reason MS couldn't standardize on ODF from Bill Hilf of MS himself, as recorded by Dinesh Nair in the Open Malaysia blog:
What got really interesting was when Yusseri raised the issue of OOXML and why didn't Microsoft just work on ODF in collaboration instead of creating a new, bloated standard. Bill's answer was quite surprising, as he clarified that the file format (OOXML) was a part of the software and that OOXML and the software (MS Office) are quite inseparable. Ergo, OOXML is an integral and inseparable part of MS Office. That's why they could not adopt ODF as the file format for subsequent versions of MS Office.
So all the rumors that OOXML is nothing but a dump of Office2007 and not an international, free and open standard are now confirmed by a MS spokesman.
In the run-up of the decision whether or not OOXML should take the ISO fast track, some tried to convince the National Bodies that they were not allowed to vote against OOXML because OOXML could physically exist next to ODF (succeeding with the US NB), see Rick Jelliffe's and Brian Jones' blogs.
The campaign even became comical, see Open Malaysia:Microsoft's Definition of Contradictions and The Art of Rewriting History. It didn't work that well, given that many countries filed objections, see Andy Updegrove's blog post and his comments on the actual responses.
However, there followed allegations that Microsoft was stacking the deck with members.
voting members with their employees.
(On a personal, opinionated note: What is the worth of a voting system where the majority of the voters get sacked if they vote against a single entity)
MS OOXML is advertised as an "Open" standard by Microsoft. So it is expected that the license for its use will at least match the openness of ODF and PDF. Two questions will be asked by any prospective user:
- Can someone stop me if I write and distribute an application that uses this standard?
- Under what conditions can I use this standard?
Microsoft has written the standard and currently has the only working implementation of EOOXML. Therefore, MS are the only ones who know what copyrights and patents must be licensed from MS and third parties to implement EOOXML. But as of January 2007, Microsoft has consistently refused to identify the copyrights and patents that must be licensed to implement EOOXML. The language used by MS in the EOOXML licenses and "covenants not to sue" is so intricate (and peculiar), that many people still doubt whether it really allows anyone but Microsoft to implement EOOXML. Microsoft has changed the licenses for EOOXML during the years. The initial offer for licensing was seriously flawed as of March 2005. In December 2005, there were still major worries about the pledges not to sue by Microsoft and the ECMA terms of reference. But as late as the start of 2007 there were still doubts whether it would be legal to implement the standard in a meaningful way. To quote the conclusion of Analyzing the Microsoft Office Open XML License:
In this way, the License is stated and is to be perceived as a threat rather than a promise. In effect it says 'Stay away! Microsoft is the only entity which may implement the Microsoft Office Open XML formats!' This is a fundamental contradiction of the purpose of an open standard.
As Andy Updegrove puts it:
And taking that concern a step further, consider the fact that OOXML also apparently violates section 2.14 of the ISO/IEC Directives, Part 1, in that not all of what it takes to implement OOXML appears to be covered by Microsoft's patent pledge, in two respects.
First, the pledge does not explicitly cover material that is referenced, but not included in the specification, and second, the Microsoft patent commitment does not cover optional features. Sections of OOXML that are not fully described include those that require compliant implementations to mimic the behavior of Microsoft products, such as those products and capabilities referred to above (OLE, etc.) Microsoft will need to clarify whether its patent commitment will in fact extend to these requirements. Potentially, these concerns would involve large portions of OOXML, in contradiction of the ISO/IEC requirement that more than a bare-bones implementation must be permitted without fear of infringement.
Most of this is also discussed in Searching for Openness in Microsoft's OOXML and Finding Contradictions and the GL EOOXML objections project page. There is also a Microsoft commissioned evaluation of the license by Baker and Mackenzie from 2006. See, eg, these discussions on GL and the Grokdoc Talk:Objections page. Basically, the Baker and Mackenzie does not address any of the criticisms leveled against the two licenses: Covenant Not to Sue (CNS) and the Open Standards Patent (OSP) license. The report has some careful choice of words, but nowhere does it state that a blanket patent license is given for implementing the EOOXML specifications in full.
This all can lead to rather comical dialogs with MS legal representatives as was exemplified in this OpenMalaysia blog Billions of Documents. At a SIRIM TC4 meeting on ECMA 376, Microsoft was represented by Stephen Mutkoski (a lawyer). A member of the technical committee asks a simple question:
They sent a lawyer to a Technical Committee briefing. Okay, perhaps we could get more legal information then....So if Macros are not included in Ecma 376, will developers who develop it independently get sued?”“I will have to get back to you ...”
To sumarize the licensing problems, the answers to the above questions for your future EOOXML compliant software can be provisionally given as:
- Microsoft can stop any attempt to distribute your application
- You need permission from Microsoft for every application of EOOXML on a computer
The OSP covers any of the Covered Specifications, and Microsoft's promise applies to “full or partial implementation,” according to its FAQ, but Microsoft also states: The OSP does not apply to any work that you do beyond the scope of the covered specification(s). This statement clarifies the qualification in the very first sentence of the OSP that the promise applies only “to the extent it conforms to a Covered Specification.” The OSP will apply to implementations of the specifications, but only to the extent that such code is used to implement the specification. Any code that implements the specification may also do other things in other contexts, so in effect the OSP does not cover any actual code, only some uses of code. Free software is software that all users have a right to copy, modify and redistribute, and as Microsoft points out in the OSP, there is no sublicensing of rights under it. So any code written in reliance on the OSP is covered by the promise only so long as it is not copied into a program that does something other than implement the specification. This is true even if the code has not otherwise been modified, and code that conforms to the specification cannot be modified if the resulting modified code does not conform. Therefore the OSP doesn't permit free software implementation: it permits implementation under free software licenses so long as the resulting code isn't used freely.
In July of 2008, Microsoft published a FAQ that stated a change in the OSP, with the GPL specifically mentioned as being covered, even in commercial code.
(In-)Compatibility with other standards
MS wanted its own XML format to be an ISO standard. However, MS' format does not conform to many other ISO or W3C standards. You might even think MS doesn't like the W3C ;-): See eg, Wikipedia, OpenDocument Fellowship, Open Malaysia: MSOOXML's disregard for existing standards and, of course, Bob Sutor's blog. See also this GL discussion:
Again from the newspick: a description of why ODF and OOXML don't interoperate.Office Open XML unquestionably duplicates or at least significantly overlaps with the ODF specification; moreover, unlike Office Open XML, OpenDocument incorporates still other standards such as XPath, XLinks, SVG, XForms, and MathML. Office Open XML reinvents the wheel at every turn rather than relying on existing open standards. The failure to implement XPath in Office Open XML is particularly problematic; it makes full fidelity in automated XSL transformations to and from other XML formats next to impossible. That problem creates a contradiction in the ISO sense; full interoperability between ODF and EOOXML applications is infeasible.
According to the Wikipedia Xpath is the following:XPath (XML Path Language) is an expression language for addressing portions of an XML document, or for computing values (strings, numbers, or boolean values) based on the content of an XML document.
The XPath language is based on a tree representation of the XML document, and provides the ability to navigate around the tree, selecting nodes by a variety of criteria. In popular use (though not in the official specification), an XPath expression is often referred to simply as an XPath.
Originally motivated by a desire to provide a common syntax and behavior model between XPointer and XSLT, XPath has rapidly been adopted by developers as a small query language.
I gather that if OOXML do not implement Xpath, then documents in this format cannot be queried using normal XML tools. For example Wikipedia says the following on XSLT.XSLT is a specific kind of template processor primarily designed to "transform" XML documents into other XML documents. The original document is not changed; rather, a new document is created based on the content of an existing one. The new document may be serialized (output) by the processor in standard XML syntax or in another format, such as HTML or plain text. XSLT is most often used to convert data between different XML schemas or to convert XML data into HTML or XHTML documents for web pages, or into an intermediate XML format that can be converted to PDF document
XSLT relies upon the W3C's XPath language for identifying subsets of the source document tree, as well as for performing calculations. XPath also provides a range of functions, which XSLT itself further augments. This reliance upon XPath adds a great deal of power and flexibility to XSLT.
What strikes me of such tactics is they are borderline on misrepresentation. When a supplier says he is using XML, most IT managers will understand a reasonably complete suite of XML related standards will be used to generate the full benefits of XML. By silently picking and choosing which standards in the XML suite they implement, Microsoft breaks this assumption without informing its customers.
Open Malaysia has a very nice writeup on the VML (non-)deprication in MS EOOXML Is VML in or out now, or was that a typo?.
This simply begs these questions:
- 1. VML is included for backwards compatibility reasons only.
- 2. Drawing ML is newer and richer and designed to replace VML.
- 3. VML is deprecated and only included for legacy reasons
- 4. New applications are strongly encouraged to use DrawingML instead
- 1. Why then is Section 2.3.1 explicitly using VML?
- 2. Where are the instructions for using DrawingML instead?
- 3. If DrawingML is richer and designed to replace VML, why have VML in the first place?
- 4. I would take it that Office 2007 is a "new application" and should use DrawingML instead of VML. Does it?
What is extra interesting is the long first comment from Stephane Rodriguez who explains that VML was implemented to lock the internet, IE, to MS Office.
So the Office team at Microsoft could not manage to get rid of VML. It is possible, unless they intentionally use it as a lock-in tool, that next version of Office, codenamed Office 14, will be the first one to have an Office-wide DrawingML implementation. Until then, Office 2007 is at odds with the specs that Microsoft is pushing and as a result, it is not a reference implementation of the specs! In other words, since there cannot be a non-Microsoft implementation of the specs (it will take at least ten years for a company starting now, and it will never achieve 100% fidelity since many things are missing in the specs), it follows that there is simply NO reference implementation out there! Quite amazing...
- Codes for the Representation of Names and Languages ISO 639
- Computer Graphics Metafile ISO/IEC 8632
- XLink W3C
- XForms W3C
- XPath W3C
- SVG W3C
- MathML W3C
- SMIL W3C
- Gregorian Calendar ISO 8601
( EOOXML redefines leap years and uses two start dates for counting, 1900 and 1904, see also Brian Jones' apology blaming the bug on a programming shortcut in Lotus 1-2-3.)
The 1900 leap year bug is an excellent illustration of how Microsoft treats standards as documentation of their code binaries, including all legacy bugs.
- The Future of Lock-in
- Game Time for OpenDocument
- Searching for Openness in Microsoft's OOXML and Finding Contradictions
- Deadline Looms to Express Concerns about ECMA 376 Office Open XML
- Standardizing away the world's languages
- Cum mortuis in lingua mortua (SVG/VML)
- OOXML has poor XML Element names
At the start of 2007, only Office 2007 implemented EOOXML. However, it is reported that Office2007 does prefer a binary format after all. Allegedly, for performance reasons.
Checking the Rob Weir blog I notice that MS Office 2007 is not using OOXML after all. There is yet another binary format that is silently introduced and touted as the preferred option.Rob Weir says: It is also interesting that Microsoft is positioning this format as the preferred one for performance and interoperability. The online help for Excel 2007 says: In addition to the new XML-based file formats, Office Excel 2007 also introduces a binary version of the segmented compressed file format for large or complex workbooks. This file format, the Office Excel 2007 Binary (or BIFF12) file format (.xls), can be used for optimal performance and backward compatibility.
The Gary Edwards blog (from the newspick) posted another finding of Rob Weir that I wasn't able to locate on his blog.Another interesting point, courtesy of the inexhaustible Rob Weir, is that MSOffice 2007 has the unique ability to produce two kinds of EooXML. I've never seen an application do this before, and one has to wonder why?
What Rob discovered is that if you import a legacy BoB into MSOffice 2007, the application will convert it to EooXML fully preserving the originating application binary encodings \u2013 even doing so within laughably and descriptively colorful named XML tags. Fine. We can easily do that with ODF using the infamous tag model. No inadequacy to be found here. (Damn, if only we had patented that technique. Phil Boutros must be lapping this up :) One of the examples Rob pointed out is the use of the long since deprecated VRM encoding. Good work MSOffice 2007!Next Rob recreated that same legacy document \u201cnatively\u201d in MSOffice 2007. Exactly the same! Saved it as EooXML. Then examined the XML, comparing the two EooXML files. Well well well. They are substantially different! Same application. Same file format. Same document content and presentation. Different EooXML! Interestingly, for one thing there is no VRM encoding. It's been replaced by the proprietary application/platform dependent but forward Vista ready DrawingML.Some will argue that this is the only way to preserve backward compatibility. I would argue that this will result in an information nightmare. Only one of the EooXML files is backwards compatible. The other is ready for the Vista bound information processing chain centered on the Exchange/SharePoint Hub. How are organizations going to keep things straight?
How can someone interoperate with Office 2007? People will have to sort out which of the ever changing file format is being produced. ISO standardisation looks pretty much like a decoy. It can be used (if at all) to send information in the MS Office suite but it will never get out. Governments that want standards for sovereignty purposes will get a "standard" but not the sovereignty.
Here is Bob Sutor's opinion from We have proprietary extensions to Microsoft Office Open XML already.
Here\u2019s an idea of how things are already reverting to the same old behavior, right when Microsoft is trying to convince the ISO how wonderful for interoperability their spec is supposed to be.
One point being made in several places, e.g., in MS Office 2007 is not using OOXML, is that Visual Basic macros in Excel (VBA) are stored in a binary (unreadable) form in EOOXML (they are not described in the standard). That is, the ASCII VB macros cannot be exported to another application using EOOXML. An anonymous comment in this blog claims that VBA macros are Excel internal and have no use outside. But other applications are implementing VB too, so that argument is incorrect.
To summarize, MS warn users to not use EOOXML to store a large spreadsheet because of bad performance and that EOOXML cannot represent all information in Excel2007. These posts also suggest that a lot of the attacks on ODF, e.g., bad performance on large spreadsheets (and here) and inabillity to represent all of MS Office features, were pre-emptive strikes to divert from the shortcommings of EOOXML itself. (and some find OO.o is faster than MS Office)
As far as I know, this might be the first ISO standard that fully implements just a single product of such horrible complexity (running over 6000 pages). It costs MS themselves an estimated 150 work years to implement their own standard. However, this was later qualified by Andrew Shebanow after a response from Rick Schaut with a different set of statistics. But even after all numbers juggling, it is still not clear whether other groups can implement the exact same product, and if they can, how much time this will take. According to How to hire Guillaume Portes by Rob Weir, it is clear that Microsoft withholds essential information to fully implement EOOXML. For a more comprehensive comparison between EOOXML and ODF, see ODF/OOXML technical white paper.
'Rumours' have it that MS EOOXML is just a serialized, packaged memory dump of the internal Office data structures. As Rob Weir puts it:
The astute reader will notice that this is pretty much a bit-for-bit dump of the Windows SDK memory structure. In this case the file format specification provides no abstraction or generalization. It merely is a memory dump of a Windows data structure.
Bill Hilf has meanwhile confirmed these rumours as reported by the Open Malaysia blog.
A good standard does contain features that help converting documents from legacy products. But it does not require a reimplementation in the standard of all legacy formats. A good illustration how far removed EOOXML is from a real standard is given in the above mentioned How to hire Guillaume Portes by Rob Weir. A choice quote from Rob Weir, after reproducing the requirements for Word 95's spacing compatibility "between full-width East Asian characters in a document's content":
What should we make of that? Not only must an interoperable OOXML application support Word 12's style of spacing, but it must also support a different way of doing it in Word 95. And by the way, Microsoft is not going to tell you how it was done in Word 95, even though they are the only ones in a position to do so.
This aspect of EOOXML is consistent with the fact that MS EOOXML seems to be a memory dump of Office internal data structures. It can be imagined that MS Office stores it's own formating information in internal data structures, but uses hooks into legacy code, bugs included, to generate older formats. Legacy formats Microsoft might not know how to recreate themselves anymore. So when dumping it's internal data to file, it inserts these hooks into the XML to refer to these legacy code fragments. How anyone without these legacy code fragments could re-implement their behavior is beyond my comprehension. Maybe someone can enlighten me about this.
In his blog, Brian Jones answers some of the points of Rob Weir and accuses ODF from having unspecified elements too. But read the comments on his blog about what questions Brian does not answer (and never answers, see this GL post). Brian Jones seem to confuse unspecified formatting elements in EOOXML that affect the layout of the document with unspecified application internal settings, eg, menu bar placement, in ODF. It is clear that the latter are not intended to be portable. Moreover, Brian Jones seems to confuse OpenOffice.org with the ODF standard. For instance, when he complains that OpenOffice.org has a config setting indicating legacy line-spacing. But this config setting is definitely not part of the ODF standard. Rob Weir discusses the lack of distance between OOXML and MS Office 2007 in Essential and Accidental in Standards where he concludes:
It seems to me is that OOXML in fact does have application-dependent behaviors, but only for Microsoft Office, and that Microsoft has hard-coded these application-dependent behaviors into the XML schema, without tolerance or allowance for any other implementations settings.
Something does not cease to be application-dependent just because you write it down. It ceases to be application-dependent only when you generalize it and accommodate the needs of more than one application.
The inclusion of application config settings into the document standard on an equal footing strengthens the picture that projects EOOXML as a semi-automatic dump of the internal data structures of MS Office 2007.
(on a personal note, it find it revealing that a very experienced MS developer like Brian Jones confuses a standard with the config options of an application)
In short, EOOXML contains a lot of distinct legacy formats. This point is made very clearly in another of Rob Weir's blog posts Calling Captain Kirk:
The OOXML specification, at 6,000+ pages has now just sucked in the complexity of one or more versions of HTML, MHTML, RTF and WordProcessingML. It requires that a conformant application understand these formats, but forbids a conformant application from producing them.
This is another example of how you never know what you're getting when you get an OOXML file. To support OOXML is not to support a single format, or even a single family of formats. To fully support OOXML requires that you support OOXML plus a motley hodgepodge of various other formats, deprecated, abandoned and proprietary. The cost of compatibility with billions of legacy Microsoft documents is that you must support their legacy of years of false starts and restarts in the file format arena.
Also Andy Updegrove's blog post The Contradictory State of OOXML refers to these proprietary legacy components in EOOXML:
Other parts of OOXML refer to OLE, macros/scripts, encryption and DRM \u2013 none of which are fully described. Nor has Microsoft stated whether necessary information will be supplied on a non-discriminatory basis to all (or at all).
Stephane Rodriguess explains in a comment to Open Malaysia how the OLE in MS EOOXML is platform bound
There is a reason why Microsoft does not document this stuff. They would have to give the documentation of OLE, which in turn would show clearly as in water that this stuff is platform-dependent : Windows.
At some point, MS started to drum op "Representation" as the basic value of MS EOOXML, see Rob Weir's No representation without specification. This representation thing can be summarized as EOOXML is like a street directory without a map. It represents each address, but won't help you to find it.
At this point, I am starting to seriously doubt whether even Microsoft could reimplement EOOXML from scratch.
Summary and epilogue
During the spring and summer of 2007, the developments around MS EOOXML became so heated that a small page like this could never do justice to it. Groklaw has it's own ODF/MSOOXML PAGE page with a selection of links. You could write a book about the ODF/MS EOOXML war, as is currently done by Andy Updegrove with ODF vs. OOXML: War of the Words (an eBook in Process). It seems best to let all these other efforts do a better job than can be done here.
A summary of the MS EOOXML fracas can be easily given. MS EOOXML is by all standards (pun intended) Incoherent, Incomplete, Incorrect, and Incomprehensible. It is basically a badly designed XML wrapper around MS Office's internal object models. Or as written on OOXML Hoo-Hah
...this is just a six-thousand-page data dump describing a particular XML serialization of a particular commercial application’s object model, completely oblivious to the universe of publishing-related standards that have been hammered out and put to work while MSOffice was being tended in Redmond.
The binary and other Office heritage of MS EOOXML is also discussed by Stephane Rodriguez in a long post OOXML is defective by design.
Some question why Microsoft would try to standardize what many viewed as unfinished work and ask whether this standard was ever intended to be implemented by anyone, or at least anyone but Microsoft. The standard follows the MS Office application like a city map follows the streets of that city (and not vice versa). Perhaps the answer could be that after ODF became an ISO standard, MS needed something, ANYthing to offer instead.
Several of the national bodies that tried unsuccessfully to appeal the adoption by ISO of OOXML as a standard expressed that it brought ISO into disrepute. Many agree. The rules (fast-track) standard applications have to abide by seem to be unknowable and changeable, even on the fly, so much so that it brings into question the very word standard in such a context.
In the spring of 2010, even the BRM conveyor Alex Brown came to the conclusion that OOXML was heading for a standards failure in his blog Microsoft Fails the Standards Test.
On this count Microsoft seems set for failure. In its pre-release form Office™ 2010 supports not the approved Strict variant of OOXML, but the very format the global community rejected in September 2007, and subsequently marked as not for use in new documents – the Transitional variant. Microsoft are behaving as if the JTC 1 standardisation process never happened, and using technologies (like VML) in a new product which even the text of the Standard itself describes as “deprecated” and “included […] for legacy reasons only” (see ISO/IEC 29500-1:2008, clause M.5.1).
In short, we find ourselves at a crossroads, and it seems to me that without a change of direction the entire OOXML project is now surely heading for failure.
This has all been predicted by Tim Bray, as Alex Brown seems to realize only now.
Indeed standards and XML veteran Tim Bray, writing shortly after the standard’s approval, made a prediction which could now seem impressively prophetic:
“It’s Kind of Sad• The coverage suggests that future enhancements to 29500, as worked through by a subcommittee of a subcommittee of a standards committee, are actually going to have some influence on Microsoft. Um, maybe there’s an alternate universe in which Redmond-based program managers and developers are interested in the opinions of a subgroup of ISO/IEC JTC 1/SC 34, but this isn’t it.
I suppose they’ll probably show up to the meetings and try to act interested, but it’s going to be a sideline and nobody important will be there. What Microsoft really wanted was that ISO stamp of approval to use as a marketing tool. And just like your mother told you, when they get what they want and have their way with you, they’re probably not gonna call you in the morning.”
The following GL comment is reproduced in full as it tells the perils of Open Standards meeting Microsoft:
Advanced Streaming Format (ASF) is the proprietary, patented Microsoft media container format (.wmv, .wma, .asf). Microsoft vigorously attacks anyone who attempts to write software compatible with this format unless they sign a licensing agreement. This is why there are so few ASF streaming clients, and no streaming servers to my knowledge.
As for the codecs, there is a huge number of them and Microsoft is always careful to push its own proprietary codecs -- most recently Windows Media 9 -- before MPEG standards (MPEG-1 which includes MP3, MPEG-2, MPEG-4 SP and ASP, MPEG-4 AAC, MPEG-4 AVC/H.264), third-party codecs, and excellent FOSS alternatives (Ogg Vorbis, Theora, &c). Microsoft often quietly drops previously distributed third-party codecs from later versions of Media Player, leaving users confused as to why their films or audio don't work after an "upgrade". Voxware Metasound is an example.
Of particular interest is Microsoft's reaction to the success of the excellent MPEG-4 AVC/H.264 codec, embraced by the audiovisual, broadcast, and telecoms industries but absent from any Microsoft product. Microsoft decided to find a standards organization ready to rubberstamp Windows Media 9; they chose the US Society of Motion Picture and Television Engineers and made a $100,000 "donation" to the SMPTE Foundation. Unfortunately for them, longtime SMPTE engineers did not like being asked to rubberstamp an incomplete spec and the process took two years; however, Microsoft did succeed in getting that "standard" into the official specs of both HD-DVD and Blu-Ray, alongside MPEG-2 and MPEG-4 AVC/H.264. Microsoft claims they will support both HD-DVD and Blu-Ray, but until they add full MPEG support they are not doing so. Meanwhile, the Chinese are promoting their own standard, AVS...
OpenDoc was an early attempt to achieve the standardization on the application level that ODF achieves at the document level (there was no XML then). With predictable reactions from Microsoft.
What better source than GL's Novell v. Microsoft coverage:
84. First, as discussed above, Microsoft excluded from the markets the "OpenDoc" technology for sharing information among applications, by using its monopoly power to force a different standard upon the industry. . . .
85. Microsoft responded to this competitive threat by preventing CIL from making OpenDoc compatible with Windows 95. For example, Microsoft routinely required all ISVs to execute nondisclosure agreements as a condition of receiving the information they needed to develop their applications. These agreements, however, contained terms that uniquely targeted ISVs who were members of CIL, by preventing their employees who worked on OpenDoc from receiving Windows 95 betas or specifications, which effectively prevented CIL from initially developing OpenDoc for Windows 95. In addition, Microsoft required ISVs working with a Windows 95 beta to agree that they would not work on OpenDoc for two years. While Microsoft eventually dropped this requirement, its impact had immediate anticompetitive effects on OpenDoc's development.
86. Further, Microsoft unilaterally announced that OLE would be incorporated directly into Windows, instead of existing independently of the operating system as a technology to be adopted or rejected by ISVs, depending on their assessments of its technical merit. Microsoft then required OLE-compatibility as a condition of Microsoft's certification of an application's compatibility with Windows 95. This certification requirement was a significant barrier to entry into the applications markets, because Microsoft represented to the industry that any application lacking the certification could not be trusted to run on Windows 95. By exploiting this barrier to entry, Microsoft forced ISVs to make their applications OLE-compatible. Furthermore, Microsoft ensured that only applications using its tools, and not those of its competitors, would reach customers. This anticompetitive behavior by Microsoft is similar to the behavior described in the Government Suit with respect to Microsoft's efforts to force ISVs to use Microsoft's implementation of Java. "Specifically, in the First Wave agreements that it signed with dozens of ISVs in 1997 and 1998, Microsoft conditioned early Windows 98 and Windows NT betas, other technical information, and the right to use certain Microsoft seals of approval on the agreement of those ISVs to use Microsoft's version of the Windows [Java virtual machines] as the 'default.'" Findings of Fact ¶401.
HTML and WWW
We remember when Microsoft joined the World Wide Web Consortium (W3C), an international consortium which develops Web standards and guidelines, to build consensus around Web technologies such as for HTML. Despite being a member, Microsoft developed their own proprietary HTML extensions anyway, their own little tweaks and twists, on the specification, which degrades the display of a good deal of the web by other browsers, simply because Microsoft won't follow the standard themselves, or maybe worse. They hold a dominant position in terms of numbers of users. You know, that old monopoly thing. So it's a meaningful negative effect. So just joining a standards group doesn't necessarily mean Microsoft will play fair. Remember this little incident in 2001? Last week, people who tried to visit MSN.com with a non-Microsoft browser found themselves locked out. Although Microsoft's own Internet Explorer easily accessed the popular site, other browsers--such as Opera, Mozilla, Amaya and some versions of Netscape--received error messages and recommended that people "upgrade" to Internet Explorer.
Kerberos standard with proprietary extension
This quote from Jeremy Allison discussing Microsoft's proprietary "extended" Kerberos specs says it all:
This is course is a very clever way to pretend to distribute the spec, whilst making it completely impossible to implement in Open Source kerberos servers. If you did of course the full weight of US anti-reverse engineering laws would descend upon you.
The EU commision has a section on MS' handling of Kerberos in their (long) final decision against MS (PDF, too long to quote, search for "Kerberos"). A snippet:
Already in its reply to the Commission’s first Statement of Objections, Microsoft stated that it in fact published "on 26 April 2000 […] details concerning its use of the Authorization Data field […]". Microsoft thereby referred to a specification called Microsoft Authorization Data Specification v. 1.0 for Microsoft Windows 2000 Operating Systems.
However, this specification only described the structure of the authorization_data field and does not describe in detail the meaning of the various fields. Furthermore, the text of the document provided that "the Specification is provided to you solely for informational purposes […] and pursuant to this Agreement, Microsoft does not grant you any right to implement this Specification". Thus, the specification could not be used by competitors to adapt their work group server operating systems so that they could participate in Microsoft’s Kerberos-based security architecture.
The Sender ID flap
A failed attempt to make all email dependent on MS licenses. PJ wrote an extended commentary with links.
Sender ID, as you no doubt know, is a proposed IETF email standard that combines Microsoft's Caller ID with Meng Weng Wong's popular SPF. The combo is designed to make it difficult if not impossible to spoof an email sender address. It's also an anti-phishing technique. The problem is that Microsoft evidently -- at least so far -- believes that because it is offering to contribute a portion of the standard, and it has applied for a patent on the PRA algorithm, it should be able to control all of the standard by attaching restrictive licensing terms to their contribution, terms that would exclude GPL software from being able to use the standard.
However, the compatibillity problems are not so much in the definition of the language and libraries (although these can make you weep, try coding AJAX for both IE and Firefox). The problem is in the underlying DOM and CSS definitions, needed to get ECMAScript doing anything usefull (note the many Microsoft extensions):
As usual Microsoft has extended the standard somewhat. Though sometimes its extensions are brilliant (innerHTML springs to mind), in the case of the DOM Core they aren't.
Note the difference between W3C and Microsoft methods. The W3C methods are owned by the parent element of the node you want to adjust, the Microsoft methods by the node itself.
You can feel the frustration of the comment posters to this blog where MS does profess standards compliance in IE.
By "fully supported CSS 1", do you mean "mostly passed the W3C's basic test suite for CSS1", or do you mean "had some sort of detectable behaviour for every feature of CSS1"?
Because I _know_ you don't mean "correctly supported all of CSS 1"...
See Carrying Water for Microsoft for what this all means in practise.
Richt Text Format, MS' previous interoperability standard that wasn't.
RTF is awful, it changes with every version of Word.
See my post Native RTF support in Office "12"? on Brian Jones' blog.Sadly, RTF is not a reliable standard. Why? Because its main purpose seems to be to provide seamless compatability between versions and OS platforms of Microsoft Word itself. It is routinely updated with every new version of Microsoft Word. It has never been proposed to ANSI or the ISO or any other standards group for peer review. It is 100% Microsoft.
Quoted from the linked post:
As it happens, the specification documentation is a minefield for anyone outside Microsoft who dares implement RTF support:
- "1.4 License Grant for Documentation. The documentation that accompanies the Software is licensed for internal, non-commercial reference purposes only."
- The latest version of the specification documentation, 1.8 of April 2004, is only available at www.microsoft.com as an executable download. It is not available online in any other format, including HTML. It will not run on a non-x86, non-Windows computer.
- The executable proposes an obligatory EULA which contains the following text:
- "3. RESERVATION OF RIGHTS AND OWNERSHIP. ...Microsoft or its suppliers own the title, copyright, and other intellectual property rights in this Software..."
- "17. ENTIRE AGREEMENT; SEVERABILITY. ...To the extent the terms of any Microsoft policies or programs for support services conflict with the terms of this EULA, the terms of this EULA shall control..."
"No part of this document may be reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of Microsoft Corporation."
- The document itself, in Word format, contains the following text:
From this, you might get the impression that MS doesn't want competitors to implement RTF functionallity.
LDAP (Lightweight Directory Access Protocol) is the basis of MS Active Directory. But read about it's implementation in the (long) 1994 final decision of the EU commision against MS (pages 69-71 or search for "LDAP"):
Microsoft explains that “Active Directory supports versions 2 and 3 of the Lightweight Directory Access Protocol (‘LDAP’), which is the industry standard directory access protocol”.
However, Microsoft has extended LDAP in a proprietary fashion and failed to disclose the extensions that it has made. In the second Statement of Objections, referring to a submission by Sun which Microsoft had the opportunity to access, the Commission pointed out such an extension in the field of “LDAP binding”.
When a piece of software needs to access a directory service, it has first to set up a communication. This process is referred to as “binding” to the directory service. For security reasons, the binding process often implies inter alia the negotiation of an authentication protocol. In LDAP version 3, this negotiation may use the “SASL authentication method”, which is an IETF standard. The submission by Sun to which the Commission referred in its second Statement of Objections claimed that Microsoft’s products use undisclosed extensions to the SASL standard in binding to Active Directory. Microsoft has not rebutted this argument. Sun also pointed out several functions of the Microsoft LDAP library that are not documented. Microsoft has not rebutted this argument either.
(SMB/Samba) a standard poisoned by MS (see here)
Quotes from Mr. Jeremy Allison in the Free Software Foundation's 1993 Letter to DG Competition commisioner Mario Monti:
As part of the settlement with the US Dept. of Justice, Microsoft was required to make descriptions of some of their proprietary protocols available, those used to communicate between Windows clients and servers, under "reasonable and non-discriminatory terms". It did not require the publication of the proprietary data formats used by Microsoft Office or other productivity software or stipulate what these terms should be.
Even with these revisions the license on these protocols was too restrictive to allow true competing products to be created from these documents. The protocols used to communicate server to server were excluded and there were restrictions on how the information in these documents could be used to create products and shared as code in implementations.
I have examined one of these documents, the CIFS file access protocol specification publicly available from Microsoft without the requirement of an NDA.
In my professional opinion as an implementor of this protocol, this document is incomplete and inaccurate and is completely insufficient to create an interoperable implementation of the CIFS file access protocol (the protocol that Samba already implements, which has been written without any knowledge of or reference to information in this document). The description of the protocol was significantly worse than the knowledge Microsoft had already released in other publications and does not add any information that would be of use in writing Samba code. In addition, it does not give any information on the auxiliary protocols such as the printing protocols or authentication protocols which are required to create a competing workgroup server.
Here is a link to an interview with Jeremy Allison of the Samba team, posted 2007-01-09, where he explains that Microsoft employees were given orders to disrupt Samba (to put it politely) when implementing SMB 2.
The best reading of MS' handling of the CIFS protocols can be found in the 2004 final decision of the EU Commision (too long to quote here, look at pages 68/69, 81/83 or search for "CIFS" and "Samba").
A standard MS wants to get killed at ANY cost. Eg, MS is buying up all patents on OpenGL, to prevent a viable implementation. (see extensive links here, here and here) There are rumours that nVidia lost their lucrative Xbox contract to Ati because nVidia insisted on supporting OpenGL. A Groklaw comment
MS has done the same thing with Silicon Graphics (SGI) and the OpenGL Architecture Review Board (ARB). 'Fahrenheit' was supposed to have been a successor of both OpenGL and Direct3D that combined the "best" of each. SGI and other members of the OpenGL ARD spent a lot of time and money working on it. Microsoft decided to not support it after all and released Direct3D 9.0 with many of the discussed improvements. SGI soon after began teetering on the edge of bankruptcy. Microsoft has never fully supported OpenGL in Windows; there would be NO real support for it without video card manufacturers like nVidia and 3Dxf including support via their drivers. In addition, MS Windows Vista (Longhorn) has further crippled support for OpenGL by completely relegating it to a video-card driver supplied interface. OpenGL won't be supported by MS natively if you choose to run any of the "nifty" 3D themes or animations promised for that OS. So users wanting their "eye candy" will need to either choose between it and any application using OpenGL or pay for a video card that adds that support for them. The big problem is that MS isn't telling the video card manufacturers how to do that. Thus MS is telling ISVs to port all their OpenGL applications to Direct3D (of course) or risk losing customers because they can't have a "full rich 3D experience". Luckily, again, nVidia drivers implement OpenGL well enough for Vista -- at this time. Many in the industry see that the reason that MS went with ATI for the XBox 360 instead of nVidia (who manufactured the video for the original XBox) was to punish nVidia for their support for OpenGL and Open Source software, however limited some might see it.
Another, more extensive comment is reproduced in full:
Start of quote
Here is the proper link to the official OpenGL.org site.
Here is a 1996 Microsoft "presspass" extolling that Microsoft was the first OpenGL vendor to support OpenGL 1.1 in MS Windows 95. Here is 1997 presspass announcing the formal partnership between MS and Silicon Graphics (SGI) to define the "future" Graphics API code named "Fahrenheit".
Note the announcement:
During the development of the Fahrenheit project, Microsoft and Silicon Graphics have also agreed to work together in support of the development of Windows-based graphics applications for professionals through the OpenGL APIs and the development of Windows-based graphics applications for consumers through the Direct3D API. It further explains their long-term plans:
Fahrenheit: Common Architecture for Innovation
The Fahrenheit project will produce the following three components: Fahrenheit low-level API will become the primary graphics API for both consumer and professional applications on Windows. The Fahrenheit low-level API will evolve from Direct3D, DirectDraw and OpenGL while providing full backward compatibility with applications and hardware device drivers written for Microsoft Direct3D and functional compatibility with Silicon Graphics' OpenGL technologies.
Fahrenheit Scene Graph API will provide a higher level of programming abstraction for developers creating consumer and professional applications on both Windows and Silicon Graphics IRIX operating systems. This API will evolve from Silicon Graphics' current Scene Graph API. The Fahrenheit Scene Graph API provides high-level data structures and algorithms that increase overall graphics performance and assist the development of sophisticated graphics-rich applications.
Fahrenheit Large Model Visualization Extensions will be based on the Silicon Graphics OpenGL Optimizer API and complementary DirectModel API from Hewlett-Packard Co. and Microsoft. They will operate in conjunction with the Scene Graph API. The large model visualization extensions add functionality that will allow the interactive manipulation of large 3-D models such as an entire automobile. The Large Model Visualization API adds functionality such as multiresolution simplification to the Scene Graph API so developers can easily write applications that will interact with extremely large visual databases. This technology will also be designed to enhance legacy applications with new large model visualization capabilities.
The presspass continues:
Microsoft and Silicon Graphics engineers will begin development on Fahrenheit APIs and extensions immediately. They will deliver new APIs, DDKs and Software Development Kits (SDKs) in phases over the next two and a half years. Phase One will be the delivery of the Fahrenheit Scene Graph and Large Model Visualization in the first half of calendar year 1999 for Microsoft Windows and Silicon Graphics IRIX. Phase Two will be the delivery of the Fahrenheit low-level API in the first half of calendar year 2000 on Microsoft Windows only. For the Windows platform, Microsoft will be the direct source for licensing, certifying and distributing the SDKs and DDKs. For the Silicon Graphics IRIX platform, Silicon Graphics will be the direct source for licensing, certifying and distributing the SDKs and DDKs.
Note that Fahrenheit never materialized in any form except as extensions and then additions to Direct3D. SGI started their decline soon after 2000. Here is one page on the OpenGL.org site listing links to documentation produced by or about Microsoft and their support OpenGL since 1990.
Here is an OpenGL.org page listing references to the Fahrenheit project, including ARB minutes. Note that they are all positive and forward-looking. And that they are all dated before the year 2000 (note the dates the minutes were taken, not when they were posted). Let's take a short look at the OpenGL versions history.
- 1.0 -- First official release. OpenGL is an Open API based upon SGI's older IrisGL. It is managed by an Architecture Review Board whose members include SGI, HP, IBM, and others.
- 1.1 -- Released 1 July 1992. Included additions and improvements in texture mapping, geometry operations, and fragment operations.
- 1.2 -- Released on March 16, 1998. Included additions and improvements in texture mapping and the pixel processing pipeline.
- 1.2.1 -- Released October 14, 1998. Introduced ARB extensions. First ARB extension is Multitexture, based on the SGIS multitexture extension. It handles application of multiple textures to a fragment in one rendering pass. A new corollary discussing display list and immediate mode invariance was added to Appendix B on April 1, 1999.
- 1.3 -- Released August 14, 2001. Included additions and improvements in texture mapping capabilities previously defined by ARB extensions.
- 1.4 -- Released July 24, 2002. Includes additions to the classical fixed-function GL pipeline, plus programmable vertex processing.
- 1.5 -- Released July 29, 2003. Includes additions to the classical fixed-function GL pipeline, plus ARB extensions. This version also introduced the OpenGL Shading Language specification, ARB shader objects, ARB vertex shader, and ARB fragment shader extensions through which high-level shading language programs can be loaded and used in place of the fixed function pipeline. Much of the work on the Shader was done by SGI and 3DLabs.
- 2.0 -- Released September 7, 2004. Supports high-level programmable shaders. This is a culmination of the work begun in the 1.5 version.
Note the extended time between the 1.2.1 and 1.3 versions. IIRC, this can be directly attributed to the Fahrenheit fiasco.
So, you might ask, what is the current version of OpenGL natively supported by the latest versions of MS Windows (i.e. XPsp2)?
Version 1.2.1 if you're lucky. Any other version supported is done through third-party add-ons (such as those written by game manufacturers) or through a ICD (Installed Client Device) -- usually through video card drivers. This is the reason so many games developers write for Direct3D on Windows. They have no guarantees that the proper support for advanced OpenGL features they may wish to use exist (especially capabilities in the 1.5 and 2.0 versions) unless they write their own ICD specific to their application.
This, however, doesn't stop Id Software, makers of the Doom, Hexen, Quake, etc. series of games, from writing game rendering engines based upon OpenGL. Doing so allows them to quickly port their games to Linux and other platforms. OpenGL is currently supported on Windows, Linux, MacOS, and most of the Unixes.
Here is an archived copy of a comparison John Carmack, Technical Director of id Software, made between OpenGL and Dirct3D in December of 1996. He has since commented that implementing a rendering engine in Direct3D has gotten easier. But you'll notice that Id still uses OpenGL
One of the primary reasons is that John has so far preferred OpenGL as a development API. Also a very good reason to stay with OpenGL is that as we make the game cross-platformed, you know we have a full OpenGL driver under Linux and Macintosh and Direct3D is not available under either of those. As we are moving forward with technology however the Direct3D API has matured really well and it is effectively equivalent to OpenGL at this point. There are probably some compelling reasons for us to take a good look at Direct3D as an option (Note: Xbox). But at this point there isn't a real reason or desire to move from OpenGL.
That's the promise of Open APIs -- they can be supported on multiple platforms, meaning vendors have fewer hoops to leap through to port between platforms. Open APIs don't allow Microsoft to achieve platform lock-in, however, so they only give lip service to their support, unless it's required for them to interoperate with someone else.
The problem of support for OpenGL affects more than games, though. Many scientific, CAD, modeling, and engineering applications that were once written for UNIX systems were ported to MS Windows NT and 2000 beause they supposedly supported OpenGL (many times the only "full" support such applications could depend upon was through an ICD). PCs were also much less expensive that UNIX workstations, if you didn't need their extra power. Microsoft made promises to those developers; promises they broke.
"Microsoft and 3D Graphics: A Case Study in Suppressing Innovation and Competition" by Alan Akin is good resource for those looking for the overall view on the dirty tricks Microsoft has used to try to kill OpenGL in favor of Direct3D.
End of quote
Microsoft isn't through trying to kill OpenGL, however. Here is a discussion thread on OpenGL.org discussing the level of support MS has included for OpenGL on Windows Vista's "Aeroglass" GUI. The problem is that Aero uses Direct3D to produce a 3D Desktop GUI, including transparent windows and non-Orthogonal views.
Preliminary tests using the Beta versions of Vista showed that applications requiring OpenGL could see up to a 50% or more reduction in capability when executed within the Aero environment. This is because all application calls to the GDI (Graphics Device Interface) are now routed through Direct3D, including "normal" OpenGL hardware requests.
The Microsoft solution to this problem is for users to "give up" the advanced Aero interface or install an ICD that handles the OpenGL to Direct3D translations in hardware. This means that, once again, OpenGL must be supported by hardware manufacturers through an ICD to work properly on a Windows platform -- and not natively within the OS itself.
This Call to Action is on the OpenGL.org home page: Microsoft's first technical beta of Vista layers OpenGL over Direct3D in order to use OpenGL with a composited desktop to obtain the Aeroglass experience. If an an application runs using a high-performance OpenGL ICD - the desktop compositor will switch off - significantly degrading the user experience. Write to your preferred software developer, hardware developer and video card manufacturer and tell them to make sure Microsoft solves this problem before release and fully supports OpenGL ICDs within Aero. Hardware and software vendors listen to developers. Don't be passive - send those emails and keep the topic in the foreground. Hopefully this clarifies the history of Microsoft's attempts to embrace, extend, and extinguish OpenGL in favor of Direct3D on MS Windows platforms.
Yet annother comment on GL
Windows supports OpenGL up to version 1.1 IIRC. The drivers for your videocard are whatever the latest version was when the driver was made (at least for the big one's: Nvidia and ATI both support 2.0). In a windows program you can only use whatever is provided by BOTH the win32 library (OGL 1.1) and the video driver. Here's a schematic:
Application -> Win32 OpenGL lib -> Video driver -> Hardware
Windows programs therefor can only reliably use OpenGL up to version 1.1. MS refuses to create a new library for the newer versions and because Windows internal API's are undocumented/encumbered/etcetera no-one else can update it to 2.0.
There are ways around it, which involves bypassing the win32 dll and asking straight at the video driver if a certain function is present or not (look up function pointers in a C/C++ reference). As you can imagine this is quite burdonsome for the developer. Not only does he have to check manually for each and every OGL 1.2 function and up, he also has to code all the workarounds and fallbacks in case a function is not present -- fallback that should be provided by the win32 dll (and which are indeed present for OGL 1.1 and 1.0 functions).
Here is a Groklaw comment on the acceptance of OpenGL in the professional 3D CAD market (2006).
More links from yet another Groklaw comment (from 2009):
Here's an updated link to "Microsoft and 3D Graphics: A Case Study in Suppressing Innovation and Competition" by Alan Akin for those looking for the overall view on the dirty tricks Microsoft has used to try to kill OpenGL in favor of Direct3D.
Also see the following:
- Wikipedia: Farenheit API
- Michael's OpenGL page
- The Register: MS and OpenGL: supporting it to death?
C#, ISO/IEC 23270, and CLI, ISO/IEC 23271, are central parts of the .NET framework that implements a Java lookalike without the "run anywhere" part. That is, C#/CLI are intended to be tied to MS Windows. Although it was set up as a common development platform with bindings to every common programming language, there keep turning up problems for those who try to interface to C#/CLI on MS Windows from other OS's (see below).
These standards are burdened with so many patents, claim MS, that only MS can legally distribute an implementation of the .NET framework. However, the Mono developers are adamant that they do not know of any patent that they infringe on.
Outside of the legallity of reimplementing C#/CLI, is the fact that MS has done the "embrace, extend, extinguish" backwards. As seems to be usual for MS (see the final decision of the EU commision), the published standard is only a subset of MS' implementation as is discussed here on GL. Mono does only implement the official published standards, so MS software will be able to use applications developed on Mono, but not the other way round.
A case report on two competing accounting software vendors that suddenly require Windows servers where they used to work with Linux illustrates this point.
The reader believes that Microsoft is behind this. "I think this is a calculated move by Microsoft to stop Linux's increasing market share in the server market, and help increase their own," the reader wrote. "I think the developers are enabling this behavior, and in fact may be called co-conspirators in assisting Microsoft in their attacks against non-Windows server systems. I find this outrageous behavior by the developers and have already informed them of my and my clients' displeasure in forcing them to make outlays for something they didn't need, for server software they didn't want, and for the additional outlays that lay for my clients in the future.
However, most criticism is targeted at the legal problems of implementing C#/CLI.
Quotes from David Berlind's article Will C# benefit Microsoft, or the industry?:
But the first question that those third parties must ask is whether another commercial deployment of the CLI standard--say, for Linux or Unix--could infringe on a Microsoft patent. According to Microsoft's director of intellectual property Michele Herman, who I interviewed earlier this year, the answer is a qualified yes. "If someone implemented a product that conforms to the specification, we believe we have a patent or one pending that's essential to implementing the specification."
MS will require a RAND license which, among other things, contains the following restriction:
Sub-licensing prohibition: "This means someone else can't come along and license the patent or transfer the license we issued to them to someone else," Herman said.
Which we know MS uses because:
According to Herman, "the field of use (you get a license only to implement the standard for the purpose of conformance) and the prohibition on sub-licensing are inconsistent with the requirements of Sec. 7 of the GPL. Sec. 7 of the GPL says that if you do not have the rights to distribute the code as required under the GPL then you do not have the right to distribute at all. The GPL says you must have the rights to sublicense and to freely modify outside the field of use limitation."
In short, you are allowed to implement it, sort of. However, your "customers" are not allowed to use or distribute it. Each customer has to get her/his own license. Which is another way of saying that you are not allowed to compete with MS.
For other analysis that come to the same conclusions, see:
- David Berlind: Will C# benefit Microsoft, or the industry?
- SO Grady: What’s the Problem? On Microsoft, Mono and Patents
- msversus.org: Microsoft's Response to .NET Patents
- Cnet: .Net patent could stifle standards effort
- The Register: MS patents .Everything
Basically, .NET is a Java lookalike without the "run anywhere" part. Only MS is legally allowed to freely distribute an implementation, or so they claim. Users of other implementations must individually obtain licenses to use the framework.
For other analysis that come to the same conclusions, see:
- David Berlind: Will C# benefit Microsoft, or the industry?
- SO Grady: What’s the Problem? On Microsoft, Mono and Patents
- msversus.org: Microsoft's Response to .NET Patents
- Cnet: .Net patent could stifle standards effort
- The Register: MS patents .Everything
Java was intended to deliver "write once, run anywhere" over the internet. Microsoft saw this as a direct attack on their platform monopoly.
A great history lesson from Expert Testimony of Ronald Alepin in Comes v. Microsoft - Embrace, Extend, Extinguish
Some choice quotes about Microsoft's view of Java as a cross platform standard. A discussion in court about one of Bill Gates's emails, from 1996, about Java:
MR. LAMB: Can you highlight the first sentence, Darin, some additional thoughts?
Q. Okay. Can you read that for us?
A. Some additional thoughts. These are all based on my conclusion that Java is already here, and we need to move down the embrace/extend path.
Q. Okay. And now, again, for the Jury, what does embrace mean in this context as used by Microsoft employees?
A. It's used to indicate a strategy where Microsoft will embrace the standards or the specifications and interfaces of another company's software.
Q. Okay. And what does extend refer to?
A. Once the specifications have been embraced, then Microsoft will extend them and add additional interfaces proprietary to Microsoft.
Q. Okay. When you say add additional proprietary interfaces that are Microsoft's, what impact does that have technologically to other ISVs and OEMs?
A. Well, the result is or the impact is that what was once sort of community development property, the work of the industry and industry participants is appropriated essentially, is taken over by Microsoft. And then Microsoft takes it and with its proprietary extensions, makes it essentially unavailable on a going-forward basis to the industry participants who were responsible for first developing the specifications and the standards.
The rest is history. Sun sued MS for breaking the standard. MS lost the fight and dropped all support for Java as soon as legally possible. MS own Java killer (without the "run anywhere" part) was the .NET branding effort.
See the following GL comment
In the early 1990s, I served as an alternate member for my company in a standards body called SQL Access Group (SAG). Many large companies had members in this group, including Microsoft. At that time, it was difficult to write database programs, because each of the major vendors had their own proprietary APIs for database connections, issuing queries, getting result sets, and so on. SAG was charged with coming up with a specification which would allow a programmer to write a single database program that would work with Sybase, Oracle, DB2, etc. After many meetings, the members came up with version 1.0 of the spec. But there were lots of problems with that version, and all the member companies agreed to wait for the next version of the spec before implementing it. Including Microsoft. Imagine our suprise when Microsoft announced ODBC, which was an implementation of the SAG specification, version 1.0. This had two negative effects. (1) The world embraced a lousy implementation as the answer to their database access problems. (2) SAG's work was totally undermined, and version 2.0 of the spec never materialized. Needless to say, I'm rather leery of Microsoft's promises.
According to this GL comment by roadfrisbee (sorry, I can't get the link right):
Didn't something similar happen around 1999 with ActiveX? I remember that a group was established with much fanfare by Microsoft, to make this an industry 'standard'. They met exactly once, then were never heard from again. But during that time, many implimented this 'soon to be a standard', which accomplished exactly what Microsoft was hoping for. This latest 'standard' from Microsoft has a real familiar aroma to it.....
And the answer in this GL comment.
Yes, they have been down this road before. There are some remnants of this project in The Open Group library, but I am not aware of anything useful.
NTP, the Network Time Protocol
Which seems not to work with MS NTP servers. See this GL comment
And a link explaining the problem: ntp questions Re: Howto synchronize a ntp client (Linux) with a server running w32time (Win 2000)?
Our organization was hit by this earlier this year.
Of course, you could always just buy one of these:
The NTS-3000-S which is a Windows NTP server that provides High security due to proprietary operating system. In all fairness they don't claim the OS is Windows, and it is an embedded product so it may or may not be.
And from the link ntp questions ...?:
> How to synchronize a NTP client (Linux) with a server running w32time (Win 2000)?
Not legally. NTP requires an NTP server to work to specification. If enough fields are set appropriately in the reply, you might be able to in practice, but you really should make Windows the leaf node and *nix the server, as Windows is about the worst common platform, even when running the reference implementation.
Is just a convoluted and subverted AIFC with byte order switched. See Principles of digital audio
.wav or Microsoft WAVE (Designed for PCs and Windows, but now usable with most audio programs, Mac or PC. Similar to AIFF for bit-depth and sample rates. As mentioned above, it uses MSB's and LSB's in reverse order of AIFF files, so Microsoft developed the RIFF interchange File Format to support the "Little Endian" scheme. Click here for specs)
The file system "standard" that MS suddenly seems to have patented (see here). The same seems to hold for NTFS (no links)
From the link:
::std::equal(m.begin(), m.end(), p)
raises the following warning:
C:\Program Files\Microsoft Visual Studio 8\VC\include\xutility(2674) : warning C4996: 'std::_Equal' was declared deprecated C:\Program Files\Microsoft Visual Studio 8\VC\include\xutility(2661) : see declaration of 'std::_Equal' Message: 'You have used a std:: construct that is not safe. See documentation on how to use the Safe Standard C++ Library'
Wow. Thanks a lot. The same thing happens for a whole raft of other standard functions, such as copy, set_difference, replace, remove, merge, etc.
To connect from *nix to Windows, and not vice versa.
Boot sector pain
MS seems to change its boot-sector format often to imperil dual boot machines. I know that Windows will zap my boot sector with every upgrade and MS did use the bootloader to kill off BeOS. However, boot loader problems might not be entirely MS's fault as the linked GL discussion shows. And if you like Windows installation rumours (and wondered why MS's domains don't suffer from the common problems), you can hardly beat this one.
After 7 years of flaming, cursing, moaning and petitioning, now only MS supports PNG alpha? ROFLMAO. And only on XP SP2, 2k3 SP1 and XP64, Win2k users are shafted and given the raw deal as usual.
So IE 7 will fully support PNG? Seems only competition can get MS moving.