Home | Older Articles |     Share This Page
Dynamic Linking Libraries

Paul Lutus Message Page --

Copyright © 1997, P. Lutus. All rights reserved.

Click here for an instant solution without analysis (not recommended).

There once was a shiny kingdom on a hill …

Okay, okay. I know you came here to read about DLLs (Dynamic Linking Libraries) and didn't expect to be confronted with a fairy tale, but my point is there is a connection between fairy tales and the DLL system.

The original idea for the DLL system was that there would be a central repository of code. Here are the advantages:

  1. Applications would link to this code library, thus saving greatly on duplication of effort and storage space.
  2. Applications that used the DLL system would behave exactly the same as all other applications that used it.
  3. If a problem arose, or a new feature was desired, it could be written once and all would benefit. In this sense, the DLL system is a weak version of the object-programming paradigm.

Naturally enough, along with these advantages came some responsibilities. An application should only place a DLL in the central repository if:

  1. The DLL was newer and/or better than the ones already there.
  2. The DLL was uniquely named, i.e. did not conflict with a DLL for another purpose with the same name.
  3. If the DLL replaced another with the same name, then the code in the DLL would be exhaustively tested, so that on replacement, other applications could use it in the same way as its predecessor.

In time, all these rules have been broken, even by Microsoft itself, the originator of the idea.

  1. On several occasions Microsoft has created and distributed DLL files that instantaneously broke every Windows application in the world.
  2. Regularly, end users will install an application that has a DLL with the same name as a "system" DLL, thus mysteriously bringing down the system until an expert can sort it out.
  3. Over time, the "synchronization" problem becomes more severe. In this scenario, a DLL is replaced that brings it into conflict with other DLLs it must work with.
  4. The service pack problem is becoming severe. In this scenario, Microsoft releases a service pack that updates all key system DLLs. All the elements of the service pack must simultaneously be present in their most recent form or the system will crash. Then the user installs an application that blithely replaces one or more of the DLLs from the service pack. Result — system failure, even on Windows NT 4.0, which, notwithstanding its reputation for stability and resilience, will fail utterly and completely.

This is an example of a collision between an idea and reality, a key element in the human drama. The idea was sound, but it failed to take into account the imperfections in the human character, in particular those imperfections that influence the creation and operation of computer programs.

The reality is that Microsoft and any number of software vendors regularly risk the stability and security of the end user's machine by writing DLL code as though it were normal programming. It isn't. To write a DLL, you must imagine the effect of your changes and additions on every computer program that uses it. This is obviously impossible.


The solution to these problems is to go back to the system that preceded the DLL system. Every application should place as many DLL files as possible in its own directory (some DLL files are part of Windows itself, these must be accessed in common). No application should assume that it can copy DLLs into the system directory or that its newer version of a system DLL is safe to copy solely because it is newer. Many applications (including Microsoft's own) have rendered systems unstable or unusable through this reasoning.

My personal experiences confirm this seemingly skeptical appraisal. In the course of writing and distributing Windows applications, I found the majority of customer problems were related to DLLs. If I created a program that assumed that standard Windows system DLLs would be present, those DLLs would not be present or would not be current, and the application would fail. If I took it upon myself to follow the DLL guidelines and copy DLLs into the system directory after ascertaining that my DLL was newer than the current one, other applications would fail.

So, after years of mysterious application failures, I simply deliver my applications with all required DLLs and I install the required DLLs in my program's directory and nowhere else. DLL-related e-mail has dwindled almost to nothing.

I say "almost" nothing. Several of my recent programs do not come in a package with DLLs because the programs are tiny — usually under 40K. On each page there is a link leading to my DLL library. If visitors have difficulties with the application, they are supposed to download the DLL files before reporting the failure as a bug in my program.

But this is mystifying to some — why should a copy of mfc42.dll with a version number of 4.2.6256 fail my application when a version released just four months later (4.21.7022) runs it perfectly? Aren't these DLL versions just cosmetic improvements, bug fixes and so forth? No, they are not. The new DLL version has routines that (1) are required by my program, and (2) do not exist in the earlier versions.


So, to cut to the chase —

  1. Always download the required current DLLs. Do not assume that, because your system has a copy of a DLL with the same name, that it is the same DLL.
  2. Never copy a DLL into the system directory as so many applications do — unless you also smoke cigarettes, drink heavily, eat peas with a knife, maltreat your dogs and believe they speak Latin in Latin America.
  3. Place the downloaded DLL files only in the same directory as the application that needs them. This allows the application to "see" the DLLs but hides them from the rest of the system.

These Pages Created and Maintained using   Arachnophilia.

Main Page

Home | Older Articles |     Share This Page