When I buy a new computer, I am happy that it runs much faster than the
previous one I had. But after a year or two it gradually becomes slower and
slower. Finally, I have to buy a new and faster computer and the history starts
over again. But why?
I remember the good old 1980's where a PC was just a PC. The
computing power of the original IBM PCs was ten thousand times less than today,
and so was the RAM and hard disk capacity. Yet, they appeared to be faster
than computers are today. In those days, you could press a key (there was no
mouse) and expect to get an immediate response on the screen. Today, you
sometimes get an immediate response, sometimes not. The other day I had to wait
more than half a minute from I clicked on a drop-down menu before the menu
actually appeared on my screen. The computer was out of RAM and was busy
installing an update I never asked for. Come on, computer industry, you can do
better than that! My computer is ten thousand times faster than it was back
then, and yet I am wasting more time waiting in front of my screen than I was
back in the old DOS days.
As you know, I am doing a lot of research on computer performance, and here
is my list of the main culprits:
- Platform-independent programming. Programming languages like Java,
C# and Visual Basic produce a byte code or intermediate code that is
interpreted or compiled on the end-user's computer. Interpreting the byte
code takes at least ten times as long time as executing compiled code. Some
systems use a just-in-time compiler that translates the byte code to machine
code, but the just-in-time compilation takes time too, of course, and the
two-step compilation process gives a less optimal machine code in the end.
- Higher level of programming. The trend in programming tools goes
toward ever higher levels of abstraction. This enables the programmer to
make more complex functionalities in less time. But the data and
instructions have to pass through multiple layers, tiers or domains of the
ever-more complex software engineering model.
- Runtime frameworks. Higher-level programming tools often require various
frameworks to be running on the end user's machine. This includes the Java
virtual machine, .NET runtime framework, Flash, Adobe AIR, Microsoft
Silverlight, various script interpreters, and runtime libraries. These
frameworks often use much more resources than the application
itself.
- Data bases. Some applications use a database for storing simple
user data where a plain old data file would suffice.
- System database. The Windows system database is such a big
resource-thief that it deserves a separate point. The system database grows
every time you install a new application and it is not always cleaned up
when you uninstall something. Most applications store their configuration
data in the system database and it can take several seconds to retrieve
these data.
- Virus scanners. Some virus scanners check all files every
time they are accessed. Other scanners check files only when they are
created or in a background process. Some of the most common virus scanners slow down everything so much that it becomes a real nuisance to the user.
- Graphical user interface. User interfaces tend to become more and more fancy. Many resources are used on graphical elements that have only
aesthetic value and tend to distract the user's attention away from his or
her work.
- Background processes. A typical computer may have fifty or more
processes running in the background. Many of these processes are
unnecessary. Some of these processes belong to a specific application, but
they are started every time the computer starts up even if the application
they belong to is never used.
- Network. Some applications rely on remote data, assuming that the
user has a fast internet connection. This makes response times more
unpredictable than when data are stored locally.
- Automatic updates. It is becoming more and more common for
applications to update themselves automatically. This is not always good
from the user's point of view. The program may suddenly behave differently
from what the user is used to - and new features also mean a potential for
new bugs and incompatibilities. Automatic updates are necessary when there is a security reason,
such as updating a virus scanner or closing a security hole in the operating
system. Other updates should be made only if the user wants them. The
automatic checking for updates takes a lot of network resources, and when an
update has been downloaded it has to be installed, which can take a lot of
time. Some applications will search for updates every time the computer is
started, even if the application is never used.
- Installation. Even though many software developers pay attention to
the performance of their programs, they are paying less attention to the
installation process. Some installation procedures are written in script
languages that run incredibly slowly. The user may want to go away and do
something else while the program is being installed but it keeps popping up
questions that the user has to answer.
- Boot up. The time it takes from the user turns on the
computer till he or she can actually start to work is often several minutes.
This time goes to downloading and installing updates, starting background
processes, checking for plug-and-play hardware, reading the system database,
and loading system components and drivers.
Now we are getting closer to answering the question of why a computer gets
slower after a couple of years. Every time a new piece of software is installed
we are also installing a potential new resource-thief. Newer software is
possibly programmed with more resource-hungry high-level tools and frameworks than older software. We keep updating software all the time. Sometimes it is updated automatically. Sometimes we update it because we want new features. And sometimes we have to update our software because the old version is incompatible with something else we have installed on the computer. The increasing number of
background processes use more RAM. The system database keeps growing. And the
hard disk becomes fragmented.
There are many incentives to installing new software: A website that we visit
requires a new version of Java, Flash, QuickTime, or whatever. A document we
have received requires a new version of the word processor. A software
package that we install tells us that it needs the latest version of .NET. And
the next gadget we buy tells us to update to a newer version of Windows.
A lot of the software we are running and the websites we are visiting have
built-in incentives to make us update and buy more. The software industry keeps
making products that require bigger and faster computers. And the
hardware industry follows Moore's law and produces ever more powerful computers
that make these developments in the software industry possible. This is what
keeps the whole business going.
I don't think there is an evil conspiracy between the hardware and software
industry to make us buy more and more. It's simply the market mechanisms. The
software industry is focusing on producing more advanced features in a shorter
development time for competitive reasons. This makes them choose the highest level of development tools.
As consumers, we would be more satisfied if software developers would focus more on optimizing
performance and minimizing resource use. New software products are typically
tested under the most favorable conditions. The software developer may test his
product on the fastest state-of-the-art computer that is running nothing else.
Network features are tested with an isolated test server - not on an overloaded
server in a crowded network. It is time that software developers begin to test
their products under worst case conditions rather than best case conditions. |