EESI Energy and Engineering Solutions, Inc.

» Home
» About Us
» Contact Us
» Services
» Energy How-to Series
» George Owens' Background
» Energy Papers
» Energy Links

Energy Star
EESI is a U.S. EPA Energy Star Buildings Ally.

Association of Energy Engineers
George Owens is a past president of "Association of Energy Engineers", an "Energy Manager of the Year" and an "Energy Managers Hall of Fame" inductee.


by George R. Owens, PE, CEM

Since the first cave dwellers banked down the fire with ashes at night, we have strived (sometimes more, sometimes less) to reduce energy costs. Their motivation may have been higher than ours, with tyrannosaurs rex chomping around while they were out collecting firewood.

Even during the 1800's and early 1900's when utility prices were decreasing (remember utility prices did not start rising until the 1970's), the control of energy was important. Automatic controls were developed in the late 1800's and a workable night thermostat was developed in the very early 1900's.

Not until the 1970's, with multiple energy crises, and increasing costs, did the idea of conserving energy become really important to building owners. A profession was born (the Energy Manager), and equipment developed to reduce costs. Out of all this, the most important new technology was the advent of the computerized Energy Management System. These early systems provided centralized control, unattended, with electronic accuracy. However, these early systems were bulky, not user friendly, unreliable, and very expensive.

During the 1980's technology changed the face of Energy Management. The personal computer (PC) and increasing computer literacy of the general population was responsible for improving the performance of Energy Management Systems in the area of increasing complexity and paradoxically, easing the use by the operator. However, the systems stubbornly refused to yield to major price reductions and even more complete user friendliness.

Now it is 2001 and looking back at the history of Energy Management Systems in the 1990's who could believe that you had to punch keys to get data in and out of the machine -- how archaic!

Let's take a look at just some of the more mundane features of today's systems that, except for a select few brilliant and farsighted individuals (such as the author), were not even being considered just ten years ago.



This paper is written in an attempt to predict what an Energy Management System in the year 2001 might look like. The author has used a combination of fact, speculation, and a sprinkle of science fiction to take today's Energy Management System through ten years of evolution to the year 2001. Although I consulted with the Research and Development Department of the major Energy Management System suppliers and several smaller companies' principles, the predictions contained herein are my own, based upon that research. The predictions are also based upon an owner's and a specifier of Energy Management System's desire (the author) to solve some of the shortcomings of today's offerings. I want to thank all who suffered through incessant questioning in my quest for the future. Thank you.


Computing Cost/Performance

The single most important reason that the Energy Management System of today is even possible is due to the invention of the computer and the dramatic drop in price with a corresponding increase in performance.

A summary of the author's approximation of price and performance is tabulated below:

                     ENERGY MANAGEMENT SYSTEM
                           COMPUTER COST

    1970    $50,000     16000      $3,125,000       Mainframe
    1975    $20,000     32000      $625,000         Minicomputer
    1980    $10,000     64000      $156,250         IST PC's
    1985    $5,000      512000     $9,766           IBMPC
    1990    $1,000      1000000    $1,000           Compatibles
    1995    $100        4000000    $25              Micro
    2001    $1          1000000    $1               1 Chip PC
The earliest Energy Management Systems were installed in the early 70's and utilized small "mainframe" style computers that were costly, bulky and hard to program. The term "user friendly" had yet to be invented. The next generation (late 70's) graduated to a minicomputer or a vendor's proprietary CPU design.

Finally, someone got around to inventing the PC (Personal Computer) and the real revolution of computing power for the masses began. My first PC in 1979 consisted of an 8 bit CPU, 4k memory, 4 megahertz, tape recorder, and a monochrome monitor. When I expanded to 64k, a 5 1/4" disc drive and a printer, I thought I was on top of the world and at the pinnacle of technology development. The cost was $3,600. Today, about one half of that cost will buy a tenfold increase in performance. Sixteen and 32 bit CPU's at 25 to 35 megahertz utilizing one to four megabytes of RAM, hard drives, laser discs and color monitors are being offered by dozens of vendors.

In about five years, hi-performance CPU prices should drop to at least $100, and may approach $10. As these prices drop, you will see more computing power decentralized from the head end of the Energy Management System. Smart controllers will, then, be the norm not the new kid on the block.

Now take technology another five year leap - and imagine a $5.00 or even a $1.00 full function CPU taking up 25% of one large scale integration chip. At that price, the cost of the box is more than the computer itself. At that price, everything - motors, HVAC units, VAV boxes, relays, light fixtures, receptacles, appliances, and maybe even each and every light bulb will contain it's own uniquely programmed CPU for control.

With all the greatly expanded computing power available at a very low cost, the next area that must evolve rapidly may also, in some ways, be the biggest obstacle to overcome.



Software development has not kept up with the quantum leaps of computer hardware evolution. Yes, I know that when compared to the systems of the 70's, the software of today has improved. Today we have computer graphics and many more software routines for controlling HVAC equipment. However, the basic control strategies that I wrote about and presented in a paper at the Association of Energy Engineers World Energy Congress in 1980 are essentially the same today as then. These strategies are/were:

  • Scheduled start/stop
  • Optimized start/stop
  • Temperature compensated duty cycle
  • Temperature compensated demand shed
  • Temperature reset:
    1. chilled water
    2. hot water
    3. condenser water
    4. supply temperature
  • Night setback
The programs of today still primarily rely on the Energy Management System supplier's programmers to implement the above strategies, based upon their expertise after reviewing the building. It is rare (if it even exists) that major changes are made to the software, once installed, to improve the operations or to provide further energy cost reductions. The reason for this reluctance to make major changes is the number of hours and cost for the programmer to set up the system in the first place. Software cost (both factory and field) for a major Energy Management System may run from 20-30% of the total system installed cost.

The major breakthrough in both software cost reductions and improvements in the sophistication of control will be wrought by the computer itself and not the programmers. Artificial intelligence is a term that has been around for a few years but by 2001 will be firmly entrenched in our computer culture and Energy Management Systems. More specifically, expert systems will be developed, possibly utilizing some form of fuzzy logic, that will self-develop and continuously optimize control strategies. An expert system is one that has a comprehensive set of rules in the base operating system. These rules would generally define how outputs should react to certain input signals. These signals might come from sensors, the operator or other outputs. But this is not enough. As self-optimization subroutine would take over and modify the expert system as the computer itself learns how to uniquely control each building.

Even expert systems can only react to what is happening after it occurs. An, although these actions occur very soon after the event, that is often not the optimum situation. Imagine if the system could simulate fully the operation of the building and could accurately predict exactly what would happen prior to its occurrence. The level and accuracy of control and optimization of energy use would be phenomenal. By 2001, all of these scenarios may be commonplace.



Again, going back to the earliest Energy Management Systems, all external devices were hard wired back to the computer. Then a distributed format evolved utilizing multiplexed signals over a common wire or even over the electrical distribution system (power line carrier).

The multiplex system did a lot to reduce the cost of wiring input/output devices. Power line carrier systems have not taken over a large share of the market due to some earlier reliability problems. Both of these systems have a common drawback in that the response time goes down as the system gets larger. The system response time will probably be sped up in the next five years by a combination of putting more and more computing power out in the remote multiplex boxes and even in the input/output devices themselves. Also, the throughput of information can be increased by switching from twisted pair wire signal carriers to coaxial cables or fiber optics. Still these improvements offer no solution to high installation cost.

By the year 2001, communication between sensors and multiplex boxes and the head end system will be by a combination of technologies:

  • Traditional means such as twisted wire, coaxial, etc.
  • Non-traditional methods such as:
    1. infrared
    2. radio wave.

Operator/Machine Interface

First there were pilot lights, then single line light emitting diode displays. Next came CRT's with text only in monochrome. Today's technology includes graphics and color with live data updated one a second. But again the programming of this data format takes a considerable amount of time, effort and cost.

In the future, due in a large part to the increase in computing power and expert system software, the building's complete set of plans will be loaded into the computer. On new buildings this will be a simple matter of popping in a disc from the CAD (Computer Aided Design) system that designed the building. For existing buildings, the drawings will be scanned into a CAD System and then be utilized in the Energy Management System. The system will then start to automatically synthesize this information into formats that ease the operator's ability to interact with and direct the system.

Multi-media computers will combine video, CD-ROM, audio, text, pictures and sound to enhance communications and understanding of both the computer and operator. The system will fully customize this interface automatically for each and every operator, anticipating what the operator will desire to know and, then, provide a display/sound environment before being queried.

Supplanting today's operator's interfaces may be a concept called "virtual reality". This is currently being experimented with major universities and research corporations. In virtual reality, you no longer are told nor observe what is happening -- you become part of it. By donning special headsets, gloves and even a full body suit, the operator will fully experience a building in operation. After receiving a complaint of a hot temperature, the operator would zoom immediately to the space to feel and measure the temperature. With a twist of the wrist, we would, then, be inside the VAV box looking at damper positions, seeing readouts of air volume and temperatures. The thermostat would be readjusted while observing the whole system's operation. After the problem is solved, the operator then moseys over to the machine room and checks the operation of fans, boilers and chillers. All this without having to leave the control console.

I can envision the following scenario by 2001:

  • An operator wants to add a sensor to previously un-monitored room.
  • The operator goes to the storeroom and picks up the $10 sensor.
  • When the operator peels off the self-adhesive backing and sticks it on the wall, several things simultaneously occur:
    1. Power is supplied to the unit by a built-in solar cell with battery backup.
    2. The sensor starts broadcasting that it is now alive. This broadcast may be infrared, radio wave, or microwave.
    3. The computer recognizes the sensor, assigns a point number internally to the computer and within the sensor.
    4. The system will know the location by triangulation of the signal and its internal map of the building.
    5. The system will then start a self-optimization routine to discover the appropriate control strategies to utilize this new sensor.
  • By the time the operator returns, the data for this sensor will be fully integrated with the man/machine interface. Information will be presented to the operator in the format that the operator prefers without ever once issuing a request.


The history of controls and Energy Management Systems indicates that the performance has increased more than tenfold each decade. By 2001, wireless communications, self-optimizing software and improved operator interfaces will far surpass any of our current expectations. The system will be easier to use, provide better comfort, reduce energy costs and be less expensive to install. The term Energy Management System will no longer exist. All of this will be just "business as usual".

1991 George R. Owens.

To send comments to George R. Owens, click here:


Home | About Us | Contact Us | Services | Energy How-to Series | George Owen's Background | Energy Papers | Energy Links