One of my colleagues strongly believes that it is less expensive to run his A/C the whole day while he's at work, than to turn it off when he leaves, then back on again.
Because, he says, it uses more energy to start the cooling process over again and cool the room, than to maintain a given (cold) room temperature. I'm pretty sure that's ignoring very basic principles of thermodynamics. Appreciate if any of you have a concise, yet authoritative site on the topic.
If it helps, you can think about A/C as a heat-removal machine. The more heat it removes from your home, the higher your bill. The other side of the equation is things that put heat into your home. Appliances, computers, stoves, outside air, and of course, the sun. Let's ignore the man-made stuff and the sun, and focus on the outside air. The amount of heat that enters your home through the walls is proportional to the temperature difference between inside and outside. Roughly speaking, you get twice as much energy entering your 75-degree home if it's 95 degrees outside (20 degree difference) vs 85 (10 degree difference).
Let's do a case studey: For simplicity's sake, let's say that during the 10 hours you're away from home for work, the average temperature outside is 85 degrees, and your thermostat is set at 75. We'll compare two houses one where the A/C is left on during the day, and one where the A/C is turned off when you leave and turned on when you return. Let's say that the houses are insulated so that when it's 10 degrees warmer outside, enough energy enters the home to increase interior temperatures by 1 degree per hour. We'll call that amount of energy X btu/hour. Whatever the total energy input into the houses over the 10 hours must be removed by the A/C.
For the A/C-always-on house, the calculation is simple: 10 hours * X btu/hour.
For the A/C-off-during-the-day house, the calculation is a bit more complex and probably involves calculus. Roughly speaking, during the first hour, the house gains X btu, which causes the temperature to rise by 1 degree. During the second hour, however, the temperature difference is only 9 degrees, so the energy gained is only 0.9X btus. During the third hour, the delta-T is 8.1 degrees, so 8.1X btus are added, and so on. By the end of the day, the temperature has risen 4.5 degrees, and the total energy gained is only about 6.5 * X btus.
Now, this ignores other heat sources (sun, appliances, etc), but it's not a big deal--as they heat up the house faster, the delta T between inside and outside shrinks, and the house will gain less energy from the outside air.
It's also worth pointing out that the cooler the outside air, the more efficient your A/C will be. Running the A/C in the middle of the day when it's 95 degrees requires more energy to expel X btus from the house than when it's only 85. So by waiting until late afternoon to cool the house, you save even more.
I'll grant that it's nice to come to a nicely climate-controlled house. But that's why you get a programmable thermostat. Have it maintain 75 degrees (or whatever) from one hour before arrival, and switch to 95 degrees (so it basically turns off) one hour before you leave in the morning. Now you get the savings from turning off your A/C during the day, and still come home to a cool house.