You may be doing something wrong with your math, because your numbers seem pretty high. 90W in a laptop? That's a fireball.
Unless it was one of those modified alienware laptops that use desktop chips, enormous fans and was playing games and rendering videos, no idea how a laptop could get that high.
I have a honker of a laptop, a quad-core Intel i7 with a discrete graphics card. Uses about 20 W when on and doing light tasks, maybe a bit over 40 with processor intensive tasks like encoding or gaming, and about 9 when it's on with the screen off, less than 3 when it's in standby. Even if it's on and plugged in at home 24/7, that's only around $12/year. It is a Sandy Bridge, which does run a lot cooler than older models.
You can measure your laptop's power draw in windows with a joulemeter:http://research.microsoft.com/en-us/downloads/fe9e10c5-5c5b-450c-a674-daf55565f794/. It's not perfectly accurate, since the program itself is going to use power to measure, but it should get you in the ballpark.
I did buy a killawatt device. Basically what I learned was that a lot of devices people get really concerned about phantom power aren't worth worrying about. A lot of devices like phone chargers and power strips wouldn't even register (under .5W) if they were plugged into the wall but not another device. It's a lot of fun from an academic experience, but I think it's not going to save people that much versus just guestimating.