It's interesting how some utilities are simultaneously saying "we can't support solar and wind because it's intermittent" yet also refuse to spend money on need infrastructure upgrades that have the side benefit of providing better ability to manage power distribution. By interesting, I mean stupid.
Oh, good. You've got the ready-made solution for integrating solar/wind/etc and keeping a stable grid? Don't keep it to yourself, share that research! Everything I've read, including fairly recent stuff, is clear that the problem is still
quite unsolved - unless the plan is to just starve the power companies out and let the grid collapse in 10-15 years, which quite a few people seem to want (or, at least, are perfectly OK with - the "The power company owes me retail rates for everything I generate, instead of their energy cost, and I should get paid to use their grid" folks).
Between them and the anti-solar, pro-desert crowd, it'll be hard to significantly expand utility-scale power.
Except for all those utility scale buildouts doing exactly that. Rooftop solar is a rogue generator that's hard to deal with, utility scale solar can do things like bid into the energy markets with accurate forecasts, and operate curtailed - if you can't maintain max output because of clouds, maintain 50 or 60% of rated output, so you can keep the slew rates low. A bit of energy storage can let the plant ride through dips in the clouds as well. But there's a huge, huge difference in how a utility solar plant is operated, and how rooftop inverters operate ("All the power they can produce at every point in time, with no external controls on them").
I do think it'll be more economical for most people to switch to rooftop solar in the near future, especially if it can be amortized over 30 years like housing itself.
Are you referring to off grid solar, or grid tied with no backup? Or what, in particular, variant of "solar" do you expect it will be economical for people to switch to?
If it's grid tied solar, you have to deal with the problem that less than half of a typical power rate for residential is for a kWh of energy - the rest is grid maintenance. It's woven into the grid costs since it makes for a simpler power bill, and the ramping tiers tend to approximate demand charge behavior (if you're using a ton of energy, the connection to your house/neighborhood has to be able to handle more than someone barely using anything).
If it's off grid solar, well... you run into the problem that it's comically impractical to do with a typical house, even if you handwave at expensive lithium batteries. If you're cool with blacking out entirely for a couple weeks a winter, you can do it, but to design a system that can run a typical house off grid for the winter, you've got an awful lot of extra expense involved, as well as a backup generator that is far,
far less thermally efficient than a power plant (a typical small scale generator runs around 10-12% thermal efficiency, maybe 15% for a small diesel, but they're a royal pain to start in the winter when you need the power). I deal with the realities of off gird power in my office, and am designing a system that can run my house substantially off grid, but I intend to remain grid tied, and don't expect my system to "pay off" in financial terms just about ever. Even with me doing all the work myself (because nobody will do what I want, and I'm stubborn like that).
If you can get an off grid system to generate power at 4x the grid rate over the long term, you're doing well. Two years in, my cost per kWh delivered for my office power system is around $1.50/kWh. In 10 years, assuming similar loads, I'll be around $0.30/kWh - but then need battery replacement. In 30 years, I
might be able to get my cost down to close to what the grid offers, but that's assuming zero other system failures, not counting generator fuel (which, admittedly, is a small amount), replacement panels, etc. And I'm a battery/energy geek who is willing to do all sorts of interesting things to maintain my system - I'm not a set it and forget it type, which is what most people want of their electricity. It would have been far, far cheaper to trench power, but, to me, far less
interesting. I'm weird. I accept this.
Syonyk - have you looked into small desalination systems (~250-500 gallons a day capacitiy)? I've read into them but am unsure about their practicality vs. buying a giant tank and storing freshwater in it.
... no, it's not like I've got a ton of salt water to deal with. I live in high desert. We get 8-10" of precipitation a year, which I intend to substantially collect in a 10k gallon tank or so (I estimate 6-12k gallons/yr of collection, depending on efficiency of my collectors and what I can route where). That'll be for irrigation, firefighting, and backup house use, but intended for outdoor use.
Aren't all long distance transmission lines already high voltage DC? That's why we have transformers and substations, to convert distant power to AC for local use. Are you suggesting we put in more transformers in more places, to minimize the distance AC has to go? I assure you that power utilities have already run the math on the most efficient way to solve this problem.
No. It's almost entirely high voltage AC, which gets interesting over long distances. But we don't have anything resembling a national grid that can pump energy in and out of various places - there's no way for wind in Texas to... well, really power anything but Texas (ERCOT), and there's no way for western AZ solar to power the east coast as they're in their evening peak (western vs eastern interconnect, and a total lack of capacity across that distance). Doing something that's a national high voltage DC grid opens up a lot of options, without requiring any sort of frequency sync across the country. It's hard enough on the current grids we have, but HVDC drops out to the local waveform where you need it, and you can skip the inverters on solar farms - just have a DC-DC boost converter coming off the solar arrays. There's enough capacitance in the system that it's quite stable, and you can just have current sources and sinks, letting the voltage float over a fairly wide range.
Agreed. Although the Fukushima thing wasn't a design problem, it was a location problem. It's hard to ever get on board with building reactors on unstable earthquake or tsunami zones (ie all of Japan). A gazillion tons of water or the ground jumping up and down a few meters is tough on any building.
Fukushima held up to the tsunami and earthquake
just fine. What it didn't hold up to, and what it wasn't designed for, was a total station blackout. The plant
required power to maintain reactor safety, and the people managing it didn't listen to the recommendation to retrofit steam coolant pumps that would have allowed them to cool the reactors in the event of the total blackout that happened. It's a "Hey, this design was flawed, you should fix it this way..." situation, followed by a "Nah, we're not going to." And, as has been noted, newer designs are radically better designed in terms of walk away safety and such - they're literally designed to maintain themselves for some period of time without power or operators (typically 48 hours), and then to melt down in a designed path to a non-critical, stable, safe configuration.