there is an odd truth to that...
It's on my list of states that I'm considering living at again. I've lived there before. The west coast is great, but that whole, California sinking into the ocean if a shitty enough earthquake happens, clause makes me a bit wary. I wonder if that's just a myth... I should really look into that.
I've lived all over the States and have visited them all. What are your plans?