Californians are known for a metro-city living and warm weather lifestyle. They hate cold weather like rain and snow, and they aren’t nature lovers. They hate states that isn’t California like Arizona, Colorado, Oregon, Kansas, Florida, New Orleans and they never feel like considering living in any of those states. California is too expensive and overhyped. If you wanna move back to California then why are you considering living there? There’s ton of things to do in the states that isn’t California and you can still still music, writing, film, and stuff. What’s the point?
Honestly, I’ve met an old guy who told me and said, “If you don’t move back to California until you die, you are nothing but idiot.” I went like what the fuck?!?! What does has to do anything with California? I believed that guy is a hardcore Californian and that just got me confused and awkward me as fuck.
EDIT: I can say I am glad I don’t live in California anymore due to the cost of living, traffic and overpopulation and I happily live somewhere else in the West Coast.