United States of America: California

California is a state in the Western United States. What is now California was first settled by various Native Californian tribes before being explored by a number of Europeans during the 16th and 17th centuries. The Spanish Empire then claimed and colonized it. In 1804, it was included in Alta…

View More United States of America: California