United States of America: Hawaii

Hawaii is a U.S. state in the Western United States, composed entirely of islands, and the only state in the tropics. Hawaii is also one of a few U.S. states to have once been an independent nation. Settled by Polynesians some time between 124 and 1120, Hawaii was home to…

View More United States of America: Hawaii