Encyclopedia > West Florida

  Article Content

West Florida

West Florida was a North American Spanish colony that later became an independent republic in 1810 and then joined the United States of America as part of the state of Louisiana the same year.



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Brazil

... led to a growing desire for independence amongst Brazilians and in 1822 the then prince-regent Dom Pedro I established the independent Empire of Brazil. This lasted ...

 
 
 
This page was created in 22 ms