Encyclopedia > West Florida

  Article Content

West Florida

West Florida was a North American Spanish colony that later became an independent republic in 1810 and then joined the United States of America as part of the state of Louisiana the same year.



All Wikipedia text is available under the terms of the GNU Free Documentation License

 
  Search Encyclopedia

Search over one million articles, find something about almost anything!
 
 
  
  Featured Article
Urethra

... is when the urethra develops between the penis and the scrotum. Infection of the urethra is urethritis, said to be more common in females than males. Urethritis is a ...

 
 
 
This page was created in 29.3 ms