gungadin09
Posts: 3232
Joined: 3/19/2010 Status: offline
|
Basically, a state or group of states (confederacy) declares independence from the republic, to form it's own country. In the Civil War, the Southern states succeeded from the United States. The North (in other words, the federal government) declared war on the South. When the North won the war, it meant that the Southern states remained part of the "United States". When people talk about the South "rising again", what they mean is that they will again try to win their independence from the rest of the country. i thought that, apart from some patriotic windbags, no one was seriously considering that as a possibility. But i could be wrong. pam
< Message edited by gungadin09 -- 1/12/2011 11:28:04 AM >
|