Cole Austin
1B
Sep 11, 2013
In my opinion, war is in a sense a literal action caused by the process of politics. In other words, it is like politics in action. Politics play a big part in the reason a war is started in the first place (such as oil, terrorist threats, nuclear threats, etc) and the main structure as to the means of how they should be fought. For example the war in Iraq; the U.S. saw a threat from them along with the possibility of oil access on their soil. Politically, it was a win-win. The U.S. went to war saying that it was a necessary response for the attacks they made on the U.S. The truth though, is that at the time we had no sure evidence or claim as to who committed it against us. We simply needed to show the world that we were not going to be bullied and let something as catastrophic as that go unanswered; we needed to strike back. Therefore, as mentioned earlier, the U.S. had leads on Iraq (along with the large supply of oil) and decided to attack. It was only later on that the U.S. was informed of the oil consumption/extraction that was taking place.
In another sense, war is based off of politics. We always need a reason to go to war, let it be to put on a display of might for the world, or for the simple sake of taking what we want from another nation. No matter what, there is always a hidden reason behind it. Such political action is evident since it is the U.S. government that even declares war, or unofficially involves itself in it. They put the country in the wars, but its up to the citizens to fight it as they stand on the side line coaching. In a simple way of putting it, we have not necessarily needed to go to war since World War II, but rather felt that it was necessary for the benefits of the U.S. to enter these following quarrels, and that one day when they were finished we would be a better nation for doing so (although many times was not the outcome). In other words, the idea is to come out better, or with some advantage that we didn't have before.