.

Monday, May 13, 2019

How World War II Transformed the United States Society and Economy Essay

How b all in all War II Transformed the United States bon ton and Economy - Essay ExampleThe nations survival to one of the close to dreadful topics in being history made America a powerful nation ever to influence the wait of the world up to this day. WWII is indeed a defining moment for a nation suffering from the agonies of the grand Depression (Kennedy xi). After the Second World War, the U.S. became the standard of power and economic prosperity. Until now, more nations want to pretend diplomatic relations with the country in order to keep that connection with the worlds most influential nation. How the U.S. transformed itself to gaining this coveted position is being traced to its significant participation in the Second World War. In the beginning, the U.S. did not want to compromise anything just to prove something to the world. The American government was seriously relations with the dilemmas of the Great Depression (Kennedy xi) and its own national problems. There wa s no intention to engage in an all out battle against Germany, Japan, or Italy if not due to the attacks made by the Japanese Military on the US Naval base in Pearl Harbor, Hawaii. Therefore, the country was provoked to defend itself be pay back it had a sensible reason to resort to such a courageous action. However, instead of plotting revenge against Japan, the U.S. wanted to begin with defeating the cause of the problem, which is the then Nazi-dominated Europe (Wartime Strategy). Uncle Sam was more worried in regard to the possibility of the German scientists inventing weapons of mass destruction than the possible attacks of the Japanese Military (Wartime Strategy). Hence, it was a decision the U.S. had to make for the involvement of its future and people. World War II ended after six years leaving the world with trauma. However, this ending marked the start of a renewed America, perhaps the momentum of monumental dislodge for the one time isolationist nation. Americans have realized the value of life, probably for the reason that they have seen the vast impact of the warfare on their country. Women began to realize how they may help the society by being part of the countrys workforce. This past event is one reason why the U.S is open minded in terms of sharing responsibilities between the masculine and female members of the household. Freedom is apparent because the Americans themselves recognize its importance by allowing the members of society to taking part in making their communities a better place to live. WWII has, therefore, changed the perception of the American Society regarding the equal roles of men and women. Thus, this change fueled the American society to becoming liberal in making crucial decisions that are incumbent for their survival. Slavery in the United States was also one crucial part of its history. African Americans were then forced to become slaves to the White Americans, and were not given equal rights by the government. The ending of slavery was something that the African Americans had yearned many years ago alongside with the ending of extreme discrimination. African American longed for freedom, which was later on given(p) by the American government at the onset of WWII. Many Blacks were given the chance to serve in the army, which had in some ways lifted their shape in society during the Second World War. In addition, WWII had, in some ways, diminished racial inequality in a place that is inhabited by varying ethnicities, like America. Even if they suffered from discrimination after their return from the war, opportunities for the African-Americans increased as legislations, such as the 1965 Civil Rights Act, as well as the Fair Housing Act of 1968, were enacted (Frazier and Tettey-Fio 85). Eventually, the enforcement of these Legislations enhanced the economic status of the Blacks, which led to

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.