Sunday, December 20, 2020

 

Lincoln's Proclamation of April 15, 1861

The President Called for Volunteers to Force Southern Compliance


May 3, 2009 Michael Streich

The April 1861 Proclamation was not a war declaration, although Southern States would respond to it as if it was, setting the stage for a long and bloody conflict.

On April 15th, 1861, President Abraham Lincoln issued a Proclamation that called for 75,000 men from the various states “in order to suppress said combinations…” The Proclamation followed the surrender of Fort Sumter in Charleston, South Carolina after P.T. Beauregard’s bombardment. The South had fired the first shot, outraging the North. Excepting the Border States, Lincoln’s Proclamation was well-received but it would be weeks before the mostly untrained militia arrived in the nation’s capital.

Scope of the Proclamation

Lincoln opened the Proclamation by addressing the needs for his actions. The Laws of the United States were opposed and their execution “obstructed.” Lincoln listed the offending states: South Carolina, Georgia, Alabama, Florida, Mississippi, Louisiana, and Texas. These “combinations” represented a force too powerful to be dealt with through ordinary channels of compliance such as the judiciary and the Federal Marshals.


The language of the Proclamation indicates that Lincoln viewed his response as a “police action” designed to “repossess” Federal property, i.e., forts, armories, and other assets. This was not a “Civil War” but an “insurrection.” There was to be no “…devastation…destruction…or interference with property, or any disturbance of peaceful citizens…” Lincoln was well aware that pro-Union sentiment still existed in the South.


Lincoln, in calling a special July session of Congress, referred to unfolding events as “an extraordinary occasion.” As Commander-in-Chief, Lincoln believed that the Constitution supported his call for state militia volunteers to serve in the armies being planned in defense of Washington and the securing of Border States like Kentucky and Missouri.


As the Proclamation resulted in an outpouring of support and unity in the North, it extinguished lingering pro-Union sympathies in the South. Governor Jackson of Missouri replied to Lincoln that his request for men was “illegal” and “unconstitutional.” Both sides rushed to enlist men, dooming Lincoln’s “police action.” The April 27th, 1861 Harper’s Weekly commented that, “Nobody – outside of lunatic asylums – doubts that civil war is an enormous calamity.”



Results of the Proclamation

Initial plans by the end of April suggested three separate army groups. The first, under the command of General Scott, would defend Washington with 50,000 men. A “New York Army” commanded by General Wood would be held in reserve while General Sumner was to encamp around the Cincinnati area with 75,000 men to protect the river systems, ultimately enabling Union troops to control the Mississippi. (“The War,” Harper’s Weekly, April 27, 1861)


Most of the soldiers arriving from various Northern states were ill-trained. It took the troops several weeks to reach Washington, facing hostile opposition in Baltimore. Additionally, some of the army’s best officers resigned their commissions and returned to the South, as did Robert E. Lee, for example. The navy, it was predicted, would ensure that all Southern ports would be “hermetically sealed.”


In the South in the weeks following the Proclamation, war plans were also being refined. William T. Sherman, who visited Virginia months before these events, had already reported then that the South was preparing for war. General Beauregard wanted to attack Washington with 32,000 men but was overruled by Jefferson Davis on advice from Robert E. Lee. Although the advice was given based on military considerations, Davis did not want to be the aggressor, falling back on his oft repeated phrase, “I hope they leave us alone.”

The Proclamation Not a Call to War

Lincoln’s purposes were very clear: the Proclamation was not a call to war nor was it a war declaration – only the Congress can declare war. Any such war declaration would have legitimized the Confederate States of America. For Lincoln, the Proclamation was a carefully worded document aimed at recovering Federal property and forcing insurrectionists to comply with Federal law. It was the South that construed the Proclamation as a war declaration and responded accordingly.


Sources:


  • Harper’s Weekly, April 27, 1861 (President Lincoln’s Proclamation reprinted, commentary, and daily news)
  • Shelby Foote, The Civil War: Fort Sumter to Perryville (Vintage Books – Random House, 1986)



 


Civil War Political Generals

Promoting Men Through Patronage Helped Serve a Purpose

May 4, 2009 Michael Streich

The appointment of political generals by Abraham Lincoln was part of an effort to help unify the North and guarantee support from ethnic and political constituencies.

When the Civil War began, Northern troop strength was low and it would take some weeks to begin the process of building an army. Exacerbating this situation was an acute shortage of officers. The majority of “West Pointers” on General Winfield Scott’s staff came from the South and most resigned their commissions to return to Southern states. Other seasoned officers – like Grant and Sherman, had left the army to pursue private ventures. Filling this gap, President Lincoln turned to political generals that frequently resulted in leadership incompetence.

The Benefits of Promoting Political Generals

Both the North and the South employed political generals, although the practice was, by far, more common in the North. Jefferson Davis was fortunate to have a large number of West Point graduates; 306 such men served in Southern armies. Yet even in the South political expediency caused the promotions of such unskilled men as Robert Toombs and Henry Wise.


Abraham Lincoln welcomed the opportunity to appoint political generals, but for reasons related to national unity. Civil War historian James McPherson writes that, “Each of the political generals represented an important ethnic, regional, or political constituency in the North.” Lincoln’s actions demonstrated patronage, an important element for producing loyalties, either from ethnic groups like the Germans or northern Democrats.


The classic example cited my many Civil War historians involved Lincoln’s choice of Alexander Schimmelfennig for the position of brigadier general, over the protests of Secretary of War Stanton. Lincoln, however, wanted a truly “Dutch” sounding name (the term “Dutch” often employed for German). Lincoln reasoned that ethnic Germans would be more apt to support his efforts if their local newspapers, often in the native language, highlighted the exploits of their own.

Contributions of Political Generals

Newspaper stories did, in fact, bolster the feats of political generals even if the reality of their combat contributions was far different. Schimmelfenning, fighting at Chancellorsville, was made the scapegoat by General Joseph Hooker for the large Union losses and setback. Others fared no better. Commenting on Benjamin Butler, a Massachusetts corporate attorney who managed to secure a command position, the young John Hay observed that Butler was “perfectly useless and incapable for campaigning.”



Politicians turned generals rarely had the experience necessary for leading men. Massachusetts Senator Henry Wilson, enduring the rigors of military life for the first time, took ill and resigned after spending days inspecting the defenses of Washington. New York Senator Daniel Sickles, a Democrat, commanded no respect from the men under him and was frequently referred to as “brutal.”


Ulysses S. Grant and William T. Sherman also owed their commissions to political connections. Grant was championed by an influential Illinois congressman; Sherman’s older brother was the United States Senator from Ohio and a founding member of the Republican Party. Unlike other appointments resulting from political considerations, however, both Grant and Sherman were West Pointers.


Some politicians given a rank distinguished themselves such as Rutherford B. Hayes and William McKinley, both from Ohio and both future presidents. It didn’t hurt James Garfield’s presidential election bid in 1880 to remind voters that he had served, albeit briefly, as a brigadier general.


Lincoln’s Strategy in Promoting Political Generals


Lincoln knew that the North would ultimately triumph even though some command positions were given to men without training or experience. These appointments filled a political necessity that was deemed important in the early months of the war. Lincoln needed a unified North behind him; political generals helped in that process.

Inevitably, this gave the South, initially at least, a distinct advantage, demonstrated by their ability to thwart many of the early advances of the North, beginning at Bull Run in July 1861.


Sources:


  • James M. McPherson, “Lincoln and the Strategy of Unconditional Surrender,” Lincoln the War President, Gabor S. Boritt, ed. (Oxford University Press, 1992)
  • Page Smith, Trial by Fire: a People’s History of the Civil War and Reconstruction (McGraw-Hill Book Company, 1982)
  • T. Harry Williams, Lincoln and His Generals (Alfred A. Knopf, 1952)


Copyright Michael Streich. Republishing requires written permission.

Saturday, December 19, 2020

 

Presidential Leadership in American History

Theodore Roosevelt's example of presidential leadership was based on his view of inherent executive prerogative power.

Strong presidential leadership is not necessarily confined to periods of conflict or controversy in American History. Too often, poor leadership at critical moments changed the nation’s course. James Madison was reviled for his lack of leadership in 1812 and the subsequent war with Great Britain. William McKinley had to be politically cajoled into submitting a declaration of war against Spain in 1898. While some national emergencies, like the Civil War and the Great Depression, witnessed strong leadership by “the right men at the right time,” this was not always true, nor is it true in the 21st Century.

Presidential Greatness Defined by Character, Vision, and Strength

America’s great presidents may have suffered personal doubts during times of important decisions, but they never wavered publicly from what they believed to be the only way to solve a crisis. Political parties compromise; presidents lead. One of those was Theodore Roosevelt. Professors Kelly and Harbison state that, “He had a dynamic and powerful personality that captured the popular imagination and inspired large numbers of people to follow him in whatever ideals he proclaimed.”

Roosevelt became president in 1901 upon the assassination of William McKinley. McKinley, as historian Lewis L. Gould notes, was effective because of his own Congressional background and the political tact he employed with powerful senators. But he was not in the mold of Roosevelt, who thought nothing of going to the American people through the press when he ran into an impasse with the Congress, especially after his 1904 reelection.

Historians note that Roosevelt believed in the “stewardship theory.” As such, Roosevelt “revived the old Hamiltonian doctrine of inherent executive prerogative power.” (Kelly & Harbison) As long as the Constitution did not prohibit an action, Roosevelt felt free to pursue it. This included taking on the oligarchs in the House and Senate, captains of industry such as J.P. Morgan, and the San Francisco School Board.

Presidents Must Lead from the Position of Strength and Decision

Roosevelt was a man of action and one who spoke his mind. In 1915, he wrote, “The policy of watchful waiting, a policy popular among government chiefs of a certain type ever since the days of Ethelred the Unready…has failed, as of course it always does…in the presence of serious difficulty…” Although Roosevelt was directing his criticism against Woodrow Wilson, the same can be said of other chief executives in American History.

Were Americans better off in 1980 than they had been in 1976? This question, asked by Ronald Reagan, helped him to defeat Jimmy Carter. Americans were not better off, and neither was the world. Flawed leadership under Carter produced economic recession at home while losing U.S. prestige and respect abroad. President Carter was a great humanitarian and negotiator for peace, but a poor leader in the post-Watergate world of uncertainty and mistrust.

President Reagan, dubbed the “Great Communicator,” went directly to the American people when Congress failed to enact laws his administration wanted. Reagan may have learned leadership as governor of California, but his economic policies exacerbated conditions leading to the long and slow financial decline that impacts the United States in 2011. Reagan’s example demonstrates that leadership is not always defined solely by words and rousing speeches. His own Vice President criticized the financial policies, calling them “voodoo economics.”

Has Leadership Changed in the 21st Century?

Writing about the U.S. Senate in 1900, Gould observes that, “…the upper house had little sympathy for those Americans whom the rise of large corporations had disadvantaged.” Over one hundred years later, critics cite similar observations: Americans mistrust the Congress and note little leadership in the White House.

When George W. Bush achieved reelection in 2004, he boasted that he had earned “political capital.” Yet his legacy was a series of Middle East conflicts and the start of the so-called Great Recession in 2008. None of the warning signals were addressed. Historians of the future may equate Bush with Herbert Hoover rather than Teddy Roosevelt. Bush, like Hoover, ignored the “bubbles” that ultimately unraveled the American economy.

President Obama’s leadership is also questioned by critics. Like Woodrow Wilson, Obama was a product of the lecture hall; both men were products of Ivy League schools and both men were most at home behind a classroom podium. Leadership, however, is not learned in books but is more the product of risk and action. Wilson himself wrote that, “Leadership does not always wear the harness of compromise.” Wilson, however, viewed himself more as a Prime Minister than a President.

Analyzing Theodore Roosevelt’s successful 1904 reelection, Charles A. Gardiner referred to “absolute executive sovereignty” of the presidency. The occasion was the annual meeting of the New York Bar Association. According to Gardiner, the president, under the Constitution, executes the laws passed by Congress with “absolute discretion” as long as he acts faithfully pursuant to his oath of office (New York Times, January 19, 1905). This was Roosevelt’s contribution to the “era of executive expansion.”

Executive leadership doesn’t come with on-the-job training. There are no “entry level positions.” Decisive leadership is tied to character, honor, and the Constitutional oath to serve the American people well and honestly. Those presidents that truly believed this are deemed great. Too many others lost the power of conviction, and the nation suffered.

Sources:

  • Lewis L. Gould, The Most Exclusive Club: A History Of The Modern United States Senate (Basic Books, 2005)
  • Alfred H. Kelly and Winfred A. Harbison, The American Constitution: Its Origins & Development, Fifth Edition (W. W. Norton & Company, 1976)
  • James Ford Rhodes, The McKinley And Roosevelt Administrations 1897-1909 (The Macmillan Company, 1922)
  • Theodore Roosevelt, Fear God And Take Your Own Part (George H. Doran Company, 1916)




Protecting the Constitution and Curbing Presidential Powers

Senator Sam Ervin's knowledge and respect of the Constitution made him the logical choice to head the Senate Watergate Committee in May 1973. 

Posted by  on Oct 8, 2008

 On July 5, 1973, North Carolina Senator Sam J. Ervin, Jr. wrote a constituent, stating “I will always be grateful to you for your complimentary remarks concerning the manner in which the Senate Select Committee on Presidential Campaign Activities is endeavoring to perform its duty to ascertain the truth in respect to the charges relating to the Watergate affair and its various ramifications.” The Watergate hearings represented the final and certainly the most far-reaching actions of the “country lawyer” from Morganton, North Carolina; when his term ended in January 1975, “Senator Sam” retired.

Sam Ervin Challenges the Nixon White House

Sam Ervin was a conservative Democrat who held a Jeffersonian passion for the U.S. Constitution and is referred to as the, “last of the Founding Fathers.” Aspects of his passion involved opposition to Civil Rights, the Equal Rights Amendment, school busing, and desegregation – although he eventually changed and embraced a “color blind” society. His love of the Constitution and the privacy rights of Americans regarding government surveillance of suspected “enemies” of freedom, however, never changed.

The Ervin hearings of May 1973, frequently called the Watergate Committee, were a clear demonstration of Senator Ervin’s quest to expose significant criminality, including a cover-up that led to the White House Oval Office, associated with the presidency of Richard M. Nixon. Sam Ervin took down an administration that saw itself above the law, rationalizing Constitutional side-steps by hiding behind dubious claims of national security and executive privilege.

Privacy Issues and the Army’s Enemies List

Before there was Watergate there was a secret domestic surveillance program conducted first by the U.S. Army and then transferred to the Nixon Justice Department. It was a time of intense social reaction to the Vietnam War, to events like the Kent State massacre and the president’s escalation of U.S. bombing beyond Vietnam’s borders into Cambodia. Presidential lackeys, men like William H. Rehnquist (later Chief Justice of the Supreme Court), publically supported First Amendment violations in the name of national security.

The Select Committee on Presidential Campaign Activities

Dust from Ervin’s army hearings barely settled when events conspired that would ultimately bring down a president. On June 17, 1972, the so-called White House “Plumbers” were caught breaking into the Watergate office of the Democratic National Headquarters.

The cover-up began almost immediately. Those directly involved received large cash payments to ensure their silence and the promise of lucrative jobs. Two courageous journalists, Bob Woodward and Carl Bernstein of the Washington Post fed their readers salient stories gleaned from tips and from leaks. By the spring 1973, Senator Ervin seemed the most logical choice to chair the select committee conceived by Majority Leader Mike Mansfield to hold hearings on the expanding scandal.

The Nixon Tapes and Start of the White House Siege Mentality

At a time many eyes in the nation were glued to their televisions, watching the hearings and awaiting Ervin’s “one-two” punch, President Nixon fought back, refusing to submit tapes disclosing Oval Office conversations, on the basis of national security and executive privilege. Existence of the tapes had been disclosed during committee hearings. The taped conversations were eventually released after a judicial battle.

Erwin had nothing to lose. In the Senate, his reputation was solid. Personally, he had no further political ambitions having decided to retire when his term expired in 1975. The White House, however, dispatched henchmen to North Carolina vainly attempting to find a shred of scandal that could be used against Ervin.

Ervin, who loved to pepper his speeches and comments with biblical allusions, was a strict Presbyterian and took his religion seriously. Nixon’s conservative lackeys, however, found nothing to embarrass the Senator.

Special Prosecutors and Impeachment

Watergate was still not “open and shut.” When did Nixon first learn of the DNC break-in? Where did the money come from to cover-up the crime? Who was part of the conspiracy and what was the ultimate motive? As Ervin’s committee sought to find answers, Richard Nixon played hardball, shutting the doors even to members of his own party.

Legacy of Sam Ervin’s Senate Hearings

Sam Ervin’s hearings, well planned for maximum impact, provided the blueprint for the Justice Department’s special prosecutor and for other Congressional committees. In the end, the Nixon White House made a mockery of separation of powers, engaged in illegal actions, and permitted the chief executive an element of autocratic leverage incompatible with Constitutional restraints.

Ervin’s last official act was to guide the Privacy Act through the Senate. The Act, “represented the fulfillment of Ervin’s long attempt to protect individuals from excessive government snooping,” according to historian Karl E. Campbell.

When Sam Ervin died in April 1985, he could not have foreseen an era when the privacy rights of American citizens would be gravely tested through new technologies tied to the internet. One wonders what his reactions would be to the new Imperial Presidency that ostensibly monitors all telephone calls, e-mails, and computer transactions in the name of national security.

References:

  • Karl E. Campbell, Senator Sam Ervin, Last Of The Founding Fathers (University of North Carolina Press, 2007)
  • Same J. Ervin, Jr., Humor of a Country Lawyer (University of North Carolina Press, 1983) Chapter 14
  • Same Ervin, Letter dated July 5, 1973
  • Barry Sussman, The Great Coverup: Nixon and the Scandal of Watergate (Catapulter books, 2010)
Copyright Michael Streich.Written permission to republish required.

Friday, December 18, 2020

 Issues and Non-Issues in the Presidential Election of 1840

Was Martin Van Buren a Dandy, a Protege of Andrew Jackson While the Nation's Economy Tanked?

Michael Streich April 12, 2012

The Presidential Election of 1840 is often reduced to the campaign slogan “Tippecanoe and Tyler, too.” The newly emerged Whig Party, led by popular figures such as Henry Clay and Daniel Webster, fought for the opportunity to send Martin Van Buren back to New York. Unfortunately, a series of events and non-issues conspired to make General William Henry Harrison the Whig frontrunner.

 

Portraying the Democrats as Insensitive Spenders

 

While Whigs portrayed Democrats as “pampered office holders,” they reintroduced their mid-western hero as a hard-drinking Indian fighter of modest means who lived in a log cabin. Building on the Jackson legend precisely to deny his heir apparent Van Buren, the Whigs managed to set aside real campaign issues, focusing instead on the wealthy and aristocratic Van Buren who had the audacity to ask Congress for an appropriation of $3,665 for White House expenditures while millions of Americans were struggling to find work and put food on the table.

 

Martin Van Burn became President in 1837 after serving as Andrew Jackson’s Vice-President. Van Buren inherited the Panic of 1837, the most significant economic downturn in the new nation since the need for a Constitution Convention in 1787, as well as the enmity of John C Calhoun and Henry Clay. Clay, who desperately wanted the 1840 nomination, reminded his listeners of, “…the heart rending wretchedness of thousands of the working class out of employment.”

 

To Southerners, Van Buren was an untrustworthy eastern establishment type whose lifestyle was compared to the monarchs of Europe. If Jackson was seen as “King Andrew” in political cartoons of the day, Van Buren was the crown prince who had turned the White House into a palace featuring sumptuous state dinners the average American could only dream about.

 

Van Buren’s pretentions were dramatically elucidated by Congressman Charles Ogle. Ogle’s congressional speech of April 14, 1840 was titled, “The Regal Splendor of the President’s Palace.” Van Buren, however, was a self-made man whose father had been a farmer and tavern keeper in Kinderhook, New York. His ancestors had arrived from Europe in the early seventeenth century as indentured servants.

 

Ignoring More Important Issues

 

Van Buren made his mark as a skillful orator and a superb organizer. There was a reason he was dubbed the “Little Magician.” His support of Andrew Jackson earned him the Vice Presidency in 1833 and Jackson’s nod in 1836 when Van Buren won the presidential election, facing a number of sectional candidates from the new Whig Party who were attempting to duplicate the 1824 presidential election by throwing the results to the House of Representatives.

 

In 1840, the chief issues were the national economy and Van Buren’s supposed indifference to the plight of unemployed and impoverished Americans. At the same time, the Whig Party championed the frontier-roots of Harrison. Real issues such as growing abolitionist concerns, the annexation of Texas, and tariffs remained in the background. By 1840, more Americans than ever before had the franchise and most of them voted on the basis of rallies, parades, and stories like Congressman Ogle’s displeasure with Van Buren’s request for $3,665 for White House expenses.

 

Van Buren lost the 1840 election but Harrison died thirty days after his inauguration, allowing John Tyler to become the first “accidental” president. Ironically, Tyler disagreed with the Whigs, alienating his only support group. Van Buren did not go quietly, returning in 1844 in a vain attempt to secure the Democratic Party nomination.

 

References:

 

Paul F. Boller, Jr., Presidential Campaigns From George Washington to George W. Bush (Oxford University Press, 2004)

Webb Garrison, A Treasury of White House Tales (Rutledge Hill Press, 1989)

Michael F. Holt, The Rise and Fall of the American Whig Party: Jacksonian Politics and the Onset of the Civil War (Oxford University Press, 1999)

Page Smith, The Nation Comes of Age: A People’s History of the Ante-Bellum Years, Volume Four (McGraw-Hill Book Company, 1981)

First published in Suite101

Copyright Michael Streich.Written permission to republish

 Why Utopian Communities Ultimately Failed

Michael Streich

September 3, 2011

The attempt at establishing a closed community of like-minded believers dates back to the Puritan experience in New England. Over the course of American history, numerous groups, both large and small, have sought to create a Utopian community, separate from the “world’s people,” as the Shakers might say. Some of the societies endured for many years before succumbing to outside forces. Others, like the Mormons, evolved into major faith denominations. Those Utopian communities that survived did so only because they successfully adapted themselves to a life of co-existence which remaining true to the dynamics that created them. Others, however, were not as successful.

 

Utopian Aspirations in the Puritan Godly Communities

 

Puritans saw themselves as distinctly different from their neighbors. They were John Winthrop’s “City on a Hill,” taken from the Gospel of Matthew in the fifth chapter. Seeing themselves as God’s chosen and the “New Israel,” their theocratic faith community lived without compromise or tolerance of other beliefs. In 1628, they sent Thomas Morton back to England and closed his estate, Merry Mount, because of immoral activities taking place there, including having sexual relations with Indians.

 

By the end of the century, however, non-Puritan influences surrounded the community and it had lost the initial zealousness. Internal dissent by Puritans like John Williams and Anne Hutchinson had fractured the solidarity of believers. In 1692, fearful that the devil was about to make war on their community, Puritans denounced and hung their neighbors in the celebrate witch trials. The City on a Hill had been breached by evil from within.

 

The Shaker Communities Endure for Many Generations

 

Mother Ann Lee brought her fledgling Shaker community to America at the end of the 18th Century. Pursuing “authentic” Christianity, Shaker communities grew, drawing thousands to their simple lifestyle and communal living. Converts were drawn to the Shakers for many reasons, chief of which was a spiritual experience. Shakerism, however, also offered a step out of the world with its often chaotic economic and social fluctuations.

 

Shakers believed in celibacy and the community expanded only because entire families converted. Shakers also accepted orphans. Shakers were not opposed to marriage, in fact they blessed it. But the Shaker life was a special gift to fulfill a divine mission. Shakers declined during the latter years of the 19th Century as demand for their hand-made products decreased, replaced by Industrialization and new methods of mass production.

 

The Mormon Quest for a Utopian Community

 

From the first days of the Mormon experience in Palmyra, New York, members of Joseph Smith’s “Latter Day Saints” were persecuted. With every newly established community, the persecution heightened. Like the Shakers, Mormons were communal. Following the murder of Smith by intolerant neighbors, Brigham Young led the faithful to Utah, establishing the Deseret Kingdom.

 

After the Mexican American War, however, the western territories came under the jurisdiction of the United States government. The Mormon community was again threatened. The Latter Day saints succeeded, however, because they adapted to outside influences without compromising their beliefs. Additionally, Mormon theology evolved with the times as the church repudiated plural marriage. Today, Mormons hold high political office and are viewed as a community with strong families and moral lifestyles.

 

Utopian Communities that Failed to Survive

 

Early 19th Century America saw many social experiments like the Oneida community, Robert Owen’s “secular socialism,” and even Transcendentalists seeking to emulate Thoreau’s Walden Pond experience. Several of these communities were based on non-traditional religious and pseudo-religious beliefs. Because many of these smaller communities were driven by charismatic leaders, they eventually failed after the leaders died or were discredited.

 

Utopian communities formed, historically, during times of social upheaval. The Great Depression produced such communities and during the turbulent Vietnam War period, disenchanted young people joined communes like the Children of God, as well as more structured communities. Some followed Eastern religions like the Hare Krishnas, who managed several retreat communities in the U.S. Once the times changed, however, the appeal of Utopian communities waned. The desire to flee a complicated world may still be a prime motive for those seeking simplicity and security. If the precedents of history remain true, Utopianism may reappear in the future.

 

Sources:

 

Brian J. L. Berry, American Utopian Experiments: Communal Havens From Long-Wave Crises (University Press of New England, 1992)

Charles Nordhoff, American Utopias (Berkshire House, 1993)

Donald E. Pitzer, America’s Communal Utopias (University of North Carolina Press, 1997)

Copyright Michael Streich.Contact writer for republication rights.

 

Pacifists and Activists React to Media War Time Coverage

Mar 9, 2011 Michael Streich

The U.S. press has a long history of war reporting, often leading to the rise of pacifism and anti-war movements that have decades of precedent.

In the September 1966 issue of Esquire magazine, readers were able to catch a brief glimpse into the life of Tommy Rodd, an upper-middle class teenager with a bright future who refused to register for the military draft and fight in Vietnam. Rodd was no coward. Unlike the wars of the 21st Century, Vietnam, as seen by men like Rodd, was an everyday affair with television footage of jungle carnage and body bags shipped home to the quiet communities of white picket fences and steepled churches. Tommy Rodd was a pacifist who chose federal prison rather than a rifle. He was able to make this choice because the war filtered into every American living room.

Pacifism and the Openness of War in America

In 1916 Jeanette Rankin became the first woman to be elected to Congress. A Montana Republican, Rankin was one of the few representatives to vote against the war declaration taking the United States into the Great War and sending thousands of troops to the blood soaked battlefields of Europe. She was not reelected.

Representative Rankin was not the only person opposed to the futility of the European conflict. Many Americans were appalled by the images of trench warfare and the staggering numbers of casualties. Rankin opposed war on pacifist grounds.


Rankin ran again in 1940 and won. War once more lingered in the shadows and despite the protestations of isolationists, other Americans, including President Franklin D. Roosevelt, appeared to be steering the nation toward participation in the conflict. From propaganda to the reality of despotism in Europe and Asia, the openness of war was in every newspaper and magazine. In London, Edward R. Morrow riveted the nation with his radio broadcasts of the Blitz.

Rankin Votes against War with Imperial Japan

When Speaker Sam Rayburn called for a roll call vote in the House on December 8, 1941, everyone held their breath as he went through the states alphabetically, eventually coming to Representative Rankin. Rankin had attempted several times to gain the attention of the Speaker only to be declared out of order. Her nay vote was the only vote opposing the declaration of war.

Korea and Vietnam Cast Doubt on the Righteousness of War

The Korean War was fought, ostensibly, to thwart Communist expansion and Americans who opposed the conflict risked being labeled Communist sympathizers. Vietnam, however, was a different war. Some Americans began to question containment and President Johnson’s escalation of the war after 1964 began to attract questions and protests, notably among young Americans being drafted to fight.


The media abetted these protests by offering a forum to dissenting Americans as well as projecting the images of war on nightly news broadcasts. This increased during the presidency of Richard Nixon who came to see the media as an enemy. For conscientious objectors, the images of war confirmed what many already knew: the government’s explanations were duplicitous.


Robert F. Drinan, S.J., who served in the Congress for five terms and sat on a joint congressional committee investigating the Watergate allegations, wrote that, “…the peace community would complain that they had been gagged if the government pressured them to cease their vigorous claims that the government is excessively belligerent and warlike.” But this was only possible as long as the nation’s media accurately reported the course of the war.

The Lessons of Pacifism, Activism, and Change through Media Openness

Sociologists refer to the success of the “Lilliput Strategy.” Often used to describe anti-globalism, the same coalescence of often disparate groups both during and after Vietnam gave rise to entire new movements. These movements believed that change was possible. For pacifists, it also meant a greater accountability of the U.S. government and a rejection of long held policies supporting despotic regimes.


At the same time, war became less visible. Congress replaced the military draft with an all volunteer force and the role of media coverage in conflict areas was curtailed. During President Reagan’s first administration, for example, the Grenada operation was carried out in secrecy, without media coverage.


Additionally, 21st Century wars are remote and media coverage is tightly controlled. The conflicts impose no sacrifice on Americans who, for the most part, cannot even identify Afghanistan or Yemen on a map. Unlike the extensive media coverage of Vietnam at the time Tommy Rodd received his draft card or Representative Rankin’s anti-war votes earlier in that century, military footage today is carefully orchestrated and any hint of a draft is speedily quashed in the Congress. No Americans want another Vietnam.

Lifting the Fog of War

From protest music to student activism, Vietnam was the last conflict that polarized a nation largely due to media coverage. In 2011, war coverage is carefully scripted despite on-going polls that demonstrate a willingness among many Americans to dramatically cut defense appropriations. As long as war is sanitized by the media, pacifism and activism will be checked and there will be no withdrawal of consent among Americans.

Sources:

  • Jeremy Brecher and others, “Globalization and Social Movements,” Globalization: The Transformation of Social Worlds (Wadsworth, 2012)
  • Robert F. Drinan, S.J., Can God & Caesar Coexist? (Yale University Press, 2004)
  • Mark Hamilton Lytle, America’s Uncivil Wars: The Sixties Era From Elvis To The Fall Of Richard Nixon (Oxford University Press, 2006)
  • Bernard Weinraub, “Four Ways to Go: Tommy Rodd Went to Jail,” Esquire, September 1966

Copyright Michael Streich. Contact the author to obtain permission for republication.



 Robert Owen's Utopian Communities in 19th Century America

Michael Streich

May 29, 2011

 The family has always held prominence in American History, from colonial New England to the present. It represents the foundation of conservative values despite the fact that every new census reports fewer marriages and an ever increasing number of single-parent families. It thus came as no surprise that in the early 19th Century Robert Owen’s communitarian experiment was vigorously attacked and, ultimately doomed to failure. Owen’s communities, much like other Utopian experiments at the time, rejected ownership of private property and established religion, community raising of children, and a socialist model of labor and wealth sharing.

 

Owen’s Clash with Established Values in American Society

 

Robert Owen was a Briton and had been successful in the pre-industrial climate in England, setting up model factory communities that paralleled his communitarian views. In the early 19th Century he brought his ideas to America, even addressing Congress and lecturing across the fledgling nation. In America, this pre-industrial period was a time of great upheaval, culminating in the Panic of 1837.

 

Distressed Americans flocked to the various Utopian communities that emerged out of the tumultuous period. This included the Shakers, Rappites, Mormons, and Transcendentalist experiments. Owen’s communities, however, never took root, perhaps because, as Historian Page Smith writes, they were “intensely secular.” Owen’s rejection of the family as well as established religion was viewed as an attack on basic American values.

 

The Raising of Children in Utopian Communities

 

Karl Marx and Friedrich Engels were enamored with the communitarianism of Native American cultures, notably the Iroquois Confederacy, which appeared to validate a socialist model. According to anthropology professor Jack Weatherford, “The kinship states of the Indians became in Marxist thought exemplars of primitive communism.” Not only was all property held by the community, but it was the community that was responsible for the raising of children.

 

The rejection of a family unit is traced back to ancient civilizations. The Greek philosopher Plato’s Utopian civilization rejected family. In Sparta, boys were taken at birth to be trained as superb warriors. The ancient practice of infanticide guaranteed healthy and productive citizens. Deceased Yale University professor John Boswell wrote a detailed history of the abandonment of children from the ancient period into the Medieval. (The Kindness of Strangers, Pantheon) Children never occupied the position they do in modern society.

 

The American Family Rooted in New England Protestantism

 

Despite a revolution, independence, and the birth pangs of nationhood, 19th Century Americans still identified with the Puritan “New Israel.” Jerusalem had been reborn on the shores of a new continent with a mission and vision tied to John Winthrop’s City on a Hill. The cornerstone of that relationship was the family, supported by biblical injunctions.

 

Robert Owen argued against this entrenched social reality. Even the Mormons and Shakers understood this. Shakers practiced a vibrant form of Christianity, albeit out of the mainstream from what they called “the world’s people.” Mormons, despite later acceptance of plural marriage, supported the family structure.

 

Owen’s experiment, centered in New Harmony, failed for a variety of reasons. University of Texas professor Brian J. L. Berry argues that Owen spent little time at the various communities established according to his model but preferred to travel across the country and lecture. Additionally, many joining the community were “freeloaders” and intellectuals without farming experience. Finally, Owen’s attack on family and religion spurred a backlash of anger by Americans wedded to the Protestant ideal of faith and family.

 

Owen returned to Britain but that ideal remained and does so today in Protestant ideology and conservative politics. The core of post-modern concerns regarding “same-sex marriage” is rooted in the historically American notion of family. 

 

Sources:

 

Brian J. L. Berry, America’s Utopian Experiments: Communal Havens From Long-Wave Crises (Dartmouth College, University Press of New England, 1992)

Page Smith, The Nation Comes of Age: A People’s History Of The Ante-Bellum Years, Volume 4 (McGraw-Hill Book Company, 1981)

Jack Weatherford, Indian Givers: HowThe Indians Of The Americas Transformed The World (Fawcett Books, 1988)

Copyright owned by Michael Streich. Written permission need for republishing.