Connect with us

Opinion

The Cross: Gift from the Savior

Published

on

But how? Understand that the Savior of mankind had to be more than just a man; otherwise, man would be saving himself, and that is impossible. Clearly then, the Savior had to be God! For again, only God can forgive sin and save sinners. Jesus spoke to the absolute necessity to see and believe this truth when He said, “…if ye believe not that I am He (God), ye shall die in your sins.” (Jn. 8:24) Meaning to be cast from God’s presence and light into eternal darkness and damnation. Jesus also said, “…I am come that they might have life…” (eternal life) (Jn. 10:10), which He gives by dying in our place on the Cross! Jesus was saying to the Jews that in spite of their Covenant history, they did not have life and were dead (lost) UNTIL HE CAME! Note that this lost, hopeless state applies to Gentiles also; indeed, to all people of the world.

To help believe that Jesus was the God-Man on the Cross, consider the definition of Incarnate: Embodied in flesh, esp. human form. Also consider Incarnation: 1. The assumption of Jesus Christ of bodily form. 2. The bodily form assumed by a deity.

Jesus Christ, as a man, did not exist before being born of Mary. In this regard, God said, “…thou art my Son, this day have I begotten thee.” ( Heb.1:5) Please note there is no separation between Jesus as man and Jesus as God! So as the Incarnation reveals, the Creator Father God dwelt with His creation as the begotten God-Man (flesh), Jesus Christ! And, thereby, entered the blood line of mankind. In so doing, He shed His own blood as a sacrifice for mankind’s sin!

ENTER: God’s Lamb, The Lamb Of God!

Cries for a Savior were answered at the Cross. The King of the Jews, the King of Israel; yeah, the Eternal Father God who said, “…I am the first and the last, and beside me there is no God.” (Isa. 44:6), was buying back what was lost to Him to establish a holy nation and an everlasting kingdom! Being fully (100%) man and fully (100%) God, Jesus as a man died, but Jesus as God did not. So the 100% God (Jesus) raised up the 100% man (Jesus) from the dead with a new and glorified body!

Jesus, speaking in the first person singular as a man and as God said: “…destroy this temple (kill me) and in three days I will raise it up.” (Jn. 2:19). By saying “I will”, Jesus is saying He is God, for it shows Him present (alive) in the past, present and future at once (the same time), having no constraints as to time or space. Existing not only from the “beginning”, but from everlasting! So, He is alive as God while dead as a man, and is speaking from all time frames because His is an uninterrupted and never-ending life!

So, because of His glorious resurrection, I can shout with the song writer who said, “He’s alive… He’s alive and I’m forgiven. Heaven’s gates are open wide!” Eternal thanksgiving to my Lord and praises to the Darling of Heaven; and oh that I could bow before Him and kiss His lovely feet!

His message? I love you. My life, death and resurrection in your place for your life. So respond with the hymnal writer who said, ” I am coming Lord to thee, dear Lamb of Calvary. Humbly at thy cross I bow. Save me Jesus, save me now.”


Reverend Jess Shifflett
Front Royal, Virginia

Opinion

“The First Muslim” by Lesley Hazleton

Published

on

With tensions in the Middle East elevated due to the conflict in Gaza and now American soldiers being killed by Iranian backed terrorists, I believe it is wise to learn as much as possible about the Middle East to make the most informed decisions possible going forward.

A key to understanding the Middle East is to understand Islam. While there are many similarities between Christianity and Islam, the differences make the two religions seem worlds apart. Just like it is impossible to understand Christianity without knowing Jesus of Nazareth, it is likewise impossible to understand Islam without knowing Muhammad. While there are dozens of portrayals of Jesus, most recently with “The Chosen,” there are no such programs about Muhammad, leaving most Westerners with little understanding of exactly who was Muhammad. To fill that void, Middle Eastern correspondent Lesley Hazleton wrote “The First Muslim: The Story of Muhammad” to show that Muhammad was a complex leader who led a difficult life.

Like Jesus, Muhammad drastically changed the world around him creating a religion that filled the world and shaped the lives of many today. Yet, Muhammad was also a man with human flaws that changed his message from one of peace to conflict.

While Hazleton breaks her book down into three parts, I actually found Muhammad’s life divided in half—the years before Medina  and the years after. There seemed to be a fundamental change in Medina as Muhammad went from being an outsider in Mecca, preaching a new religion, to the undisputed religious and political leader in Medina. It is similar to Christianity but a reverse order. Whereas the Old Testament was much more militant while the New Testament spoke of peace, in Islam the revelations that made up the Quran go in the opposite direction.

Muhammad had little chance for success. Being born after his father passed and his mother dying when he was six, he grew up an orphan in a society that cared little for orphans. Yet, without any power or prestige, he earned the reputation for integrity and honesty as he worked his way up to positions of importance with his uncle’s trading caravans. However, after feeling as if he had earned respect, he was reminded who he was when his uncle denied his marriage to his daughter because of his low status. Hazleton writes, “To a boy imbued with the rough egalitarianism of Bedouin life, all this could only have come as a shock. His own people had co-opted faith, piously declared it even as they contravened its most basic principles. From his perch on the sidelines, he saw the social injustices of what was happening all too clearly.”

One eventual positive in his life was Khadija, a wealthy women 25 years his senior, who hired him to help with her caravans until they fell in love, and she proposed marriage. While their years together were difficult, they loved and supported each other for 25 years.

When Muhammad first told her he heard the voice of the angel Gabriel and thought he was either possessed or going mad, Khadija told him to listen and accept the messages. She stood by him as he began to preach the message of Islam and felt the hatred from the Mecca elite. Muhammad loved Khadija and refused any other marriages until after her death.

According to Hazleton, during these early years, the verses in the Quran were “an impassioned protest against corruption and social inequity. They took the side of the poor and the marginalized, calling for advantaging the disadvantaged. They demanded a halt to the worship of the false gods of profit and power along with those of the totem stones. They condemned the concept of sons as wealth and the consequent practice of female infanticide. And above all, they indicated the arrogance of the wealthy—‘those who amass and hoard wealth,’ who ‘love wealth with an ardent passion,’ who ‘are violent in their love of wealth.’” Yet, turning away from the gods whom the pilgrims of Arabia visited could greatly hurt the wealth of Mecca, so much so that when Muhammad’s uncle and protector died, other Meccan leaders teamed up to kill the threat to their livelihood. Fortunately for Muhammad, he was warned in a dream and he and his followers made the trip to Medina where he had been invited to come as a judge but eventually became their leader and prophet.

In Medina, now as the leader, the revelations were less about a peaceful society and more about governance and control. Revelations now said, “Permission is granted to those who fight because they have been wronged…those who have been driven out of their houses without right only because they said our god is God.”

Consolidating his power, Muhammad eventually either exiled or killed the three Jewish tribes at Medina when they did not join Islam. Turning his attention toward Mecca, his forces began attacking caravans. Eventually the two cities were in an all-out war. Muhammad was able to take over Mecca making himself the undisputed political and religious leader.

Rising from an lowly outsider managing caravans for his uncle to become the religious and political leader, one would expect changes in the Quranic verses. It may have been the burden of leadership or the loss of Khadija, but Muhammad changed in Medina, which was reflected in the Quran. The verses transformed from societal actions to a guide to enforce God’s law.

Hazleton writes in such a way that the book reads more like a novel than nonfiction. Her storytelling ability brings the life of Muhammad to life for Western readers. She presents his life as a timeless journey of an underdog who grows up to start a major world religion. She is respectful in narration while at the same time showing that Muhammad is not just the prophet of Islam but also a man with both positive and flawed characteristics. Hazleton does an excellent job also explaining the origins of Islam while also challenging some false perceptions such as the role of women that occurred after the prophet’s death. For anyone concerned with the region, it is necessary to understand Islam and Hazleton’s “The First Muslim” is a perfect place to start for readers at any level.

James Finck is a professor of history at the University of Science and Arts of Oklahoma. He may be reached at HistoricallySpeaking1776@gmail.com.

Continue Reading

Opinion

Are Super Bowl Commercials Shaping Our Country? If So, At What Cost?

Published

on

 

In 1984, Democrats needed a candidate to challenge the very popular Republican incumbent President Ronald Reagan. Democratic frontrunner Walter Mondale, an ex-U.S. senator and state attorney general from Minnesota who had also served as Jimmy Carter’s vice president, seemed as though he would be a shoo-in for the presidency; this was confirmed with his dominating Iowa Caucus win.

Yet, at the caucus, a lesser-known U.S. senator from Colorado fared much better than expected. Still, nobody really paid much attention to Gary Hart. That is until he shocked the political world by winning the New Hampshire primary. Though 10 years younger than Mondale, Hart, another Democrat, seemed like a different generation. His youth and charisma reminded voters of Kennedy–something he did not try to discourage.

Hart then continued his run, winning both Vermont and Wyoming. He insisted he was a new type of Democrat who had new ideas. He wanted to cut taxes while at the same time increase welfare and healthcare for all. What helped halt Hart’s momentum was during the March 11th televised debate after Hart laid out his plan, Mondale turned to the camera and said, “Where’s the beef?”

Mondale’s question implied Hart was all talk and no substance. While the slogan itself did not solely defeat Hart, it did stick to his campaign and made more people question his policies.

The line resonated with voters because it had recently aired as a slogan for Wendy’s Restaurants during the previous Super Bowl. The commercial starred three elderly ladies eating at Home of the Big Bun. When one lady opened her burger to find an exaggerated tiny hamburger patty, she exclaimed, “Where’s the beef?”

It’s no surprise that Super Bowl commercials made their way into political campaigns. Super Bowls in this country are a big deal. In fact, of the top 30 most watched television programs of all time, 23 are Super Bowls.

Part of the reason for large crowds is that many Americans love football. But that alone does not account for such high ratings. Another major factor are the commercials. Even viewers who are not sports fans watch this one game more and more regularly because of the heartwarming, comedic or even sometimes outlandish advertisements.

During the regular season only 35% of women watch football. Super Bowl ratings show that number jumps up to 75%. That is a huge leap! And it’s mostly because of commercials. Even for men, the next morning around the water cooler, it seems like more are debating which were the best commercials more than recapping the plays on the field. Commercials have taken the Super Bowl from a paramount sporting event to a cultural phenomenon.

“Where’s the beef?” is not the only phrase to make it into our everyday vocabulary. In 1993, after Larry Bird and Michael Jordan played the greatest game of HORSE ever for a Big Mac and fries, the phrase “nothing but net” became the catchphrase that is now heard during every basketball contest from schoolyard picks to the NBA. In 2010, the E*TRADE baby first used the line, “Well, that’s going to cost you a lot of money.” That phrase has been used in boardrooms ever since. Of course, let’s not forget that in 2002 the greeting for any group of guys became “Whassup” after the Budweiser commercial.

While “Where’s the Beef” may be the most famous example of the relationship between the Super Bowl and politics, there have always been subtle political or cultural messages–especially recently.

Many ads have catered to the left’s message of diversity and inclusion. Coca-Cola’s “It’s Beautiful” 2014 commercial showed every shade of humanity while “America the Beautiful” played in multiple languages, and Airbnb’s 2017 “We Accept” commercial was pretty much the same as Coke’s, but with pictures of diverse people and written text claiming they accept everyone. Days after President Donald Trump announced his travel ban, Budweiser showed the harrowing immigrant story of their German co-founder, Adolphus Busch. And finally, Audi’s daughter ad with a father concerned that his daughter would not be treated fairly in a soapbox derby race full of boys aired in 2017.

The right has also had their share of ads like the Servant Foundation’s “He Gets Us” campaign with the message that Jesus loves them, and a string of patriotic ads like the NFL’s Ragged Old Flag in 2020, and Chrysler’s 2012 “It’s Halftime in America” commercial starring Clint Eastwood. But probably the most notable is from 2002, when Budweiser’s famous Clydesdales kneeled to honor the victims of 9/11. With no dialogue, the reverent Budweiser commercial aired only once.

While the first Super Bowl was in 1967, most consider the first famous Super Bowl commercial came a few years later in 1973. The ad was for Noxzema Shaving Cream and the company cracked at least one successful code by using celebrities to push their products. This particular ad had Farrah Fawcett from Charlie’s Angels smearing shaving cream all over the face of New York Jets quarterback Joe Namath with the tagline, “Let Noxzema cream your face.” The line might not get past censors today, but it opened the door to famous football players and models—think “Mean” Joe Green or Cindy Crawford.

Finally, it should be noted that the 1973 Noxzema commercial cost a whopping $42,000, (roughly $288,000 today) while 30-second spots for this year’s game ran for around a measly $7 million. Yet, as more than 100 million consumers were expected to tune in, for companies with the means, it is worth it. If Super Bowl commercials can stand out, advertisers’ brands become immortal.

 James Finck is a professor of history at the University of Science and Arts of Oklahoma. He may be reached at HistoricallySpeaking1776@gmail.com.

Continue Reading

Opinion

Father, Son Give Insightful Review of Temple Grandin

Published

on

 

I rarely get personal with my articles, but this week’s subject resonates with me. One of the hardest days of my life was when a doctor confirmed to my wife and me what we already suspected – that my son, my oldest child had autism.

We knew that not all was lost. Compared to many, we are blessed in that he is high-functioning and incredibly smart. Yet it is still a blow to any parent knowing that their child is different and would have challenges many other kids would not have to face.

That has been the case with our child. While he excelled in school, graduating as valedictorian and earning the state regent’s scholarship, he struggled making connections. He has always wanted friends, but does not know how to talk to his fellow students. It’s difficult for people he meets to understand what is going on in that big brain of his. He struggles to look people in the eye, does not know what to say and can’t understand nonverbal cues. It’s easy to judge him as slow, until you talk to him about movies. Suddenly, he comes to life and knows more about movies than anyone I know—how they are made, who starred in and directed every film, and even types of techniques the directors used. I have read some of his movie reviews and he sees movies different than most and understands things that I did not even know I was supposed to understand.

My dream for my son is that the world understands that he is special, and that autism makes him different not lesser. That is the exact message of HBO’s Temple Grandin. Grandin’s condition allows her to see things differently which has allowed her to improve things in her field. The biopic of her life and struggles have brought hope to thousands of parents like me who want what is best for their kids.

I teach at the University of Science and Arts of Oklahoma. I could write dozens of articles about why this school is special, but the one program of which I am most proud is The Neill-Wint Center for Neurodiversity. Started by Kathy Perry and sponsored by Phillip and Katie Wint, the center’s mission is to assist students with autism spectrum disorder in their transition to college life and to help ensure postsecondary success. Partly because of the center, on Feb. 23, as part of the Emerson-Weir Liberal Arts Series, Temple Grandin herself will be the keynote speaker.

My son, who is currently part of the Neill-Wint Center and lives on campus, comes home at least once a week and we watch a movie together. With his love of all movies and me being a historian, we tend to watch a classic. However, this past week he asked that we watch Temple Grandin staring Clair Danes in preparation for her upcoming visit. I was so inspired by this movie that I knew I needed to include a review in my column.

Grandin, born in 1947, dealt with autism in a time when the condition was still relatively unknown. The movie starts with her at a boarding school where, because of a strong mother and sainted teachers who recognized her gifts, she was able to excel. It then followed her through college and graduate school. At each level there were many who stood in her way, believing she was not capable of learning. At each level, she proved them wrong. Not only did she learn, but she excelled and became a published author and expert in animal behavior.

At the end of the movie, she is attending an autism conference where parents were shown trying to handle their autistic children in different ways. When Grandin speaks up and announces that she is autistic and a has a Ph.D., suddenly every parent in the room wanted to hear everything she had to say. Symbolically, ever parent of an autistic child, including myself, was in that room. Seeing what she overcame shows each of us that our children can also. There are things I can say technically about the movie, but it only seems right that I turn that part over to my son.

“Temple Grandin is a fine film. It does a good job at explaining who Temple Grandin is, specifically when it comes to her work in agriculture that first brought her attention, as well as her life growing up with autism that most people know her for. Claire Danes is great as Grandin, disappearing into the role without her performance coming across as cartoony. Director Mick Jackson succeeds at getting across how Grandin thinks, thanks to the stylistic use of on-screen graphics as well as small cutaways to show how she interprets certain phrases literally. Temple Grandin might feel like a TV movie, and it probably won’t blow everyone’s minds away, but it’s still worth a watch.”

I am excited to see Dr. Grandin in person. Watching the movie has made her a personal hero. She broke down many barriers in higher education and paved a way for students like my son and the others to chase their dreams. It is because of people like Dr. Grandin that USAO has already seen several students in the Neill-Wint Center walk across the stage at graduation.

That day at the doctors may have been one of my hardest days, but I also know that watching my son cross the stage in April when he graduates college will be one of my absolute best.

James Finck is a professor of history at the University of Science and Arts of Oklahoma. He may be reached at HistoricallySpeaking1776@gmail.com.

Continue Reading

Opinion

Celebrating the Essential Role of School Libraries and Teacher-Librarians in Our Community

Published

on

This past year, we’ve heard about libraries being battlegrounds. In Virginia alone, books have been removed from the shelves of public schools in Hanover County, Rockingham County, and Spotsylvania County, and our very own Samuels Public Library was in the national spotlight over the summer when its refusal to bend to book-banning efforts temporarily put its funding in jeopardy. The conversation about libraries has been serious, anxious, and urgent – all understandably so. But as we observe School Library Month, let’s pause to celebrate these libraries as safe spaces, vibrant hubs of knowledge and innovation, and champions of literacy, diversity, and inclusivity.

In our county’s school libraries, students from Pre-K through Grade 12 have opportunities to engage with literature, learn how to find and use information, experience the challenges and rewards of creating and sharing knowledge, and develop important digital and technological skills. Our county’s school librarians share the same passion for education as our wonderful, dedicated teachers – because they, too, are teachers, and our librarians play a significant role in their school communities by making connections with students and supporting their personal and academic growth. School librarians in Virginia are also uniquely qualified to promote student literacy because, in addition to their training in education, they are also educated in library science to prepare for the momentous task of providing students access to information by selecting materials using informed criteria.

School librarians tie shoelaces, wipe tears (and snot!), remember students’ interests and get excited about finding them “just the right book,” offer a respite from the chaos of the school day, and always challenge students to learn more, about themselves, about others, about the world they live in and their place in it. It’s joyful to be a school librarian, and as a current student tackling the training required to take on this responsibility, I’m thrilled to experience that joy for myself in the near future. But it’s also an unsettling time to work in this field, when school librarians across the country are losing their jobs, and many of those who are able and willing to stay in their positions have found their agency and roles severely limited due to suspicion and fear surrounding libraries, books, and information.

This April, instead of debating the merits of books and questioning the intelligence and motivations of educators who have dedicated their professional and often personal lives to librarianship, let’s commit ourselves to supporting and investing in Virginia’s school libraries. Let’s celebrate their vital role in shaping the next generation of lifelong learners. Let’s do everything we can to ensure that today’s and tomorrow’s students have access to the transformative power of literacy. Our school libraries have the potential to enhance the future of our community dramatically. Let’s not stand in their way.

Lydia Buhl
Linden, Va.

(Darden College of Education & Professional Studies, Old Dominion University

LIBS 676: Library Media Services and the Curriculum, Professor Cynthia Stogdill)


Disclaimer: The opinions expressed in the letters published on this page are solely those of the respective authors and do not necessarily reflect the views or opinions of the Royal Examiner’s editorial team, its affiliates, or advertisers. The Royal Examiner does not endorse or take responsibility for the accuracy, completeness, or validity of any statements made by the authors. The Royal Examiner has not independently verified the statements and claims presented in the letters. Readers are encouraged to exercise their judgment and critical thinking skills when evaluating the content. Any reliance on the information in the letters is at the reader’s own risk.

While the Royal Examiner makes every effort to publish diverse opinions, it does not guarantee the publication of all received letters. The Royal Examiner reserves the right to edit letters for clarity, length, and adherence to editorial guidelines. Moreover, the Royal Examiner does not assume any liability for any loss or damage incurred by readers due to the content of the letters or any subsequent actions based on these opinions.

In submitting a letter to the editor, authors grant the newspaper the right to publish, edit, reproduce, or distribute the content in print, online, or any other form.

We value our readers’ engagement and encourage open and constructive discussions on various topics. However, the Royal Examiner retains the right to reject any letter that contains offensive language, personal attacks, or violations of any legal regulations. Thank you for being a part of our vibrant community of readers and contributors, and we look forward to receiving your diverse perspectives on matters of interest and importance.

Continue Reading

Opinion

Presidential Competence in an Age of Instantaneous Interaction and Decisiveness

Published

on

A bombshell was dropped last month when Department of Justice Special Counsel Robert Hur released his findings on President Joe Biden’s handling of classified documents. The good news for Biden was Hur does not plan to bring charges. However, the bad news was that even though Hur concluded that Biden was actually guilty, no jury would convict him because Biden is too old and has, “limited precision and recall.” Hur’s ultimate conclusion was that Biden is “a sympathetic, well-meaning elderly man with a poor memory.” The statements—though probably politically motivated—are damning to the president because to many they only confirm what they already suspect: Biden is too old and will never make four more years.

Age has always been a concern for presidential elections. It was one of the biggest issues facing Ronald Reagan when he ran for president at age 73. Yet that is now seeming young compared to the two presumptive candidates with Trump at age 77 and Biden at 81.

With Biden, between the reports and what seems like mental slips in the last few years, voters have wondered what happens if the president becomes mentally unable to fulfill his duties. Historically speaking, it would not be the first time a president was mentally incapacitated, only the first time it happened that the public never knew.

In 1912 the very progressive ex-governor of New Jersey, Woodrow Wilson was elected President of the United States as a Democrat, only the second since James Buchanan’s 1856 election. As a progressive, he shaped the direction of the nation including instituting income tax, direct election of senators and women’s suffrage. While in office, arguably Wilson’s two biggest events were personally the marriage to his second wife Edith Galt Wilson in 1915, and internationally the beginning of WWI in 1914. Wilson used American neutrality in the war as his campaign slogan, “Vote for Wilson! He kept us out of war,” when he ran for reelection in 1916. Yet it was only about a year later that America sided with the Allies and started shipping soldiers off to France.

Historians debate whether Wilson intended all along to enter the war. Those who believe he always planned on fighting do not believe his progressive nature would allow him to stay out of a fight that had such important outcomes. The war changed the map of Europe and toppled four major empires. Wilson knew the only way he would have a seat at the table after the war was as a fighting participate. Wilson, who held a Ph.D. in history and government from Johns Hopkins University and had served as president of Princeton University, was so confident he could solve all the worlds issues he showed up at the negotiating table with his Thirteen Points and expected to dominate the meeting. While Wilson did not get all his points into the Treaty of Versailles, he did get his most desired point: establishing the League of Nations, an international body that could solve future problems before they escalated into war.

The problem for Wilson was the Republican congress back home. Knowing the treaty needed congressional ratification, Wilson should have consulted with key Republicans on the treaty, but he was not that kind of president. When he presented the treaty to the Senate it was rejected, especially the League of Nations.

However, instead of compromising with Republican senators, Wilson took his cause to the people. He believed that the people would rally to his cause and force the Senate to accept the League. For months Wilson rode a train around the nation giving whistlestop speeches to any crowd that would listen. However, in October, overworked and physically exhausted, the president suffered from a stroke that left him paralyzed and mentally impaired.

Instead of reporting the stroke, Edith and a small group decided to cover it up and tell the American people he was suffering from exhaustion. While the government continued to function normally Edith began making the executive decisions, including meeting with cabinet members and foreign dignitaries. When Republicans demanded an audience, Edith pulled a scene straight out of Weekend at Bernie’s, where she dressed Wilson up, put his bed in the shadows with his paralyzed side to the wall. Wilson was able to pull it off with enough ability to carry a very short conversation to appease his detractors. Edith pulled off the act for over a year, knowing that, if discovered, the League of Nations would be doomed. In the end, it did not matter as the Republicans voted down American membership in the League, killing it before it even got started. As for Wilson, Edith was able to keep his secret until President Warren G. Harding was inaugurated.

I’m not sure if Jill Biden would be up to the task like Edith Wilson, but fortunately, if something were to happen to the president there are now laws in place. After the death of President John F. Kennedy, Congress pushed through the 25th Amendment that set up the line of accession to the presidency. In Section Four it states, “Whenever the Vice President and a majority of either the principal officers of the executive departments or of such other body as Congress may by law provide, transmit to the President pro tempore of the Senate and the Speaker of the House of Representatives their written declaration that the President is unable to discharge the powers and duties of his office, the Vice President shall immediately assume the powers and duties of the office as Acting President.”

This section allowed for the president to be replaced if he is no longer able to perform his executive duties. While so far this amendment has only been used temporarily, mostly for colonoscopies, there are many who believe it might be enacted for the first time no matter which elderly statesman holds the office.

James Finck is a professor of history at the University of Science and Arts of Oklahoma. He may be reached at HistoricallySpeaking1776@gmail.com.

Continue Reading

Opinion

History of American Political Parties, Part X: The Gilded Age

Published

on

For many, the Gilded Age (1877-1900) is the most boring part of political history. All the presidents were bearded white men from New York or Ohio who are hard to distinguish. In fact, it was just as hard to differentiate between Republicans and Democrats, being that neither really did very much.

Elections were always exciting as the contests were very close and because there was still no secret ballot, both parties did everything they could to influence voters including massive picnics with great deals of alcohol. Elections were such a celebration that voter turnout was around 80%. Once the secret ballot was instituted and politicians could no longer control the votes, voting rates dropped down into the 40s in the next century.

It’s surprising how large voter turnout was in the Gilded Age considering the lack of differences between the two parties. Republicans remained the classic conservatives of positive government, which simply meant they just did a little more. If we look at the government’s role based on the Preamble to the Constitution, then it has three jobs: ensure domestic Tranquility, provide for the common defense and promote the general Welfare. For domestic tranquility the main job the government did was use the army to break up strikes, really nothing else. For common defense, it did maintain a small army. But up to that point, America had never believed a democracy should maintain a standing army, that would only allow for tyranny. When an army was required, the people should filled its ranks. (Think Second Amendment.) America will not maintain a standing army until the Cold War (1947-1991). As for general welfare, the government promoted economic growth through tariffs, but that was very controversial. The government also sold cheap land in the West to bring in some income. There was no concept of any type of safety net, but the largest government expenditure was Union soldier pensions after the Civil War.

As for parties, Republicans looked closer to today’s liberals than conservatives. They were the party of big government; their constituency was comprised of businessmen because the party pushed for economic growth and protective tariffs to help American businesses. Black Americans, when they could vote, overwhelmingly voted Republican because it was the party of Lincoln and emancipation. Strong Protestants supported the party because they pushed for moral reforms like outlawing alcohol and gambling. Finally, Union soldiers voted Republican because of the pensions.

As for Democrats, they were the reverse and more closely resemble today’s Republican Party. They believed the best form of government is one that governs the least. This philosophy drew support from white Southerners who wanted the least government interference possible. It also drew support from Northern immigrants in the cities. Most of these were Catholics who believed it was the Church’s job to regulate morality, not the government.

The biggest issue for Democrats was that they were the party of white supremacy. This was not something they shied away from, but they openly supported segregation, Jim Crow laws and ending Reconstruction (the era between 1865 and 1877 where the government abolished slavery, reintegrated once-seceded states and rebuilding the South after the Civil War).

Even though elections were always close, Republicans dominated during Reconstruction and the Gilded Age. After Republican Ulysses S. Grant completed his presidency, another Republican, Rutherford B. Hayes, from Ohio, won in 1876. James Garfield, Republican from Ohio, was elected in 1880. Garfield was assassinated and was replaced by New York Republican Chester Arthur. A Democrat did win in 1884 with Grover Cleveland from New York, but Cleveland lost reelection in 1884 to Republican Benjamin Harrison from Ohio. In 1892, Cleveland came back and won making him the only two-term, nonconsecutive U.S. president. Finally Ohio Republican William McKinley won the presidency in 1896.

This last election of the Gilded Age was held in 1896. Officially part of the Gilded Age, it is my third favorite election and is a game-changer in American politics. In a Hail Mary attempt, Democrats completely changed their political ideology which guided them down the long road towards being the liberal party that they are today.

In the future, this series will resume with the 20th century and how the modern-day Republicans and Democrats came into being.

James Finck is a professor of history at the University of Science and Arts of Oklahoma. He may be reached at HistoricallySpeaking1776@gmail.com.

Continue Reading
Verified by ExactMetrics