Connect with us

Opinion

The Lost Game: Gridiron Memories of November 22, 1963

Published

on

I was playing quarterback in a high school intramural flag-football championship game around 1:30 p.m. on Nov. 22, 1963.

The game went into overtime as the class time ground into its last minutes. My team needed a score to even the alternating possession OT (we were ahead of our time) and extend things to the following day. Impatient, I lofted one deep and up for grabs – like Brett Favre occasionally does – that some defender out jumped my guy for.

BANG! We lost, no tomorrow.

Little did I know that the bang of defeat that had just gone off in my head was the mere echo of a much louder bang that went off almost simultaneously 1,330 miles to the south-west.

That other bang I had yet to hear was one of another kind of defeat that I will, it seems, carry with me to the grave.

Somewhat dejected I headed back to the locker room to shower before heading to my fifth period English class. Someone ran out of the locker room to meet us and said, “The president’s been shot!” Bullshit, that kind of thing doesn’t happen except in history books, I thought, “That’s not funny,” I said.

Inside the Alexandria, Virginia high school, not eight miles from the White House, things seemed normal as I prepared to shower. No solemn faced coaches, no lock down to protect then Republican House Minority Whip Gerald Ford’s sons. “The president’s been shot” was lost beneath what seemed normal adolescent, locker room banter. I began to return to a 15-year-old’s reality: sport, the thought of the girl’s locker room on the other side of a thick cement wall.

Then the PA system crackled and the locker room went unnaturally silent as the principal’s voice, not a secretary’s, asked for attention. A chill went down my spine, perhaps as a subconscious premonition that things were about to change in previously unimaginable ways flashed along sub-atomic particles throughout my brain. The tone first, then the words “President Kennedy has been shot” gravely confirmed what I had immediately denied as a plausible reality. One kid, a little red around the edges for that suburban Alexandria high school said something to the effect of “good.” Though we were casual friends and recent teammates, I started swinging and we went into a pile on the floor only to be quickly pulled apart by classmates and coaches. I had never wanted to damage someone as irrevocably as I did at that moment and the two of us never spoke again, leaving a silent distance between us that precluded the necessity of re-engaging that primal impulse toward some sort of irreversible destruction.

President Kennedy leaves the White House for the final time.

The emotions were immediate, deep and apparently ran in the family. I didn’t find out until years later that at almost the same moment, following a similar remark, my father, a WWII Army veteran who had lived through Normandy and the Battle of the Bulge, was decking a total stranger in a D.C. medical building on I Street where he was waiting for my mother to complete a routine checkup.

Across the Potomac River, we sat quietly in our classrooms: no teaching, no discussion, no emergency mentoring. We sat alone, grappling with our thoughts, as was our teacher. The principal came on again and said the president was dead. The reaction was subdued except for a girl named Jacqueline Kennedy – though I think she spelled her first name differently than the president’s wife. Spelling aside she went off, sobbing, hysteria rising. The teacher took her outside the room to settle her down. Didn’t work, she ended up in the infirmary. I sometimes wonder what happened to Jackie Kennedy, my classmate. How did she ride out that 15-year-old’s identification with the now blood-stained Queen of Camelot?

Forty-odd years later I know that day was the measurable beginning of the direction of the balance of my life. Despite the immediate profundity of a presidential assassination, I couldn’t have recognized that JFK’s violent death would lead directly through a five-year span of political upheaval between my formative 15th and 20th birthdays. This and three other domestic assassinations – of Malcolm X, Martin Luther King Jr. and Robert Francis Kennedy – seemed to earmark the time through a litany of foreign political intrigue, murder and assassination that always seemed to lead in one direction – to the right, toward war, toward implicit corporate profiteering from war, toward social division, toward lies.

That is my perception, my belief – the bad guys won. That is my psychological watershed. Rather than living under the auspice of a state favored by both man and God, I was floating through the most recent episode of civilization in decline, fueled by greed, power, murder and conquest.

It took all of those next five years for me to begin to appreciate what had begun during that that lost football game. By 1969 it was becoming apparent that a hopeful youth-driven world counterculture, as well as the best and the brightest within the world political system reflecting or inspiring the social idealism that characterized that counterculture, the Americans named above, Salvadore Allende, Alexander Dubcek, Patrice Lumumba, Che Guevara and others were beaten.

Around the world we had lost.

We would either be annihilated or assimilated – a foolish, inaccurate footnote to American and World History X – the fiction written by the winners.

I left Alexandria in 1967 for college. I moved from the specter of the federal capital to Richmond, the historical capital of the American Confederacy that had fought the ascendance of that federal system just over a century earlier. In retrospect it seemed an unconsciously profound symbolic move. Though I was through and through a son of the federal government in whose shadow I was raised by two parents it employed, I was soon to become suspicious, some would say paranoid, about its machinations, its intent, its history.

I followed my intellectual instincts for the next five years, studying sociology and psychology – how society and the human mind work. I guess I wanted to know why I had grown so alienated from the culture in which I lived. Was I crazy or did I live in an insane world? I learned things about myself and my society between 1967 and 1973 and most of what I learned took me back to the day my team lost that high school, intramural football game.

In college I learned that three days before John Kennedy’s inauguration, his predecessor, Dwight David Eisenhower, made an astonishing observation in his farewell address to the nation. I had grown up thinking of Eisenhower as a doddering, old, golf-playing general rewarded with the presidency for a job well done holding the Allied war effort together in Europe during World War II. My interest in the fate of his successor led me to a different view of Eisenhower. It began with that farewell address of Jan. 17, 1961.

Presidents Kennedy and Eisenhower at Camp David, April 22, 1961, five days after the first crisis of JFK’s presidency, the failed CIA-sponsored Bay of Pigs invasion.

On that day Eisenhower, the West Point graduate, career military man, general and president who led his country and its allies, first against Nazi Germany and then through the height of the Cold War with the Soviet Union, told his nation that the greatest threat it faced as he prepared to leave office was that born of its own military and corporate institutions in a profoundly changing American landscape.

“This conjunction of an immense military establishment and a large arms industry is new in the American experience,” Eisenhower told the American people of the corporate, political and military landscape that had arisen in the wake of World War II. “The total influence – economic, political, even spiritual – is felt in every city, every Statehouse, every office of the Federal government. We recognize the imperative need for this development. Yet we must not fail to comprehend its grave implications. Our toil, resources and livelihood are all involved; so is the very structure of our society.

“In the councils of government, we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military-industrial complex. The potential for the disastrous rise of misplaced power exists and will persist,” Eisenhower concluded.

The career soldier turned politician had apparently not thought it a sin to normalize relations with the Soviet Union, then our recent military ally, and reduce the rapidly expanding American military budget. This belief, according to Eisenhower biographers, led to much behind-the-scene infighting with the evolving military and industrial institutions Eisenhower spoke of at the end of his eight-year presidency.

Less than three years after Eisenhower’s dour warning, his successor had his head blown off in the streets of Dallas, Texas, while I played football a half a continent away. That successor, John Fitzgerald Kennedy, had also bucked the American military-corporate apparatus during his presidency, and perhaps fatally, more directly and in more immediate situations than Eisenhower had.

First, just three months in office Kennedy refused to commit to direct American military involvement during the 1961 invasion of Communist Cuba by a CIA-trained militia despite the urgings of the Joint Chiefs of Staff and CIA Director Allen Dulles. As a result of the intelligence misinformation and personal coercion he endured during that experience, Kennedy fired Allen Dulles as director of the CIA. He also developed enough distrust of the U.S. military command to avoid the armed, likely nuclear confrontation they suggested over Cuba during the missile crisis less than two years later.

President Kennedy addresses the nation on live television during the Cuban Missile Crisis, when the U.S. and Soviet Union came to the verge of nuclear war according to later unclassified Soviet intelligence documents.

Kennedy is even reported to have stated the intention of scattering what was threatening to become a rouge intelligence agency resistant to presidential oversight “into a thousand pieces” following a 1964 re-election that seemed a sure thing.

A great deal of debate still exists over whether Kennedy was planning implementation of another post-1964 election plan that would have flown further in the face of Eisenhower’s originally-named American “Military-Industrial-Congressional Complex”. That much-discussed plan was a lessening of direct American involvement in Vietnam. That involvement in the fall of 1963 was 16,000 “advisors” compared to the half million combat troops that would be sent there after his death. If true, as key Kennedy insiders assert, that plan reflected JFK’s growing belief that the Vietnam conflict was ultimately a civil war that would have to be won or lost by the South Vietnamese themselves – a decade and the bulk of 65,000 American and two million Vietnamese lives later that belief proved correct.

Many years after the fact I heard a European investigative report that quoted Kennedy archives indicating his ambassador to South Vietnam, Henry Cabot Lodge, reporting back to Washington that presidential directives relayed through the embassy in 1963 ordering CIA operatives in country to back off of aggressive covert actions, including assassinations, were simply being ignored in the field. I wondered then if those “rouge” CIA elements had an inkling those directives would soon change despite Kennedy’s overwhelming popularity as the1964 election approached?

No, probably just a coincidence that Kennedy was soon shot down like a rabid dog on a parade route whose path had been realigned that day to go down Elm Street in front of the Texas School Book Depository in a town whose mayor was reportedly the brother of Allen Dulles’s former military liaison officer.

Coincidence too, I expect that a former Marine named Lee Harvey Oswald worked in that School Book Depository. Oswald was the prodigal American son, who had “defected” to the Soviet Union with a perfect command of the Russian language following his assignment to a top-secret American military intelligence base in Japan from which American U-2 spy flights were launched over the Soviet Union. Later, the prodigal son would reconsider that defection – perhaps because the Soviets thought he was an American spy and kept a close check on him. I sometimes wonder at the benevolence of a nation that would welcome back its prodigal son with camera equipment to start a “new” career, rather than a little prison time for his alleged departure with top-secret information that was claimed to have compromised America’s U-2 spy missions. But how could his native land stay mad at the whimsical Oswald, who despite his highly public pro-Castro activities in New Orleans, cultivated associations with a number of right-wing, anti-Castro associates based in both New Orleans and Miami as he “floundered” philosophically in the years between Russia and Dallas?

Above, JFK and Ike at Camp David in the days after the failed Bay of Pigs invasion of Cuba – likely when the sitting president found out he had been lied to by the CIA about the previous president’s authorizations, or lack thereof, for a CIA-sponsored invasion of Cuba. Photographer Paul Vathis wrote ‘They looked so lonely.’ Below, Vathis’s photo juxtaposed on the wall of the Newseum in D.C. with mob-connected Jack Ruby’s silencing of Lee Harvey Oswald, who died claiming he was set up as a patsy in the JFK assassination. JFK photo sources, credits: Public Domain; White House Photographs; John F. Kennedy Presidential Library and Museum, Boston; Robert Knudson; Paul Vathis; Abraham Zapruder; “The Men Who Killed Kennedy” Nigel Turner-produced British TV documentary series; AP; Roger Bianchini at Newseum, Washington D.C.

Pondering these things after launching my own college term paper research on the JFK assassination in 1969, I told my mother, “There are circumstances leading a lot of people to think your old (CIA) bosses were behind it.”

“I wouldn’t be surprised, the way they talked about him,” she surprised me with a frank appraisal of her early 1960s superiors at the top of the American intelligence apparatus.

Now 43 years gone I am the paranoid-tinged, conspiracy freak sitting alone in the dark corners of dark bars, reflecting on the familiarity of low times and low lies glowering at me from the “enduring freedom” of a television screen hovering slightly above my still focused eye.

And 43 years gone from that long lost childhood football game I find myself choking down one final coincidence – that the U.S. president gesturing at me from that screen explaining the necessity of this country’s ongoing military-industrial occupation of Iraq, one of the world’s two primary oil fields, and the ultimate evil of its oil-rich neighbor Iran, is heir to a family legacy the roots of which run deep into Texas oil, American politics and the directorship of the CIA.


Roger Bianchini
Front Royal, Virginia

First published on November 22, 2013 as part of a pull-out section of the Warren County Report on the ongoing significance of the assassination of President John F. Kennedy 50 years later.

Opinion

Father, Son Give Insightful Review of Temple Grandin

Published

on

 

I rarely get personal with my articles, but this week’s subject resonates with me. One of the hardest days of my life was when a doctor confirmed to my wife and me what we already suspected – that my son, my oldest child had autism.

We knew that not all was lost. Compared to many, we are blessed in that he is high-functioning and incredibly smart. Yet it is still a blow to any parent knowing that their child is different and would have challenges many other kids would not have to face.

That has been the case with our child. While he excelled in school, graduating as valedictorian and earning the state regent’s scholarship, he struggled making connections. He has always wanted friends, but does not know how to talk to his fellow students. It’s difficult for people he meets to understand what is going on in that big brain of his. He struggles to look people in the eye, does not know what to say and can’t understand nonverbal cues. It’s easy to judge him as slow, until you talk to him about movies. Suddenly, he comes to life and knows more about movies than anyone I know—how they are made, who starred in and directed every film, and even types of techniques the directors used. I have read some of his movie reviews and he sees movies different than most and understands things that I did not even know I was supposed to understand.

My dream for my son is that the world understands that he is special, and that autism makes him different not lesser. That is the exact message of HBO’s Temple Grandin. Grandin’s condition allows her to see things differently which has allowed her to improve things in her field. The biopic of her life and struggles have brought hope to thousands of parents like me who want what is best for their kids.

I teach at the University of Science and Arts of Oklahoma. I could write dozens of articles about why this school is special, but the one program of which I am most proud is The Neill-Wint Center for Neurodiversity. Started by Kathy Perry and sponsored by Phillip and Katie Wint, the center’s mission is to assist students with autism spectrum disorder in their transition to college life and to help ensure postsecondary success. Partly because of the center, on Feb. 23, as part of the Emerson-Weir Liberal Arts Series, Temple Grandin herself will be the keynote speaker.

My son, who is currently part of the Neill-Wint Center and lives on campus, comes home at least once a week and we watch a movie together. With his love of all movies and me being a historian, we tend to watch a classic. However, this past week he asked that we watch Temple Grandin staring Clair Danes in preparation for her upcoming visit. I was so inspired by this movie that I knew I needed to include a review in my column.

Grandin, born in 1947, dealt with autism in a time when the condition was still relatively unknown. The movie starts with her at a boarding school where, because of a strong mother and sainted teachers who recognized her gifts, she was able to excel. It then followed her through college and graduate school. At each level there were many who stood in her way, believing she was not capable of learning. At each level, she proved them wrong. Not only did she learn, but she excelled and became a published author and expert in animal behavior.

At the end of the movie, she is attending an autism conference where parents were shown trying to handle their autistic children in different ways. When Grandin speaks up and announces that she is autistic and a has a Ph.D., suddenly every parent in the room wanted to hear everything she had to say. Symbolically, ever parent of an autistic child, including myself, was in that room. Seeing what she overcame shows each of us that our children can also. There are things I can say technically about the movie, but it only seems right that I turn that part over to my son.

“Temple Grandin is a fine film. It does a good job at explaining who Temple Grandin is, specifically when it comes to her work in agriculture that first brought her attention, as well as her life growing up with autism that most people know her for. Claire Danes is great as Grandin, disappearing into the role without her performance coming across as cartoony. Director Mick Jackson succeeds at getting across how Grandin thinks, thanks to the stylistic use of on-screen graphics as well as small cutaways to show how she interprets certain phrases literally. Temple Grandin might feel like a TV movie, and it probably won’t blow everyone’s minds away, but it’s still worth a watch.”

I am excited to see Dr. Grandin in person. Watching the movie has made her a personal hero. She broke down many barriers in higher education and paved a way for students like my son and the others to chase their dreams. It is because of people like Dr. Grandin that USAO has already seen several students in the Neill-Wint Center walk across the stage at graduation.

That day at the doctors may have been one of my hardest days, but I also know that watching my son cross the stage in April when he graduates college will be one of my absolute best.

James Finck is a professor of history at the University of Science and Arts of Oklahoma. He may be reached at HistoricallySpeaking1776@gmail.com.

Continue Reading

Opinion

Celebrating the Essential Role of School Libraries and Teacher-Librarians in Our Community

Published

on

This past year, we’ve heard about libraries being battlegrounds. In Virginia alone, books have been removed from the shelves of public schools in Hanover County, Rockingham County, and Spotsylvania County, and our very own Samuels Public Library was in the national spotlight over the summer when its refusal to bend to book-banning efforts temporarily put its funding in jeopardy. The conversation about libraries has been serious, anxious, and urgent – all understandably so. But as we observe School Library Month, let’s pause to celebrate these libraries as safe spaces, vibrant hubs of knowledge and innovation, and champions of literacy, diversity, and inclusivity.

In our county’s school libraries, students from Pre-K through Grade 12 have opportunities to engage with literature, learn how to find and use information, experience the challenges and rewards of creating and sharing knowledge, and develop important digital and technological skills. Our county’s school librarians share the same passion for education as our wonderful, dedicated teachers – because they, too, are teachers, and our librarians play a significant role in their school communities by making connections with students and supporting their personal and academic growth. School librarians in Virginia are also uniquely qualified to promote student literacy because, in addition to their training in education, they are also educated in library science to prepare for the momentous task of providing students access to information by selecting materials using informed criteria.

School librarians tie shoelaces, wipe tears (and snot!), remember students’ interests and get excited about finding them “just the right book,” offer a respite from the chaos of the school day, and always challenge students to learn more, about themselves, about others, about the world they live in and their place in it. It’s joyful to be a school librarian, and as a current student tackling the training required to take on this responsibility, I’m thrilled to experience that joy for myself in the near future. But it’s also an unsettling time to work in this field, when school librarians across the country are losing their jobs, and many of those who are able and willing to stay in their positions have found their agency and roles severely limited due to suspicion and fear surrounding libraries, books, and information.

This April, instead of debating the merits of books and questioning the intelligence and motivations of educators who have dedicated their professional and often personal lives to librarianship, let’s commit ourselves to supporting and investing in Virginia’s school libraries. Let’s celebrate their vital role in shaping the next generation of lifelong learners. Let’s do everything we can to ensure that today’s and tomorrow’s students have access to the transformative power of literacy. Our school libraries have the potential to enhance the future of our community dramatically. Let’s not stand in their way.

Lydia Buhl
Linden, Va.

(Darden College of Education & Professional Studies, Old Dominion University

LIBS 676: Library Media Services and the Curriculum, Professor Cynthia Stogdill)


Disclaimer: The opinions expressed in the letters published on this page are solely those of the respective authors and do not necessarily reflect the views or opinions of the Royal Examiner’s editorial team, its affiliates, or advertisers. The Royal Examiner does not endorse or take responsibility for the accuracy, completeness, or validity of any statements made by the authors. The Royal Examiner has not independently verified the statements and claims presented in the letters. Readers are encouraged to exercise their judgment and critical thinking skills when evaluating the content. Any reliance on the information in the letters is at the reader’s own risk.

While the Royal Examiner makes every effort to publish diverse opinions, it does not guarantee the publication of all received letters. The Royal Examiner reserves the right to edit letters for clarity, length, and adherence to editorial guidelines. Moreover, the Royal Examiner does not assume any liability for any loss or damage incurred by readers due to the content of the letters or any subsequent actions based on these opinions.

In submitting a letter to the editor, authors grant the newspaper the right to publish, edit, reproduce, or distribute the content in print, online, or any other form.

We value our readers’ engagement and encourage open and constructive discussions on various topics. However, the Royal Examiner retains the right to reject any letter that contains offensive language, personal attacks, or violations of any legal regulations. Thank you for being a part of our vibrant community of readers and contributors, and we look forward to receiving your diverse perspectives on matters of interest and importance.

Continue Reading

Opinion

Presidential Competence in an Age of Instantaneous Interaction and Decisiveness

Published

on

A bombshell was dropped last month when Department of Justice Special Counsel Robert Hur released his findings on President Joe Biden’s handling of classified documents. The good news for Biden was Hur does not plan to bring charges. However, the bad news was that even though Hur concluded that Biden was actually guilty, no jury would convict him because Biden is too old and has, “limited precision and recall.” Hur’s ultimate conclusion was that Biden is “a sympathetic, well-meaning elderly man with a poor memory.” The statements—though probably politically motivated—are damning to the president because to many they only confirm what they already suspect: Biden is too old and will never make four more years.

Age has always been a concern for presidential elections. It was one of the biggest issues facing Ronald Reagan when he ran for president at age 73. Yet that is now seeming young compared to the two presumptive candidates with Trump at age 77 and Biden at 81.

With Biden, between the reports and what seems like mental slips in the last few years, voters have wondered what happens if the president becomes mentally unable to fulfill his duties. Historically speaking, it would not be the first time a president was mentally incapacitated, only the first time it happened that the public never knew.

In 1912 the very progressive ex-governor of New Jersey, Woodrow Wilson was elected President of the United States as a Democrat, only the second since James Buchanan’s 1856 election. As a progressive, he shaped the direction of the nation including instituting income tax, direct election of senators and women’s suffrage. While in office, arguably Wilson’s two biggest events were personally the marriage to his second wife Edith Galt Wilson in 1915, and internationally the beginning of WWI in 1914. Wilson used American neutrality in the war as his campaign slogan, “Vote for Wilson! He kept us out of war,” when he ran for reelection in 1916. Yet it was only about a year later that America sided with the Allies and started shipping soldiers off to France.

Historians debate whether Wilson intended all along to enter the war. Those who believe he always planned on fighting do not believe his progressive nature would allow him to stay out of a fight that had such important outcomes. The war changed the map of Europe and toppled four major empires. Wilson knew the only way he would have a seat at the table after the war was as a fighting participate. Wilson, who held a Ph.D. in history and government from Johns Hopkins University and had served as president of Princeton University, was so confident he could solve all the worlds issues he showed up at the negotiating table with his Thirteen Points and expected to dominate the meeting. While Wilson did not get all his points into the Treaty of Versailles, he did get his most desired point: establishing the League of Nations, an international body that could solve future problems before they escalated into war.

The problem for Wilson was the Republican congress back home. Knowing the treaty needed congressional ratification, Wilson should have consulted with key Republicans on the treaty, but he was not that kind of president. When he presented the treaty to the Senate it was rejected, especially the League of Nations.

However, instead of compromising with Republican senators, Wilson took his cause to the people. He believed that the people would rally to his cause and force the Senate to accept the League. For months Wilson rode a train around the nation giving whistlestop speeches to any crowd that would listen. However, in October, overworked and physically exhausted, the president suffered from a stroke that left him paralyzed and mentally impaired.

Instead of reporting the stroke, Edith and a small group decided to cover it up and tell the American people he was suffering from exhaustion. While the government continued to function normally Edith began making the executive decisions, including meeting with cabinet members and foreign dignitaries. When Republicans demanded an audience, Edith pulled a scene straight out of Weekend at Bernie’s, where she dressed Wilson up, put his bed in the shadows with his paralyzed side to the wall. Wilson was able to pull it off with enough ability to carry a very short conversation to appease his detractors. Edith pulled off the act for over a year, knowing that, if discovered, the League of Nations would be doomed. In the end, it did not matter as the Republicans voted down American membership in the League, killing it before it even got started. As for Wilson, Edith was able to keep his secret until President Warren G. Harding was inaugurated.

I’m not sure if Jill Biden would be up to the task like Edith Wilson, but fortunately, if something were to happen to the president there are now laws in place. After the death of President John F. Kennedy, Congress pushed through the 25th Amendment that set up the line of accession to the presidency. In Section Four it states, “Whenever the Vice President and a majority of either the principal officers of the executive departments or of such other body as Congress may by law provide, transmit to the President pro tempore of the Senate and the Speaker of the House of Representatives their written declaration that the President is unable to discharge the powers and duties of his office, the Vice President shall immediately assume the powers and duties of the office as Acting President.”

This section allowed for the president to be replaced if he is no longer able to perform his executive duties. While so far this amendment has only been used temporarily, mostly for colonoscopies, there are many who believe it might be enacted for the first time no matter which elderly statesman holds the office.

James Finck is a professor of history at the University of Science and Arts of Oklahoma. He may be reached at HistoricallySpeaking1776@gmail.com.

Continue Reading

Opinion

History of American Political Parties, Part X: The Gilded Age

Published

on

For many, the Gilded Age (1877-1900) is the most boring part of political history. All the presidents were bearded white men from New York or Ohio who are hard to distinguish. In fact, it was just as hard to differentiate between Republicans and Democrats, being that neither really did very much.

Elections were always exciting as the contests were very close and because there was still no secret ballot, both parties did everything they could to influence voters including massive picnics with great deals of alcohol. Elections were such a celebration that voter turnout was around 80%. Once the secret ballot was instituted and politicians could no longer control the votes, voting rates dropped down into the 40s in the next century.

It’s surprising how large voter turnout was in the Gilded Age considering the lack of differences between the two parties. Republicans remained the classic conservatives of positive government, which simply meant they just did a little more. If we look at the government’s role based on the Preamble to the Constitution, then it has three jobs: ensure domestic Tranquility, provide for the common defense and promote the general Welfare. For domestic tranquility the main job the government did was use the army to break up strikes, really nothing else. For common defense, it did maintain a small army. But up to that point, America had never believed a democracy should maintain a standing army, that would only allow for tyranny. When an army was required, the people should filled its ranks. (Think Second Amendment.) America will not maintain a standing army until the Cold War (1947-1991). As for general welfare, the government promoted economic growth through tariffs, but that was very controversial. The government also sold cheap land in the West to bring in some income. There was no concept of any type of safety net, but the largest government expenditure was Union soldier pensions after the Civil War.

As for parties, Republicans looked closer to today’s liberals than conservatives. They were the party of big government; their constituency was comprised of businessmen because the party pushed for economic growth and protective tariffs to help American businesses. Black Americans, when they could vote, overwhelmingly voted Republican because it was the party of Lincoln and emancipation. Strong Protestants supported the party because they pushed for moral reforms like outlawing alcohol and gambling. Finally, Union soldiers voted Republican because of the pensions.

As for Democrats, they were the reverse and more closely resemble today’s Republican Party. They believed the best form of government is one that governs the least. This philosophy drew support from white Southerners who wanted the least government interference possible. It also drew support from Northern immigrants in the cities. Most of these were Catholics who believed it was the Church’s job to regulate morality, not the government.

The biggest issue for Democrats was that they were the party of white supremacy. This was not something they shied away from, but they openly supported segregation, Jim Crow laws and ending Reconstruction (the era between 1865 and 1877 where the government abolished slavery, reintegrated once-seceded states and rebuilding the South after the Civil War).

Even though elections were always close, Republicans dominated during Reconstruction and the Gilded Age. After Republican Ulysses S. Grant completed his presidency, another Republican, Rutherford B. Hayes, from Ohio, won in 1876. James Garfield, Republican from Ohio, was elected in 1880. Garfield was assassinated and was replaced by New York Republican Chester Arthur. A Democrat did win in 1884 with Grover Cleveland from New York, but Cleveland lost reelection in 1884 to Republican Benjamin Harrison from Ohio. In 1892, Cleveland came back and won making him the only two-term, nonconsecutive U.S. president. Finally Ohio Republican William McKinley won the presidency in 1896.

This last election of the Gilded Age was held in 1896. Officially part of the Gilded Age, it is my third favorite election and is a game-changer in American politics. In a Hail Mary attempt, Democrats completely changed their political ideology which guided them down the long road towards being the liberal party that they are today.

In the future, this series will resume with the 20th century and how the modern-day Republicans and Democrats came into being.

James Finck is a professor of history at the University of Science and Arts of Oklahoma. He may be reached at HistoricallySpeaking1776@gmail.com.

Continue Reading

Opinion

EDA Treasurer Urges Strategic Reflection on County Economic Choices

Published

on

I serve as the treasurer of the Front Royal-Warren County EDA. I’ve received a number of calls and questions about our recent discussion at the EDA about the house at 158 Faith Way, where Jennifer McDonald and her husband Sammy North have lived. This property was previously forfeited by Ms. McDonald and Mr. North, but they have filed suit to prevent eviction, and then followed up with a cash offer to settle that suit.

While I have no particular dogs in this fight – I voted against any settlement with Ms. McDonald or Mr. North, and will continue to do so – I believe it is important to clarify some financial information, and to correct some clear mistakes in the public record. In financial decisions details matter, and here the case is detailed, complex, and nuanced. Our elected public officials don’t seem to “do nuance” very well.

The settlement proposed with Ms. McDonald et al was, in fact, a good financial outcome for the County. Mr. North has sued the EDA in order to retain the property. This litigation will cost a minimum of ten to twenty thousand dollars, and possibly much more. The proposed action was not a sale, it was an effort to settle yet another lawsuit, and receive cash in the process.

Failing to settle the lawsuit over 158 Faith Way will cost the county more than $175,000 according to our best estimates. This is because of two details: the property has attached liens which must be deducted from a sale; and any proceeds from a sale must be split with First Bank and Trust, according to a separate legal agreement with the bank. And then there are those out-of-pocket legal costs. The EDA’s initial vote to settle the lawsuit, rather than additional (and seemingly endless) expensive litigation was made in good faith, and for the single purpose of getting more money for the taxpayers of Warren County. (And yes, I am defending the decision, even though I do not ultimately agree with it.)

The Board of Supervisors had clearly indicated to the EDA board that no more funding will be available for litigation. Similarly, we have been assured that no funding will be available for hiring a private investigator, or accountants, to pursue other assets that may be recoverable from the guilty parties. It is important to note here that in every legal proceeding the EDA has prevailed, and has recovered significant monies. But recoveries will never be enough to replace the stolen funds and cover the legal expenses.

Even after the criminal sentencing of McDonald and the conclusion of all related litigation, a larger issue will still overshadow the case, possibly for years to come.

More than $20 million was stolen from the County. Another $9 million (and change) was spent on legal fees to both find, and recover, the losses. And yet, in recent years, the Warren County Board of Supervisors has reacted by actually cutting the county’s budget. In fear of being accused of “raising taxes,” the political leadership of our community has looked the other way.

If thieves stole your entire month’s pay from your personal bank account, would you just stop paying your bills for the month? That’s the best analogy I can think of here.

A $29-million hole has been blown through our public budgets, and has not been replaced. Additional recoveries from the guilty are possible – but the decision-makers stopped paying for forensic accounting, and have said they will stop paying for litigation. So more recoveries are unlikely. Even more damning, it’s highly likely that several former public officials were either involved in McDonald’s schemes or knew of those who were. However, federal authorities have stated that they will not pursue any further prosecutions.

It is an unfortunate reality, but taxes must be raised, and I applaud the supervisors for moving in that direction. Past cuts to taxes during a period of nearly 7% annual inflation were a mistake, even before accounting for our extraordinary losses due to theft. Our community needs to learn to grapple with the ongoing, complex realities of the McDonald case.  Elected leadership must take on the challenging task of communicating hard and necessary – and unpopular – decisions to the public. The elected leadership of Warren County has struggled to understand and listen to sound advice, and take actions accordingly. The Board of Supervisors has a history of decisions that have not been in the long-term best interests of the citizens of the community, nor the economic health and vitality of the county. The blind obedience to the mantra of “no increased taxes” harms the long-term growth, vitality, and stability of our county in terms of the quality of our public schools, teacher retention, emergency services, social services from the youngest to the oldest of our citizens, and the work itself of economic development.

I want to repeat my point that financial matters are complex, and must be given better consideration than a quick public statement or a posting on Facebook. In the future, please dig deeper. I trust that each supervisor now understands, and will publicly admit, that they are in favor of spending that added $175,000 to deny any settlement with McDonald or North. I know I am, and I will continue to trade off this relatively small amount of cash for what I consider to be justice for our community.

In my view, there are things much more valuable than money. I personally opposed any settlement with McDonald, and I voted that way. Justice, integrity, and retribution are all more valuable than settling for what amounts to less than 3% of the total that McDonald owes the people of Warren County. Hopefully her upcoming criminal sentencing will begin to claw back some of that integrity. And on the financial side, our public officials will need to work to do the same.

Jim Wolfe
Front Royal, Va.
Treasurer, Warren County Front Royal EDA
Associate Professor of Management, George Mason University


Disclaimer: The opinions expressed in the letters published on this page are solely those of the respective authors and do not necessarily reflect the views or opinions of the Royal Examiner’s editorial team, its affiliates, or advertisers. The Royal Examiner does not endorse or take responsibility for the accuracy, completeness, or validity of any statements made by the authors. The Royal Examiner has not independently verified the statements and claims presented in the letters. Readers are encouraged to exercise their judgment and critical thinking skills when evaluating the content. Any reliance on the information in the letters is at the reader’s own risk.

While the Royal Examiner makes every effort to publish diverse opinions, it does not guarantee the publication of all received letters. The Royal Examiner reserves the right to edit letters for clarity, length, and adherence to editorial guidelines. Moreover, the Royal Examiner does not assume any liability for any loss or damage incurred by readers due to the content of the letters or any subsequent actions based on these opinions.

When authors submit a letter to the editor, they grant the newspaper the right to publish, edit, reproduce, or distribute the content in print, online, or in any other form.

We value our readers’ engagement and encourage open and constructive discussions on various topics. However, the Royal Examiner retains the right to reject any letter that contains offensive language, personal attacks, or violations of any legal regulations. Thank you for being a part of our vibrant community of readers and contributors, and we look forward to receiving your diverse perspectives on matters of interest and importance.

Continue Reading

Opinion

Deliberating the Constitutionality of Presidential Ballot Restrictions

Published

on

An amendment meant to keep ex-Confederates from holding office after the Civil War is once again coming into play as Colorado and Maine announced that former President Donald Trump will not appear on the ballot of their upcoming presidential primaries.

The 14th Amendment, Section 3 of the U.S. Constitution states, “No person shall be a Senator or Representative in Congress, or elector of President and Vice President, or hold any office, civil or military, under the United States, or under any state, who, having previously taken an oath, as a member of Congress, or as an officer of the United States, or as a member of any state legislature, or as an executive or judicial officer of any state, to support the Constitution of the United States, shall have engaged in insurrection or rebellion against the same, or given aid or comfort to the enemies thereof. But Congress may, by a vote of two-thirds of each House, remove such disability.”

Setting aside any feelings about Trump and simply looking at this legally, personally, I do not understand how someone not convicted of insurrection can be disqualified in the name of democracy. However, I will leave that argument to legal scholars and instead try to correct a popular online statement that Democrats have done this before with Abraham Lincoln in 1860. While it is true Lincoln was not on any ballots in the South, it’s not because of Democrats’ interference but because there was no official ballot in 1860.

During most of the 19th century, the government did not print official ballots. In fact, during the first several elections, nothing was written down at all. A voter came to the courthouse on voting day, swore on the Bible he was who he claimed he was (the first form of voter ID), and then announced his vote to the clerk who recorded it. Eventually voting turned to paper, but mostly written on scrap paper or ballots printed in newspapers, but the voting was still public knowledge. Being a public ballot allowed for political machines like Tammany Hall to form and control votes, especially from new immigrants. Eventually, parties began printing their own ballots already filled out and could pass them to their supporters. With most immigrants, a party representative met them on the docks and let them know that a job and lodging was prepared for them and all they had to do was vote for their man. Parties even color-coded their ballots to guarantee compliance at the open polls.

It was during this time that Lincoln ran for president in 1860. Since the government did not produce a ballot, there is no way it could have excluded Lincoln in the South. The exclusion actually came from Lincoln’s own party. Because the Republican Party had no foothold in the South, there were no Republicans to create or distribute a ballot.

The only slave states where Lincoln received any popular votes were along the border where the Party had some support: Delaware at 23%, Kentucky at .9%, Maryland at 2.4% and Missouri at 10.3%. As a side note, one major reason the Deep South seceded quickly after Lincoln’s victory was because he could begin to give out government jobs. The fear was Southerners might become Republicans simply for the lucrative positions and by the next election Republicans would have printed ballots.

After the Civil War, political machines continued to pressure voters with public ballots leading to calls for reform. Finally in the 1880s, states began going with the Australian System where the government printed ballots and voters submitted them in secret in an attempt to stop the corruption. It was about this time that voter turnout dropped from around 80% to 40%. With the open ballot, parties did what they could to get voters to the polls knowing they could control them. Once they lost control, parties no longer made sure everyone showed up.

If looking for a better example to fit the current situation, look no further than Eugene Debs in the 1920 Election. Debs had run for president four times as a Socialist Party candidate. This fifth time was different as he was serving a ten-year stint in prison for violating the Sedition Act. In 1918, during World War I, President Woodrow Wilson pushed through the Sedition Act making it illegal to criticize the government or the war. That same year Debs gave a speech criticizing both which landed him in jail. His sentence only grew his support and in 1920 the Socialist Party nominated “Convict 2253” for president.  Even while serving time for attacking the nation, Debs was allowed on the ballot. Probably the biggest difference is Debs only polled 3.4% of popular votes, whereas if Trump is allowed to run, he might possibly win.

We are walking in uncharted territory with Trump’s primary ban. While the 14th Amendment does not require a criminal conviction, this could set a dangerous precedent. Even when Debs was convicted, he was allowed to run. Fortunately, the Supreme Court decided to take up the case.

James Finck is a professor of history at the University of Science and Arts of Oklahoma. He may be reached at HistoricallySpeaking1776@gmail.com.

Editor’s Note: The Supreme Court’s succinct ruling made it clear: under the Constitution’s “insurrectionist ban,” states lack the authority to disqualify a federal candidate from the ballot. The Court emphasized that this responsibility lies with Congress, not the states. Consequently, the implications of this decision extend well beyond the initial dispute in Colorado, indicating its broad impact.

Continue Reading
Verified by ExactMetrics