inside sources print logo
Get up to date Delaware Valley news in your inbox

KING: The Day Peace Broke Out Between Kissinger and Schlesinger

Henry Kissinger has died age 100. I remember him through his archrival, James Schlesinger.

April 24, 1980, was a bleak day for the United States. It was the day we lost helicopters and eight men in the desert during Operation Eagle Claw, the failed attempt to rescue the hostages held by Iran.

Two Washington titans were out of office, chafing at their distance from power, their inability to take action and the attendant sense of impotence. They also disliked — no, hated — each other.

These giants were Henry Kissinger and James Schlesinger. Kissinger had been a national security adviser and secretary of state. He shaped geopolitical thinking for the latter half of the 20th century. He informed foreign policy as no other has.

Schlesinger had been the chairman of the Atomic Energy Commission, director of the CIA, secretary of defense and the first secretary of energy.

I had started covering Schlesinger as a journalist when he was at the AEC in 1971, and we formed a friendship that would last until his death.

I created The Energy Daily in 1973 and later Defense Week, high-impact newsletters dominant in their fields at the time. I wanted to know what was going on with the failed rescue attempt. Although Defense Week was weekly, we frequently put out daily supplements. Along with The Energy Daily, these were hand-delivered in Washington. We got the news out fast.

I had helped Schlesinger create the Department of Energy as a sounding board and, at times, as the public voice of his frustration with the Carter administration — where Schlesinger, a Republican, didn’t always fit.

I called Schlesinger to get the story on that fateful day in the Iranian desert. He astounded me by telling me that he was in close contact with Kissinger. “Henry has better sources than I do on this,” he said.

I remember that sentence verbatim because it was extraordinary to hear Schlesinger refer to Kissinger by his first name. I had never heard it, and, except for that day when I heard Schlesinger refer to Kissinger as “Henry” all day, I never heard it again. Before and afterward, it was always just “Kissinger,” often preceded by a derogatory qualification.

“Henry may know.” “I’ll ask Henry.” “Let me see what Henry has heard.” Schlesinger had an open line to Kissinger, asking questions on my behalf all day.

I assumed that the rift between two of the most formidable figures in Washington was bridged. Some said this animosity went back to their time at Harvard.

Certainly, it reached its zenith during the Nixon administration when both men were high officeholders with considerable input into national policy.

In 1984, Kissinger published one of the volumes of his memoirs. I asked Schlesinger if he had read the book. (He seemed to read everything.) He responded with a string of invective against Kissinger. Obscenities often flowed from Schlesinger, but this was epic. So much for first names and respect that one day of entente.

When Kissinger famously told The Washington Post’s Sally Quinn at a party that he was a “secret swinger,” he wasn’t far off. Kissinger loved the social world and his place in it.

By contrast, Schlesinger entertained sparingly at his modest home in Arlington, Virginia. My wife, Linda Gasparello, and I were there frequently, and it was always takeout Chinese food and lots of Scotch.

In all the years I knew him, Schlesinger only came to my home once, although I must have gone to his scores of times — especially toward the end of his life when he liked to talk about the British Empire with me and European history with Linda.

That one visit to an apartment I had in the center of Washington wasn’t pure socializing either. The deputy editor of The Economist, the legendary Norman Macrae, was the guest of honor. Schlesinger, then secretary of energy, was keen to meet Macrae, so he and his wife, Rachel, came.

In government, Kissinger thought Schlesinger was too hardline, too reckless in his attitude toward the Soviet Union, Iran and, later, Saddam Hussein. Schlesinger thought Kissinger’s reputation was overblown and he enjoyed the machinations of negotiation without regard to the end result.

I never formally met Kissinger. But at a dinner in Washington where Kissinger had spoken and was taking questions afterward, someone at my table asked me to ask his question on the grounds that asking questions was my job.

I thought it was a stupid question, but I asked it anyway. Kissinger glowered at me, so everyone could see who had asked the question, and declared, “That is a stupid question.”

Please follow DVJournal on social media: Twitter@DVJournal or Facebook.com/DelawareValleyJournal

CHERRY: Moral Equivalency and Moral Clarity

When teaching about the Israeli-Palestinian conflict, I leave accounts of suffering aside.  Such accounts, regardless of whose suffering is being described, are inherently partial.  Neither side has a monopoly on misery. In terms of pain and sorrow, there truly is a moral equivalency between Israelis and Palestinians.

There is no moral equivalency, however, between the two sides’ motivations for their actions.  Hamas rejects a two-state solution because it would recognize the existence of a sovereign Jewish state in what was once the abode of Islam, dar al-Islam.  The one-state solution of Hamas requires the elimination of the Jewish State of Israel.  Israel rightly refuses to be an accomplice to its own demise.

What shocks my students the most is to learn that there was never a State of Palestine.  The only independent states that ever existed on that land are the ones described in the Bible—the Kingdoms of Israel and Judah.  The names of Judah and Israel were expunged by the Romans after the Jews’ second revolt against their harsh colonial rule in the second century.  They renamed what became an administrative province Syria-Palestina to echo the Israelites’ ancient enemies, long since vanquished, the Philistines.  They also exiled the Jews who had survived the bloody suppression of those revolts.

Jews are not colonizers.  You can’t colonize your own homeland.  In the late-19th century, the age of nationalism, the Zionist movement asserted the Jewish historical claim as the indigenous people of Zion.  The Arabs who had subsequently made that area their home saw the influx of European Jews as alien colonizers.  Their perception was understandable, but wrong.

When Britain used its international mandate to unilaterally break apart Palestine in 1922, granting 80 percent of the whole to the Hashemite Kingdom to create Trans-Jordan (now called Jordan), the Zionists expected the remaining territory west of the Jordan River to become the national home of the Jewish people.  That’s what was promised to them by the Balfour Declaration of 1917 and internationally sanctioned by the League of Nations in 1921.  The Zionists had purchased the land, developed the land, and then fought for the land.  Throughout history those are the only three methods of acquiring real estate.  The Jews did all three.  Repeatedly.

Although Israel was eventually offered significantly less by the United Nations partition plan of 1947, Israel accepted.  The Arabs, alas, rejected the UN’s offer to create a second Arab state in Palestine.  Since then, Israel has fought three wars for its very existence:  1948’s War of Independence, 1967’s 6-Day War, and 1973’s Yom Kippur War.  In the intervening years, Israel made significant progress in achieving acceptance among its neighbors beginning with Egypt and Jordan and extending to the countries of the Abraham Accords.  As normalization with Saudi Arabia became an imminent possibility, Hamas lashed out.

When Egyptian President Anwar Sadat became the first Arab leader to accept Israel’s existence, Israel returned the land it had conquered during those defensive wars.  Egypt got the Sinai, but Sadat was assassinated by a member of the Egyptian Islamic Jihad.  In the eyes of these Islamists, recognition of and rapprochement with Israel is a capital crime.

Fundamentalist Islamist regimes and organizations, like Iran, Hezbollah, Islamic Jihad, Isis, Al-Qaeda, Islamic Brotherhood, and Hamas, are engaged in a jihad that brooks no compromise.  Their problem is not with the Israeli occupation, it is with the existence of a sovereign Jewish state.  Thus, when Israeli military operations to incapacitate Hamas inevitably, and tragically, cause civilian deaths, there is no moral equivalency.  Hamas murders civilians to murder civilians; when Israel unintentionally kills civilians, it is to eliminate the threat of Hamas committing additional barbaric atrocities in the future.

For many of us, our moral reflex is to show mercy on the innocents caught in the crosshairs.  But a ceasefire before Hamas is extirpated would amount to nothing but a brief interlude allowing Hamas to rearm with even more deadly weapons.  In our imperfect world, moral clarity dictates the eradication of Hamas for the possibility of peace.

Please follow DVJournal on social media: Twitter@DVJournal or Facebook.com/DelawareValleyJournal

KING: Artificial Intelligence — the Greatest Disruptor Ever?

To rephrase Leon Trotsky: You may not be interested in artificial intelligence, but artificial intelligence is interested in you.

Suddenly, long-rumored and awaited, AI is upon the world—a world that isn’t ready for the massive and forever disruption it threatens.

AI could be the greatest disruptor in history, surpassing the arrival of the printing press, the steam engine, and electricity. Those all led to good things.

At this time, the long-term effects of AI are just speculative, but they could be terrifying, throwing tens of millions out of work and making a mockery of truth, rendering pictures and printed words unreliable.

There is no common view on the impact of AI on employment. When I ask, the scientists working on it point to the false fears that once greeted automation. In reality, jobs swelled as new products needed new workers.

My feeling is that the job scenario has yet to be proven with AI. Automation added to work by making old work more efficient and creating things never before enjoyed, and, in the process, opening up new worlds of work.

AI, it seems to me, is all set to subtract from employment, but there is no guarantee it will create great, new avenues of work.

An odd development, spurred by AI, might be in a revival of unionism. More people might want to join a union in the hope that this will offer job security.

The endangered people are those who do less-skilled tasks, like warehouse laborers or fast-food servers. Already Wendy’s, the fast-food chain, is working to replace order-takers in the drive- through lanes with AI-operated systems, mimicking human beings.

Also threatened are those who may find AI can do much, if not all, of their work as well as they do. They include lawyers, journalists, and musicians.

Here the AI impact could, in theory, augment or replace our culture with new creations; superior symphonies than those composed by Beethoven or better country songs than those by Kris Kristofferson.

I asked the AI-powered Bing search engine a question about Adam Smith, the 18th-century Scottish economist. Back came three perfect paragraphs upon which I couldn’t improve. I was tempted to cut-and-paste them into the article I was writing. It is disturbing to find out you are superfluous.

Even AI’s creators and those who understand the technology are alarmed. In my reporting, they range from John E. Savage, An Wang professor emeritus of computer science at Brown University, to Stuart J. Russell, professor of computer science at the University of California, Berkeley, and one of the preeminent researchers and authors on AI. They both told me that scientists don’t actually know how AI works once it is working. There is general agreement that it should be regulated.

Russell, whose most recent book is “Human Compatible: Artificial Intelligence and the Problem of Control,” was one of a group of prominent leaders who signed an open letter on March 29 urging a six-month pause in AI development until more is understood—leading, perhaps, to regulation.

And there’s one rub: How do you regulate AI? Having decided how to regulate AI, how would it be policed? By its nature, AI is amorphous and ubiquitous. Who would punish the violators and how?

The public became truly aware of AI as recently as March 14 with the launch of GPT-4, the successor to GPT-3, which is the technology behind the chatbot ChatGPT. Billions of people went online to test it, including me.

The chatbot answered most of the questions I asked it more or less accurately, but often with some glaring error. It did find out about a friend of my teenage years, but she was from an aristocratic English family, so there was a paper trail for it to unearth.

Berkeley’s Russell told me that he thinks AI will make 2023 a seminal year “like 1066  [the Norman Conquest of England].”

That is another way of saying we are balanced on the knife-edge of history.

Of course, you could end AI, but you would have to get rid of electricity — hardly an option.

HOLY COW! HISTORY: The Woman Who Created Kids’ Television

Before there was “Sesame Street” …

Before there was “Mister Rogers’ Neighborhood” …

Before there was “Romper Room” …

Before there was “Captain Kangaroo” …

There was Miss Frances and “Ding Dong School.” The show not only was the first in its genre, it literally created children’s television — and it set the bar very high, too. Let’s hop into the Wayback Machine and revisit 1952.

Television was brand new back then. TV stations were launching all over the country, and big, cumbersome televisions were popping up inside more and more American homes.

Judith Waller was the public service and educational programming director at WNBQ-TV in Chicago. Most local stations produced hours of programming daily, far more than they do now. As Waller was talking with her boss one day, he noted that with the Baby Boom in full swing, there were more than 235,000 preschool children in the Chicago area. Then he pointedly asked, “What are you going to do about it?”

Waller rolled into action. She devised a nursery school program designed to teach tykes watching at home. Because viewers would be little people, the show used six cameras that shot from angles toddlers would see. All props would be easily recognizable to little children.

Auditions were held for the program’s host. Frances Horwich was one of the educators who tried out for the gig. A woman of a certain age with a kindly disposition, she headed a local college’s education department. She lacked showbiz experience but had once taught nursery school. While being alone on set for a full hour each day scared her, she thought, “Why not?” and gave it a try. 

She was soon hired to be “Miss Frances,” AND then successfully negotiated to own the rights to the show. When the producer’s 3-year-old son was told each episode would begin with an old-fashioned teacher’s desk bell ringing, he blurted out, “Ding Dong Show!” They had the program’s title.

A pilot episode was filmed. One horrified station executive said the show was so bad it would kill television and make viewers listen to the radio again. So it was decided to air the program just once. WNBQ didn’t issue a press release announcing it or promoting it in any way. They figured they’d let it die of its own embarrassment.

And so, with no fanfare, “Ding Ding School” made its debut on Thursday morning, October 2, 1952. Primitive by today’s hi-tech standards, it began with a closeup of Miss Frances’ hand ringing the aforementioned bell, followed by a cheesy studio organ playing the show’s equally cheesy theme song. Miss Frances sang in a warbly voice that was better suited for a country church choir than television:

“I’m your school bell

ding dong ding;

boys and girls

all hear me ring.

Every time I

ding dong ding,

come with me

to play and sing.”

Then she jumped into the lesson. Miss Frances talked like she was speaking to actual children. “How are you, boys and girls? What are you doing today? (Pause.) Really? That’s good!”

WNBQ’s big brass cringed for an hour until the show ended. No one expected what came next. The station’s switchboard was flooded with more than 150 calls in 45 minutes as parents told how their kids loved the program. That was nothing compared to the tidal wave of enthusiastic fan mail that followed.

“Ding Dong School” instantly became part of WNBQ’s morning lineup five days a week. It was such a hit that NBC picked it up in 1953 and broadcast it nationally. Miss Frances even became a TV star. More than 12,000 children and parents attended a promotional event in Boston. When she and her husband flew to Florida for a vacation, kids on the plane recognized her — and sang the show’s theme song over and over all the way to Miami. (Likely making it history’s most miserable flight for the other passengers.)

But TV is a cutthroat business. Despite its success, NBC canceled “Ding Dong School” in 1956 for the more lucrative “The Price Is Right.” The show continued in syndication until 1965.

Miss Frances eventually moved to Arizona, where she dabbled in local public television until her death in 2001 at age 94.

Children’s television is one of broadcasting’s few success stories. And it’s largely due to the huge influence of a teacher and her little bell.

Please follow DVJournal on social media: Twitter@DVJournal or Facebook.com/DelawareValleyJournal

Holy Cow! History: From Mom to Motherhood Icon

Anna was a typical mom. She loved her kids with the passion found only in a mother’s heart. And something she did for her son made her an icon for mothers everywhere.

Here’s how it happened.

They say a woman can’t resist a man in a military uniform. That apparently was true with Anna. In the 1820s, she fell in love with a West Point cadet named George. They married in 1831.

A Southern belle from North Carolina, Anna and George started life together in the North. Their first child, a boy named James, arrived in 1835. George resigned his Army commission and jumped into designing that brand-new high-tech transportation marvel — railroad locomotives. He soon switched to constructing rail lines.

The young family grew. Besides raising three children from George’s first marriage, the couple had three more sons, two of whom died young. Anna doted on her surviving boys. William was a serious scholar, while James was a daydreamer with an artistic gift. Anna nurtured and encouraged both. She was strict (Sundays were strictly observed with no toys and no books allowed but the Bible) yet also very loving.

George’s skill at building railroads eventually led the family halfway around the world. Russia’s Czar Nicholas I sent representatives to study America’s booming railroad business. They were so impressed with George’s skills that they offered him his dream job: supervising the construction of a railroad linking Moscow and St. Petersburg. George checked with Anna, who said, “Go for it.” So, they headed off to Mother Russia, where they became friends with the czar and socialized with nobility.

By now, James’ talent as an artist was apparent. Anna pulled some royal strings and enrolled the boy in the prestigious Imperial Academy of Arts. The happy family seemingly had a bright future ahead.

Until George contracted cholera and suddenly died in 1849.

A sympathetic Nicholas offered to educate the boys at the Imperial School, but Anna politely declined. Shaken and heartbroken, she gathered up her children and moved first to Connecticut, then New York.

With her income slashed from $12,000 a year to just $1,500, pennies were pinched, dollars were squeezed, and somehow she made ends meet. She was even able to send William to medical school.

Though Anna hoped James would become a minister, he was appointed to the U.S. Military Academy when he turned 17 instead.

But James and West Point weren’t a good fit. After three years of lackluster studies, the breaking point came when he failed a chemistry exam and quietly resigned. (James later said, “If silicon had been a gas instead of a solid, I’d be a major general today.”)

Free at last to indulge his love of painting, James headed first to Paris and then London.

While all that was happening, America was sliding ever closer to a civil war. Anna and James returned to her native North Carolina, where he became a Confederate surgeon. As the conflict raged, Anna increasingly missed her son across the pond. The Union’s naval blockade of Southern ports stood between them. But no cannon was powerful enough to stop a mother’s love. On a dark August night in 1863, Anna boarded the blockade runner Advance. It was a daring thing to do for a woman who was pushing 60. Yet Anna was determined.

The Advance slipped through the patrolling warships, and she had a joyous reunion with James at his London studio. Though caught off-guard by his flamboyantly bohemian lifestyle, she nevertheless showered Southern hospitality on his many friends by serving them tea, preserves and homemade biscuits.

A few years later, James asked Anna to pose for him. Some said she filled in for a model who couldn’t make it; others claim James intended his mother to be the subject all along. We do know he wanted her to stand. But now 67 and her health failing, he wound up painting her seated in profile, hands properly folded in her lap.

Entering a VIP showing in London in 1872, James titled his work “Arrangement in Grey and Black No. 1.” Victorian critics wouldn’t accept it as an arrangement since it was clearly a portrait, so they renamed it “Portrait of the Artist’s Mother.”

It eventually morphed into the name we know today. Because James was James McNeill Whistler, Anna was Anna McNeill Whistler, and the portrait on display today at the Musee d’Orsay in Paris is simply dubbed “Whistler’s Mother.”

More than the likeness of one man’s mother, it is an enduring tribute to the love of mothers everywhere.

FLOWERS: In 2022, SCOTUS Righted a Grave Wrong

I generally hate year-in-review columns. They seem forced, like a list of things you must buy at the grocery store. Check this off, and then this, and we did this, and I need that, and we are out of this, and can we have extra of that, etc. Years blend into each other and it’s often hard to pick exceptional events, particularly since the same things seem to happen over and over again: Wars start and continue, and we think they end, and then they’re prolonged.

People die (surprise!) and we reflect on their lives, even when we might have forgotten they were still alive. Couples divorce and then find other partners they will eventually cast off in search of the perfect fit. Fads spring out of nowhere and insecure people with no particular talent film themselves on once-obscure social media apps in the hopes of boosting their self-esteem (after artificially boosting their lips and bosoms). Year after year, the same things tend to happen, and we try and frame them in a context where they seem historic.

But this year, something historic did happen, something that many people despaired of ever seeing, even though hope is the last thing to die. Since this is my column, this is my perspective. You won’t hear me talking about the tragic war in Ukraine, the January 6th Committee results, the disappointing red trickle at the mid-terms, the death of Sidney Poitier, or any of the other things that were indeed important (and about which I’ve written) but which did not stand out as the central, sea change event of 2022.

What defines for me the alpha and the omega of this year, the San Andreas Fault that splits two diametrically opposed tectonic plates, the BC and AD of our current historical timeline, is the Dobbs decision overturning Roe v. Wade.

That case, like Brown v. Board of Education, brought down a monolith of injustice: Legalized abortion by judicial fiat. The fiat was created by seven old men who ignored the voices of the American people and reached into some insubstantial and fictitious folds of constitutional jurisprudence to pluck out the right to kill a child.

Most would not be quite so blatant about it. They would replace “child” with “pregnancy” and “kill” with “terminate.” Those are the accepted terms in polite conversation, even though there is never anything polite about discussions around abortion. But the truth is clear and has been for almost 50 years.

January 22, 1973 would have marked one of the bloodiest half centuries known to modern society. That would have been the anniversary of the date Roe v. Wade, the decision to legalize abortion, was handed down and announced by that all-male court. I keep emphasizing the gender of the justices since we have been force-fed a diet of “if you can’t get pregnant, you have no right to have an opinion” by pro-choice advocates. I am going to be generous herein using the term that they prefer, pro “choice,” even though I would invite the reader to reflect on what “choice” we are discussing. There are only two: Life and death. Pro-choice advocates find both to be equally acceptable. Roe v. Wade supported that position and perpetuated a myth that there was virtue and legitimacy to the idea that women have dominion over their own bodies and the body growing within them.

But in 2022, after 50 years of lost potential and lives sacrificed to convenience and a skewed sense of autonomy, a court composed of men and women ruled that abortion was no longer a “right” and that, indeed, it never had been. And even though the reaction was brutal and there are continued attempts to codify abortion rights into law, and even though there are states where women will continue to be able to “choose” termination, there is now in this great country where immigrants find shelter and the oppressed find solace, an understanding that you cannot simply make up a right to do whatever you want, simply because you want to do it.

That principle transcends the issue of abortion. In 2022, women and men were told that no matter how much they want to engage in magical thinking and read the Constitution as a blueprint for living the lives they want, in the way they want, on the timeline they want, there are principles that are larger than their own narcissistic desires. One of them is the respect owed to other lives.

That is a lesson we should have figured out after the Civil War. It’s still a lesson we need to learn, and 2022 is bringing us closer to the point where we’re finally getting the message.

Please follow DVJournal on social media: Twitter@DVJournal or Facebook.com/DelawareValleyJournal

POWELL: Biden Assessment–Year One

An assessment of President Joe Biden’s first year clearly shows weakness in issues management. He has allowed more issues to become crises than he solved: Energy, border, urban violence, inflation, supply chain, fentanyl deaths, the aftermath of COVID-19 policies, and Afghanistan.

Biden has two tendencies that contribute to his lack of success in issues management: Failure to get ahead of issues and misreading residual costs associated with his decisions. He was caught short on inflation and has still not acknowledged the residual issues associated with open borders and his energy policy.

Every president has had to manage complex issues and crises while in office. So, how does Biden stack up to some of his predecessors? The issues he has been called on to manage are much less grave in scope than the dual crises, Great Depression and World War II, faced by President Franklin Roosevelt. The decisions he has made are of lesser magnitude than President Harry Truman deciding to drop two atomic bombs on Japanese civilians to end the war.

He has neither demonstrated the political finesse of President Dwight Eisenhower in defusing the Little Rock school desegregation issue in 1957, nor the accountability of President John Kennedy who immediately owned up to the failure of the Bay of Pigs invasion, or President Ronald Reagan who publicly admitted his mistakes after the Tower Commission Report on Iran-Contra.

He has not shown the agility of President Bill Clinton to triangulate after the American people resoundingly rejected his liberal approach to government in his first two years, nor has he shown the humility of President George H.W. Bush in ending the First Persian Gulf War when his stated objectives were reached and a retreating army could easily have been decimated.

The end of his first year shows a president who is following the example of President Lyndon Johnson, who squandered his presidency in pursuit of victory in Vietnam; President Jimmy Carter, whose presidency was paralyzed by the Iran Hostage Crisis, President Richard Nixon, who refused to acknowledge the consequences of Watergate until it was too late to save his presidency; President Gerald Ford, whose pardon of Nixon failed to heal the nation and sunk his presidency; and President George W. Bush, who unilaterally expanded his mandate to wage the War on Terror and lost the support of the American people as a result. Biden during his first year would not abandon “Build Back Better” even though it had no chance of passing in the Senate.

There is ample evidence, based on Biden’s “secret” meeting with historians at the White House in March 2021, to support his desire to be great. The vehicle he has created to accomplish that goal appears to be his “Build Back Better” initiative. The $5 trillion program is transformative and it could be argued that it would set the stage for a post-capitalism economy in America.

The World Economic Forum defined the launch point for the move toward more global governance and “social capitalism” as the COVID-19 pandemic but events, including the invasion of Ukraine by Russia, are overtaking it in importance and causing the Western nations to pause and rethink many of their assumptions about energy and defense of the free world.

Effective crisis management is what elevates presidents to the pantheon of greatness. George Washington invented the presidency because he was the first president and was truly the “indispensable man” of his era. Abraham Lincoln embraced our founding values (rights are God-given and all men are created equal) as the moral foundation for civil war. Franklin Roosevelt asked for and was given a mandate from the American people to take extraordinary measures to tame the Great Depression. Lyndon Johnson embraced the ripeness of the civil rights issue and used his formidable political skills to enact his Great Society agenda. The opportunity for greatness is created in large measure by the times in which a president serves in office. Greatness cannot be achieved simply because a president wants to be great.

Biden was given the mandate to heal our nation’s divisions. Arguably, he has exacerbated them in year one. With razor-thin majorities in Congress, the opposite of FDR in 1933, Biden has tried to push through his own version of the “New Deal.” Lacking broad-based popular support, he desired to enact his own version of the “Great Society.”  Biden misread FDR, who assiduously avoided his self-imposed third rail – being defined as a socialist because of his policies. He also did not give enough credit to LBJ, who was a master legislator and the hardest worker in Congress, long before he became president. Their significant accomplishments were not accidents nor were they givens.

There may be unforeseen issues and crises over the next three years that will provide Biden with the opportunity to achieve greatness. But that is not where he finds himself as he begins 2022. At this point, it would be best for him to focus on what he was elected to do – heal our divisions.  Then, when opportunities present, he would have a consensus to act boldly.

 

 

MYERS: Rediscovering America: A Quiz for Martin Luther King, Jr. Day

January 17 is Martin Luther King Jr. Day. On this holiday, we celebrate one of the great civil rights leaders of the 20th century. The Rev. King challenged Americans to uphold the Declaration of Independence’s promise “that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.”

The quiz below, from the Ashbrook Center at Ashland University, provides an opportunity for you to test your knowledge of King and his efforts to further civil rights in America.

 

1. King became pastor of which Georgia church in 1960?

A. Wheat Street Baptist Church

B. Milestone Baptist Church

C. Ebenezer Baptist Church

D. Zion Baptist Church

 

2. Which leader inspired King’s belief in “nonviolent resistance” and, according to King, provided “the method for social reform that I had been seeking”?

A. Nelson Mandela

B. Dalai Lama

C. Eleanor Roosevelt

D. Mahatma Gandhi

 

3. King said in an interview that if he were marooned on a desert island with only one book other than the Bible, he would choose what book?

A. Plato’s “The Republic”

B. John Stuart Mills’ “On Liberty”

C. Ralph Waldo Emerson’s “Essays by RW Emerson”

D. James Baldwin’s “Notes of a Native Son”

 

4. After which civil rights victory did King become a recognized national leader of the civil rights movement?

A. Civil Rights Act of 1964

B. Montgomery Bus Boycott

C.  Birmingham Campaign of 1963

D. School integration in Little Rock

 

5. In what city in 1968 was King assassinated?

A. Memphis, Tenn.

B.  Jackson, Miss.

C. Little Rock, Ark.

D. Selma, Ala.

 

6. Before his famous “I Have a Dream” speech, King gave another speech from the Lincoln Memorial. What was the topic of that speech?

A. Vietnam War

B. Importance of nonviolence

C. Economic equality

D. African American voting rights

 

7. In 1964, King received what important award?

A. United Nations Human Rights Award

B. Nobel Peace Prize

C. Presidential Medal of Freedom

D. Congressional Medal of Honor

 

8. Which other civil rights leader did King work closely with before his death?

A. Ralph Abernathy

B. Jesse Jackson

C. Benjamin Mays

D. All of the above

 

9. What was the name of the speech King gave the day before his assassination?

A. “I’ve Been to the Mountaintop”

B. “The Three Evils of Society”

C. “Our God Is Marching On”

D. “The Other America”

 

10. Which president signed legislation designating Martin Luther King Jr. Day as a federal holiday?

A. Barack Obama

B. Jimmy Carter

C. Ronald Reagan

D. George Bush

 

Answers: 1-C, 2-D, 3-A, 4-B, 5-A, 6-D, 7-B, 8-D, 9-A, 10-C

Follow us on social media: Twitter: @DV_Journal or Facebook.com/DelawareValleyJournal

HOLY COW! HISTORY: The Drinking Song Christmas Classic

t’s one of the very first Yuletide songs the youngest of children learn. The lyrics are easy and its music is peppy, making it the ideal holiday song for little kids. Who among us hasn’t heard toddlers warbling at the top of their little lungs to adoring parents,

“Jingle bells,

Jingle bells,  

Jingle all the way.”

Mom and dad probably wouldn’t be beaming so happily if they knew the backstory of that beloved Christmas classic. Because when the song originally came out, it certainly wouldn’t have been considered “age-inappropriate” by today’s standards.

It was called “One Horse Open Sleigh” when it was first published in 1857. Even back then, its words conjured images of simpler, more carefree times.

It’s difficult for 21st-century minds to imagine now, but a heavy snowfall often brought a treat to 19th-century folks. Consider these lines from a letter written in February 1865 by a Union soldier to his girlfriend in Upstate New York. “I presume you did not fail to take advantage of the deep snow. I imagine I see you on a cold pleasant moonlit night gliding over the crystal surface preceded by nettlesome steeds and the pleasant ringing of musical bells which seem to mock the joyous laughter of you and your companions.”

After enjoying one such sleigh ride, James Lord Pierpont recognized a song was waiting to be written.

He was an interesting character. Son of a New England minister, he ran away to sea at age 14, later returned, married, and started a family. He settled in Medford, Massachusetts where his father pastored a Unitarian Church. The urge to wander returned with the 1849 California Gold Rush. Pierpont wound up in San Francisco where had a store and a photography studio before losing both in a fire. He returned to New England flat broke.

After his wife died in 1856, Pierpont’s brother accepted the pastorate of a Unitarian Church in Savannah, Georgia. Pierpont tagged along to serve as music minister. He wrote songs and gave organ and singing lessons on the side, all the while composing a steady stream of music himself. He became successful enough to take the daughter of Savannah’s mayor as his second wife.

In 1857 he released the song we still sing today. But “One Horse Open Sleigh” bombed so badly when it came out that he had to rebrand it. It was re-released in 1859 as “Jingle Bells.” Even then it was far from a best-seller.

Ironically alcohol, not schoolkids, spread the tune. “Jingle Bells” became a popular mid-Victorian drinking song with singers clinking their glasses to imitate the sound of bells. And get this–the lyrics were even considered racy for the time, too. A young couple sleigh riding without a chaperon? Hubba-hubba! Risqué stuff.

Despite the initial setbacks, “Jingle Bells” has stood the test of time. But one question still lingers: Where exactly did Pierpont pen the piece?

Medford, Massachusetts, and Savannah, Georgia each claim to be the song’s birthplace. A plaque in downtown Medford asserts Pierpoint wrote it at the Simpson Tavern in 1850. It even cites a Mrs. Otis Waterman who said she remembered it.

Nonsense, Savannah adherents huff. Just look at the calendar. Pierpont was living in that coastal city when the song was first published. It stands to reason it was written there as well.

Which claim is accurate? Who knows! While history lovers like to nail down these little details, in this case, it really doesn’t matter.

From the moment it was recorded for the very first time on an Edison wax cylinder in 1889 to countless elementary school Christmas concerts around the country in 2021, it remains a timeless favorite. Which is why James Lord Pierpont is honored in the Songwriters Hall of Fame.

So, remember him the next time you belt out “Jingle Bells” at a Christmas party. And if the kids have been put to bed, go ahead and clink your glasses. Pierpont wouldn’t mind.

Follow us on social media: Twitter: @DV_Journal or Facebook.com/DelawareValleyJournal

HOLY COW! HISTORY: Famous Candy Bars’ Name Game

In the aftermath of big elections in Virginia and New Jersey, pundits say a Red Wave may be replacing a Blue Wave. But one thing is certain: America’s children are riding a Sugar Wave as they finish the last of Halloween’s trick-or-treat candy.

Which poses an interesting question: How well do you know the stories behind the names of America’s beloved candy bars?

We begin with the Kit Kat bar. One day in the 1930s, a worker at a large confectionary plant in York, England slipped a recommendation into a suggestion box for a small candy bar that a working man easily could carry in his lunchbox. Britons quickly fell in love with the crunchy taste when Rowntree’s Chocolate Crisp debuted in 1935. But they were less enthusiastic about its clunky name.

Reaching into English history, when a mutton pie called a Kit Kat was served at 18th-century meetings of the political Kit-Cat Club in London, the name was resurrected. The new title played equally well when it was introduced on this side of the Atlantic after World War II.

Think the gooey goodness of the Milky War candy bar was inspired by our galaxy? Think again.

When Mars Candy rolled it out in the early 1920s, it borrowed the name of a milkshake that was popular at the time. Americans didn’t mind that bit of plagiarism because when Milky Way went national in 1925, it racked up $800,000 in sales, or about $12.5 million today. Not bad when you consider they sold for nickel apiece.

Speaking of Mars, who hasn’t at one time or another sunk a sweet tooth into the nougat on peanuts on caramel on milk chocolate sensation of a Snickers bar? A logical guess would be its name originated from snickers that followed an amusing idea. But no. Snickers was actually named for the Mars family’s favorite horse!

Then there’s the widely popular 3 Musketeers Bar. What on earth does Alexander Dumas’ 1844 novel about 17th-century swashbuckling adventurers have to do with fluffy, whipped mousse covered in milk chocolate?

When it debuted in 1932, it was different from the candy bar we know today. The original version had three sections: chocolate, vanilla, and strawberry. Three tastes led to 3 Musketeers. Rationing during World War II forced Mars to drop the vanilla and strawberry pieces. Americans seemed happily content with just the chocolate part because it remained popular after wartime restrictions ended.

What about M&Ms? Americans were devouring the pill-sized sweets long before rapper Marshall Mathers piggybacked on its popularity by calling himself Eminem. The concept was taken from candy eaten by soldiers during the Spanish Civil War of the 1930s. A hard coating kept chocolate from melting in hot climates.

When M&Ms debuted exactly 80 years ago this September, its name was drawn from confectionary royalty. It was created by using the first letter in the last name of Forrest Mars, son of legendary Mars Candy founder Frank Mars, and Hershey Chocolate president William. F.R. Murrie, who owned 20 percent of the product.

Which brings us to the mother of all candy bar names.

When Chicago’s Curtiss Candy Company introduced its combination of peanuts, caramel, and chocolate Kandy Kake in 1920, it experienced what Kit Kat’s makers encountered. Folks loved the taste but hated the name. In 1921 it became Baby Ruth. It just so happened that at that precise moment, a certain New York Yankee named George Herman Ruth was knocking out home runs on his way to becoming a baseball superstar. So, Baby Ruth was named in honor of Babe Ruth, right?

Oh, no, Curtiss claimed with a straight face. The new name was actually a tribute to President Grover Cleveland’s daughter. Born in the White House in 1891, she was nicknamed “Baby Ruth.” Americans at the time were captivated by the child, following her first words, her first steps, and so on.

But believing Americans were motivated to plunk down a nickel for a candy bar named after a girl born 30 years earlier (and who sadly died of diphtheria at age 12) stretched credulity. A more likely explanation is the “Baby Ruth” Cleveland claim was a cover story that kept Curtiss from having to pay royalties to the Sultan of Swat.

By the way, Baby Ruth’s progenitor did some name changing of his own. Otto Schnering originally sold candy under his last name for years. Until World War I, when having a Germanic surname was suddenly bad for business. So, he adopted his mother’s maiden name, and it was the Curtiss Candy Company from then on.

What’s in a name, Shakespeare famously asked? When candy is involved, plenty!

Follow us on social media: Twitter: @DV_Journal or Facebook.com/DelawareValleyJournal