Now that I’ve gotten your attention, I’m tempted to use twisted phrasing, bent facts, key omissions and other such dark marketing powers to paint you a harrowing picture of my circumstances. Instead, I’ll go ahead and tell you the less-than-pitiable truth of how I got here.
Nine days ago, I returned from a lovely five day vacation in the tropical waters of Grand Cayman with my boyfriend and his family. When we first arrived in Grand Cayman, things in the U.S. were just starting to get weird, but hadn’t yet gotten bad. Through sheer luck, we even managed to time our return plane trip so that we avoided the mass panic at DFW International Airport. Still, since we’d been “traveling internationally,” Boyfriend and I decided to quarantine ourselves for the requisite 14 days in an abundance of caution.
So, essentially, I was sheltering in place before it was cool.
My boyfriend and I have since spent most days working from home together – him holed up on one end of the room with headphones in, and me holed up on the other end of the room with headphones in. For the interest of the general public, here is a non-comprehensive list of items we have accomplished during this time, ranked in order of Most Useful to Most Useless. Items in italics were accomplished by Boyfriend.
Attended virtual church and virtual church community group – I’ll say it: quarantine ain’t what it used to be back in the good ol’ Medieval ages.
Finalized multiple work-related projects – not to go all sappy, but we’re both incredibly lucky to work for companies that give us remote work capabilities.
Applied a liberal dosage of WD-40 to Girlfriend’s squeaky bathroom door.
Called my Grandma – she’d left a voicemail while I was flying over the Gulf on Monday that didn’t actually show up on my phone until Friday. What strange corridors of the Verizon network it got lost in, we may never know.
Took a field trip to Boyfriend’s house (we’ve taken the stance that if one of us is infected, the other probably is too) and cleaned out his closets. Many dust bunnies were slain.
Discovered a 12 pack of toilet paper I’d bought a month ago and forgotten in the back of my car. Yes, I also hoarded toilet paper before it was cool.
Exercised almost every day – I’ve been exploring workout videos on fitnessblender.com and utilizing my stationary bike.
Persuaded girlfriend to join in daily exercise, mostly through the use of bribery via chocolate cake.
Wiped off my bathroom counter for the first time in none-of-your-business.
Successfully made spaghetti squash.
Learned there was more than one way to cook spaghetti squash.
Threw a shark themed birthday party for two – Boyfriend’s birthday was this week, and I’d managed to order some shark-themed decorations (streamers, balloons, tablecloth, etc.) from Amazon right before things went from weird to weirder. Why shark? Unclear; some scholars point to origins as a macabre joke surrounding scuba diving in Grand Cayman.
Played Small World with boyfriend, a board game that I’d had on my shelf for quite some time, but hadn’t actually gotten into yet.
Learned how to play Small World, subsequently beat girlfriend at Small World.
Played Villainous, a Disney villain-themed board game; ditto on the extended shelf life.
Learned how to play Villainous, subsequently beat girlfriend at Villainous.
Taught girlfriend how to play solitaire.
Learned how to play solitaire.
Played approximately 24 games of solitaire in 24 hours.
Accepted delivery of a large coffee cup full of gin, a six pack of rose cider, three rolls of toilet paper and a potato. No further elaboration will be given at this time.
Played approximately 54 games of Speed with Girlfriend.
Lost approximately 50 games of Speed.
Got Boyfriend to watch Frozen II.
Eventually got Boyfriend to stop pointing out plot holes in Frozen II and just enjoy the music.
Finished the TV show Firefly and watched its cinematic continuation, Serenity.
Yelled at Boyfriend because of certain [spoiler] in Serenity.
For some reason, spent approximately five minutes digging out my sticker collection so I could give Boyfriend a snail sticker and tell him it was the “snail of approval.”
Inundated Boyfriend with approximately 80,000 coronavirus memes.
Hid the chocolate cake while Boyfriend was in the bathroom.
Dropped ice down the back of Girlfriend’s shirt to extract information about whereabouts of chocolate cake.
Taped balloons to streamers so that the balloons hung from the ceiling.
Spent approximately ten minutes of work day “boop-ing” balloons on head.
Spent approximately ten minutes of work day “boop-ing” balloons on head.
Taped mass of balloons to girlfriend’s desk while she wasn’t looking.
Dropped roll of toilet paper in the toilet.
When asked what he was doing, Boyfriend said he was shuffling cards. When asked why, he said, “’Cause every day I’m shufflin.’”
As their definitions quickly make clear, short stories, novelettes, and novellas are all short pieces of prose fiction. What, then, differentiates these different literary categories?
Short stories are the briefest of these three prose genres. While most definitions do not include a word limit, Random House Kernerman Webster’s College Dictionary says in its description that the general rule is that short stories are typically no more than 10,000 words (“Short Story”). An article from WriterMag.com places the cap for a short story at 7,000 words (“The Novella”). To put these estimates in perspective, a short story of 10,000 words would be about 40 pages of text if written double-spaced with a basic 12-point font.
One unique element of the short story is that it tends to include few characters and focus on one theme. This creates the “unity of effect” that is characteristic of this genre, according to the definition from the American Heritage Dictionary (“Short Story”).
The short story in action: “Signals” and other works by Tim Gautreaux, “The Gift of the Magi” by O. Henry, and “The Tell-Tale Heart” by Edgar Alan Poe
While novelettes lack a prescribed length, just like short stories and novellas, they tend to be between 8,000 and 15,000 words long (“The Novella”). A work of 15,000 words would be about 60 pages, using the same formatting listed above.
According to the Collins English Dictionary, common characteristics of the novelette are that it is “slight, trivial, or sentimental” (“Novelette”).
The novelette in action: “Two Hearts” by Peter S. Beagle, “—That Thou art Mindful of Him” and “The Bicentennial Man” by Isaac Asimov, “Ender’s Game” by Orson Scott Card, and “The New Atlantis” by Ursula K. Le Guin
A novella is longer and more complex than a short story. This type of prose fiction often includes a moral lesson or satirical elements. In an article for The New Yorker, columnist Ian McEwan likens the novella to a movie and estimates that a typical screenplay averages 20,000 words, which he indicates is the normal length of a novella as well. An estimate from an article on the website Writers Digest by Chuck Sambuchino and from WriterMag.com puts the length of a novella between 20,000–50,000 words, with 30,000 as the average (“The Novella”). This means that the novella is twice the length of a short story in its briefest form.
Like a movie, a novella is more complex than a short story and may include one or two subplots and some rich character development, but within the constraints of a more abbreviated space than a novel would allow (McEwan).
The novella in action: Candide by Voltaire, The Decameron by Boccaccio, The Metamorphosis by Franz Kafka, and Heart of Darkness by Joseph Conrad.
What do writers do when they are procrastinating while putting together a story? They go hunting for writing resources to help them with worldbuilding! This is a concept I’ve always struggled with as a writer—I relish the dialogue but drag my feet with the setting. In developing the world for Death and Taxis, I have been researching writing resources for assisting with world development.
This resource is broken up by category and therefore gives more structure to the world development than the previous resources. The first section pertains to the physics and nature of the world, the second section to geography and natural resources, etc.
Any grammar Nazi worth his or her salt knows the atrocity of the missing antecedent. The worst of all errors is a sentence that begins with “it” (à la, “It was a dark and stormy night”). What I have realized, though, is that few rules like these are so unarguable that someone hasn’t broken them with success. After all, the opening lines of Charles Dickens’ A Tale of Two Cities and Jane Austen’s Pride and Prejudice have an unforgettable ring: “It was the best of times, it was the worst of times” and “It is a truth universally acknowledged, that a single man in possession of a good fortune, must be in want of a wife.”
In fact, this scenario where the great authors break the rules reminds me of the final rule I learned in my photography class a couple years ago. My teacher taught us photography composition rules, such as the rule of thirds and leading lines, but then the list that he referenced ended with the tip that rules are meant to broken. And that is where true talent often shines through. Where some people realize they are Austens, and others discover they aren’t.
Examples of “It” in Action
“It was a bright cold day in April, and the clocks were striking thirteen.” — George Orwell, 1984
“It was a pleasure to burn.” — Ray Bradbury, Fahrenheit 451
“It is a truth universally acknowledged, that a single man in possession of a good fortune, must be in want of a wife.” — Jane Austen, Pride and Prejudice
“It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness, it was the spring of hope, it was the winter of despair, we had everything before us, we had nothing before us, we were all going direct to Heaven, we were all going direct the other way – in short, the period was so far like the present period, that some of its noisiest authorities insisted on its being received, for good or for evil, in the superlative degree of comparison only.” — Charles Dickens, A Tale of Two Cities
Why “It” Works
Clarity and precision are paramount when you write to explain, persuade, or inform. However, in literature and poetry, authors can break these rules and intentionally confound readers with unclear subjects. Sometimes a sense of mystery or confusion can be a tool instead of a hindrance. I think this power to create mystery and suspense is part of why a sentence that begins with it can be powerful in artistic writing situations when, in other contexts, the construction would be weak.
In the opening lines quoted above, each author makes a startling claim. If Bradbury had merely said “it was a pleasure to burn wood in the fireplace,” his readers would have responded “duh!” and slipped into boredom within seconds. The same would have been true if Orwell had stopped at “April.” His second clause packs the punch in the opening line from 1984 with its final word. Because clocks don’t strike thirteen. And if clocks are doing this, then something is wrong, and the audience already feels the wrongness with this simple, jarring statement.
Don’t state the obvious if you are going to start a sentence—and especially a book—with it. Make it count. Lure the audience in with a benign “it was…” and then catch them off guard. This sentence construction is likely to fall apart without a startling claim. In fact, I think this explains why “it was a dark and stormy night” is ridiculed, while the opening lines I referenced above are revered. Dark and stormy nights are commonplace. Describing a night in this way does little to set this particular evening apart from any other and fails to capitalize on the power and mystery of it.
Based on the claims above, perhaps a simple formula is possible: Successful sentence beginning with it = “It” + verb + startling claim (humorous, thought-provoking, intriguing, or surprising).
Now, the next time you’re about to pounce on a sentence with a missing antecedent, red pen in hand, remember that it serves a valuable role in writing and even has the power to form some of the most memorable quotes in the history of literature. Further, if you’re feeling creative, try using it to begin a story and see if the formula works for you (please share your sentence in the comments). Perhaps you will be the next Austen, Dickens, Bradbury, or Orwell.
How often do the “normal” people and moments in life capture national fascination? After all, the public and the media like to focus on stories that deviate from the norm, that are bigger than everyday life, and that take the audience away from their typical lives. However, a person or event occasionally becomes extraordinary by being quite ordinary and yet surprising the world in some unusual way. Harry S. Truman was one of these people.
In the biography The Accidental President: Harry S. Truman and the Four Months That Changed the World, A. J. Baime provides insight into Truman’s life, career, and the national and international impact of his time in office after FDR dies. This story is fascinating as it shows how a Missourian with little money and almost no public presence rises to the highest seat of power in the United States. What makes Truman’s career even more remarkable is that he was extremely ordinary. Baime writes about Truman and his future wife Bess, “Bess Wallace was everything Harry was not. She was fashionable, athletic, and popular. Harry, in his own words, ‘was never popular. The popular boys were the ones who were good at games and had big, tight fists. I was never like that. Without my glasses, I was blind as a bat, and to tell the truth, I was kind of a sissy. If there was danger of getting into a fight, I always ran’” (44). Humorously, Baime explains that even though “Harry sat next to Bess Wallace in church school…[i]t took him five years to get up the courage to say hello” (44). These descriptions sound more like a depiction of Charlie Brown, not future president material.
Despite his ordinariness though, Truman wins against all odds time and time again, and his honesty and hard work appear to have been key to his success. Also important to Truman’s character is his continuous dedication to his family. He always makes time to look after and stay in touch with his mother, sister, daughter, and wife. When his family is most concerned about the huge responsibility that has been thrust on him, Truman is worried about how being president will affect the privacy and lives of his family.
In contrast to his unimpressive personality and ordinary origins, Truman’s life is anything but ordinary, and The Accidental President is a fascinating biography. Baime packs the book with interesting details and narrates events in a story-like manner that makes the biography very readable. Thanks to Baime’s skillful juggling of places, people, and events, the different scenes of the story tie together smoothly and help the reader grasp what is happening simultaneously around the world.
While the title The Accidental President appropriately captures how unusual Truman’s career turned out to be, I think perhaps a more fitting title would be The Providential President. As much as people may criticize or disagree with Truman’s policies and decisions, he turned out to be the right man for his hour. Truman faced difficult decisions and stressful scenarios with courage, honesty, and dedication, and I think succeeding generations should take care before passing judgment on Harry S. Truman. After all, he had to make some of the hardest choices and deal with some of the greatest challenges any American president has ever confronted, and he did so without the clear support of the American people that an elected president would have had and without the history-making charisma that most world leaders have possessed. President Harry S. Truman proved a common man could become the leader of a world power and accomplish the extraordinary.
Baime, Albert J. The Accidental President: Harry S. Truman and the Four Months That Changed the World. Houghton Mifflin Harcourt, New York, NY: 2017.
Today, Caroline Bennett discusses music periodizations, pedagogy, and more, while highlighting the importance of studying a variety of musicians and musical styles.
Whenever someone tells a story, reads a textbook, writes an essay, or participates in a discussion, this person inevitably employs a set of preconceptions and a view of the world. In a discussion of periodizations in music history, historian James Webster notes that “periodizations serve the needs and desires of those who make and use them…This is so whoever ‘we’ are, and whether we conceive our historical intentions as ‘objective’ or interest-driven.” Webster’s claim also pertains to the current push to diversify the study of music. When historians or teachers decide which composers to talk about they have certain objectives, and the attempt to diversify music history is a direct result of the value that American society currently places on inclusivity and diversity. Although this is not necessarily a wrong approach to music history, musicians should be conscious of why they study certain people or compositions. Musicians can actually achieve greater diversity in their view of the past by not making diversity the ultimate objective. Rather, musicians should strive to study and perform music that was impactful at the time that it was written, that serves an important pedagogical function, or that is timely and appropriate in a modern context. This goal, though daunting, is achievable if historians, teachers, and performers expand their knowledge of music and apply it to their respective disciplines.
Given the immensity of music history, it may appear unfeasible for music historians to talk about music that is not only excellent but also demonstrates diversity. However, this should not be the primary goal of historians. Instead, while conducting research historians should notice any information that is thought-provoking or could potentially connect with other facts. If the name of an unknown composer is mentioned in a document, a historian should consider going off on a tangent and seeing where else the composer is mentioned or what pieces the person wrote. This may lead to exciting connections between the unknown composer and more famous composers, or occasionally result in the discovery of a truly great or influential artist. Additionally, historians have a second task: they should notice the time periods, countries, and societies that did not have many composers of diverse ethnicities or genders. For example, a prevalent reason why there have been fewer and less-well known female and African-American composers in music history up into the 20th century is because they did not have good educational opportunities. Although this makes it harder for historians to include diverse composers in their writings and presentations, it is wise for historians to inform their audiences of these reasons because it gives context to the narrative and highlights the composers who did manage to overcome racial prejudice or social inequality, such as Scott Joplin, Ethel Smyth, William Grant Still, or Germaine Tailleferre.
Supplied with the wealth of resources that music historians share, music teachers can expand their knowledge of their instrument and its repertoire. It is important for teachers to be familiar with an assortment of pieces that not only come from various time periods but also have different purposes, contexts, and styles. This gives teachers an arsenal of works with which to inspire and challenge their students. Although a majority of the pieces that teachers assign their students will be by standard composers such as Bach, Mozart, Beethoven, and Schumann, if teachers are intimately familiar with their instrument’s canon they will have the freedom to choose pieces best suited to their student’s interests and abilities. Likely this will lead to more and more students studying works by Fanny Mendelssohn Hensel, Clara Schumann, and the like. For example, if a piano student expresses interest in learning a blues or jazz song, a teacher might assign “Saint Louis Blues” by African-American composer W.C. Handy. The benefits of this are twofold. Not only will the student likely be more motivated to practice the piece because it is appealing, but it will also present an opportunity for the teacher to introduce the student to a specific segment of music history. Indeed, teachers ought to always seek to incorporate music history into lessons and expect their students to become well acquainted with the story and repertoire of their instrument.
When musicians receive a well-rounded education and are knowledgeable of their instrument and its repertoire, concert programs are more likely to feature unique and lesser-known works. A performer who remembers that she enjoyed studying Amy Beach songs in high school will be more likely search for more good pieces by Beach and include them on concert programs later on in her career. This will in turn introduce audience members to pieces and composers that they may not have been familiar with before and inspire other musicians to study new works. Though not overtly related to diversifying music studies, this process will certainly affect people’s understanding of music history and eventually make a mark on musical canons. The story of how Mendelssohn’s performance of J.S. Bach’s St. Matthew Passion in the mid-19th century helped instigate renewed interest in Bach’s music, though not an example of diversity, certainly demonstrates the power of performing uncommon pieces. Even one concert can prompt more and more people to study music by an unfamiliar composer until that composer becomes an established figure in music history.
If music historians are diligent in following tangents in their research and discovering new composers and pieces, and if teachers assign a variety of works to their students and encourage their students’ curiosity about their instrument’s history and repertoire, and if performers constantly present the most innovative, interesting, and compelling works on their instruments, then music history and music canons will naturally become more diverse. Instead of making a conscious effort to change the way people view the past, and in the process imposing current values or agendas, musicians ought to encourage diversity and inclusivity via a different route. They should study and teach and perform the music that is most impactful, most influential, most imaginative, most intriguing. And although this approach demands much from musicians and requires a well-rounded education, the results will be invaluable. Historians, teachers, and performers will have a deeper, richer understanding of music, its history, and the world, and this in turn will make them better able to share music with their audiences.
. James Webster, “Between Enlightenment and Romanticism in Music History: ‘First Viennese Modernism’ and the Delayed Nineteenth Century,” 19th Century Music, vol. 25, nos. 2-3 (2001-02): 110.
. Laura Artesani, “Beyond Clara Schumann: Integrating Women Composers and Performers into General Music Classes,” General Music Today 25, no. 3 (2012): 23. MasterFILE Premier, EBSCOhost (accessed April 10, 2018).
. J. Peter Burkholder, Donald Jay Grout, and Claude V. Palisca, A History of Western Music, Ninth edition (New York: W. W. Norton & Company, Inc., 2014), 461.
Artesani, Laura. “Beyond Clara Schumann: Integrating Women Composers and Performers Into General Music Classes.” General Music Today 25, no. 3 (2012): 23. MasterFILE Premier, EBSCOhost (accessed April 10, 2018).
Burkholder, J. Peter, Donald Jay Grout, and Claude V. Palisca. A History of Western Music. Ninth edition. New York: W. W. Norton & Company, Inc., 2014.
Webster, James. “Between Enlightenment and Romanticism in Music History: ‘First Viennese Modernism’ and the Delayed Nineteenth Century.” 19th Century Music 25, nos. 2-3 (2001-02): 108-126.
In a break with tradition, we four authors have decided to work together on a special post containing some favorite, old, new, funny, long, or fun to say English words. Read on to see what we found, and please share some of your own favorites in the comments below.
Given the plethora of words that are fun to say, I have just gone with my most recent discovery: sesquipedalian. Originally coined by the Roman writer Horace to warn young poets against using overly long words, it literally means “foot-and-a-half long.” The Webster definition of the modern word is: sesquipedalian “-1: having many syllables, long; 2: given to or characterized by the use of long words.”
While certainly not a commonly used term, sesquipedalian does roll off of the tongue in a pleasing way with some practice. A few other words that have piqued my interest lately are: prescient, nepenthe, asphodel, castellated, and surcease.
Maybe you, like me, find new colloquialisms entertaining (Gasp! Young people are ruining the English language!). College-aged kids introduced the following to me in recent months: slap and bet. Be careful: they don’t mean what they traditionally mean!
“See you at the party tomorrow night?” “Bet!”
Hey, how was that party the other night?” “That party was slap!”
The meaning can be inferred based on context – bet meaning you bet, and slap meaning good or great.
Two other words pertain to a person’s sphere of knowledge and are both new to me (one I learned 2 years ago and the other, yesterday):
To go beyond one’s scope or province, esp to criticize beyond one’s sphere of knowledge
1. A person’s specific area of interest, skill, or authority. See Synonyms at field.
2. The office or district of a bailiff.
British Literature is Professor Barrik’s bailiwick, but she enjoys ultracrepidating on early American literature as well.
I can’t imagine using either word in a normal conversation where I wasn’t trying to be condescending, obtuse, or humorous. So the above sentence will have to do.
Tongue Twister: I have several favorite tongue-twisters, but one of the best is arachibutyrophobia. Because we all need a word for that fear we have of peanut butter sticking to the roof of our mouth.
New to Me: I always called cars with a missing headlight “cyclops,” but this past weekend I learned paddidle, which has interesting origins as a driving game.
Perfect for the Purpose: Some words have an almost onomatopoeic quality where their sound and their definition match in a satisfying way. Two examples are incorrigible and indeed (said with Jeeves’ level of emphasis and a hint of indulgence and incredulity, two other great words).
I like so many things surrounding this word. I love the alliteration in the Merriam-Webster definition: “Not readily investigated, interpreted, or understood” I like being this word. I like the challenge of scrutinizing (to use a sister expression) things that are this word. It’s got some fun synonyms, too: arcane, cryptic, enigmatic, impenetrable, uncanny.
I have loved this word ever since Calvin, of Calvin & Hobbes, used it in the following sentence: “I must obey the inscrutable exhortations of my soul.” It’s a feeling I often have myself.
I’ve actually used that very word in that very sentence several times, usually when justifying some inane thing I just said or did. If I’ve quoted it to a fellow Calvin & Hobbes lover, it’s an opportunity for bonding and swapping other favorite strips. If I’ve said it to anyone else, they’re likely to be even more confused. Which makes me, myself, a bit inscrutable.
English vocabulary may be a maze, but let’s own it in its delightful craziness. As Mark Twain reportedly said, “There is no such thing as the Queen’s English. The property has gone into the hands of a joint stock company and we own the bulk of the shares!” So let’s have fun with English in all it’s changing intricacy, sesquipedalianism, and inscrutability.
The following post is by guest author Caroline Bennett. Normally our musical columnist, she has expanded into U.S. history for today’s essay.
In most towns in the United States of America, preparations for July 4 begin nearly a month in advance. People become extra patriotic, swathing their front porches with bunting, lining the sides of their driveways with mini-American flags, and stockpiling fireworks. Reenactments and parades take place across the country on July 4, and Americans travel to be with family and enjoy hotdogs and pies. Independence Day is one of the most popular holidays celebrated in the United States, marking the day a group of men from thirteen colonies declared independence from Great Britain, creating the foundation for the great nation that Americans know and love today. Considering how important Americans consider Independence Day, however, few realize that July 4, 1776, was not actually as important a day for American freedom as July 2 or even August 2, 1776. Many assume that the fading parchment exhibited in the National Archives in Washington, D.C. was written and signed on July 4, but in reality the process of declaring liberty from Great Britain spanned many years.
In the Course of Human Events
As American colonists became more and more restless under British rule in the 1770s, colonial leaders made the decision to coordinate the actions of the colonies by forming the Continental Congress (Thompson 190). The first Continental Congress convened briefly in the fall of 1774, and delegates from twelve colonies attended (Johnson 148). After the confrontation between colonists and British soldiers at Lexington and Concord, the second Continental Congress met in Philadelphia (Evans 229). A merchant from Massachusetts, John Hancock, presided over the Congress, and all thirteen colonies sent delegates. On July 7, 1775, the congress approved a declaration authorizing the use of force against England (190). Originally, Thomas Jefferson, a young Virginian lawyer, was to pen this declaration, but after his writing proved too inflammatory, John Dickinson, a solicitor from Pennsylvania, wrote a second draft (190). M. Stanton Evans, an American journalist and author, notes in his book The Theme Is Freedom that most American colonists were not eager to declare independence from Great Britain (229). Colonial leaders spent decades writing manifestos and demanding redress, and Dickinson’s declaration assured readers that the Continental Congress was not dissolving the union between the colonies and Great Britain (196).
To many delegates, separation from Great Britain was still an alien concept in 1775. Numerous congressional delegates feared the colonies would fall into chaos if they threw off the structure of British government (Johnson 154). John Adams, a Massachusetts lawyer, dismissed these fears and unofficially led a group of supporters of independence. Adams and his associates debated before Congress over the course of many months, attempting to convince the delegates that a separation from Great Britain was necessary (Evans 231). Indeed, it was nearly a year after Dickinson penned his declaration, in May 1776, that some of the Congressional delegates finally made motions to declare British power null and encourage the colonies to set up governments of their own (231).
Thomas Jefferson returned to Philadelphia on May 14, having been absent since the previous December (Evans 231). He arrived just in time. On June 7, Richard Henry Lee, on behalf of the Virginia Assembly, asked Congress to “adopt a declaration of independence, prepare articles of confederation, and solicit ‘the assistance of foreign powers’” (Rakove 74). John Adams seconded Lee’s motion, and the following day the delegates renewed the debates on independence (McCullough 118). The delegates in support of independence were increasing in number, but still the Congress could not agree to declare the colonies free from Great Britain. On June 10, the delegates opposed to severing ties with Great Britain requested that the final vote be delayed until July 1, so that the Congressional delegates could send for new instructions from their colonies (McCullough 119). Congress agreed to the delay but appointed a committee consisting of Benjamin Franklin, John Adams, Roger Sherman, Robert Livingston, and Thomas Jefferson to draft a declaration of the colonies’ independence in the event that Congress voted in favor of the measure (Johnson 154).
Right of the People
Thomas Jefferson was the youngest member of the committee and had spent the least amount of time in Congress. Nevertheless, the committee nominated him to write the first draft of the Declaration of Independence. Whatever the reasons of the committee, Jefferson’s appointment was fortuitous. His rhetorical eloquence is powerful and serves as a testament to his education and extensive reading. Nevertheless, total credit for the Declaration of Independence should not go to Jefferson alone. He was a part of a committee and wrote what his associates agreed upon. The Declaration of Independence was to be a corporate statement, after all, and it essentially summed up all the debates that had taken place in Congress over the past two years (Evans 232-233). David McCullough, a historian and lecturer, records in John Adams that Jefferson later wrote
Neither aiming at originality of principle or sentiment, nor yet copied from any particular and previous writing, [the Declaration of Independence] was intended to be an expression of the American mind, and to give to that expression the proper tone and spirit called for by the occasion. (121)
Because the Declaration
represented the opinions of many different men, colonies, and beliefs, it was
naturally influenced by a variety of sources. The colonial leaders certainly
found inspiration in the writings of philosopher John Locke, particularly in
the concepts from his Second Treatise of Government
(Rakove 78). Locke’s belief that people should throw off the authority of
their king or government after suffering repeated violations of their rights reflected
the influence of his heritage in the English common law, however. The English
common law was grounded in the idea that political authority came ultimately
from the people, and that kings and other magistrates were their agents (Evans
With centuries of history backing him, Jefferson wrote out the list of abuses charged against George III. Many people have speculated why the Declaration was addressed only to the king instead of the British Parliament. M. Stanton Evans points out that American colonists had never pledged allegiance to Parliament, only to the king of England, and therefore the Continental Congress only needed to address its sovereign (234). The reason for writing a declaration was simple: to let George III and the entire world know why the thirteen colonies could in good conscience throw off British rule and establish their own government. Excluding the introduction and conclusion, the Declaration of Independence is a summary of George III’s violations of authority. The king had obstructed justice and rule of law, made it difficult for legislative bodies to meet, kept standing armies in the colonies during times of peace, held mock trials for his guilty subordinates, deprived many colonists of trial by jury—the list goes on and on (Thompson 204-205). These were not violations made only by George III, however; he had been king but sixteen years. The colonists were charging George III and his predecessors with failure in their duty to justly use the power their people had given them.
Once Jefferson finished
writing his draft, he and the committee revised it. Impressed by Jefferson’s
conciseness and clarity of thought, Franklin, Adams, Sherman, and Livingston
made mostly minor changes (McCullough 121-122). The concepts the declaration presented
were powerful and undeniable, and the committee did not alter them. Most of the
changes made the document easier to read. For instance, the committee altered
the wording of the famous second paragraph, replacing Jefferson’s phrase “we
hold these truths to be sacred and undeniable” with the simpler “we hold these
truths to be self-evident” (121). The committee finished revising the draft of
the Declaration of Independence after a few days, and by June 28, 1776, Jefferson
and his associates were ready to present their Declaration of Independence to
Congress (Rakove 74).
To Throw Off Such Government
The second Continental Congress reconvened on July 1, and the debates on Richard Henry Lee’s resolution to declare independence continued. John Dickinson said that severing ties with England was premature, but acknowledged that his was an unpopular opinion. Dickinson knew that in opposing independence, he was ruining his career, but he believed that “thinking as I do on the subject of debate, silence would be guilt” (McCullough 126). After Dickinson delivered his moving speech, Adams stood and “wished now as never in his life…that he had the fits of the ancient orators of Greece and Rome, for he was certain none of them ever had before him a question of greater importance” (126). Following Adams’ speech, other delegates took the floor, including John Witherspoon from New Jersey and Joseph Hewes from North Carolina. A preliminary vote followed the speeches. Pennsylvania, South Carolina, and Delaware voted against declaring independence, and the New York delegates abstained from the vote because they lacked instructions from their legislature (128). Though nine colonies favored independence, Adams and his associates hoped for a show of solidarity, and thus the final vote was postponed until the next day in hopes of more colonies changing their votes.
On July 2, 1776, the two
chairs reserved for the Pennsylvania delegates were empty. Delegates John
Dickinson and Robert Morris could not in good conscience vote in favor of
independence, but they also knew how important it was for Congress to speak
with one voice, so they absented themselves from the proceedings (McCullough
129). When the final vote was taken, New York once again abstained from the
vote, but South Carolina and Delaware changed sides. The colonies’ decision to
declare independence was unanimous, at least in the sense that no colony stood
opposed (129). Adams joyously wrote to his wife later that evening,
The second day of July 1776 will be the most memorable epocha in the history of America. I am apt to believe that it will be celebrated by succeeding generations as the great anniversary festival…It ought to be solemnized with pomp and parade, with shows, games, sports, guns, bells, bonfires and illuminations from one end of this continent to the other from this time forward forever more. (McCullough 130)
Free and Independent States
Now that the Congress was agreed to declare the colonies free of British rule, the delegates began revising and approving the Declaration of Independence that Jefferson and the committee had written. Once again, many of the changes were minor, largely focused on making the writing less verbose and toning down some of Jefferson’s language. All in all, the Congress made more than eighty changes to Jefferson’s draft (McCullough 134). In the end, the concluding line of the declaration read, “And for the support of this Declaration, with a firm reliance on the protection of divine Providence, we mutually pledge to each other our lives, our fortunes, and our sacred honor” (Thompson 207).
The final sentence of the Declaration of Independence reveals the feelings of the Congressional delegates. Casting accusations at the king of England and declaring the thirteen colonies free of British control was no laughing matter, and the Congress knew full well it might face disbandment or punishment for treason (Armor). Nevertheless, twelve of the colonies ratified the Declaration of Independence on Thursday, July 4, 1776. The New York delegates initially abstained from the vote, but their legislature later approved the declaration, ultimately making the vote for the Declaration of Independence unanimous among the colonies (Armor). On July 5, printers began making copies of the momentous document, and the Congressional delegates sent copies to friends and to their legislatures. On July 8, the Declaration of Independence was read publicly in the State House Yard in Philadelphia, and the Liberty Bell was rung (Johnson 156).
The Declaration of Independence was not yet complete, however. A copy of the declaration was “elegantly engrossed on a single, giant sheet of parchment by Timothy Matlack, assistant to the secretary of Congress” (McCullough 137). On Friday, August 2, most of the Congressional delegates convened to sign the Declaration of Independence. Once more, there was no fuss or ceremony, and the delegates simply stepped forward and fixed their signatures. John Hancock, as president of the Congress, made his signature in the middle of the document, using large, flowing strokes. A number of other important delegates were noticeably absent—Richard Henry Lee, George Wythe, Oliver Wolcott, Elbridge Gerry. They signed later. A new member of Congress from New Hampshire, Matthew Thornton, fixed his signature in November 1776, and Thomas McKean of Delaware signed in January 1777 (138). Approving the declaration had been seditious enough. Now the delegates were writing their names on it—undoubtedly a treasonous act. As a result, the signing of the document remained a secret for some time (Armor).
The legal process of
severing ties with Great Britain was concluded after the signing of the
Declaration of Independence, but seven years would pass, and thousands of men
would die, before the thirteen United States of America were truly independent.
Nevertheless, Americans began celebrating their liberty and freedom just one year
after the second Continental Congress ratified the Declaration of Independence.
On July 4, 1778, a few local celebrations took place (Armor). After the
surrender of Cornwallis at Yorktown, Americans began more extensive
celebrations. Despite Adams’ conviction that July 2 would be commemorated, July
4 inexplicably became the day that Americans chose to celebrate their
independence. In 1873, Pennsylvania became the first state to officially recognize
July 4 as Independence Day (Armor). Other states followed suit soon after, and
eventually Independence Day became a federal holiday in the United States of
For most Americans, the Fourth of July is a holiday that takes place in the heat of summer, involves fireworks and parades, and celebrates American independence. Many believe that the Declaration of Independence was signed on July 4, 1776, a day that ushered in celebrations and the ringing of the Liberty Bell. Declaring independence was not that simple, however. The truth is that numerous colonial leaders left their families and homes for months on end, argued with one another for years about the right course of action, and struggled with their personal doubts and fears. The Congressional delegates put their lives on the line by voting for independence, voting to approve the Declaration of Independence, and affixing their signatures to the same document. The delegates knew that what they were doing would go down in history, but they did not know whether they would be remembered as defeated traitors or as victorious American patriots. The delegates would probably not care what day Americans have chosen to celebrate independence. The second Continental Congress did not meet in Philadelphia in order to be remembered, but in order to give future generations of Americans a government that prized life, liberty, and the pursuit of happiness.
Armor, John. “‘Independence’ Day, Past and Present.” World & I 11.7 (1996): 72. History Reference Center. Web. 26 June 2016.
Evans, M. Stanton. The Theme Is Freedom. Washington, D.C.: Regnery Publishing, Inc., 1994. Print.
Johnson, Paul. A History of the American People. New York, NY: Perennial-HarperCollins, 1997. Print.
McCullough, David. John Adams. New York, NY: Simon and Schuster, 2001. Print.
Rakove, Jack N., and States United. The Annotated U.S. Constitution and Declaration Of Independence. Cambridge, Mass: Harvard University Press, 2009. eBook Collection (EBSCOhost). Web. 26 June 2016.
Thompson, Bruce, ed. The Revolutionary Period: 1750-1783. Farmington Hills, MI: Greenhaven Press, 2003. Print.
While surfing the internet looking at t-shirts one day, I came across a shirt with an image of a raven spelling the word “nevermore.” Immediately recognizing the reference to Edgar Allen Poe’s famous poem The Raven, I went and grabbed my copy off of the shelf and began to read. However, upon opening the book, a small piece of paper fell out, and upon looking closer I realized it contained a bunch of words on it meant for further research in the dictionary. One of the great advantages of reading older books is the expansive use of vocabulary found in them. While by no means a universal truth, many older authors (especially the ones who have lasted the test of time) maintained a much stronger mastery of the English language than people do today. This makes old books a great way to learn new (old) English words. However, like just about anything, vocabulary growth cannot be obtained in a desultory manner, because then we just end up writing words down to reference later and stick them in a book. After rediscovering my list of vocabulary from Poe’s book, I continued on to reread The Raven, it took me all of 5 minutes, but in the process I had discovered several more words to add to my small paper. So, find a good book, and whenever you get a chance – read and learn. There’s a whole wealth of words out there that the majority of people do not know, a whole treasure chest just waiting to be discovered. Here are some of the words that I found through reading Poe:
Quiescence –adj. in a state or period of inactivity or dormancy
Asphodel –n. an immortal flower said to grow in the Elysian fields
Desultory –adj. lacking a plan, purpose, or enthusiasm
Nepenthe –n. any drug or potion bringing welcome forgetfulness
At times, if you are like me, you will end up in a situation that you agreed to be in, but without fully understanding what you were signing up for and the risks involved. This is why a lot of wiser, older folks opine about experience being the best mentor; sometimes it’s good to listen!
Lace your shoes securely.
You will fall down a lot at first; the important thing is to not break any bones and to get back up and keep going. If you do this, eventually, one of the following will happen:
you’ll break something and have to stop.
your bum will go numb and you’ll stop minding the falls as much (preferable to option 1).
you’ll gain experience and fall less (the most preferable option).
Often, an inexperienced boarder (I have no idea who you might be thinking of!) will pick up speed, lose control, and wipe out. Learning to slow down takes practice but is 100% worth it.
Be sure to pack the right equipment for the activity (snow pants, beanies, snowproof gloves, etc.).
Learn from each mistake, and don’t get discouraged when you’re sore and it hurts.
Try not to collide with other people during your journey; it’s less fun for everyone.
We all make it to the bottom of the mountain eventually, just at different speeds and in different conditions; some arrive at the bottom on stretchers (not to be too macabre, but I did see it happen). Experienced boarders tend to arrive at the bottom quickly and with minimal injury.
Regarding chair lifts – sitting and waiting is a big part of snowboarding, and of life! It helps to have a friend alongside you, to pass the time.
While you’re sitting and waiting, remember to enjoy the scenery. Oooh! Ah!
To those who may be curious: no, I didn’t collide with anybody (whew!). I did fall a lot, but I returned safely home with no major injuries, and with more skill as a boarder than when I arrived. Since I had no experience when I arrived, this is not remarkable. Whether snowboarding is something I would ever do again, well…that is another post.