History Teaches Empathy At Its Best, Not Its Worst

MAR15_23_emapthy.png

As history continues to be devalued by today’s educational system, many prominent historians point to empathy as (one of its many) saving graces.  John Lewis Gaddis writes that “Getting inside other people’s minds requires that your own mind be open to their impressions – their hopes and fears, their beliefs and dreams, their sense of right and wrong, their perception of the world and where they fit within it.” John Cairns adds, empathy “is the passport to gaining a genuine entry into the past as a foreign land, and something distinct from our time.” Through historical thinking, empathy allows people to unite themselves with mankind, and understand on a deeper level what it means to be human.

However, the teaching of empathy is not without its caveats. Paul Bloom’s recent book, Against Empathy: The Case for Rational Compassion, is one example. In an interview, Bloom explains,

“But empathy is surprisingly bad at making us good. It’s a spotlight focusing on certain people in the here and now. This makes us care more about them, but it leaves us insensitive to the long-term consequences of our acts and blind as well to the suffering of those we do not or cannot empathize with. Empathy is biased, pushing us in the direction of parochialism and racism. It is innumerate, favoring the one over the many. It can spark violence; our empathy for those close to us is a powerful force for war and atrocity toward others. It exhausts the spirit and can diminish the force of kindness and love.”

Interestingly, Bloom’s notion of empathy seems to be completely disconnected from the one that historians advocate. Historian John Fea clarifies,

“Empathy differs from sympathy. Empathy is all about understanding. It is an attempt to discover why a particular individual in the past acted in the way that he or she did. It might even mean exploring such actions in an attempt to grasp how he or she reflects the mentality of all of those living in that time and how such mentality differs from our own. Sympathy, however, carries a deeper moral component than empty. The sympathetic person develops an emotional attachment – such as a desire for the other person to be happy – that can sometimes make empathy difficult and might even get in the way of an accurate historical interpretation.”

Fea’s notion of sympathy seems to align much more closely with Bloom’s idea of empathy. Perhaps poor definitions for these words undergird much of the controversy surrounding them. In fairness to Bloom, however, it seems questionable as to whether or not empathy can truly be devoid of moral and emotional attachment. There is an emotionalism – or sympathetic bent (to use Fea’s definition) – that seems inherent to empathetic thinking. While this is certainly problematic for all of the reasons that Bloom writes about, history provides its own solution: perspective.

The answer coming from historical thinking should not be a vain attempt to remove emotionalism and morality from empathy – or to discourage empathy altogether. Rather, it should be an increased emphasis on perspective (and the Five C’s that support it). Bloom himself believes that people should be cultivating their ability to “stand back” in order to provide a more rationally effective programme of care. Historical methodology is unique in its ability to do this. It teaches empathy at its best, not its worst.

History not only allows people to place themselves into the shoes of historical actors – to identify with them emotionally (even morally) – but also allows people to remove themselves from particular events, seeing them within the scope of a much larger narrative. It balances the immediacy of emotional attachment with the rationalistic nuances of a removed perspective. Today’s society, it seems, needs more of both. Perhaps, as more people begin to see the dangers of one approach without the other, they will increasingly turn to historical instruction for the solution.

Advertisements

Should History Be Used As A Christian Apologetic?

ChristianApologetics_Header

In 1965, Herbert Butterfield published The Whig Interpretation of History, a book that largely helped to define the historian’s craft. Although using more general terminology in condemning the practices of presentism, moral judgments, and over-simplicity in historical study, it becomes apparent that Butterfield’s core definition of a whig interpretation of history is much more specific (and perhaps targeted) than his broad language implies. When mentioning the whig interpretation of history, Butterfield almost always associates it with people who see liberty as a desirable and perhaps naturally progressive end for mankind (especially Protestants). In this way, Butterfield attacks those who use the Reformation, French Revolution, etc. to explain the current benefits of a free society.

When contemplating the use of history as a Christian apologetic, Butterfield’s work is helpful. His condemnation of those who use history to advance particular agendas – whether it be political, religious, etc. – is fitting. It becomes all too easy to distort a historical narrative when blinded by present-day motivations. Bookstores are filled with works by popular historians from all walks of life who succumb to these errors. Butterfield argues that history should be the “love of the past for the sake of the past.” From the surface, it appears that Butterfield’s historical methodology would preclude its use by those who are attempting to justify their religion.

Yet, Butterfield’s claim that history should be studied on its own terms is not without problems. One may question whether or not it is even possible to remove present day motivation from historical analysis. After all, what use is history if it is not relevant in some way? Butterfield appears to be drawing a line in the sand that even he himself cannot help but fall short of.

The real question is not whether Christians should use history as an apologetic, but how. This is a much more difficult question to answer. Many conservative writers point to the numerous benefits that the Christian religion has had on society, thus affirming its truth and value – only to be criticized for excluding the religion’s darker historical elements. Others point to the providentialist vision of history, and how God’s existence is evident throughout world events. Although there is clearly some truth to these approaches, they nevertheless open themselves up to Butterfield’s criticisms – most of which are valid.

Perhaps the best use of history as a Christian apologetic is one that focuses on the humanity of those who compose it. History reveals that humans are complex, flawed, inventive, fragile, and moral creatures. It shows the value that humans possess (created in the image of God), the brokenness that they cannot free themselves of (sin), and their inherent longing for a better world (eternity). In other words, history is most effective as a Christian apologetic when it is left to itself – much in the way that Butterfield suggests.

This does not mean that Christians, and those who seek to defend their faith, can not use history to support their claims. However, it does mean that history stands as an apologetic in and of itself, and does not need to be framed in “grand narratives” that verify certain points. True history – whether done by a Christian, Hindu, or Atheist – naturally has existential implications. It should lead people to ask questions that go beyond historical study – to seek after truth that is relevant and personal. It should unite people to the human race and reveal the qualities that are inherent to all people.

For Christians, history can be applied as an apologetic by presenting it in a way that is loyal to the historian’s discipline. When conducted in this way, history is far more effective as an apologetic than when it is used to simply buttress argumentation. True historical study reaches to the human heart – a place Christians believe is not far from Jesus himself.

Can Calvinists Support Modern Education?

636299845207049255-165478264_education

Historians tend to differ on the origins of “modern” education. Many point to the progressive philosopher John Dewey, whose educational reform of the early 1920s had a profound impact on the United States’ public school systems. However, even Dewey’s significant shifts in educational theory pale in comparison to the monumental changes that occurred during the Enlightenment era. Jean-Jacques Rousseau’s Emile, or On Education (1762) shaped western thought for decades after its publication. Even Dewey, writing over 100 year later, agreed with many of it’s premises.

Yet, the true shift towards modern education began even before Rousseau, with John Locke (1632-1704). The significance of Locke’s educational concept did not arise from methodological experimentation or even trial and error (as much reform is derived today). Rather, it began with a shift in the theological articulation of childhood. Locke believed that the minds of children were like “blank slates” – an apparent break from the Calvinist concept of original sin. This nuance, according to historian Stephanie Schnorbus, included significant epistemological implications.

Calvinists believed that the inherent sinfulness of children distorted their senses, making knowledge gained in this manner unreliable. It was only through God’s redeeming work that true knowledge could be obtained. As a result, information presented to children needed to be as clear and straightforward as possible. This was done primarily through catechisms and simple, written descriptions. Images were vague and open to interpretation, making them potential stumbling blocks for acquiring truth.

The Lockean view, however, came to very different conclusions. Schnorbus writes, “In contrast with earlier philosophers, Locke argued that all ideas came out of sensory perceptions. The only way to gain knowledge (and the fodder for thinking) was through the senses. … Sensory perceptions were not perfect, but the more there were, the better the chance for correct knowledge and eventual refinement of reason. People needed to love learning, so adults had to give children an appealing, sensory-diverse education to help them form correct ideals and a desire to learn at the same time.”

As almost all historians agree, it was the Lockean view of education that ultimately won-out, becoming the predominant concept beginning in the mid-18th century. Even Calvinists came to embrace Lockean concepts. However, Schnorbus notes, “Calvinists who adopted the Lockean view of children would have been walking a fine line. To accept the view fully would have meant changing beliefs about original sin and the reliability of knowledge gained through the senses, in the end requiring a new way of seeing the relationship between God and man.” She concludes that Calvinists ended up using “one epistemology to teach the essentials of another.”

The modern implications of Schnorbus’ conclusions are fascinating. Do modern Calvinists recognize the unorthodox origins of their own educational practices? How do they reconcile the two?

It seems that there are three possible options:

1.) Modern Calvinists have a sloppy childhood theology. A quick amount of research reveals that Calvinists seldom (if ever) write on this topic. Although a few Calvinist epistemological writings are available, they are not widely disseminated. Lay Calvinists are likely unaware that any type of contradiction may exist.

2.) Calvinist theologians are aware of theological and epistemological differences, and are seeking a return to more historical origins. This can be seen through The Gospel Coalition’s (TGC) push for catechismal instruction. As one of the leading organizations for Reformed theology, TGC seeks to influence laymen through spiritual, intellectual, and historical truth. In recent months, they have published a number of online articles that advocate for the use of catechisms. Perhaps an increased understanding of the theology at the heart of educational theory is leading them to the past for answers.

3.) The dichotomy between Lockean and Calvinist concepts of childhood are not as distinct as Schnorbus portrays them.  Although most historians view Locke’s tabula rasa concept as a blatant contradiction to original sin, other historians, such as W. M. Spellman, raise questions regarding Locke’s perceived unorthodoxy. Spellman even suggests that Locke’s focus on moral education implies a depraved disposition. Whether this is true or not, it seems clear that several Calvinist contemporaries of Locke were open to reconciling many of his views. Even the 18th century’s biggest defender of Calvinist theology, Jonathan Edwards, once described children’s minds as being like “white paper,” and used many of Locke’s educational concepts.

 

In the end, perhaps there is at least a shred of truth to each of the options mentioned above. Educational concepts and practices are predicated on many presuppositions that often go unrealized. Calvinism itself is a complex branch of theology that contains varying caveats and nuances. At the same time, modern Calvinism lacks the presence of figures that mirror the intellectual giants of its past. This seems to have been a natural process, as Calvinism distanced itself from accusations of close-mindedness and dogmatism. Nevertheless, there is a rich history in these educational and theological traditions that, for the benefit of modern society, should be explored and analyzed.

 

Teaching Historical Thinking in High School: A Case Study Using Jonathan Edwards

220px-Jonathan_Edwards_engraving

I recently had the opportunity to teach a group of 11th grade students about Jonathan Edwards and the First Great Awakening. Due to their Christian backgrounds (and the school’s ties to Reformed Theology), most students possessed basic prior knowledge of the topic. A short PowerPoint presentation in a traditional lecture format served to refresh their memories.

However, my efforts then turned to the task of engaging the students in a deeper understanding of Edwards through the practice of historical thinking, a term popularized by Sam Wineburg’s book of the same title. My goal was to at once reveal the complexities of historical understanding and also to shed light on the difficulties involved in the work of a historian.

To do this, I began by explaining how historians use primary and secondary sources to construct a narrative about a historical person or event. Next, I divided the students into pairs and gave each group a worksheet. At the top of the worksheet were two paragraphs:

Calvinism and Human Nature:

Calvinism is a major branch of Protestantism that follows the theological tradition and forms of Christian practice of John Calvin and other Reformation-era theologians. … Martin Luther, John Calvin and other Reformers used the term “total depravity” to articulate what they claimed to be the Augustinian view that sin corrupts the entire human nature. This did not, however, mean the loss of the imago Dei (image of God). … John Calvin used terms like “total depravity” to mean that, despite the ability of people to outwardly uphold the law, there remained an inward distortion which makes all human actions displeasing to God, whether or not they are outwardly good or bad. Even after regeneration, every human action is mixed with evil.

The Enlightenment and Human Nature:

The Enlightenment was an intellectual and philosophical movement which dominated the world of ideas in Europe during the 18th century. The Enlightenment included a range of ideas centered on reason as the primary source of authority and legitimacy. … John Locke was an English philosopher and physician, widely regarded as one of the most influential of Enlightenment thinkers and commonly known as the “Father of the Enlightenment.” … In Locke’s philosophy, tabula rasa was the theory that at birth the (human) mind is a “blank slate” without rules for processing data, and that data is added and rules for processing are formed solely by one’s sensory experiences. … As understood by Locke, tabula rasa meant that the mind of the individual was born blank, and it also emphasized the freedom of individuals to author their own soul. Individuals are free to define the content of their character—but basic identity as a member of the human species cannot be altered.

I read through each of these paragraphs with the entire class and emphasized their major points. Next, I turned students’ attention to the primary source excerpts by Jonathan Edwards printed at the bottom of the page. I instructed them to read these excerpts and answer the following questions:

  1. What words does Jonathan Edwards use in his depiction of children?
  2. How would you describe Jonathan Edwards’ writings about children? Is he harsh or gentle?
  3. Do you think that Jonathan Edwards had more of a “Calvinist” view of children, or an “Enlightenment” view of children? Why?

After giving each pair of students sufficient time to complete this task, I brought the class together again to review the answers. The discussion began slowly and with little interest. The students clearly thought that this task was too easy.

I called on a group to answer the first question, and they gave the following answers: “vipers,” “miserable,” “wicked,” “wild.” There were some confused looks. A different group responded. “Innocent,” “harmless,” “simple,” “white paper.” More confused looks.

I moved on to the next question. A group raised their hand and suggested that Edwards was mostly gentle in his description of children and seemed to deeply care about them. A boy in the back row retorted, “What? How can you say that? He called them ‘the devil’s children,’ and ‘stupid.’ He was very harsh to them.” Another student interjected, “Where do you see that? That’s not what my sources say.” The students began to exchange papers.

Quickly, I moved to the third question, asking students to raise their hand if they believed Edwards had a Calvinist view of children. Half of the students raised their hands. The other half of them believed that he was undoubtedly an adherent of an Enlightenment view. Questions began to pour forth. At first, students asserted that I must have given them the wrong worksheets. Half of the pairs received a different set of primary sources than the others. This was true. However, I affirmed to them that this was intentional, and that all of the excerpts were directly from Jonathan Edwards. Shocked at this realization, a student raised his hand. “So, why did Edwards change his mind?” “He didn’t,” I responded, “At least not as drastically as you may think.”

And so the question became: How can all of these excerpts be from the same person? Is it possible to explain them or reconcile them together?

Historical thinking.

Here are the excerpts that the different groups received:

Jonathan Edwards Excerpts:

As innocent as children seem to be to us, yet if they are out of Christ, they are not so in God’s sight, but are young vipers, and are infinitely more hateful than vipers, and are in a most miserable condition, as well as grown persons; and they are naturally very senseless and stupid, being “born as the wild [donkey’s] colt” [Job 11:12], and need much to awaken them. (“Some Thoughts Concerning the Revival: Ten Criticisms Answered,”)

There are multitudes of kinds of wickedness that children are guilty of. They serve the devil and behave themselves like the devil’s children. God is very angry with them for their sins. He is very angry to see their hearts so full of sin, to see them of such a wicked disposition. (“God is Very Angry at the Sins of Children,”)

Jonathan Edwards Excerpts:

Little children are innocent and harmless: they don’t do a great deal of mischief in the world: men need not be afraid of them: they are no dangerous sort of persons: their anger don’t last long: they don’t lay up injuries in high resentment, entertaining deep and rooted malice. … Little children are not guileful and deceitful; but plain and simple: they are not versed in the arts of fiction and deceit; and are strangers to artful disguises. (Religious Affections)

When they are young, they are newly come into the world and their minds, as to any prepossessions or prejudices of any judgement already formed or habit contracted, are like white paper: you may write or lay what colors you will upon it – though when once it is colored or written on, it is not so easily altered afterwards. (“Don’t Lead Others Into Sin”)

Although there is not enough room for an in-depth explanation of these documents (check out my M.A. thesis on Edwards and Puritan Childhood), the deeper issue remains clear. Edwards was a complex man living in a complex time. His writings and sermons had specific purposes and were developed in response to particular events. Edwards also drew on allegory, typology, and biblical imagery to emphasize his points. He was at once a strong proponent of Calvinist orthodoxy, but also an avid reader of the “Father of the Enlightenment” John Locke. We must view Edwards – and all of history – through this lens of understanding. True history cannot be simplified, and we must be weary of “scholars” who try to do so.

The students in the classroom caught on quickly to these truths. They came away not only with a better understanding of Edwards’ context and complexity, but also the context and complexity surrounding all people. After all, historical thinking is not only for history, but for today too.

 

 

Using Morality for Classroom Engagement

You-Actually-Cant-Avoid-Legislating-Morality-1

There may not be a better book regarding the practice of teaching history than Dr. Sam Wineburg’s Historical Thinking and Other Unnatural Acts: Charting the Future of Teaching the Past. It should be required reading for all social studies education majors. Although a post could (and perhaps should) be devoted to each chapter of the book, there is one in particular that stands out the most. This section, titled “Lost Words: Moral Ambiguity in the History Classroom,” recounts a classroom observation in which public school teacher Richard Stinson engaged his history students with a moral question. In the midst of discussing “the rules that govern American society,” Stinson proceeded to ask:

“Is there any authority that transcends the Constitution?”

Students fidgeted at their desks, but no one uttered a word.

“Well,” Stinson continued, “What about moral authority or religious authority?”

The mere mention of religion set off a flurry of “oohs” and “ahs,” the students’ signal that a taboo had been broached. Paul, sitting in the front row, seemed to initiate a script that had been played many times before. “So, Mr. S., are you saying there is a God?” 

Wineburg goes on to explain how the classroom erupted in energy, as students bantered back and forth with Stinson and their peers. Several boys in the room boldly resisted the idea that citizens, especially soldiers, were bound to any sort of “divine authority.” Distressed by this thought, Stinson (who himself is a devout Christian) posed to them questions concerning Nazi war criminals and the tragedy at My Lai. The class ended with many of the students still having not been “won over” by Stinson’s historical and philosophical implications.

Although it may be easy to critique Stinson’s performance (perhaps a redirection to historical arguments of divine authority would have served him better), Wineburg rightly points out Stinson’s success “in creating an atmosphere where education is not ‘academic’ but instead is a process of debate, discussion, and questioning.”

Learning to Engage

Perhaps it is no surprise that history classrooms are seen as “boring” and “tedious” by many of today’s students. This truth continues to be one of the biggest travesties in America’s modern educational system. In response, tools, activities, and other “gimmicks” continue to be developed for the purpose of better engaging history classrooms. However, Stinson’s example is a reminder that while these devices have their place, history teachers must first learn to engage students with the content itself. Participation is not an indicator of  historical thinking. Instead, the goal must remain for educators to use historical thinking as the motivator for participation, a task that Stinson does brilliantly (yet imperfectly).

Although much easier to conceptualize than practice, the implementation of moral questions into the history classroom may be the key. After all, morality is what makes us human. It is, in essence, a motivator in and of itself. Ultimate purposes, regardless of one’s worldview, are undergirded by moral justifications. A truly engaging classroom is one that appeals to a person’s humanity, and tugs at the strings of their heart. History, like theology and philosophy, is uniquely able to do this.

While personal morality may be the gateway to the past, it should not be confused with the past itself. Historical figures, even those to whom we can best relate, often thought vastly different about the familiar concepts of “good” and “bad.” Students should be forced to interact with, and wrestle out, the implications of these various modes of thought – while simultaneously understanding the perils of making moral judgements.

The best method for presenting this moral content is perhaps one that is purposefully ambiguous. Stinson’s frustrating inability to directly convey the importance of moral absolutes may have ironically been his greatest strength. The detrimental effects of dogmatic religion teach us that people are more likely to disengage when moral assertions are forced down their throat. Moral questions are most compelling when students must contemplate them on their own, at least to some extent.

Conclusion

History teachers should not be afraid to guide and lead students in discussions of morality. It must be done purposefully and shrewdly, while drawing upon historical context. Presented somewhat ambiguously by the teacher, students must be forced to think through moral implications, and how moral decisions have effected historical events. In doing so, teachers will not only be promoting an engaged classroom, but much more importantly, a future generation that knows how to ask and solve moral questions about our world – something we seem to need now more than ever.

Making Moral Assertions: John Dewey and Race

dewey

A Guest Post by Alex Boggs: 

In my studies for doctoral classes, I came across several thinkers and educators that influenced the field substantively. One such thinker is John Dewey. Dewey is a powerhouse of educational thought, particularly when it concerns skills-based curricula and lessons whose objective is to develop better-equipped citizens. He wrote in the early twentieth century after completing most of his training in the late-nineteenth century. He impacted the field of education so deeply that he is still referenced by scholars today building off his research. Much like the work of Vygotsky, Pavlov, and Skinner, Dewey has made a name for himself through his notoriety in the academy. This occurs to the extent of people following in the footsteps of Dewey and applying his ideas to the current education climate. Some take the praise of Dewey too far, as some scholars have pointed out. However, many in the educational school of the academy revere Dewey and operate as if they were his acolytes. As a budding historian, this raises a few eyebrows.

Dewey was an advocate for civil and social rights, and for improving the overall lives of children. These are certainly noble pursuits, worthy of praise and a positive reputation in the academy. However, when scholars so closely conjoin their present ideas to those proposed by Dewey in the past, a noteworthy problem arises. The result is a modern interpretation of Dewey’s thought that is anathema to the historian’s craft. At the heart of this thinking is a flawed sense of historical analysis.

Ultimately, it is wrong of scholars to provide a contemporary interpretation of Dewey’s pedagogy, while ignoring its deeply-rooted historical context. In the field of history, this would be regarded as a blatant case of presentism. Most often this comes from scholars who assert that Dewey should have reflected more on race.  Looking at Dewey by applying present morals to his ideas or body of work is poor scholarship. Understanding the context in which Dewey lived is crucial to understanding the heart of what he advocated.

Dewey first and foremost was a staunch supporter of civil rights in the early twentieth century. As Susan Carle notes, Dewey was an influential part of the initial years of the NAACP and worked to promote the importance for all to have access to quality education. An article by Sam Stack Jr., however, reveals that many people have misconstrued Dewey’s statements, suggesting that they applied only to those in affluent white neighborhoods. However, this is mostly false, as Dewey himself was an advocate of multiculturalism as a fundamental component to developing a society that was prepared to engage with the challenges of the twentieth century.

Stack Jr. himself is typical of many modern scholars by lambasting Dewey’s lack of involvement in breaking down racial barriers with his writing. He notes,

We especially need to be wary of our devotion to Dewey when it comes to race. Given Dewey’s insistence that philosophy be informed by the context of ‘real life,’ it is dismaying that Dewey wrote very little about the contradictive role that race and racism play in lived experience.

This quote is somewhat troubling for a few reasons. First, the quote itself misunderstands the role of historical context in interpretation (whether because of their interpretation of Dewey’s contextualism, or poor historical analysis). Second, despite Stack Jr.’s attempts to qualify it, the selection has implications that skew the actual context of Dewey.

Perhaps the best way to highlight why Dewey’s context of racism might not align with that expected of him from these scholars, is covered aptly through John Steinbeck’s Travels with Charley. During his ventures in the South, Steinbeck came face to face with a brand of racism that he had not before encountered and did not understand. The virulent and rabid racism that was prevalent in the Jim Crow South, which grew monstrously, was plainly evident to Steinbeck in late 1960 when he was traveling through the south with his dog Charley. He coalesces his understanding of race by saying he was “kept out of real and emotional understanding of the agony not because I, a white, have no experience with Negroes but because of the nature of my experience.” Steinbeck admits that he experienced a phenomenon in the South undergoing the tumult of desegregation that he did not adequately understand because he grew up in a different social context.

Steinbeck’s story pertains directly to an analysis of Dewey because it highlights the importance of context in discussions about race. Often, we are unable to see outside of our experience with race, or social issues, because our context makes us unfamiliar with others’ perspectives. This contextual foreignness takes on added importance in the realm of history, as we are even further removed from the lives of historical figures. Dewey commented quite frequently on the subject of race, but did so in ways that were embedded in his progressive mindset. The philosophies of Dewey were often couched in the socialist and Marxist principles contemporary to the late nineteenth and early twentieth century. His writings frequently refer to social reform in education, and his experience working to help build the NAACP and ACLU point to his advocacy of human rights. Dewey’s focus on class and social standing does not take away from the discussion of race in these contexts; it does, however, show that he had a different way of looking at the issues of race plaguing America.

Dewey’s involvement with racial awareness in the early twentieth century was shown in more muted and subdued ways. Yet, evidence points to his interest in race in America insofar as his expertise led him – which was education. In many ways, his advocacy of multicultural schools was an indictment on the homogeneity of many schools in the United States, not just in the American South. Even still, scholars continue to ignore Dewey’s philosophical and geographic context (Dewey was from Vermont) by continuing to assert that he said “virtually nothing about racism in American society.”

Dewey’s dealings with race and his subsequent interpretation by modern scholars in education departments reveal two important questions in regards to morality. First, should Dewey have been more outspoken than he already was about race in the early 20th century? To this, I would say a reserved no, for all of the arguments mentioned previously on context. Second, should modern scholars show more empathy when looking at Dewey as a historical figure? Here I would give a resounding yes. After all, it is this emotional connection that allows students and scholars to better understand the context of people in the past.

[A Note by Russ Allen] Boggs’ article aptly reveals the complexity involved in a study of John Dewey. As he points out, the assertion by modern scholars that Dewey “should have reflected more on race” has both moral and historical implications. These implications cut much further and deeper than a simple analysis can convey. The Work of Redemption hopes to feature more of Boggs’ thoughts on this topic in the near future. 

 

Picture1

Alex Boggs is earning his Ed.D through Liberty University. His research focuses on using documentary filmmaking for the purpose of promoting historical thinking in secondary school classrooms. Boggs also has an M.A. degree in 19th and 20th century European and American history. You can follow him on twitter @jayfianakis.  

Teaching Morality in History

morality

Morality’s role in the work of a historian is a long-debated and messy topic. Scholars have cautioned against the practice of making “moral judgements” in both history writing and teaching. David Hackett Fischer suggests that in committing a “moralistic fallacy,” history becomes the handmaid of moral philosophy and present-day biases. Herbert Butterfield warns against making “pseudomoral judgments,” which often masquerade as moral ones, becoming “mixed and muddy affairs, part prejudice, part political animosity, with a dash of ethical flavoring wildly tossed into the concoction.” In essence, history becomes skewed by a moral perspective.

Although there is a significant amount of truth to these assertions, there appears to be a problem in the other direction as well – leaving morality out of history. Removing moral intent from the narrative leaves it utterly bleak, and detaches it from the inherent substance to which humans can most closely relate. Morality, after all, is the defining feature of humanity. It is what binds together people in the present with those in the past. A history devoid of morality risks falling into fatalism, dehumanization, and “leaving an unbridged gulf between the subject and the reader.”

Even self-proclaimed “secular” historians, devoid of any religious belief system, have realized the problems of detaching morality from history. In the wake of Watergate in the 1970s, historians became concerned that their removal of morality from historical narratives was partially to blame for the “eroding” moral quality of America’s most educated citizens. Gordon Wright reflects,

For a long time, of course historians comforted themselves with the thought that dispassionate value-free history would somehow secrete its own moral lessons, or would at least ensure that those who study it would be led somewhat automatically to sensible and judicious conclusions.

Despite agreeing with the intent of this approach, Wright cannot help but note that it “somehow leaves one vaguely unsatisfied.” He conjectures:

There are dangers built into all stances toward the teaching and writing of history, including the stance called perfect neutrality. … What too many of us have hesitated to do, I believe, is to take that final step – to risk a conclusion, to make a judgment, to advance and defend our view of how things were, and why, and what this meant to people of the time, and what it means to people of today. … 

…Perhaps it is a buried aspect of that old liberal heritage, so much maligned in our day; or perhaps it is a surviving spark of an evangelical upbringing. It has not yet driven me to the point of urging that we resurrect the label “moral science” as a category within which our profession might find its proper place. But it does impel me to think that for some of us at least, our search for truth ought to be quite consciously suffused by a commitment to some deeply held humane values. The effort to keep these two goals in balance may be precarious; but if we can manage it, perhaps we will be on the way to reestablishing the role of history as one- and not the least-of what we might fairly call the moral arts. 

Wright’s conclusion, while enlightening, is nevertheless vague. How shall historians and educators approach this “precarious balance?” Perhaps the most convincing answer comes from one of Wright’s predecessors, John Higham. In 1962, Higham wrote an article for The American Historical Review in which he expounded upon the role of the historian as a moral critic. Although controversial, Higham’s words are a welcome perspective that needs further consideration by modern historians – especially those who profess a religious faith. He contends that there is an important difference between making a moral judgment and being a moral critic, writing,

Let us beware of the easy temptations of moral judgment in essaying, the difficult adventure of moral criticism. Let us operate on any subject with a conviction of its dignity and worth. Let us grant to every actor in a moral drama the fullest measure of his particular integrity; let us not destroy the drama by hastening to condemn or to absolve. The serious historian may not wrap himself in judicial robes and pass sentence from on high; he is too much involved in both the prosecution and the defense. He is not a judge of the dead, but rather a participant in their affairs, and their only trustworthy intermediary.

To Higham, historians have a responsibility in laying bare the moral complexity of historical events and figures. Moral judgements are derived from moral standards – and rightly so – as any Christian will affirm. However, the problem with moral judgements is not that they are too overpowering, but that they are not deep enough. He notes,

But to try to lay down exact criteria is, I think, to misconceive our opportunity and to narrow our prospect. The historian is not called to establish a hierarchy of values, but rather to explore a spectrum of human potentialities and achievements. While maintaining his own integrity, while preserving the detachment that time and distance afford, he must participate in variety, allowing his subjects as much as possible to criticize one another. … 

... In the simplest sense, the historian commits to moral criticism all the resources of his human condition. He derives from moral criticism an enlarged and disciplined sensitivity to what men ought to have done, what they might have done, and what they achieved. His history becomes an intensive, concrete reflection upon life, freed from academic primness, and offering itself as one of the noblest, if also one of the most difficult and imperfect, of the arts.

If Higham’s advocacy of moral criticism is correct, then it has large implications not only for writing history, but also teaching it. If morality is indeed a bridge between historical actors and present-day learners, perhaps a more concerted focus in this area will enhance student engagement. One need only to look at the current state of American society to find that morality is just as important and relevant as it has ever been. By becoming a true “moral critic” of history – examining all causes, perspectives, contexts, and worldviews – students will have a broader and firmer foundation from which they can base their own morality. Perhaps, as Wright concludes, people will begin to “consider moral insight as something they can gain by skilled and patient historical study, not merely as something they cannot keep out of it.”

Can Theology and Historical Knowledge Make Society Moral?

antique-books

Donald Trump’s ascent to the American Presidency has brought with it a slew of criticism from both Liberals and Conservatives alike. Chief among these concerns is the idea that the United States is slipping into moral degeneracy, typified by the perceived “lack of virtue” exhibited by the man in our nation’s highest office. Ironically, much of Donald Trump’s political success is a reaction by Conservative Americans who feared Barack Obama’s “immoral” policies and agenda of the political Left. Although Americans continue to disagree on what is right and what is wrong, the overwhelming narrative shows that almost all people are concerned with the morality of their fellow citizens. There is a general distress that further corruption will lead to destruction. This concept is not a new one, but rather has origins as old as the nation itself.

The United States’ founding fathers believed that the country’s success fully depended on the ability of its people to be moral and virtuous. Thomas Jefferson once stated, “No government can continue good but under the control of the people; and . . . . their minds are to be informed by education what is right and what wrong; to be encouraged in habits of virtue and to be deterred from those of vice.” Jefferson, along with other leaders, affirmed that morality and virtue could only be sustained through religion (made most apparent in the moral teachings of Jesus) and education (a thorough understanding of the world and its history). Perhaps America’s shift toward new forms of individualism and a bureaucratic industrial society during the 20th century pushed the once core principles of religion and historical education to the periphery, as is proposed in the book Habits of the Heart.

However, one must be cautioned before concluding that a refocus on these disciplines will produce a moral society. Elizabeth Fox-Genovese and Eugene D. Genovese’s Mind of the Master Class shows that this idea is much more complex. Their research reveals that white southern society in the U.S. during the 18th and 19th centuries was both theologically and historically proficient. They write:

Southerners needed to know from whence they came and where they were going. Hence, like other Americans, they turned to history as well as religion for moral guidance for nations and for individuals – for illumination of the rise and fall of empires and nations in consequence of incurring the wrath of God. …

… Southerners did not turn to history primarily for facts about the past but for illustrative accounts of admirable and despicable human behavior, the complexities of political and moral struggles, and the tensions between long-term linear and cyclical patterns. … 

… Separate departments of history came slowly to American colleges, with the South leading in the establishment of chairs of history. … In politics and public affairs conservative Southerners appealed to historical experience, not abstract principles, to determine the course of government. … 

… Revivals and camp meetings receded in the North while they long remained vigorous in the South. Methodists led a revival in 1826-27 that extended from Virginia to Georgia and then westward. Stores and schools shut down. …

… Rural as well as town folk often attended church several times a week: services on Sunday and perhaps prayer meeting and songfests on Wednesday; sometimes the church trials of backsliders on Saturday. … Ministers groused about people who seized upon some excuse to skip services. 

Southerners, especially those in the upper class, thought deeply and profoundly about both history and theology. However, what makes this truth complex is the uneasy fact that at the same time, they also partook in one of the world’s most brutal institutions – slavery.

Did southern states during the antebellum era subconsciously invalidate the words of their own Thomas Jefferson (who, ironically, owned slaves himself)? Are religion and education useless in producing a truly moral society?

The natural depravity exhibited in humans should be a caution that no society will ever be fully moral or virtuous. It is indeed an unattainable hope for this world. However, this also does not mean that theology and history are useless for producing these desired qualities, generally-speaking. Although there are numerous complexities surrounding the “southern mind” and their attitudes toward slavery during the Antebellum era, the example is revealing. It shows modern Americans not that theology and history are useless, but that more attention must be paid to how they are used. Religious inclinations are troublesome when they fall short of disinterested benevolence (to use an Edwardsian term), and historical knowledge is dangerous when it is devoid of empathy.

In order to restore morality and virtue to a degenerating modern society, two things must take place. First, American Christians (and other religious practitioners) must emphasize a theology of radical, self-sacrificial love. It must be a faith not driven by the defense of certain norms, but rather the pursuit of ultimate ones. Secondly, American educators must teach students from an early age the skill of sound historical thinking, because as Sam Wineburg writes, it is an inherently “unnatural act.” History cannot be used to advocate or defend certain agendas or positions, but must rather be examined on its own terms. Students must learn to empathize with historical characters – both those they personally like and dislike – in an effort to gain nuance and perspective.

While the current condition of the United States’ societal morality is unclear, there is no doubt that it can be improved. Further knowledge of theology and history – and how to practice them – may be a major solution, but we must proceed with a great deal of caution and humility…for fear that we point out the twig in the eye of our brothers of the past, but ignore the log in our own.

Of Two Spirits: The Resolutions of Franklin and Edwards

picture1

Each new year brings with it a spirit of self-reflection and resolution. Many Americans take up the noble mantle of personal improvement, and set goals that they believe will make them better family members and citizens.

The concept of “individual resolution” is not a new one. In 18th century North America, it was common for educated people to draw out these ideas in diaries and journals throughout the year. Two of the most noteworthy men to compose such lists were polymath Benjamin Franklin and New England reverend Jonathan Edwards.

As Bancroft Prize-winning historian George Marsden notes in his book A Short Life of Jonathan Edwards, both Franklin and Edwards were well-known contemporaries who each left their respective imprints on the fabric of American society. As transitional figures, the men largely stood apart from others in their thinking and actions. Franklin was a prominent Enlightenment thinker who often found it difficult to completely abandon his Calvinist ties. Edwards, on the other hand, was a staunch Calvinist who sometimes adapted Enlightenment ideas to the orthodoxy of his own worldview. It is this unique dichotomy that makes Franklin and Edwards fascinating parallels of one another.

In studying these two figures, perhaps there is no better point of comparison than their respective “resolutions.” Historian George S. Claghorne writes,

Scholars have long compared Edwards’ and Benjamin Franklin’s resolutions. In addition to arguing about Edwards’ and Franklin’s respective skills and significance as autobiographers, scholars have discussed the two men as philosophers, scientists, and religious commentators. They have seen in these representative figures two sides of the Enlightenment, as well as the different patterns of the American character. 

Franklin’s resolutions in his Autobiography stand in interesting comparison with Edwards’. Both men agreed on the value of making resolutions, evaluating their effectiveness, and following them lifelong. And the resolutions show that the two were united on the importance of speaking the truth, living in moderation, helping others, and doing one’s duty. Each counseled himself (and others) to avoid sloth, make good use of time, cultivate an even temper, and pray for divine assistance; and each offers an energetic, thoughtful approach to life.      

Beyond these similarities, however, the two differ greatly, and the resolutions reflect this. Franklin was satisfied with only thirteen resolutions, while the earnest Edwards drew out his list to seventy. They also differed in spirit and purpose. Franklin represents the Age of Reason. His emphasis is on this world and the preparation of a good citizen. His “Resolutions” were brief, epigrammatic, and eclectic. Jesus and Socrates equally merited imitation. Prayers were an afterthought in Franklin’s daily practice. In contrast, Edwards remained the exemplar of Puritanism, depicting himself, along with all humans, as weak and sinful, helpless without divine intervention. Because the ultimate intention of the “Resolutions” was to produce a soul fit for eternity with God, they served as a set of practical day-to-day guidelines for achieving that end. Edwards adjured himself to study the Scriptures above all other books and to pray steadfastly; Jesus was to be trusted as Lord; God was present, personal, and primary.

For Benjamin Franklin, resolutions were not only about self-improvement, but more importantly, self-fulfillment. Designed in a “stepping stone” format, Franklin believed that the gradual mastery of each “virtue” would increase his happiness and ability to thrive in life. For Jonathan Edwards, however, resolutions were instructions and maxims that should be followed all at once. Although the abundance and specificity of these resolutions may seem overbearing (and perhaps they were), Edwards did not view them as “pious hopes, romantic dreams,” or “legalistic rules.” Rather, they were a daily window to introspection, pointing him to the sustaining strength of God who would enable him to live up to them.

Although the days of Franklin and Edwards are long in the past, it seems that the dichotomy made apparent in their resolutions is very much alive. While it is unlikely to encounter a person who has resolutions that are the same as these two men, most can nevertheless be seen in light of their legacies. Some are made in the spirit of virtue and self-fulfillment, and others are made in the spirit of introspection and dependency.

As an evangelical Christian, I hope that none of my resolutions rely on my own power or promises of a happier life. These, I believe, will result only in disappointment. Even “noble virtues,” such as the ones Franklin enumerates, can become shallow idols. Edwards instead insists that “true virtue must chiefly consist in love to God” – a factor quite beyond the capacity of the self to achieve – an issue of the heart.

Indeed, it is the heart that truly needs resolved. And it is with this quandary that all resolutions must first deal.

Guns in College: Historical Context and the Dangers of Presentism

assets-2016_Ga._House_passes_college_campus_carry_bill_416837906

Last school year, President Jerry Falwell Jr. of Liberty University made a series of controversial statements concerning concealed firearms on the college campus. The first was in December when he urged students to carry guns for self-defense against potential terrorist attacks. This came in the wake of the devastation in San Bernardino, CA. Falwell remarked, “If more good people had concealed-carry permits, then we could end those Muslims before they walked in and killed them.” Although poorly worded, Falwell’s point was clear. The Second Amendment, established by our nation’s Founding Fathers, permitted the use of guns for self defense. To Falwell and other staunch Republicans, this logically applies to college campuses as well. In suit, Falwell’s second controversial statement was his announcing that all Liberty Students twenty-one years and older were permitted to conceal firearms in their dorms. The rationale behind this new policy can be seen in Liberty’s new educational video below:

With such controversy surrounding concealed carry at Liberty University and various other schools, an interesting article caught my eye, titled “Thomas Jefferson and James Madison Didn’t Want Guns on their College Campus.” Here is an excerpt:

In October of 1824, Thomas Jefferson and James Madison attended a board meeting of the University of Virginia, which would open the following spring. Jefferson and Madison had spent not a little time thinking about individual liberties. But minutes from the meeting show that their new school would not extend the right to bear arms to its red-brick grounds.

“No student shall, within the precincts of the University, introduce, keep or use any spirituous or vinous liquors, keep or use weapons or arms of any kind …” the board declared. 

Many quotes about the relationship of private firearm ownership and freedom — often deployed online by gun rights proponents — have been misattributed to Jefferson, or misconstrued. Jefferson is widely credited with saying, “The beauty of the Second Amendment is that it will not be needed until they try to take it,” but archivists at Monticello have found no evidence he made this statement. He is on record has having written, “Let your gun therefore be your constant companion of your walks” in a letter to his nephew. But historian Saul Cornell thinks Jefferson was most likely contemplating slave rebellion, not the need for self defense from criminals.

I must admit that the arguments and evidence supported in this article are convincing. The history appears to be sound. Clearly Jefferson and Madison thought long and hard about what should and should not be allowed on a college campus. However, as a young historian, I am made to wonder what the context is surrounding this ban of guns. When examining the transcription of the Visitor’s Minutes, it becomes clear that the Board desired a strict institution of higher learning, devoid of all potential distractions. Just five paragraphs before the ban on weapons is another statement that prohibits firing guns due to their loud and “disturbing” noise (pg. 6). If this is any indication, it is possible that Jefferson and Madison’s gun ban is more due to their distracting nature than their dangerous potential (although the board clearly understood the aggressive nature of its young students, i.e. prohibiting fighting with weapons). More support for this idea is found in the direct context of the ban. Weapons are listed with “spiritous or vinous liquors,” servants, horses, dogs, smoking, and chewing, as all things that are not permitted on school grounds – each of them hardly life-threatening.

While my above proposition is mere conjecture (a deeper understanding of early American gun culture would be helpful here), the larger lesson to be learned is the importance of historical context and dangers of presentism. Thomas Jefferson and James Madison lived in a much different world than we currently do. Using Jefferson as an apologetic, either for or against concealed carry on college campuses is not only poor logic, but poor history. Given the modern threat of terrorism and mass shootings, who is to say whether or not Jefferson and Madison would see guns as a necessary tool for self-defense? After all, by historian Saul Cornell’s admission, Jefferson advocated carrying firearms for protection against slave rebellion – would terrorism be any different? We will never know the answer. While the actions and beliefs of the Founding Fathers make for interesting conversation and provide helpful background knowledge, in the end we are left to our own devices in determining the best policies for our institutions.