Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Accessible American History: Connecting the Past to the Present
Accessible American History: Connecting the Past to the Present
Accessible American History: Connecting the Past to the Present
Ebook836 pages9 hours

Accessible American History: Connecting the Past to the Present

Rating: 0 out of 5 stars

()

Read preview

About this ebook

In a comprehensive series of essays - addressing topics from the time of Columbus to the covid-19 pandemic - Paul Swendson does in written form what he has spent more than twenty years doing as a community college history instructor: making American history “manageable, meaningful, and relevant” for everyday people. In addition to breaking down the fundamental topics of American history in a concise, easy to read fashion, this is a work of political and social commentary, relating the experiences, struggles, and decisions of past Americans to life in the United States today. As stated in the book’s introductory essay, “For if history teachers – and historians for that matter – make no effort to draw lessons from the data and to bring the facts to life, then we are merely engaged in a trivia exercise.” In the end, the goal of this book, like all good history teaching and writing, is to help its readers become a little wiser, and raising the essential questions is often more important than providing the “right” answers.

This book is ideal for anyone who is looking to get an overview of the basics of American history. It can also be a very effective supplemental reader in an American history survey course, stimulating classroom discussions that go beyond just learning the "facts." The author himself is currently using this book in his history courses, and many of the essays have evolved through his personal experiences working with junior high, high school, and college students. And since many of these students have not been history enthusiasts, the author has worked as hard at making the material engaging as he has ensuring its accuracy.
LanguageEnglish
PublisherCypress Books
Release dateDec 30, 2020
ISBN9780985000288
Accessible American History: Connecting the Past to the Present

Related to Accessible American History

Related ebooks

Politics For You

View More

Related articles

Reviews for Accessible American History

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Accessible American History - Paul Swendson

    Accessible American History:

    Connecting the Past to the Present

    Paul Swendson

    Third Edition

    Cypress Books

    Cypress CA

    Copyright 2020

    ISBN: 978-0-9850002-8-8

    Opening Words:

    Reflections on Studying and Teaching History

    How and Why History Can Matter

    Writers of historical scholarship study primary source materials and use them to present original historical information. By that definition, this book is not a work of historical scholarship. It is instead a work of historical education, a type of book that I am the most qualified to write. Rather than focusing on original historical scholarship, I have chosen to be a community college instructor, a job that requires a different skill set. For while it is important for a teacher to develop and continually enhance his or her academic knowledge and skills, a good teacher, more than anything else, is a combination of a performer, organizer, coach, sage, critic, and enthusiast. A great scholar is not necessarily an effective teacher, and vice versa.

    As a community college history professor, I have devoted my career to organizing and attempting to explain information produced by historians. I see myself as a middleman relaying this information to students who, by and large, are unlikely to spend much time reading in-depth historical scholarship. Let’s face it. Much of what is called historical research is produced almost exclusively for historians and hard-core history buffs. If historical information is to have any impact beyond this small community of history lovers, then we need effective teachers who can make this stuff interesting and relevant for the general public. This is where I fit in.

    In the community college survey courses that I have taught for almost twenty years, I try to help students learn the basics about all of the major events and trends in American history. There is therefore not enough class time to study any one topic in nitpicky detail. In fact, if students are overloaded with information, they tend to learn less than if you cover a smaller amount. Unfortunately, many people associate history with the memorization of a mass of random, disconnected facts. Because of experiences in past history courses where the material was not broken down in a manageable and meaningful way, many people have been cursed to forever hate history.

    Being overwhelmed by random data, however, is only part of the problem. Even if the material is presented in an understandable way, many students will still see the study of history as pointless. Why do they have to hear stories and memorize facts about a bunch of dead people from the past? This is where the history teacher faces the most challenging and important task. We must help students to make some sort of a personal connection to the material, to try and imagine themselves in the situations faced by past human beings and recognize that in some ways, we all face similar experiences and challenges today. History never repeats itself completely, and we can never quite see the world through the eyes of people from other times and cultures, but there are general themes and patterns that seem to keep popping up. Also, there are certain fundamental traits that make up our human nature that remain consistent over time. People in the past, like today, enjoyed sex, tried to accumulate wealth, cared about their families, made stupid mistakes, and often feared change. By learning about the experiences of other individuals and societies trying to make their way in the world, we can learn lessons that will maybe make us a little wiser and more empathetic. In the end, history is the study of humankind’s collective memory. All of our knowledge, whether personal or collective, is based on past experience. As individuals, we know that red lights mean stop, jumping off of a cliff can be painful, and politicians occasionally lie because of things we have been told or have personally experienced in the past. If we lose our memories, then it is difficult to function.

    Some would say that this analogy between individual memory loss and historical ignorance works better in theory than in practice. Obviously, if you forget how to swim and fall into the water, then things will not end well. But does not knowing about some famous date, person, or event from hundreds of years ago have any impact on your life? I do agree, believe it or not, that many historical facts in themselves are irrelevant. Good historians and history teachers, however, are not primarily interested in compiling, memorizing, and presenting trivia. The goal is to draw more general, applicable meanings from all of the facts. If history is done correctly, then a student of history will gain a better understanding of how the world used to be and of how it became what it is today. The only way, after all, to make sense of the present is to turn to the past.

    Studying history can also have another practical benefit. To be an informed citizen, a certain amount of historical literacy is required. When reading newspapers, magazines, books, or web pages that seek to be informative, references to famous people and events of the past will inevitably be made. In most cases, the writer will not bother to define or describe well-known historical references. If the Cold War is mentioned in a publication that is directed to a reasonably informed audience, the writer will not say, This, by the way, was an indirect and undeclared war between the Soviet Union and the United States. The writer assumes that you know. Can you imagine a writer today bothering to stop and describe for readers the 9/11 attacks, Tiger Woods, Hurricane Katrina, Lebron James, or the corona virus? If an American has not spent several years in a coma, then he or she will probably know at least the basics about these events and people.

    A person without a certain amount of historical literacy will have a hard time getting the full meaning out of writings about history, politics, economics, current events, or many other subjects. History is unavoidably integrated into the study of virtually any subject, so historical illiteracy can contribute to a more general academic and cultural illiteracy. If the ultimate success of our democracy rests with an informed voting public, historical and cultural illiteracy can be disastrous. So much of the battle over hearts and minds that we see in our country and world is a competition over which interpretations of the past will predominate. Whoever wins this competition will then have the greatest influence on our views of the present. In the course of this battle, history will often be twisted and misused to suit the needs of various powerful interests.

    After twenty-five years of teaching history at various levels, I am more convinced today than ever of its relevance. People with basic historical literacy will not be so easily talked into questionable ideas. In a nutshell, studying history can make a person wiser. Too often, education is seen merely as a means of getting job skills. In recent years, as our country has faced some tough times, it is clear that our main problem is not a shortage of job skills. Instead, it is a shortage of wisdom. God help us if a college education degenerates into nothing more than a job-training program creating workers who have little interest in or understanding of the circumstances that have shaped our country and world.

    So if you are interested in reading historical essays compiled by someone who has been out there in the community college trenches seeking to make American history manageable, meaningful, and relevant for the average 21st-century American, then you have come to the right place. Essentially, my essays on various history topics are lesson plans converted into written form. And in all my lesson plans, I make a continuing effort to show how the events of the past relate to the modern world. For if history teachers – and historians for that matter - make no effort to draw lessons from the data and to bring the facts to life, then we are merely engaged in a trivia exercise, helping students memorize enough to get their three units, forget the stuff immediately after the final, and move on to subjects that they see as actually relevant. Like journal articles and books rotting away in an old library, the hard work will not amount to much of anything.

    2. The Problem with Big Books

    I have trouble resisting used book sales, the clearance rack at bookstores, or my personal favorite: book giveaways. I have a particular weakness for big books. It is difficult, after all, to resist a low cost-per-page rate. These big books, however, pose a couple of problems: they take up a bunch of the increasingly precious space on my bookshelves, and, more importantly, they tend to mess with my self-esteem. 

    As each big ass book is added to the collection, I become more painfully aware of how much I have not read. They will just sit there for months or years on end, collecting dust as they mock my lack of time and knowledge. Of course, I can try and justify myself in resistance to their mockery. Taking on a big book, after all, is a major time commitment. Plus, if it eventually becomes clear that a book does not deserve the time, I have lost many precious hours that could have been spent on other literature. This is why I often gravitate toward magazines or collections of historical essays. I can get quick doses of information on a wide variety of topics, and as a teacher of history survey classes, that is probably the most productive thing that I can do. Sure, I might not be studying things in much detail, but who can remember all of those details anyway?

    I also find the authors of big books intimidating. How the hell did people find the time and amass the knowledge necessary to compose these giant things? The people that I find particularly impressive are all of those authors who managed to write these massive works before the era of word processors. How could a single individual write and revise Moby Dick, War and Peace, or The Brothers Karamazov with just pen and paper? They either got some serious writer’s cramp composing multiple drafts, or they did a lot of erasing, crossing out, or scribbling revisions in the margins. Plus, they did all of this work knowing that the pages could be easily lost. It would be a bit frustrating, after all, if Melville’s home caught on fire, Tolstoy’s dog ate his draft, or Dostoevsky spilled some vodka all over his manuscript. I wonder how many fantastic pieces of literature were lost or destroyed throughout history from simple carelessness or just bad luck?

    Of course, it’s also possible that Tolstoy cranked out War and Peace and published the first draft because it was too much of a pain in the ass to go back and do a lot of revising. In addition to saving time and trouble, the publisher would have the added benefit of charging more for the book due to its thickness. Then, decades into the future, an overly ambitious book buyer like myself might be unable to resist picking it up if it was offered at a discount rate. How could I resist buying War and Peace for only five bucks? It’s such a low cost-per-page rate, and it would look impressive sitting on my shelf.

    3.  Less is More

    One of the most valuable things that I learned when I was getting a teaching credential many years ago was the concept less is more. Twenty-five years of teaching have further strengthened my faith in this key concept. The idea is as simple as it sounds. When students are expected to learn a large amount of information in a short time, they may become overwhelmed and end up learning very little. On the contrary, if a teacher breaks things down to a few essential ideas and covers these concepts thoroughly, students will have a better chance of retaining the information. Less material covered can lead to more material actually learned.

    This concept is particularly important in a history course. One of the reasons many people dislike history is because they view it as a mass memorization exercise in which they are expected to absorb various facts about famous events and (mostly dead) people. It is apparently nothing but a training course for either completing future crossword puzzles or impressing friends while watching Jeopardy. So if a college history professor dumps a one-hundred question multiple choice test on his or her students – a test filled with obscure, very specific dates and names - it will reinforce students’ frustrations with this subject. Memorization, of course, is an inevitable and essential part of any type of learning, and history teachers should expect students to memorize certain key facts. The trick is separating the key facts from the trivia.

    Here is a simple example of how less is more can work. World War II may be the most studied and discussed topic in American history. One can easily spend a lifetime studying nothing but this subject. I have met several people over the years, in fact, who could tell you practically everything about World War II and very little about anything else. (The History Channel used to be nicknamed the World War II channel.) We don’t have a lifetime, however, in a community college American history survey course; in fact, we only have three or four hours to spend on World War II. So how do you narrow down the mass of material that could be talked about into just a few hours?

    For every topic I cover, I try to break things down to a few key questions. My Power Point outline is then built around these simple questions. For World War II, the core questions are similar to those for every major war I cover, from the American Revolution to Vietnam. First, who was fighting and why? We then do a quick overview of the motives behind German and Japanese expansion. Second, how did the United States respond when World War II started, and what circumstances eventually led it to enter the war? We then talk about the reasons why the United States initially stayed out of the war and the process by which it gradually became involved, culminating in Japan’s attack at Pearl Harbor. Third, who won this war and why? We then talk about the advantages that the Allies had in the war, focusing primarily on their ability to produce more than the Germans and Japanese. Next, we discuss the impact that the war had on people living in the United States, discussing its impact on the economy, federal government power, and social change. Finally, we finish this topic by discussing issues that the United States had to deal with when the war ended, with the most important being the threat posed by its former ally: The Soviet Union.

    There you have it: World War II broken down into a single paragraph. And if you think that is impressive, you should see what I do during test reviews. Typically, during the class before a test, I go over everything that will be on the test in about ten minutes. Obviously, this is not done in tremendous detail, which is kind of the point. I want students to get the big picture straight in their minds before they worry about memorizing some of the details. They need to see that history is a story, and if you break it down to certain key questions, it is much easier to make sense out of that story.

    After conducting our review, I encourage them to predict the questions they are likely to be asked. All they have to do is look at the essential questions that we have covered for each topic. The test questions are right there in the class notes. They just need to take the main headings in the Power Point outlines and turn them into questions.

    Now I could give other, more detailed examples to explain the concept of less is more, but I think that I will stop at this point. Less, after all, is more.

    4.  The Case for Social History

    People often say that they don’t like history and that studying it is pointless. Over the years, I have come up with some different responses to history haters, and this post will focus on one. When people say that they do not like history, it is important to ask them to define the term history. Many, I suspect, would say something about the memorization of important dates, events, and people. Then, if you were to ask what they mean by important, they would probably refer to the politically significant: wars, kings, presidents, revolutions, etc.

    When history is defined in such a limited way, it is no wonder that many people find it uninteresting and pointless. The overwhelming majority of people who have ever lived, after all, were not famous or politically important. They, like most of us, were average people focused on their day-to-day lives, so, like many people today, they did not spend a lot of time thinking about politics. During a crisis or an election (if they lived in a country that had them), politics could seem important. For the rest of the time, however, they were thinking about survival, getting ahead, having fun, raising families, establishing relationships, and doing everything else that people who have lives do. If politics does not draw much attention in the present, then the politics of the past must seem even more boring and irrelevant.

    But what if people could see that the everyday lives of average people are subjects that are as much a part of the field of history as all of that political stuff? Would history come more to life when topics were addressed that personally related to all of us average people? I used to get frustrated in history classes because I had trouble connecting the material that was covered to past people’s lives. It was only when I started taking classes that focused more attention on social and economic history that I started to really enjoy this field that has become my career. These subfields of history, after all, discuss the subjects that consume most of our lives. Since most people were (and still are) average, I tend to think that more historical attention should be focused on them. In the past and present, however, the average is not, by definition, news, so it tends to get left out of the story.

    There are two books listed at the end of this book that do a terrific job of bringing to life the day-to-day experiences of the average people of the past. One book is called, More Work for Mother by Ruth Schwartz Cowan. I read this book when working on my master’s degree, and I give a brief summary of it every time that I teach a unit on the 1920s. What she basically does is give an overview of how new technological developments over the course of American history revolutionized housework. These include either the invention or increased availability of technologies such as washing machines, vacuum cleaners, refrigerators, furnaces, running water, and fancier stoves with multiple burners. (Can you imagine life without any of these?)

    The assumption, of course, is that all of these technologies make housework easier and less time consuming for the entire family. This is only a half-truth, however. When we were a mostly agricultural society, all family members had certain tasks to perform at home based on a traditional division of labor system. Men worked the fields, gathered the water, and chopped wood for fuel. Women took more responsibility for the indoor tasks of cooking, cleaning, and childcare. Today, in our industrial society, the traditional housework of men has been basically eliminated by technology. When my family needs heat, we push a button on the thermostat. Water comes out of the faucet, and food comes from the grocery store. Men began doing their work outside of the home, earning money in order to afford these labor-saving technologies.

    Women’s traditional housework, however, has not disappeared. They still have to cook, clean, and take care of the kids. Technology, of course, has made some of these tasks easier. Vacuum cleaners help clean floors, and washing machines save an enormous amount of time and energy doing laundry, so it seems that housewives would now have more free time. The problem, however, is that the new technologies also bring higher standards. Because washing machines exist, clothes are expected to be cleaner. If you have a vacuum cleaner or a modern stove, you are supposed to use these things. Doing these tasks may take less time, but they are now done more often in order to maintain these new standards. In addition, many women needed to work outside of the home in order to afford these labor-saving devices. Today, some people have realized that so-called traditional notions of what constitutes men and women’s work are outdated. We are not farmers anymore. Some men, however, are not so enlightened, and as a result, they have some worn out, stressed out wives.

    Another fascinating book – a book not directly related, however, to American history - is Ancient Inventions by Peter James and Nick Thorpe. As the title indicates, this is a historical overview of the earliest known evidence of technologies that we often associate with the modern world. To give you an idea of the book’s scope, here are some of the topics: medicine, sex life (my personal favorite), cosmetics, urban life, military technology, high tech, drugs, and sports.  One thing this book does well is show how technological progress does not happen in a continuous, straight line. In other words, societies that existed thousands of years ago could be more advanced in certain ways than those that existed a few hundred years ago. The cities of the Indus Valley Civilization and of the Minoans of ancient Crete, in fact, would provide an improvement in living standards for many people today. But what this book also does is bring to life the day-to-day lifestyles and activities of the ancient world. And for me, the history of toilets, chewing gum, condoms, plastic surgery, automatic doors, beer and other ancient technological innovations are more interesting than the genealogies of royal families or the specific details of presidential elections. I can personally relate to these technologies that impact my life more on a daily basis than most of the actions of any politician. In my Early World Civilizations class, I read excerpts from this book each time we complete a discussion of a civilization. In many cases, students find this much more interesting than the more conventional curriculum (especially when I describe the Kama Sutra ).

    It is my fundamental belief that everyone likes history. If there is any subject that a person finds interesting in the present, then they will likely be curious about its history, whether it be skateboarding, sculpture, jellybeans, video games, sex toys, or anything else. Then, if you can capture their attention with the social history, you might have a better chance of keeping them awake for the political stuff, particularly if you can help them see its relevance. Of course, if a person has no interests, then he or she might be out of a history teacher’s reach. But people like that, if they actually exist, probably hate every subject.

    5. Can Historians do Science?

    "Many people who celebrate the arts and the humanities, who applaud vigorously the tributes to their importance in our schools, will proclaim without shame (and sometimes even joke) that they don’t understand math or physics. They extoll the virtues of learning Latin, but they are clueless about how to write an algorithm or tell Basic from C++, Python from Pascal. They consider people who don’t know Hamlet from Macbeth to be Philistines, yet they might merrily admit that they don’t know the difference between a gene and a chromosome, or a transistor and a capacitor, or an integral and a differential equation."

    Walter Isaacson, from The Innovators

    This quotation partially applies to me. I enjoy and appreciate the benefits that science and technology bring, and I admire the people who have made and continue to make these wonders possible. I also talk at length about the economic and social effects of technological innovations in the history courses that I teach. But when it comes to understanding the scientific and engineering principles that underlie these amazing innovations, I do not have a clue, and I have been generally okay with my ignorance, chalking it up to a brain that is much better at understanding the abstract than the mechanical. I love to use computers and the internet, but I have no clue how the hell either of these things really work.

    When I started reading The Innovators by Walter Isaacson, I had no intention of trying to understand the intricacies of computer hardware and software. I just wanted to get a general understanding of the basic timeline in the evolution of computers and the internet. This way, I could accurately place these events in their general historical context and sound like I know what I am talking about when we discuss the Information Age in my classes. I was reading this, therefore, as a history book, not a computer science book.

    The only problem, as I found out early in the book, is that it is impossible to understand the history of computer development without knowing at least the basics of how a computer works. As I was reading his detailed descriptions of the earliest mechanical computers; inventions of transistors, microchips, and microprocessors; and the packet switching that makes the World Wide Web operate, I struggled to get a handle on what the hell he was talking about. My first instinct was to do what I have often done when a history book forced me to think about science or engineering: just try to get the gist of what it is talking about and move on. But in this case, I actually found myself caring about the mechanics, and before I knew it, I was browsing web sites that tried to explain how a transistor works and how these could be tied together into circuits. For probably the first time in my life, I actually found engineering to be interesting, and one thing became clear fairly quickly: if you really want to understand the engineering, then you also have to go back and understand some very basic scientific principles.

    It might take a lifetime, an extremely high IQ, or both to really get a grip on how a computer actually works. But there is no inherent reason why a social scientist like myself cannot at least get a general idea of what is going on inside of the machine. Few of us soft scientists and liberal artists, however, ever care enough to try, either because we have bought into the modern notion that knowledge must be compartmentalized or because science, math, and engineering are just too damn hard. When I switched from computer science to social science all of those years ago, after all, it wasn’t just because social science was more interesting. It was also a hell of a lot easier for me.

    What makes history such an endlessly fascinating topic to study and an intimidating subject to teach is that it is basically the study of everything. Because every human endeavor has played a role in shaping the past, a competent history teacher – particularly a teacher of broad survey courses – needs to have at least a working knowledge of a wide range of topics. Since I will never come remotely close to knowing everything, I constantly struggle to figure out how to use the limited time that I have for reading, listening to podcasts, watching documentaries, or doing whatever I can to be less ignorant. So how much time should I spend trying to get a working knowledge of math, science, engineering, computer science, and other technical subjects? Whatever choices I make in the future, I need to shake off two basic tendencies: first, the belief that it is acceptable for a historian to know little about subjects outside of the social sciences; and second, the belief that my brain is not hardwired to think in certain ways. If I expect people who struggle with history to pass a college level course, then I should be willing to struggle with stuff that I find difficult too.

    6.  Should History Teachers Express Their Opinions in Class?

    As the title of this book indicates, this collection of essays is as much about contemporary issues as it is about American history. In my view, historical study is of little value if it does not shed some light on the world of the present and teach us lessons that will help us to be a bit wiser in the future. Connecting the past to the present, however, is an inherently subjective enterprise, and when a history teacher like myself addresses contemporary political issues, I open myself up to being accused of bias, of pushing a personal political agenda. Some would argue that presenting personal opinions is not the role of a history teacher or of educators in general. Others only object when the personal opinions of teachers contradict their own. The following essay addresses the often-expressed claim that there is a liberal bias in education. More generally, however, it addresses the issue of history teachers expressing personal opinions of any kind:

    I have heard it said for many years that there is a liberal bias in higher education institutions. Apparently, colleges are filled with a bunch of ex-hippy – and sometimes still practicing hippy - professors teaching young people why they should hate Republicans, corporations, religious conservatives, and, in some cases, America in general. I have even heard stories of a growing movement of conservatives who are compiling lists of liberal educators that they can use as ammunition in their attempts to bring more balance to American colleges and universities. (I guess that they want to make them fair and balanced, like Fox News.)

    A part of me wishes that people pressuring educational institutions to be more politically balanced would get their wish, and colleges would actively seek more conservative educators in order to correct this liberal bias. It would be entertaining, after all, to watch colleges enforce this policy. First, a college would have to conduct some kind of a test with its existing faculty to determine the political leanings of the current staff. This would likely be some sort of a standardized test consisting of questions regarding abortion, foreign policy, social programs, business regulation, and other controversial topics. Then, once they found out where the college is located on the liberal / conservative political spectrum, they could then give the same basic test to prospective employees applying for positions. That way, they could hire people who would eventually bring things into greater political balance. Of course, if they wanted to fix things more quickly, they could fire excess liberals (or conservatives) that they currently had on staff and immediately hire people who would bring political balance. Or, if they wanted to save themselves all of this trouble, they could force all faculty members to sign some sort of an agreement in which they promised to never express political opinions in the classroom. Instead, they would simply express facts, and if God forbid they found themselves unable to avoid the discussion of some controversial topic, they would agree to give equal time and personal support to the conservative and liberal perspectives on the issue at hand.

    This sounds easy enough, but there are a variety of potential problems with a policy like this. First, colleges would find themselves dealing with all sorts of lawsuits from people claiming that they were discriminated against because of their political views. Then, if colleges were somehow able to overcome the lawsuit problem, they would still be setting a precedent for other movements to achieve proper balance. Maybe colleges, in the name of achieving balance, would be required to have an equal number of people from different religious perspectives. Aggressive affirmative action programs, much to the chagrin of many conservatives, could get a new lease on life, with colleges pressured to achieve proper balance by bringing in more women, ethnic minorities, or people with different sexual orientations. Developments such as these, while not entirely a bad thing, could get tricky and expensive, with educational institutions distracted from accomplishing their primary goals.

    Even if you could muddle your way through some of these legal problems, there are deeper philosophic problems with any attempt to force educational institutions to be politically balanced. The most basic problem is pretty simple: definition of terms. When a person claims that there is either a liberal or conservative bias in some type of an institution, what exactly does he or she mean? Very often, I suspect, the people who complain the most about political bias are those on the political fringes, and to people on the fringes, virtually anyone to the left or right of them is a member of the opposing faction. But to a political moderate, a person whose views were also moderate would be labeled as such. The terms liberal and conservative, like any other political labels, mean different things to different people, and breaking people up into these two general categories is far too simplistic. All people – with the exception of the fascists and the communists on the extreme fringes – exist somewhere on a political continuum, so who gets to decide the point on this political continuum that represents the unbiased middle?

    The term biased also creates some philosophic problems. It seems like a simple enough word. An unbiased person is not trying to push some personal agenda. Instead, he or she tries to look objectively at an issue and is willing to present different perspectives on controversial questions. On the surface, being unbiased is a noble goal. However, if you take this unbiased ideal to an extreme, it can take away all of the potential value of studying fields like history, political science, economics, and many others. Some would say that history teachers, for instance, should limit themselves to teaching facts and not allow their personal bias to get into the class. On a superficial level, I agree with this point of view. But if a history class is going to be more than simply memorization and regurgitation of simple statistics and dates, then a teacher is forced to confront questions that have no definitive answers. We can all agree that Germany invaded Poland in 1939 and that Pearl Harbor was attacked on December 7, 1941, but if you ask the more valuable general question of why World War II occurred, there may be several reasonable explanations and points of view.

    The existence of different points of view, however, should not be a problem. A teacher just needs to present all of these points of view, including both the liberal and conservative perspectives on both past and current events. The problem, however, is that there are often dozens of explanations for why some particular event may have occurred. While some of these explanations can be supported by material evidence and reason, other explanations might be, for lack of a better term, kind of wacky. So, as a teacher, am I obligated to present an explanation simply because there are a certain number of people out there who may believe it is true, or should I limit my presentation to those explanations that historians have offered the most evidence to support? By making the unavoidable decision to present certain explanations while editing out others, I open myself up to the possibility of being labeled as biased. But if being unbiased means that I present information without any regards to what trained historians or my own common sense tell me are plausible, then I will happily proclaim myself biased. I am biased toward historical accuracy.

    This insistence on lack of bias and political balance is based somewhat on the assumption that there is no such thing as truth. Now anyone who has taken Philosophy 101 knows that it is virtually impossible to say that anything is true without a shadow of a doubt. The more complicated the question, the more difficult it is to state anything with any degree of certainty. Bias, therefore, is inevitable. So if bias is inevitable, then teachers should, regardless of their personal opinions, expose students to as many points of view as possible, right?

    While I agree with the basic philosophic argument, I still have this old-fashioned idea that there is such a thing as truth. We may never know exactly what it is, but the truth still exists. As a teacher, it is my job to help students absorb and seek information that is as accurate as possible. My job is not to help them memorize all of the points of view that exist in the world about any given topic. Many points of view, after all, are based on flimsy evidence. There may be people who believe that the world is flat, the moon landing was fake, and that John F. Kennedy was shot by the smoking man in The X Files, but I do not feel obligated to present these particular beliefs. Of course, in the real world, ridiculous ideas are not always so easy to recognize. However, if I am convinced through study and reflection that a commonly held belief is inaccurate, then it is my responsibility as an educator to say so.

    So is there a liberal bias in education? It depends on what you mean by the words liberal and bias. Still, people who make this claim raise some important issues. Teachers should be open minded enough to explore different ideas and to present sometimes contradictory points of view to students. Pushing a political agenda should not be the primary goal of teachers, and they should never penalize students for disagreeing with their personal opinions. A classroom dedicated purely to the pursuit of truth may be an impractical dream, but it should be the primary goal of any educational enterprise. The pursuit of truth improves the quality of our thinking, and no decent thinker believes that all ideas and opinions about a given subject deserve to be valued equally. If all opinions are valued equally, then there is no point in studying history (or many other academic subjects) at all. We can all just believe the versions of the past that feel right.

    7. Are Massive Online Classes the Future?

    Some people are convinced that the college experience as we currently know it will become decreasingly common over time. Instead of people traveling to campuses to take courses in classrooms, they will be receiving their educations online, with thousands of students taking a single online course at the same time. Instead of a small number of students spending tens of thousands of dollars to hear lectures by the top professors in their fields, an unlimited number of average Americans will have access to these top educators at limited cost. Already, elite institutions like Stanford and MIT are offering these massive online open courses (or MOOCs) for free, with tens of thousands of people signing up. The future, therefore, is already here, and it is just a matter of time before these types of classes become the norm.

    It is easy to see the appeal of this new college model. A college education has become increasingly vital for success as time has passed, so even though the costs of college have been skyrocketing for decades, people feel compelled to pay. Collectively, Americans have amassed trillions of dollars of student loan debt, delaying or preventing peoples’ ability to achieve the American dream. But with only one professor being paid to teach these MOOCs, and the cost of providing physical facilities for classes eliminated, a college education can once again be affordable.

    As a parent of current college students and an American concerned about the future of my country, I am open to any ideas that people might have for making college more affordable. I don’t want to fork out tens or hundreds of thousands of dollars to send my kids to the best schools. I don’t want to live in a country where social mobility keeps decreasing and inequalities of wealth become even more entrenched. But at the same time, I am a community college history instructor. If people start taking massive online history classes online, I could soon be out of a job.

    Also, as a community college instructor, I am particularly aware of some of the problems that institutions offering MOOCs will have to solve before these types of classes can truly replace college as we know it:

    Accountability – As a community college instructor, I teach classes that are relatively cheap to take and that have very limited admission requirements. It is therefore inevitable that students will come into my courses who are woefully unprepared when it comes to academic skills. If they begin to struggle, or if they decide that they are either unwilling or unable to invest the effort required to pass this college level class, there is little to stop them from dropping out. It didn’t cost them much money to sign up, after all, and they can always just start the class over at a later date.

    Everything that I just said is multiplied for MOOCs. Literally anyone can sign up, the classes cost nothing, and there is no consequence for failure. Until open online courses offered by Stanford start charging significant fees for taking classes, and until there is some sort of a penalty placed on a permanent record when people fail, I will not be impressed by the huge numbers of people taking these classes. Tell me how many people finished the class, not how many signed up. As every community college teacher knows, there is a big difference between a student who enrolls in a class and a student who takes it.

    Assessment – If grades are based on objective multiple-choice exams, then it doesn’t really matter how many people are taking a course. A computer program can grade tens of thousands of multiple-choice exams as easily as it can grade fifty. Multiple choice exams, however, are only effective for either testing a student’s ability to memorize facts or to perform scientific/mathematical calculations. But if a teacher wants to assess the ability of students to perform higher level critical thinking, a type of assessment that is particularly critical in the humanities or social sciences, then multiple-choice exams are not particularly effective.

    Now there may come a time when computer software can assess the quality of a research paper, essay, short story, or poem. But since we are not there yet, institutions offering MOOCs may have to hire an army of educators to grade written assessments. Costs, however, would inevitably go up, taking away one of the main advantages of MOOCs in the first place. The question, therefore, as it is with education in general, is whether the benefits of reducing costs outweigh the resulting limitations on methods of assessment.

    Cheating – In a perfect world, student assessment would primarily consist of take-home assignments that require more time than is available during a single class. Conventional tests, after all, are not reflective of what people will be asked to do in their future careers. Few people will ever sit down in a desk and be required to regurgitate a bunch of memorized information in an hour or two.

    The problem, however, is that it is impossible to know if the student enrolled in the class actually did the take-home assignment. It is likely that most students are honest, but there will always be a few cheaters. In this less than perfect world, I base student grades (in traditional classrooms) almost exclusively on work that they complete in class. But with a purely on-line class, this is not possible, and while there may be technologies and techniques to improve the odds that the enrolled student is doing the work, students throughout history have consistently shown a great deal of ingenuity when it comes to cheating.

    Cost – By their nature, on-line classes are cheaper to offer than conventional classes. But they are not free. When major universities offer free MOOCs, it is important to keep in mind that these free courses are being subsidized partially by students paying tens of thousands of dollars to take conventional classes on site. So until these MOOCs can start funding themselves, I will remain skeptical of their ability to change college education as we know it.

    Student / teacher interaction – Good college courses, as with good classes at any level, do not consist of a teacher standing in front of a room and performing a one-person show for the students. There needs to be a certain amount of back and forth happening between teacher and students, with students being forced to respond to prompts and having the ability to get immediate responses to questions. Sometimes, the best teaching moments happen one-on-one with discussions after class or during office hours.

    The internet, of course, can provide plenty of opportunities for interaction. Students can talk to teachers and to each other at all hours of the day via chat rooms, online conferencing, or email. It is difficult, however, for teachers to have any meaningful one-on-one interaction with students if they have a class of 20,000. As with assessment, this problem could be solved by hiring (at a high cost) dozens of tutors to respond to student inquiries. Automated tutors may also be able to fill at least some of this gap. But for the moment, automation cannot have the same effect as personal interaction with a knowledgeable, passionate, and human teacher, and some would argue that it never will.

    The college experience – At all levels, school has never been purely about education. Grammar schools are as much about providing day care and socializing children as they are about reading, writing, and math. When people reflect on their high school years, they often remember more about the extracurricular activities than about what happened in classrooms. And when people go off to college, they look forward to an even more rewarding rite of passage: living in dorms, joining fraternities, going to football games, attending events on campus, and squeezing in a party or two (or more). Needless to say, taking courses online cannot replicate these experiences.

    Ever since books became readily available, schools as educational entities have arguably been unnecessary, and the older that we get, the less we should need school. If K-12 schools do their jobs and teach people basic academic skills, then everyone who graduates from high school should be an independent learner. Especially in this dawning Information Age, college as we have known it should not be needed. For independent, self-motivated learners, the online education model is ideal. If these people want the knowledge without the frills, then I hope that there will be plenty of low-cost online programs, whether MOOCs or otherwise, available to them.

    Large numbers of students, however, are not the self-motivated, independent learners ideally suited for large online classes. Some lack basic academic skills and need some one-on-one help. Some need a set schedule with a physical class to attend that is run by a cheerleader/disciplinarian/role model that will help keep them on track. Still others want to experience the same rite of passage as their parents and grandparents.

    I have no doubt that online college education has a significant role to play in the present and future. But it is not necessarily well suited for everyone, and so long as some of the issues listed above remain, it may be a while before MOOCs replace college as we know it.

    8. History is More          Complicated Than it Seems

    Some people see historians as individuals who compile facts about the past (and list these facts in books so that students in history classes can be forced to memorize them). There is a sharp distinction, however, between historical writing and the types of data compilation that one might find in an almanac or index. Historians search for patterns, try to isolate causes and effects, and dig for clues to understanding how the present world came into being. In short, it is the art of generalization, of drawing general meaning from a wide assortment of data.

    The broader the scope of the topic, the more sweeping the generalizations will be. An eight-hundred-page book that focuses on a narrow historical topic will be able to go into tremendous detail, providing the reader with a thorough understanding of what happened from a wide variety of points of view. But a book like this one, which covers more than 500 years of American history in 130 short essays, broad generalizations are unavoidable. Yet even the writer of the longest book on the narrowest topic will be forced to leave a lot of things out, some of it by the author’s choice, and some out of ignorance. No matter how thorough historians might try to be, most details of the past are currently unknown or impossible to recover. We are forced to make generalizations from the data that we have available.

    So before you embark on my attempt to break American history down to the essentials and connect what happened in the past to the circumstances of the present, I have inserted this final opening essay as a disclaimer. The following book, by its nature, is going to be filled with sweeping generalizations. Due to the nature of historical writing and the genetic disposition of the human brain, things will be presented in a way that is more neat, orderly, and factual than the past really was (or ever will be). Our brains are wired to find order in the chaos, a fact that compelled ancient historians to try and make sense out of the past in the first place. When historians, like any writers, get down to writing their conclusions, these must be presented in a neat and orderly fashion in order to have any meaning at all. Unfortunately, this can create the impression that the past was also neat and orderly, and many of the various exceptions to the rules, those annoying facts that do not fit the patterns, get filtered out of the story.

    For example, one of the most common ways that we historians overly generalize is with our various methods of categorizing people. We often lump people into groups based on categories such as gender, age, race, ethnicity, religion, social class, political party, geographic region, or many others. Because categories like these are useful, the upcoming essays will, on more than one occasion, use generic distinctions such as rich and poor, Democrat and Republican, young and old, men and women, or North and South. With each of these categories, chapters or even books could be (and have been) written about the limitations of these generic ways that we classify people. There is no such thing as a single entity called the poor, the young, or the South in which all of the individuals in that group share the exact same experiences, interests, or points of view. But since this book is a collection of short essays, I will briefly describe the limitations of only one of the most common ways of classifying people in American history: skin color.

    The common, general (and pretty accurate) narrative for most of American history is the story of white dominance. As the story goes, people of European descent, after the accidental discovery of the New World by Columbus, swept aside the previous inhabitants fairly easily and came to dominate the Western Hemisphere. In order to profit from these new lands, white people in the Americas imported Africans as slaves, using them mostly to do the hard labor in the fields needed to produce and harvest valuable cash crops. White people were the conquerors and slaveholders, victimizing Native Americans and black Americans in order to build this New World.

    The term white, however, can be problematic, particularly in reference to Central America, South America, and the Caribbean. Because a relatively small number of Spanish people came to New Spain – and most of them were men - it was not long before a racial hodgepodge developed in Spain’s American lands, with most people an ethnic hybrid of Spanish, Native American, and African ancestry. In the English colonies and the United States, however, much larger numbers of Europeans came, so the attempt was generally made to maintain distinctions between whites and either Africans or Native Americans. Racial mixing still occurred, but if it was known that a person was partly Native American or African, then they were perceived as non-white. The illusion was therefore maintained that a person was either white or the other.

    But even among so-called white people, plenty of distinctions remained. Through most of the 1600s, the majority of laborers on plantations were white indentured servants, with people living and working in conditions little better than slaves. This would not be the last time that white Americans demonstrated a willingness to exploit ruthlessly other white people.  When Irish immigrants poured into the United States beginning in the 1840s, many white Americans viewed the Irish as lower orders of life, willing to live and work in conditions that more civilized people would never tolerate. In the late 19th-early 20th centuries, when there was a marked increase in the number of immigrants from southern and Eastern Europe, many white Americans – some of them descendants of Irish immigrants – viewed these newcomers with alarm, wondering how these people who were so culturally different would impact the mainstream. Over time, as with the Irish, descendants of European immigrants would be able to blend into that mainstream, helped by their Caucasian faces. But even as the concept of white remained important, divisions based on religion, region, political ideology, and social class existed within the white community, making even the concept of a white community a gross oversimplification. Race mattered, but it was not by any means the only way that people might be classified.

    Native Americans in the general historical narrative, as already stated above, are perceived as victims of conquest. Because of sharp differences between Native Americans and Europeans, these two general, cultural groups were unable to coexist peacefully. Because people of European descent had superior weapons and brought diseases with them that Native Americans had never been exposed to, white people had no reason to make any attempt to share the land. Native Americans, after all, were relatively easy to conquer, and white people, given their worldview, felt perfectly justified in doing so.

    This narrative, while having a great deal of historical value, has a few basic weaknesses. Namely, it lumps all Native Americans into a single group with a particular culture. In some cases, certain Native American cultural groups did have a way of life that did not mix well with Europeans. Hunting cultures living on the Great Plains could not live side by side with people of the United States who wanted to use the land for farming or raising cattle. Hunters and Gatherers on the west coast could not practice their traditional way of life either when large numbers of settlers and gold miners arrived. But in eastern North America, many Native American cultures were agricultural, so in theory, they could have lived somewhat in harmony with European colonists and later the United States. In many cases, they showed a remarkable willingness to adapt or at least peacefully coexist, incorporating new technologies, establishing business relationships with whites, or signing treaties. Native Americans, however, were not generally the problem. English colonists and later the people of the United States, regardless of how Native Americans reacted to them, tended to lump all Native cultures together. Whether a Native American culture fought against the United States or made some sort of attempts to coexist peacefully, Indians were simply Indians, and they needed to get out of the way.

    But even this last statement is a bit of an over-generalization. There were many cases of white people in the Americas who sought to protect and defend Native American lands and rights. Some even found Native American cultures to be more appealing than so-called civilized life, and they chose to live on the frontier with the Indians. Given the fact that so many of the people who migrated to the Americas were more adventurous and unorthodox than average, it should not be surprising that some of them found it appealing to go native. But since their numbers were relatively small, they often get left out of the story.

    Black Americans, as mentioned earlier, are also generally perceived as victims. From the 1500s until the 1800s, the overwhelming majority of people of African descent were brought here as slaves. Most went to Central America, South America, and the Caribbean where many would toil in conditions even worse than would eventually be found in the United States. Not all of them, however, were successfully controlled and victimized. Many were trained warriors before being captured, and they took off when the first opportunity presented itself. Maroon communities of runaway slaves – who were sometimes joined by resistant Native Americans – could be found throughout Latin America, with many able to fight off Spanish attempts to bring them back under control. Runaway slaves would also at times wreak havoc on Spanish attempts to transport valuable commodities and profit from its empire. Spanish authorities sometimes found it necessary to acknowledge the autonomy of maroon communities in order to stop this harassment. It is too simplistic, therefore, to always perceive Africans in the Americas as victims. And as mentioned earlier, given all of the racial mixing that has taken place over the centuries in Latin America, the term black in much of this part of the world has generally ceased to have any real meaning.

    In the English colonies and later the United States, slaves did not generally make up as large a percentage of the population as in Latin America. Autonomous communities of runaway slaves, therefore, were less common. Plenty of individuals, however, attempted to run away, and violent resistance occurred from time to time. But the most common forms of resistance tended to be more subtle: working slowly, faking illness, stealing small things, damaging property, etc. The ultimate form of resistance, however, was the successful effort by African-Americans to maintain their humanity. Relationships between slave families and friends on plantations were often very tight, and a new culture was created that was influenced by their African roots, the general American culture, and their experiences as slaves. While slave owners were generally unwilling to acknowledge it, this new African-American culture rubbed off on white Americans in all sorts of ways.  Some white slave owners, however, due to their close interaction with black people, may have recognized to a certain degree that the races were not so different, with living and working conditions of slaves varying widely depending on the characters of their masters. In the

    Enjoying the preview?
    Page 1 of 1