Thursday, September 26, 2013

What Colleges Will Teach in 2025



What Colleges Will Teach in 2025

America must resolve the conflict between knowledge and know-how
TIME Magazine Cover, October 7, 2013
Photographs by Peter Hapak for TIME
Reports on what supposedly educated Americans know—and more sensationally, don’t know—come along fairly regularly, each more depressing than the last.
A survey of recent college graduates commissioned by the American Council of Trustees and Alumni and conducted by GfK Roper last year found that barely half knew that the U.S. Constitution ­establishes the separation of powers. Forty-­three percent failed to identify John Roberts as Chief Justice; 62% didn’t know the correct length of congressional terms of office.
Higher education has never been more expensive—or seemingly less demanding. According to the 2011 book Academically Adrift, by Richard Arum and Josipa Roksa, full-time students in 1961 devoted 40 hours per week to schoolwork and studying; by 2003 that had declined to 27 hours. And even those hours may not be all that effective: the book also notes that 36% of college graduates had not shown any significant cognitive gains over four years. According to data gathered by the Chronicle of Higher Education and American Public Media’s Marketplace, half of employers say they have trouble finding qualified recent college graduates to hire. Everybody has an opinion about what matters most. While Bill Gates worries about the dearth of engineering and science graduates, the American Academy of Arts and Sciences frets about the fate of the humanities.
Rising tuition costs, an underprepared workforce, an inhospitable climate for the humanities: each of these issues, among others, shapes arguments over higher education. True, polls suggest that most students are happy with their college experiences (if not their debt loads), elite institutions are thriving, U.S. research universities are the envy of the world, and a college degree remains the nation’s central cultural and economic credential. Yet it’s also undeniable that hand-­wringing about higher education is so common that it almost forms an academic discipline unto itself or should at least count as a varsity sport.
And so wring the hands of many parents, employers, academics and alumni in the fall of 2013 as the undergraduate class of 2017 begins its freshman year—and as parents of the class of 2025 contemplate the costs and benefits of college down the road. “Higher education is facing a real crisis of effectiveness,” says Michael Poliakoff, vice president of policy at the American Council of Trustees and Alumni, a group that supports traditional core curriculums and postgraduate assessment tests. At the TIME Summit on Higher Education on Sept. 20, Secretary of Education Arne Duncan called for more accountability in higher education through the development of a university ratings system—one that could include the earning power of an institution’s graduates as a factor.
At a time when virtually every state is implementing new Common Core standards to increase the amount of general knowledge in math and English that a typical public-school student must master in K-12, there is renewed interest in the perennial collegiate argument over what’s called either general education or, more colloquially, core curriculum. At issue is whether there are certain books one should read and certain facts one should know to be considered a truly educated person—or at least a truly educated college graduate.
At the heart of the debate between traditionalists (who love a core) and many academics (who prefer to teach more specialized courses and allow students more freedom to set their own curriculums) is a tension between two different questions about the purposes of college. There are those who insist that the key outcome lies in the answer to “What should every college graduate know?”—perhaps minimizing the chances that future surveys will show that poor John Roberts is less recognizable than Lady Gaga. Others ask, What should every college graduate know how to do?
Those three additional words contain multitudes. The prevailing contemporary vision, even in the liberal arts, emphasizes action: active thought, active expression, active preparation for lifelong learning. Engaging with a text or question, marshaling data and arguments and expressing oneself takes precedence over the acquisition of general knowledge.
A caveat: the debate we are discussing here is focused mainly on selective schools, public and private, where there seems to be a persistent unease among key constituencies—parents, trustees, alumni and most of all employers—about undergraduate curriculums. The last time these questions were in circulation was in the 1980s, the years in which Education Secretary Bill Bennett pushed for renewed emphasis on the humanities and Allan Bloom of the University of Chicago published The Closing of the American Mind, a best seller that argued, among other things, that the great books were being wrongly marginalized if not totally neglected by the modern university.
That debate reflected larger arguments about the country’s trend toward the right under Ronald Reagan. What’s driving the core-standards conversation now is the ambition to succeed in a global economy and the anxiety that American students are failing to do so. How does the country relieve those fears and produce a generation of graduates who will create wealth and jobs? It’s a question that’s fueling the Obama Administration’s push for a ratings system, and it’s a question that isn’t going away.
The Roots of the Core
From the founding of Harvard College in 1636 until the Civil War, American university education was mostly about sending pious and hopefully well-read gentlemen forth into the world. As Louis Menand, a Harvard English professor and literary critic, has written, what Americans think of as the university is of relatively recent vintage. In 1862 the Morrill Act created land-grant universities, broadening opportunities for those for whom college had been a virtual impossibility. Menand and other historians of collegiate curriculums note that at Harvard in 1869, Charles William Eliot became president and created a culture in which the bachelor’s degree became the key credential for ongoing professional education—a culture that came to shape the rest of the American academy. The 19th century also saw the rise of the great European research university; the German model of scholar-teachers who educated under­graduates while pursuing their own research interests moved across the Atlantic.
The notion that a student should graduate with a broad base of knowledge is, in Menand’s words, “the most modern part of the modern university.” It was only after World War I, in 1919, that Columbia College undertook a general-education course, called Contemporary Civilization. By reading classic texts—from Plato’s Republic to The Prince to the Declaration of Independence, with the Bible and Edmund Burke thrown in for good measure—and discussing them in the context of enduring issues in human society, every student was compelled to engage with ideas that formed the mainstream of the American mind. The impetus for the move reflected a larger social and cultural concern with assimilating the children of immigrants into American culture. Robert Maynard Hutchins adopted a similar approach at the University of Chicago. The courses were not about rote memorization; they were (and are) centered on reading followed by discussion. They were (and are) required of all students, something that set Columbia and Chicago apart from many other ­colleges—and still does.
World War II helped bring about the Harvard Report of 1945, an effort by America’s oldest college to provide a common cultural basis not only for its elite students but also for the rising middle class. Students were expected to read, for example, the great books. As the decades went by, however, the assumption that there was a given body of knowledge or a given set of authors that had to be learned or read came under cultural and academic attack. Who was to say what was great? Why not let teachers decide what to teach and students decide what to study?
There are many cultural reasons for opposing the core. For instance, faculties generally dislike being told what to do. (Doesn’t everyone?) The most intelligent argument against a core? That the freedom to choose one’s academic path will stoke one’s curiosity and fuel experimentation. At places like Vanderbilt University (where I am a visiting faculty member) the curriculum alters the Columbia approach in two ways. First, students choose specific courses that the university believes provide what chancellor Nicholas Zeppos calls “both foundational knowledge and critical thinking. In other words, we encourage more student growth and risk taking in electing how one builds that foundation.” Rather than mandate a specific set of general-­education courses, Vanderbilt asks undergraduates to meet distribution requirements, choosing classes in broadly defined fields including humanities and the creative arts, the history and culture of America, and international cultures. “So our approach,” says Zeppos, “allows for more exploration and risk taking.”
Knowledge itself changes, and not only in science and technology, where change is so rapid and self-evident. Appomattox will always have happened in April 1865, but one’s understanding of the causes, course and effects of the Civil War can shift. The prevailing academic culture puts more emphasis on developing a student’s ability to confront questions of interpretation by asking them more about why something occurred than when. But some raise reasonable concerns about this approach. “At prestigious schools, the majority of students come from strong backgrounds and will do well even without the core, but that is not the reality for all students,” says Poliakoff. “The core curriculum makes sure that all students develop the skills they need to be successful.”
So what to do?
A Question of Assessment
Page A1 of the Wall Street Journal ­often brings news that matters to America’s striving classes. One such story arrived this August. The headline “Are You Ready for the Post-College SAT?” was followed by a revealing subhead: Employers say they don’t trust grade-point ­averages. The piece explained the imminent arrival of an “SAT-like assessment that aims to cut through grade-point averages and judge students’ real value to employers.”
The Collegiate Learning Assessment, or CLA+, a voluntary test developed by a New York City–based nonprofit, the Council for Aid to Education, is to be administered to seniors at some 200 U.S. colleges and universities, including the University of Texas system and the ­liberal-arts St. John Fisher College near Rochester, N.Y., in an attempt to measure learning by asking critical-thinking questions. “Exit exams are an excellent idea because they are a quantifiable way of giving institutions and individuals the measure of the kind of progress they’re making,” says Poliakoff. And while an assessment like the CLA+ might help employers decide which students to hire, some argue that students and parents need more information to help choose a college. When Duncan told Time’s education summit about the ratings system envisioned by the Obama Administration, he described an approach that would take into account many metrics, including graduation rates, graduate earnings and a graduate’s student debt. The basic question, Duncan said, is this: “How many students at an institution graduate at a reasonable cost without a lot of debt and get a job in the field they choose?”
Fair enough, but none of this tests general knowledge. You don’t have to be able to identify, say, Albert Einstein or explain the difference between a stock and a bond. Critics of the CLA+ argue that institutions may be penalized for attracting strong students who score highly as freshmen and then just as highly as ­seniors—thus showing no growth. Others have even more fundamental problems with the idea of a universal test. “The idea of the CLA+ is to measure learning at various institutions and compare them,” says Watson Scott Swail, president and CEO of the Education Policy Institute. “I don’t think that’s technically possible with such a diverse system of higher education. That’s based on the fact that all the curriculums are different, textbooks are different, and you’re expecting to get some measure of—in a very generic way across all ­curriculums—how someone learns in one institution compared to another. All institutions are different, and all of their students are different.”
So why not make the diversity of American higher education an ally in allaying concerns about how much core knowledge college graduates take with them into the world? Why not honor the independence of each institution and encourage every college to create a required general-education comprehensive exam as a condition for graduation? Ask each department for a given number of questions that it believes every graduate, regardless of major, should be able to answer. Formulate essay questions that would test a student’s capacity to analyze and reason. In other words, take the initiative.
Yes, the departmental discussions about what an educated person should know about chemistry or Chinese or communism would be fraught and long. The good news, however, is that the debates would be illuminating, forcing academics to look to first principles, which is almost always a healthy exercise in any field. An institution might decide that such an assessment just isn’t for them, but it’s an idea worth exploring, for colleges could then control the process rather than cede that authority to yet another standardized national test.
What is heartening to those who believe in the value of a passing acquaintance with Homer and the Declaration of Independence and Jane Austen and Toni Morrison as well as basic scientific literacy is that there is little argument over the human and economic utility of a mind trained to make connections between seemingly disparate elements of reality. The college graduate who can think creatively is going to stand the greatest chance of not only doing well but doing some good too. As long as the liberal-arts tradition remains a foundation of the curriculum in even the most elective of collegiate systems, there is hope that graduates will be able to discuss the Gettysburg Address—in a job interview at Google.
 —with reporting by Eliza Gray/New York and Maya Rhodan/Washington



Read more: http://nation.time.com/2013/09/26/the-class-of-2025/#ixzz2g057GVgg

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.