Showing posts with label civic knowledge. Show all posts
Showing posts with label civic knowledge. Show all posts

Thursday, May 5, 2011

When you get my age, a favorite pastime is to bitch about how young people today, they don't know nuthin.  In that spirit I offer this New York Times story that confirms to the world that young people today, they don't know nuthin.

Based on this survey, the story points out a few distressing factoids:
  • Three-quarters of high school seniors could not name a power granted to Congress by the Constitution
  • Fewer than half knew the purpose of the Bill of Rights
  • Only 1-in-ten knew anything about the checks and balances among the three branches of government.
Which leads us to suspect that young people today, they don't know nuthin.  As the NYT story reports:
“Today’s NAEP results confirm that we have a crisis on our hands when it comes to civics education,” said Sandra Day O’Connor, the former Supreme Court justice, who last year founded icivics.org, a nonprofit group that teaches students civics through Web-based games and other tools.  
Of course we don't really test on civics knowledge, at least not in a world of No Child Left Untested. My own high school children would do well, but they have the advantage of taking AP American History and a class in Government.  Apparently most do not, because some of the questions are so straightforward as to be, well, straightforward.

Friday, October 16, 2009

Classrooms and Political Engagement

An "open classroom environment" can have positive effects on adolescent "civic knowledge and appreciation of political conflict," according to this study in the academic journal Political Behavior.

Okay, the first question from you budding methodologists is -- what's an open classroom environment? 

Table 1 of the study (page 443) outlines what is basically a classroom environment where kids feel comfortable asking questions, expressing opinions, and where teachers respect student opinion (or at least, I suppose, don't beat the crap out of the brats for mouthing off).  But yeah, I can buy this.  So does it make a difference?

Table 2 includes a monumentally long regression analysis to support the argument.  Classroom environment, even after all these statistical controls, does explain some unique variance -- PhDweebspeak for it is still a significant factor despite controlling for lots of other explanations (beta a mere .06, but significant despite all the other controls).  So what predicted civic knowledge among kids?  Classroom environment, obviously, but also significant factors were: expected education (how far you think you're going to go on in school), race, reading of books, discussing politics at home, and perception of the classroom environment, which is apparently different from actual environment.

What drops out as predictors of civic knowledge?
  • Sex.  That's interesting given sex becomes a predictor for adults, with men scoring higher than women on tests of political knowledge (I've blogged on this research here and here).  But at this early age, no gender effect.  So the differences may emerge later.  Worthy of further investigation.
  • News Media use.  At this age, less surprising.  Kids, even older ones, don't consume all that much news.  More meaningful would be the media habits of parents or guardians, but the data probed student use of media.
  • Income and Free Lunch.  These essentially measure the same thing, which makes me worry about multicollinearity.  You can't get a free lunch at public schools unless your family is under the federal poverty level, so these two variables measure basically the same thing.  I would have taken out one from the model, gone with just the other. Or perhaps combine them in some way.
  • Social studiesGasp!  Taking social studies has no effect?  Bad news for those who hope classes like these, which have largely disappeared due to No Child Left Untested, will save the day. 
In all, an interesting peek at kids versus the jillion of studies that examine U.S. adults.  I think there is some stuff above that make for interesting follow-up studies, in particular a further investigation of the sex differences in political knowledge scales.  Some good alternative explanations for that one have emerged of late (the kinds of questions asked, social learning, etc.).  More needs to be done.

Monday, October 12, 2009

Civic Engagement and Ideology

Everyone agrees civic engagement is important, and that young people need to be encouraged to participate.  After that, it gets kinda political.

Here's an Inside Higher Ed article that outlines the basic idea here, that colleges need to do more.  The lede:
There is strong support among students and faculty members for the idea that colleges have a role to play in encouraging civic engagement and promoting good citizenship. But there are real doubts about whether colleges are actually carrying out that role.

No argument here, and the article pumps a lot of tables at you to make the point.  A similar article can be found here at the Chronicle of Higher Education.   The lede:
Colleges are not promoting civic engagement nearly as strongly as their students, faculty members, and administrators believe they should be, says a report released today by the Association of American Colleges and Universities, a group that promotes liberal education.

To be journalistically picky, the "who said" should never be as long as the "what said" in a lede.  But that's the journalism prof in me rearing my ugly head.

Again, no big deal.  But you do get organizations with a somewhat political/partisan/ideological bent in this debate.  Go here and scroll down, you'll find a link to their guide to what colleges don't tell you.  Some of this is the old preserve western civilization argument (one I'm supportive of, but one that is nonetheless ripe with ideological overtones). 

Click on a state, see how the universities score.  My school -- UGA -- got a B.  UF only got a C, meaning we finally beat 'em at something.  Rice, a great school in Texas, got an F.  That alone makes you wonder about the scoring system.

So sometimes what people know, or what is taught, is full of political undertones.

Saturday, September 19, 2009

Civics Knowledge -- No Longer A Priority?

Long piece in The Nation on the state of civics knowledge today in schools, etc., and the decline of civics as a priority.  Including this line:
If you believe that the success of our participatory democracy is directly related to how it prepares its youngest citizens, then you must worry that our democracy is in sorry shape. 

Yup, gotta agree.  But let's face it, No Child Left Untested does not include civics, so why bother?  Pass me the math book, please, gotta do a standardized test tomorrow, and next week, and the week after that ...

And I'll end with this from the article (which I encourage you to read in full):
Our young people's civic ignorance is a long-term threat. The decision to vote can be traced to our civic knowledge. "Nonvoting results from a lack of knowledge about what government is doing and where parties and candidates stand, not from a knowledgeable rejection of government or parties or a lack of trust in government," write Samuel Popkin and Michael Dimock. That was George Washington's point all along: active citizens are integral to democracy, and schools are the training grounds for those citizens.

Monday, June 1, 2009

Kinds of Knowledge

There are many ways to measure what people know. Sometimes scholars and others use them interchangeably, missing the nuances. Below are a few:

Civics Knowledge -- usually measured by questions such as "what branch of government interprets the constitution." These are aimed at understanding a person's base level of how government is structured and works. Rarely used as a dependent variable except by people who study socializing of immigrants into the U.S. or in youth learning the basics of government, it most often shows up as an independent or control variable for other kinds of political knowledge. But you'll often see this kind of question as a generic "political knowledge" measure, especially in news stories bemoaning the fact that so few people can identity some core aspect of democracy. Not a good measure of general political knowledge and lousy in combination with media effects.

Campaign Knowledge -- usually a measure of an active political campaign, most often how various candidates stand on particular issues or, sometimes, questions like "what candidate recently said xyz." A very good measure if you're looking to explain specific campaign events or factors that lead to this kind of knowledge, such as what media best predict knowledge of some recent campaign event. Good measure with media variables.

Current Events Knowledge -- Seen perhaps most often as a measure of "political knowledge," here I'm separating it from "campaign knowledge" because, what the hell, I need more categories. Current events is a wide ranging category. It could something simple like "what party controls the U.S. Senate" or "what happened last week in Iraq?" This is a really good measure if you're studying media effects.

Political Actor Knowledge -- by actors I don't mean on the silver screen but rather measures that ask such questions as, "Who is Nancy Pelosi?" We typically prompt with a name and ask for the office, though sometimes we'll prompt with the office and ask for a name: "Who is the Speaker of the House?" Methodological note: a recent study explained the gender differences (men scoring higher than women on political knowledge tests) in part because women are rarely used in name-prompt questions. So in other words, men do better because we mostly ask about men who are political actors or public figures. So-so for media studies.

Misinformation -- better known as anti-knowledge, or knowledge of incorrect facts such as the belief that Barrack Obama is Muslim. This is the type of knowledge that most readily gets mixed up with attitudes (I'll discuss this later in the week in a post about public relations challenges). We don't study this one an awful lot but it's a great one for media scholars since certain kinds of media content (talk radio, for example) tends to be associated with incorrect knowledge.


There are no doubt others I'm missing. As they come to me, I'll add 'em. Suggestions welcome.

Friday, May 29, 2009

Short Stuff

Just a few bits and pieces found on the net.
  • If you follow Pennsylvania politics, you can take this quiz and see how much you really know (I didn't even try).
  • What is political ignorance? An economist and blogger explores the topic. At the bottom you can follow to later links and discussion.
  • A new book argues that to advance our understanding of political knowledge we must consider five principal areas of research: the traditional model, heuristic models, impression-driven models, affect-based models, and models of operative knowledge. I plan on reading the full chapter soon. Will report back.
  • Perhaps people don't do well on political knowledge tests because they're just not motivated to try hard. That's the focus of this conference paper (abstract only).

Thursday, December 4, 2008

America Gets an F?

The "civic literacy report" by the Intercollegiate Studies Institute gives America an "F" in what people know.

According to the site, which I've only started exploring, 71 percent of Americans failed a test on "America's heritage." Now that's a loaded term and I'll take some time, and another post, getting into what they asked. Instead I provide their key findings below:
  • Americans fail the test.
  • Americans agree colleges should teach "America's heritage."
  • College adds little to civic knowledge.
  • TV, especially TV news, dumbs down America.
  • College grads aren't all that smart either.
  • Elected officials score lower than regular, everyday people.

Again, I want to dig deeper into this one. I'm always suspicious when we talk about "heritage" and what it means, and whether it means the same thing to the people who put out this report that it does to others. Some of the sample questions are too esoteric to be considered "heritage" and to be honest have a certain ideological ring to them. No wonder people failed it.

btw, I got an 85, probably because I ran out of time and put "C" on some questions rather than think them through. Then again, I always was a "B" student.