There are many ways to measure what people know. Sometimes scholars and others use them interchangeably, missing the nuances. Below are a few:
Civics Knowledge -- usually measured by questions such as "what branch of government interprets the constitution." These are aimed at understanding a person's base level of how government is structured and works. Rarely used as a dependent variable except by people who study socializing of immigrants into the U.S. or in youth learning the basics of government, it most often shows up as an independent or control variable for other kinds of political knowledge. But you'll often see this kind of question as a generic "political knowledge" measure, especially in news stories bemoaning the fact that so few people can identity some core aspect of democracy. Not a good measure of general political knowledge and lousy in combination with media effects.
Campaign Knowledge -- usually a measure of an active political campaign, most often how various candidates stand on particular issues or, sometimes, questions like "what candidate recently said xyz." A very good measure if you're looking to explain specific campaign events or factors that lead to this kind of knowledge, such as what media best predict knowledge of some recent campaign event. Good measure with media variables.
Current Events Knowledge -- Seen perhaps most often as a measure of "political knowledge," here I'm separating it from "campaign knowledge" because, what the hell, I need more categories. Current events is a wide ranging category. It could something simple like "what party controls the U.S. Senate" or "what happened last week in Iraq?" This is a really good measure if you're studying media effects.
Political Actor Knowledge -- by actors I don't mean on the silver screen but rather measures that ask such questions as, "Who is Nancy Pelosi?" We typically prompt with a name and ask for the office, though sometimes we'll prompt with the office and ask for a name: "Who is the Speaker of the House?" Methodological note: a recent study explained the gender differences (men scoring higher than women on political knowledge tests) in part because women are rarely used in name-prompt questions. So in other words, men do better because we mostly ask about men who are political actors or public figures. So-so for media studies.
Misinformation -- better known as anti-knowledge, or knowledge of incorrect facts such as the belief that Barrack Obama is Muslim. This is the type of knowledge that most readily gets mixed up with attitudes (I'll discuss this later in the week in a post about public relations challenges). We don't study this one an awful lot but it's a great one for media scholars since certain kinds of media content (talk radio, for example) tends to be associated with incorrect knowledge.
There are no doubt others I'm missing. As they come to me, I'll add 'em. Suggestions welcome.