Interest in the difference between what people know (actual knowledge) and what people think they know (perceived knowledge) crosses a number of scholarly disciplines. I've done work on whether listening to talk radio or TV faux news shows increases knowledge or merely the perception that one is informed.
I'm not alone in my fascination with the difference between actual and perceived knowledge.
For example, this study finds elementary school teachers often overestimate what they know about literacy development in children. There's similar work in pharmacy, counseling, and political science (the latter gets into some fascinating gender differences and I hope to write more about this tomorrow).
My point? What people think they know can have a number of consequences, thus should be studied when looking at any type of standard knowledge. What consequences? If a person feels informed, he or she may engage in early closure, not seek out additional information, perhaps reach a faulty conclusion. Persons may fill up on "empty calories" of junk news or entertainment-based news and think themselves informed when perhaps they're not truly informed.
Perceived knowledge can lead to greater feelings of efficacy. That's the good. But if those feelings of efficacy are based on weak, insubstantial actual knowledge, you have to worry about any final judgments -- be it in the classroom with teachers, or in the voting booth by citizens. Or in any number of possible scenarios, from doctors to government workers.
I've always wanted to dig deeper into the theoretical possibilities of actual versus perceived knowledge, perhaps even write a good NSF grant aimed at what the consequences of this are for democracy. Maybe I will. Some day. But I firmly believe that if you're studying knowledge, you need to also study perceived knowledge.