Suppose that I am on a fundraising mission for the Singularity Institute, and my travels bring me to a small and unexplored island, where a small group of tribal hunter-gatherers have lived for the past thousand years. As I am young and inexperienced, I try and convince them of the merits of Friendly AGI, blissfully unaware of the inferential distance between my memes and theirs. Obviously, they aren’t going to understand what the heck I am talking about; they don’t know what a computer is, let alone a transhuman computerized intelligence. But they probably wouldn’t say “We don’t understand you”; for to admit ignorance would be to admit weakness, and to lose precious social status. Instead, the tribal shaman would probably conclude that I am talking about god-spirits, or witch-doctoring, or something totally unrelated.
This behavior is fairly common in humans and has been observed across a wide variety of situations. When faced with a new and unknown idea, we do not usually admit (even to ourselves) that it is new and unknown. We try and mentally “tweak” the idea, reinterpreting the words, so that it falls into a pre-existing mental category. This was quite useful in tribal Africa; there really wasn’t much that was new under the sun, so if something sounded weird or unusual, it was probably something we’ve heard about before but have just misunderstood. But in the modern-day environment, where new ideas are published every thousandth of a second or so, it can lead to wide-scale misunderstandings across population groups.
When Einstein’s theory of relativity (both special and general) was first published, it was widely considered hard to understand and counterintuitive. The math behind relativity was not generally known (it still isn’t), so popular writers trying to explain the exciting new theory had to make do with fancy words, like “space-time” and “mass-energy equivalence”. When Nazi Germany started to denounce the Jews, they labeled Einstein’s theory as being “pseudoscientific” and “abstract”; these words didn’t mean anything, but they did sound good, and the populace bought them. The populace might not have been able to understand four-vectors, but they could understand ivory-tower academicians and pompous nonsense- after all, they had seen it all before. The Nazis, in effect, exploited the availability heuristic- instances of crackpottery were much more mentally available than instances of counterintuitive theories which turned out to be right, and so the public funneled relativity into the mental box “crackpottery”.
Meanwhile, over in the United States, the new cultural fad was relativism- the idea that there is no “absolute truth“, but that different things are true for different people. The literature on special relativity had little to say on matters of “absolute truth”, as the physicists writing it were more concerned with accuracy than with philosophical implications. But to the relativist philosophers, special relativity looked like the confirmation they had been waiting for- certain quantities varied from observer to observer (the Lorentz invariants seem to have been forgotten). Although the two actually had little to do with each other, they sounded the same in the popular literature, and so the representativeness heuristic kicked in. And lo, Einstein’s theories were brought forth in the philosophical literature as “scientific evidence!” that the relativists were right; Einstein’s theory was funneled into the “relativist philosophy” mental box.
The concept funneling effect implies that if you are writing something on transhumanism (or any esoteric topic), even if you manage to purge popular culture from your mental toolbox, people will still “understand” your ideas in terms of whatever is “hip”. And if you are misunderstood at the start of your writing, you will not have a chance to clarify yourself; people will automatically make the phony connection and start substituting their own experiences for your explanations. It may not be a good idea to write for the “common public”, or any group which lacks the mental building blocks to form transhumanist ideas; for if they misunderstand you once, you will probably never get to explain things properly at all.