New! Sign up for our free email newsletter.
Science News
from research organizations

How funny is this word? The 'snunkoople' effect

Date:
November 30, 2015
Source:
University of Alberta
Summary:
How do you quantify something as complex and personal as humor? Mathematicians have now developed a mathematical method of doing just that -- and it might not be quite as personal as we think.
Share:
FULL STORY

How do you quantify something as complex and personal as humour? University of Alberta researchers have developed a mathematical method of doing just that -- and it might not be quite as personal as we think.

"This really is the first paper that's ever had a quantifiable theory of humour," says U of A psychology professor Chris Westbury, lead author of the recent study. "There's quite a small amount of experimental work that's been done on humour."

"We think that humour is personal, but evolutionary psychologists have talked about humour as being a message-sending device."

The idea for the study was born from earlier research in which test subjects with aphasia were asked to review letter strings and determine whether they were real words or not. Westbury began to notice a trend: participants would laugh when they heard some of the made-up non-words, like snunkoople.

It raised the question -- how can a made-up word be inherently funny?

The snunkoople effect

Westbury hypothesized that the answer lay in the word's entropy -- a mathematical measure of how ordered or predictable it is. Non-words like finglam, with uncommon letter combinations, are lower in entropy than other non-words like clester, which have more probable combinations of letters and therefore higher entropy.

"We did show, for example, that Dr. Seuss -- who makes funny non-words -- made non-words that were predictably lower in entropy. He was intuitively making lower-entropy words when he was making his non-words," says Westbury. "It essentially comes down to the probability of the individual letters. So if you look at a Seuss word like yuzz-a-ma-tuzz and calculate its entropy, you would find it is a low-entropy word because it has improbable letters like Z."

Inspired by the reactions to snunkoople, Westbury set out to determine whether it was possible to predict what words people would find funny, using entropy as a yardstick.

"Humour is not one thing. Once you start thinking about it in terms of probability, then you start to understand how we find so many different things funny."

For the first part of the study, test subjects were asked to compare two non-words and select the option they considered to be more humorous. In the second part, they were shown a single non-word and rated how humorous they found it on a scale from 1 to 100.

"The results show that the bigger the difference in the entropy between the two words, the more likely the subjects were to choose the way we expected them to," says Westbury, noting that the most accurate subject chose correctly 92 per cent of the time. "To be able to predict with that level of accuracy is amazing. You hardly ever get that in psychology, where you get to predict what someone will choose 92 per cent of the time."

People are funny that way

This nearly universal response says a lot about the nature of humour and its role in human evolution. Westbury refers to a well-known 1929 linguistics study by Wolfgang Köhler in which test subjects were presented with two shapes, one spiky and one round, and were asked to identify which was a baluba and which was a takete. Almost all the respondents intuited that takete was the spiky object, suggesting a common mapping between speech sounds and the visual shape of objects.

The reasons for this may be evolutionary. "We think that humour is personal, but evolutionary psychologists have talked about humour as being a message-sending device. So if you laugh, you let someone else know that something is not dangerous," says Westbury.

He uses the example of a person at home believing they see an intruder in their backyard. This person might then laugh when they discover the intruder is simply a cat instead of a cat burglar. "If you laugh, you're sending a message to whomever's around that you thought you saw something dangerous, but it turns out it wasn't dangerous after all. It's adaptive."

Just as expected (or not)

The idea of entropy as a predictor of humour aligns with a 19th-century theory from the German philosopher Arthur Schopenhauer, who proposed that humour is a result of an expectation violation, as opposed to a previously held theory that humour is based simply on improbability. When it comes to humour, expectations can be violated in various ways.

In non-words, expectations are phonological (we expect them to be pronounced a certain way), whereas in puns, the expectations are semantic. "One reason puns are funny is that they violate our expectation that a word has one meaning," says Westbury. Consider the following joke: Why did the golfer wear two sets of pants? Because he got a hole in one. "When you hear the golfer joke, you laugh because you've done something unexpected -- you expect the phrase 'hole in one' to mean something different, and that expectation has been violated."

The study may not be about to change the game for stand-up comedians -- after all, a silly word is hardly the pinnacle of comedy -- but the findings may be useful in commercial applications such as in product naming.

"I would be interested in looking at the relationship between product names and the seriousness of the product," notes Westbury. "For example, people might be averse to buying a funny-named medication for a serious illness -- or it could go the other way around."

Finding a measurable way to predict humour is just the tip of the proverbial iceberg. "One of the things the paper says about humour is that humour is not one thing. Once you start thinking about it in terms of probability, then you start to understand how we find so many different things funny. And the many ways in which things can be funny."


Story Source:

Materials provided by University of Alberta. Original written by Kristy Condon. Note: Content may be edited for style and length.


Journal Reference:

  1. Chris Westbury, Cyrus Shaoul, Gail Moroschan, Michael Ramscar. Telling the world’s least funny jokes: On the quantification of humor as entropy. Journal of Memory and Language, 2016; 86: 141 DOI: 10.1016/j.jml.2015.09.001

Cite This Page:

University of Alberta. "How funny is this word? The 'snunkoople' effect." ScienceDaily. ScienceDaily, 30 November 2015. <www.sciencedaily.com/releases/2015/11/151130131847.htm>.
University of Alberta. (2015, November 30). How funny is this word? The 'snunkoople' effect. ScienceDaily. Retrieved December 2, 2024 from www.sciencedaily.com/releases/2015/11/151130131847.htm
University of Alberta. "How funny is this word? The 'snunkoople' effect." ScienceDaily. www.sciencedaily.com/releases/2015/11/151130131847.htm (accessed December 2, 2024).

Explore More

from ScienceDaily

RELATED STORIES