Editor's note: Clifford Nass is the Thomas M. Storke Professor at Stanford University and director of the Communication between Humans and Interactive Media (CHIMe) Lab. He is the author of "The Man Who Lied to his Laptop: What Machines Teach Us About Human Relationships," "Wired for Speech" and "The Media Equation."
(CNN) -- Speaking is profoundly human: More of the human brain is devoted to speech than any other activity. People can have an IQ of 50, or a brain that is only one-third the normal size and have difficulties with many simple tasks, but they can speak.
Humans are so tuned to words that from about the age of 18 months, children learn about eight to 10 new words a day, a rate that continues until adolescence.
Humans love to speak: When two hearing people encounter each other, they will speak, despite having other means of communication such as gesturing or drawing. Even when people speak different languages or come from different cultures, they will try to find common words and phrases.
One-day-old infants can distinguish speech from any other sounds and 4-day-olds can distinguish between their native language and other languages. Even in the womb, a fetus can distinguish her or his mother's voice from all other female voices. Adults can distinguish speech sounds at twice the rate of any other sounds, aided by special hair cells in the outer right ear.
Among all animals, only humans have the necessary breathing apparatus and musculature to be able to speak: despite the "Planet of the Apes," no primate could speak like a person, even if their brains grew. Even human ancestors such as the Neanderthal could not possibly speak: speech is a new and remarkably impressive ability.
So, there is nothing so human as speech -- at least until modern technologies came along. Through striking advances in a computer's ability to understand and produce speech, it is common to use your telephone to make airline reservations, answer questions and search the Web.
Because of the shrinking size and increasing speed of computers, it is also possible to speak directly to your automobile.
From putting up with the car intoning, "Your door is ajar," we have moved to navigation systems that can tell you where to find a latte and car interfaces that understand spoken commands and even allow drivers to dictate e-mails, texts and make phone calls.
What could be more simple and natural than talking, even to a technology? And speaking to cars seems particularly desirable. We don't have to take our eyes from the road or our hands from the wheel to select buttons or make choices: Why not let our mouths and our ears do all the work?
Unfortunately, it's not so simple or so desirable.
Recent research by the AAA Foundation for Traffic Safety, conducted by David Strayer at the University of Utah, finds that the new technology can be so distracting it impairs the ability to drive. Studies found that while driving, our attention becomes overloaded by speaking. It basically takes our minds, if not our eyes, off the road.
Here are three reasons why talking while driving is so distracting, and not as safe and effective as you might think:
People like to picture who they are talking with. When you speak with someone face-to-face, you "hear lips and see voices": Your brain automatically and easily focuses on the person.
When you speak on the telephone, you use brainpower to create a mental image of the person you are talking with: The less you know the person, the more mental workload it takes. When you talk to a car, use a phone in a car or dictate a text message, your brain has to do a great deal of work to picture with whom you are communicating. When you're thinking that hard, it's very difficult to pay attention to the road.
That's why talking on a cell phone -- hands free or not -- is much more dangerous than talking to a passenger. The need to imagine steals from attention to the road.
People want to be understood. Although people love to speak, there are few more frustrating things than someone not listening. Listeners puts a great deal of energy into showing that they are listening: They nod their head, say "uh huh," open their eyes and change their posture. People are built to expect these signals of attention, but cars refuse to provide them.
As a result, drivers become overly concerned with whether the car understands or is even listening, and their attention is again drawn away from the road. In addition, the voice of the car does not have the rich vocal cues that indicate engagement and emotion, providing further evidence that the car isn't understanding.
Cars are not native speakers. When you encounter someone who isn't facile in your language, you have to put a great deal of time into selecting the right words, avoiding idioms and speaking slowly and clearly. Speech is no longer an easy and natural means of communication in these instances.
While it is remarkable that cars can understand something that took billions of years of human evolution, the typical car recognition rate of 85% to 95% makes it a mediocre second-language speaker. As a result, speech becomes effortful and demanding, stealing attention from the road.
Because of these problems, my laboratory and laboratories around the world are trying to find ways to support the driver in creating mental images, in showing that the car wants to understand and enabling the car to understand at levels equal to or even better than a person.
And soon cars will be driving themselves, so that people can ignore the road and multitask their way to fighting for attention from each other, just as they do outside the car.
Follow us on Twitter @CNNOpinion.
Join us on Facebook/CNNOpinion.
The opinions expressed in this commentary are solely those of Clifford Nass.