The Anthropology of Everyday Life

Monday, June 12, 2017

Baby Food

Image result for three month old eating a bone

 My daughter’s first solid food, at three months of age, was ossobuco, the Italian dish of meat, wine and vegetables. If we had been Italian, or even visiting Italy, this would have made sense. 

In fact, if she had been gumming her mashed up ossobuco in a restaurant in Italy the other diners would have ignored her, or clapped. 

Since then, she has gone through phases of liking or not liking particular foods, but this past weekend she texted me a photograph of herself eating jellyfish, and I thanked the ossobuco and the wide array of dishes that she’s been offered over nineteen years for her adventurous gastronomic spirit. 

But apparently, loading her baby spoon with food from other cultures is not all that was going on at our table. In a series of experiments with over 200 one-year-olds, Development Psychologists Zoe Liberman and Kathern Kinzler of Cornell University watched babies as the babies watched films of adults eating. This research protocol is a walk in the park because babies are fascinated with other people and when they gaze at someone for a long time, it’s meaningful. In general, the babies paid little attention if one person liked a food and the next person did as well but they stared longer, presumably confused, when the subsequent diner was disgusted by the test food. 

More important, the babies also made layered social distinctions. If the two diners acted like friends and spoke the same language the babies expected them to like the same things. If they acted like enemies or spoke different languages, the babies expected different reactions to food.

We know that what we eat is highly cultural. Just discuss cupcakes and orange soda with the Maasai and watch them make the yuk face while we, in Western culture, would be hard pressed to drink a cocktail of milk and blood. Every culture’s diet is based on a particular kind of subsistence pattern linked to such mundane things as climate, topography, and available raw materials. The recent study shows that babies are not just being indoctrinated into their own cultures by the foods they are offered. They are also innately clocking people who look or talk the same or different, noting enemies and friends, figuring out who to trust and who not to trust. In other words, eating with others is one way babies go about filling in their social map. And eating alone is lonely.

As such, food is not just a cultural moment or a window to the past, it is not just identity or nutrition. Food and what we like or dislike is also one of the threads of connection that signal someone is one of us or not, a point of social communication that even infants recognize.

If I had known all this nineteen years ago, I might have paid more attention to the the context of my daughter’s first real meal. I would have seen her taking note of the reactions by the people at the table, good friends and devoted foodies who loved ossobuo. Her growing baby brain, already geared to such calculations, would have surely digested the fact that these people were part our tribe and that she was culturally home.

Thursday, May 25, 2017

The Monkey in the Coal Mine

A recent outbreak of yellow fever in Brazil has resulted in at least 240 human deaths and over 4,400 monkey deaths. The outbreak has also had a secondary fatal effect on the monkeys—people are capturing and killing monkeys or clubbing and stoning them to death thinking the moneys must somehow be at fault. 
Image result for howler monkeys
In fact, the monkeys have nothing to do with it, and authorities are now begging citizens to stop killing them. 

Yellow fever is transmitted by mosquitoes, not mammals, and certainly not monkeys. As the death toll shows, these animals, fellow primates, are just as vulnerable to yellow fever as humans, maybe more so. 

And of course, the fault is really ours. Slash and burn agriculture, deforestation, and climate change have made swamp out of large swaths of tropical forest. Swamps where where mosquitoes thrive. The human touch, fueled globally by greed, is turning a once pristine ecosystem into a charnel house. 

Image result for brazilian monkeys
In that scenario, monkeys are actually useful and shouldn't be bludgeoned to death because they can be harbingers of infectious disease. (This sort of explanation, pointing out how some animal should be saved because it's useful to humans, pisses me off. But then I don't think humans are in charge of everything and every creature.) 

That is, Brazilian authorities point out, the monkeys are the tropical equivalent of  "canaries in the coal mine." Miners used to bring caged canaries into the mines and when a canary died, they knew it was time to get out of that hole as soon as possible. 

Danilao Simonini Teixeira, the president of the Brazilian Society of Primatology says  that people living in areas gripped with yellow fever don't seem to understand that monkeys are crucial to signaling the onset and march of diseases. Monkeys and humans are closely related primates and so when monkeys start dying it means something bad for humans. 

Also, monkey deaths from yellow fever are putting some species, such as the golden lion tamarin at risk of extinction.   

Image result for brazilian monkeys
Brazil has has the greatest diversity of primate species on earth and what a shame to loose any of it at the direct hand of humans, as if the human caused habitat destruction  weren't enough. 

When we are scared, we pick on the vulnerable, even when it wasn't their fault, even when they had absolutely nothing to do with it, even if they are suffering as well. 
Image result for brazilian monkeys

And even when those vulnerable are so incredibly beautiful.


Monday, May 22, 2017

A Head for the Future

Oh, the endless thinking. The ruminating that never, ever, stops, even when we are asleep. We think and think about the past and the future and often, very often, it takes effort to focus on the present.

I've always considered this mind buzz an unwelcome consequence of having a big brain. In fact, I believe that self-consciousness, which seems like such a good idea, is actually a human curse that we have to endure because it came along with the much more important puzzle solving skills that evolved to help us survive.

A recent opinion piece in the NY Times takes this idea a bit further—into the future. Authors Martin Segilman (a psychology professor at the University of Pennsylvania) and John Tierney (a science journalist) suggest that what really separates humans from other animals is not tool use nor language, but our ability to think about the future and come up with all sorts of possible scenarios.

The evolutionary advantage of this skill is obvious—lying awake at night trying to map where a tasty antelope might be going, or projecting where some tree might be fruiting, must have been a good thing.

But these days all this evolved forethought is not so advantageous. The problem is that humans are unable to turn off that mental shuffling through a zillion ways that things can turn out and we often get stuck on the negative possibilities while ignoring the positive possibilities.

Imagined catastrophes can be paralyzing and they are root of depression. Depression is, after all, the loss of hope and thinking that nothing will ever get better. In other words, the future looks beak. But in reality, the future is unknown and things might actually turn out pretty well.

The trick is to include the good possibilities, not just the bad and depressing ones, when you let your mind wander on its own into the future.

Monday, May 15, 2017


We all have regrets, and usually they are highly personal. Most of them are about decisions we made long ago and when ruminating (or obsessing) about these regrets, we fantasize that a different choice might have led to a different life. One is that is would, of course, be better than how we ended up.
But there are some people who regret  what they did to others, and that must be a hell of its own kind. And what if the people you hurt were strangers? What if you played a major role in actually changing their way of life? And not for the better?

I'm not talking about politicians, or despots, or law makers, but anthropologists, people who have also, in many situations, had a hand in destroying the very cultures they studied.

In the last few months, The New York Times published two articles on the people of the Andaman Islands. These islands are in the Bay of Bengal and under the jurisdiction of India. And what makes Andaman Islanders so special is that they are hunters and gatherers who have only recently been integrated into modern Indian life, and it's fair to say things are not going well.

Western culture has a very long history of trying to end, or protect, what they see as "primitive cultures" (an insult in itself). Although the Indian government decided to try and protect the Jarawa and Sentinelese people by surrounding their land with a buffer zone, the modern world leaked in. Indian anthropologist T.N. Pandit, now 82, knows he is partly, or fully, to blame for encouraging these people to leave the forest and interact with Indians. For two decades he spent time with the islanders and eventually, they did indeed leave their home and seek the goods of modern culture. The result is a destruction of aboriginal life and the very soul of a people. And lots of horrific culture clash.

For example, The Times also reported on the homicide of a 5-month-old baby that was conceived when a Jarawa woman was raped by an outsider. The light-skinned baby was clearly not a full Jarawa and apparently they have a tradition of killing babies outside their blood line.

It's a mess and the authorities don't know who to prosecute or what to do.

Pandit now regrets his role in what has turned out to be the usual story of a society that went from self sufficiency and an intact cultural structure to corruption and poverty. As Pundit said to The Times, "Now they have gotten infected. They have been exposed to a modern way of life and they cannot sustain. They have learned to eat rice and sugar. We have turned a free people into beggars."

Image result for Andaman islanders Mr. Pandit sits at home ruminating on his life as a destroyer of a culture and a people. But he is not alone. As the world becomes more populated, and its most remote environs penetrated, this story will continue to be the usual one. There is something about "modern" culture (read Western culture) that has a manifest destiny about it. We can't help butting in and these groups can't help wanting what we have. 

Our culture and society, when first viewed, seems so shiny and full of great things to see and to own. But in the end, it just doesn't fit everyone. In fact, it does't even work for many who have been citizens of modern culture since they were born. 

Monday, February 6, 2017

Keep Going

It's no secret that Western culture is not exactly the most healthy of places. Sure, we have lots of stuff and lots of food, but our affluence has also brought lots of down time. Or sitting down time. On sofas, in cars, across La-z-Boys. And the result is not pretty.

Anthropologists have long suggested that lounging around and stuffing our faces is not exactly how we were evolutionarily brought up. 
Instead, the theory goes, humans are physically hunter-gatherers, people who have to run after game or wander about the landscape for tubers and so our bodies are supposed to be on the go.

This theory was recently underscored with the generous aid of hunter-gatherers themselves. Researchers from the University of Arizona and Yale University recruited Hadza hunter-gatherers in Tanzania to wear heart monitors for two weeks as they went about their day. 

As the data show, the Hadza have great heart health at any age and, as expected, it's because the Hadza are always on the go. They aren't running and jumping but simply briskly active for more than two hours a day. 

Men follow game all day and women walk into the bush and dig vigorously and so their heart rates are up. 

As a result their blood pressure is down and their heart muscle exercised. 

The Hadza also lie around a lot, but that's the reward for finding food, not the normal position for eating it.

Of course, this is not rocket science, but it is one way anthropologists have added to the conversation about the rate of obesity in Western culture and our modern health crisis. We now live in a world where it's almost mandatory to drive to get food, drive it home, and eat it sitting in front of the T.V.

Funny, Western culture also has the highest rate of depression in the world. Hey anthropologists, anything to add about that?

Monday, January 30, 2017

Lucy in the Trees Without Diamonds

When I first learned the human fossil record back in my undergraduate days, it was a straight shot from Homo habilis to modern humans. But since then, the path of human evolution has become a tangled tree with many branches, and it's much harder to explain. Now we have all kinds of Australopithecines and any number of the species Homo and with each discovery comes a rethinking of our past.

One of those controversies focuses on Australopithecus afarensis, and more specifically on the 40% compete fossil specimen affectionately known as "Lucy."

I admit that Lucy has a special place in my heart. For many years I hung out with the paleontologists who found her, and so her fame at the time as the most ancient hominid feels like part of my personal history.

During my first years as a professor at Cornell, I had the job of purchasing and organizing fossil casts so that students could look at, and even hold, their ancestors  duringIntroductory Biological Anthropology.
For about two years, all sorts of casts were mailed to me, but none were as special as those of Al-288-1's (Lucy's official fossil name). She arrived in many boxes and I had to unwrap each piece and place them, one by one and in  correct anatomical orientation, onto sheets of  foam set in wood drawers. It was, for me, not so much about getting ready for a class as a sacred, deeply moving, act.

I remember so distinctly holding the cast of Lucy's tiny pelvis and thinking about bipedalism and how that bone confirmed that Lucy and her kind walked one two legs, even 3.2 million years ago, meaning she was a human, not an ape.

But since that time, researches have deduced that Lucy actually retained some ape characteristics—curved hand bones, long arms, and most recently, heavy use of her arm bones, all suggesting lots of time spent in trees, perhaps.

I don't really care where Lucy spent her time. She's my special fossil. It was this woman's bipedalism that pushed the hominid lineage back millions of years. In a sense, Lucy stood up for us. And she was, in fact, the first women to stand up for herself and others.

Our legacy, in other words, started with her.

Baby Food

  My daughter’s first solid food, at three months of age, was ossobuco, the Italian dish of meat, wine and vegetables. If we had b...