Suzana Herculano-Houzel is an associate professor at the Federal University of Rio de Janeiro, Brazil, where she heads the Laboratory of Comparative Neuroanatomy. She is a Scholar of the James McDonnell Foundation, and a Scientist of the Brazilian National Research Council (CNPq) and of the State of Rio de Janeiro (FAPERJ). Her main research interests are the cellular composition of the nervous system and the evolutionary and developmental origins of its diversity among animals, including humans; and the energetic cost associated with body size and number of brain neurons and how it impacted the evolution of humans and other animals.
Her latest findings show that the human brain, with an average of 86 billion neurons, is not extraordinary in its cellular composition compared to other primate brains – but it is remarkable in its enormous absolute number of neurons, which could not have been achieved without a major change in the diet of our ancestors. Such a change was provided by the invention of cooking, which she proposes to have been a major watershed in human brain evolution, allowing the rapid evolutionary expansion of the human brain. A short presentation of these findings is available at TED.com.
She is also the author of six books on the neuroscience of everyday life for the general public, a regular writer for the Scientific American magazine Mente & Cérebro since 2010, and a columnist for the Brazilian newspaper Folha de São Paulo since 2006, with over 200 articles published in this and other newspapers.
Luke Muehlhauser: Much of your work concerns the question “Why are humans smarter than other animals?” In a series of papers (e.g. 2009, 2012), you’ve argued that recent results show that some popular hypotheses are probably wrong. For example, the so-called “overdeveloped” human cerebral cortex contains roughly the percentage of total brain neurons (19%) as do the cerebral cortices of other mammals. Rather, you argue, the human brain may simply be a “linearly scaled-up primate brain”: primate brains seem to have more economical scaling rules than do other mammals, and humans have the largest brain of any primate, and hence the most total neurons.
Your findings were enabled by a new method for neuron quantification developed at your lab, called “isotropic fractionator” (Herculano-Houzel & Lent 2005). Could you describe how that method works?
Suzana Herculano-Houzel: The isotropic fractionator consists pretty much of turning fixed brain tissue into soup – a soup of a known volume containing free cell nuclei, which can be easily colored (by staining the DNA that all nuclei contain) and thus visualized and counted under a microscope. Since every cell in the brain contains one and only one nucleus, counting nuclei is equivalent to counting cells. The beauty of the soup is that it is fast (total numbers of cells can be known in a few hours for a small brain, and in about one month for a human-sized brain), inexpensive, and very reliable – as much or more than the usual alternative, which is stereology.
Stereology, in comparison, consists of cutting entire brains into a series of very thin slices; processing the slices to allow visualization of the cells (which are otherwise transparent); delineating structures of interest; creating a sampling strategy to account for the heterogeneity in the distribution of cells across brain regions (a problem that is literally dissolved away in the detergent that we use in the isotropic fractionator); acquiring images of these small brain regions to be sampled; and actually counting cells in each of these samples. It is a process that can take a week or more for a single mouse brain. It is more powerful in the sense that spatial information is preserved (while the tissue is necessarily destroyed when turned into soup for our purposes), but on the other hand, it is much more labor-intensive and not appropriate for working on entire brains, because of the heterogeneity across brain parts.
Read more »