Spring 2013

The Real Trouble with Day Care

Mary Eberstadt Download Article

Re-published by permission of Mary Eberstadt and Sentinel, a member of Penguin Group, from Mary Eberstadt, Home Alone America: Why Today’s Kids Are Overmedicated, Overweight, and More Troubled Than Ever Before (New York: Sentinel, 2004), chapter 1 (pp. 1-21).

Not too long ago – ironically, on a day I had spent buried under just a little of the vast literature on what is called “early child development” – our ten-year-old daughter skipped home from school with some unexpectedly apt news. Her class would soon be volunteering some time at a local day-care center – and not just any day-care center, but the snazziest of several in our Washington, DC, neighborhood: a cheerful and inviting high-end sort of place much prized by the parents whose infants and small children spend their weekdays there.

Like most girls her age, this one adores babies and toddlers, so she was elated at the idea. It was all the more surprising then when she returned home on the day of her visit with a long face. As things turned out, the day-care center had not been the fun she had expected, and the reason was this: “There was a boy, a little boy, who was really sick and cried the whole time. His ear was all red, and he shrieked if they even touched it. The day-care ladies were nice and everything, but he wouldn’t stop. It was just so sad. All he did was keep screaming the same thing over and over: Mommy! Mommy! Mommy!

In this way a distressed ten-year-old, empathizing with an even more distressed two-year-old, captured something I had been struggling to formulate for weeks – namely, exactly what our long-running national controversy over institutional child-care is not about. It is not about that screaming toddler. It is not in fact about the immediate emotional experience of any toddlers or babies who spend most of their waking hours out of their homes and in nonfamily care. That is to say, for all the many things our discussion is about, it is not about this perhaps most prosaic of facts: institutional care as it is experienced by real, live, very small children.

No, our ongoing national child-care debate – and it is a real enough debate, among the most heavily documented controversies of our time – is a more sanitized, abstract, at times even a fastidious thing. It is told of, by, and for educated adults, and its vernacular is that of scholarly social science. Does day-care affect long-term “personality development”? “Cognitive ability”? “Educational readiness”? Is “attachment theory” out and “early socialization” in? Where are the “longitudinal data” in all this, and just how “statistically significant” are those sample sizes? These are the sorts of things that we talk about when we talk about day-care, whether we ourselves are “for” it or not.

And just as the argument over institutional care is dominated by talk of outcomes and effects, so also is it advocated on the same basis: results. “My kids got dropped off at day-care,” as a feminist put it one Mother’s Day in the New York Times, “and one is now finishing up at Brown, and the other went through Harvard and Oxford.” “Our son,” bragged another in the Washington Post, also that Mother’s Day, “got a 3.6 grade point average in grad school and was the valedictorian of his class” – and in addition, “Our daughter [is a] Shakespearean actress.” The day-care proof, as advocates see it, is in the achievement pudding. In a 1997 book called When Mothers Work: Loving Our Children Without Sacrificing Ourselves, Joan K. Peters summarizes some of the research behind such boosterism: this British study argues that children of employed mothers read better than those of at-home ones; that American study claims that children left in day-care from one month on develop higher cognitive and language abilities; and Alison Clarke-Stewart’s work argues that day-care children are more confident and “socially skilled” than others.[1]

It is not only advocates who think that institutional care rises or falls by the standard of outcomes, but also, for different reasons, the critics of institutional care. For the most part these writers make the opposite empirical point – either that data does not suggest the rosy outcomes advocates believe in or that the “good” data on cognitive and language skills are outweighed by the “bad” data on a variety of behavioral problems. The work of Jay Belsky, perhaps the best-known authority to raise questions about day-care’s possible negative impact on some children, exhibits both lines of empirical criticism. So does researcher Brian C. Robertson’s 2003 book, Day-Care Deception: What the Child-Care Establishment Isn’t Telling Us, which uses the “bad” data to argue that if parents knew more about the real facts of day-care, they would try harder to avoid it.[2] Moreover, even critics who have made nonempirical arguments against institutional care tend to invoke the long run – that is, the imagined effect on such protocitizens down the road. One particularly interesting recent example is a 2003 essay called “A Schoolhouse Built by Hobbes” by Bryce Christensen, which argues against day-care on the grounds that it weakens the attachment to family necessary for later character formation, thus contributing to the over-individuation of American society.[3]

Generally speaking, then, both the critics and the advocates of institutional care agree about one thing: It is the effects, whether behavioral or cognitive or other, that make or break the case for day-care. This emphasis on the long run is only natural, of course; parents do indeed care very much about results of all kinds. In fact, as the ones most likely to have the long-term interests of the child at heart, parents by definition must care about such things; it would be perverse if they did not.

Yet this focus on the long term, natural as it may be, has also obscured one important related point: To say that day-care should be judged on the long-term results is not to say that those results are the only measure by which to judge this experiment. Here, as in other serious arguments, ends aren’t everything; the question of what happens in the here and now also needs to be factored in.

Let us momentarily grant for the sake of argument that most children who grow up in institutional care turn out fine. To advocates this is where the controversy over day-care begins and ends; case closed. But they are wrong. The notion that “most kids will turn out fine anyway” does not end the question of whether institutional care is good or bad; actually, it should be only the beginning. That other question, about immediate effects, demands to be answered, too. It is not about whether day-care might keep your child out of Harvard ten or twenty years from now or launch him into it, but, rather, about the independent right or wrong of what happens to him day to day during the years that he is most vulnerable and unknowing. Reduced to its simplest form, that inquiry goes something like this: What about the way this radical change in care is experienced by babies and young children? Do we know anything about that, and, if so, does that knowledge deserve any moral weight at all?

This chapter is an attempt to answer that question about contemporaneous as opposed to long-term harm. It argues that institutional care is a bad idea for parents who do have a choice because it raises the quotient of immediate unhappiness in various forms among significant numbers of children, and the continuing ideological promotion of such separation causes the related harm of desensitizing adults to what babies and children actually need. Yes, many parents have to use day-care. But there is a difference between having to use it and celebrating the institution full-throttle. What follows is an argument about why that difference matters.

Day-Care as Germ Factory

The reason for beginning with institutional care, as opposed to other forms of substitute care, is simple: That is the chosen battleground of advocates who have argued over the years that such care is as good as or even better than maternal care or non-maternal care in other forms – an older sibling or grandparent, a babysitter in the home, a turn-taking arrangement with the mother next door, and so on.

This ideological defense of mother-child separation is not new, of course. As Allan Carlson showed recently in an interesting essay on the history of such attempts, its pedigree stretches all the way back to Plato and includes many other thinkers through the centuries.[4] In our own time such advocates generally have been dubbed “feminists.” I will refer to their ideology instead as “separationism” and to its advocates as “separationists,” for that is what they are – thinkers who urge institutional care not as an inevitable practical choice for some, but as a theoretical choice that allegedly advances higher personal and social goals. This is how institutional care has come to be rationalized and promoted.

One immediate harm of such care – or at least what some people would regard as harm – is familiar to all pediatricians and many parents. Day-care centers literally make children sick, and they do so a lot more efficiently than care at home. The screaming toddler with whom I opened this chapter is not the exception but the norm; he is perhaps on the extreme end of pain (of course, not all children in day-care spend their days this way), but it is the norm nonetheless. He represents the truth that just being in day-care increases the likelihood of physical distress. That is because infections are more likely among babies or toddlers tended to in an institutional setting – for three rather obvious reasons. First, infants in full-time care are almost certainly not being breast-fed, or not much at any rate, so the immunological benefits of human milk are not being supplied to them. This raises the risks of their contracting ailments no matter where they are. Second, certain specific things about babies and toddlers, such as diaper-wearing and constant hand-to-mouth contact, make them germ carriers beyond compare, especially germs transmitted by saliva or feces. Third, the sheer number of children encountered every day in such institutions – which is far higher than for children at home even in large families – further and dramatically raises the likelihood of infection. It is like playing pathogen roulette with five bullets instead of two.

In a medical nutshell, and as parents who use day-care already know, children in it tend to be sick more often than others. Consider the example of otitis media, commonly known as an ear infection and the single most common complaint that brings children to the doctor. Otitis media itself is not contagious but is caused by upper respiratory ailments (URAs) that are. Over the past couple of decades, as any pediatrician can tell you – to say nothing of those millions of parents still harboring a sticky pink bottle of antibiotic somewhere in the refrigerator – ear infections in children, especially young children, have risen dramatically. Why? For the same reason that Dr Charles Bluestone, an otolaryngologist (ear, nose, and throat specialist), told one newspaper: “Virtually every study ever done on the increase in otitis media has shown that day-care is the most important difference.”[5]

And otitis media is only the beginning. One current American Academy of Pediatrics fact sheet on “Controlling Illness in Child Care Programs” – the title is suggestive in itself – enumerates a number of other infections that are spread more easily in day-care, from the common cold to gastrointestinal problems to any number of skin and eye infections (impetigo, lice, ringworm, scabies, cold sores, and conjunctivitis, or pinkeye). In fact, hepatitis A, which can be transmitted by contact with feces and is actually more serious for adults than for children, is such an issue in center-based care that this paper further recommends vaccines for “high-risk occupations” – that is, day-care workers.

Medically speaking, the story of day-care as germ central is relatively old news; it has been more than ten years since Pediatric Annals, an authoritative source for pediatricians, devoted a special issue to the subject and titled its editorial “Day-care, Day-care: Mayday! Mayday!”[6] But what has lagged in the popular understanding, at least to judge by the relative absence of writing on the subject, is what might be called the phenomenological face of all this – that is, what numbers like these mean in real life for people, including babies and toddlers.

Something like that need has lately been supplied by Harvard professor Jody Heymann, who devotes considerable space to examining real-life case studies of contemporary family life in her 2000 book on inequality, The Widening Gap. (It was based on extended interviews with more than eight hundred individuals, including workers in the child-care industry as well as parents.) The day-care employees repeatedly emphasize the problems of having to work not only around sick babies and children, but also around desperate parents who drop off those babies and children at day-care rather than miss a day of work. One center worker even coined the term “Tylenol signs” to describe what is evidently common practice: dosing a child with fever-lowering medicine at home or in the car just before drop-off, with the result that the caregivers do not realize the child has a fever until several hours later when the effects wear off and the child’s temperature goes back up. Of course this is contrary to the rules of most centers; since fevers usually mean that kids are contagious, they are supposed to stay home when they have them – but this apparently is a rule parents frequently breach. In fact, on account of this “Tylenol” practice, some caregivers also routinely interrogate children about what happened at home – specifically, whether at not they have had any “pink medicine.”

As anyone who has attended even one sick child can attest, the physical and emotional demands of several at once can strain many a “caregiver ratio” to the breaking point. “[M]any of the child-care providers we spoke with,” Heymann summarizes, “described having received children whose acute health problems made it impossible to provide adequate care either for them or for the well children under the child-care providers’ supervision” [emphasis added]. Problems arose, for example, because the child-care providers could not keep clean and well hydrated the sick children who were vomiting or had diarrhea, give sufficient attention to the sick children’s other needs, and curb the spread of infectious diseases while also trying to care for the healthy children.”[7] Moreover, many parents further confirmed these negative findings to the research team. “Overall,” Heymann reports, “41 per cent of the parents we interviewed extensively... said their working conditions had negatively affected their children’s health in ways that ranged from children being unable to make needed doctors’ appointments to children receiving inadequate early care, which resulted in their condition worsening.”[8]

Heymann’s account, sad and all too real, is one of several in recent years to have drawn attention to the poor quality of care in many centers and to infer the need for some national “solution” (paradoxically, more and also better day-care). Like most other such advocacy, Heymann’s emphasizes how emotionally difficult it can be for the parents who must manage all these competing claims at once. And who cannot feel for a stressed-out mother torn between an unforgiving workplace on the one hand and a sick baby on the other? To avoid that, as discussed by Arlie Russell Hochschild in her book The Time Bind, increasing numbers of corporations have devised ways to keep parents at their desks, including flex time and other leave arrangements as well as in-house care centers.[9]

Yet like most of the day-care literature, Heymann’s explains the sick child problem from the adult point of view – that is, the stress that a sick child adds to an already hectic schedule. As such, it is of limited moral utility. To get the full measure of the harm possibly transpiring, one must look at it from the point of view of the miserable ailing child in institutional care who is not only being deprived of the familiar people and things that might take the edge off his discomfort, but is also too young to understand where everyone else is and why he feels so bad. Shouldn’t his unhappiness and confusion and lack of fulfillment count for something in the day-care calculus, too? Life is indeed hard and misery abundant for all of us, and as some separationist literature reminds us, kids do have to get used to it. But why don’t advocates answer this question: What age, if any, is too young for induction into the school of hard knocks?

How Do You Spell “Aggression”?

Another immediate harm caused by institutional care, well documented if still bitterly resisted, is that day-care makes some children more belligerent and aggressive – and we are talking not only about the longer term here, but also about the here and now.

The latest evidence to back this claim, well publicized by all sides during the last two years, comes from lengthy investigations by the National Institute of Child Health and Human Development (NICHD), one subset of the National Institutes of Health (NIH). Beginning in 1989, a team of researchers began tracking children at ten different sites to determine what effects, if any, day-care was having on them. Over the years various adverse findings have been thrashed out in the media and elsewhere – for example, that babies and toddlers at various ages appeared less attached to their mothers depending on the amount of time spent in non-maternal care.[10] Even so, perhaps nothing about the NICHD project has proved quite as incendiary as the lead article published in the July/August 2003 issue of Child Development that asked, “Does Amount of Time Spent in Child Care Predict Socioemotional Adjustment During the Transition to Kindergarten?”

Yes, said the research, and not in a good way, at least for some. “The more time children spent in any of a variety of non-maternal care arrangements across the first 4.5 years of life, the more externalizing problems and conflict with adults they manifested at 54 months of age and in kindergarten, as reported by mothers, caregivers, and teachers” are perhaps the most quoted words of their report. “More time in care not only predicted problem behavior measured on a continuous scale in a dose-response pattern but also predicted at-risk (though not clinical) levels of problem behavior, as well as assertiveness, disobedience, and aggression.”[11]

As Jay Belsky, one of the lead researchers, explained elsewhere, the criteria for these problem behaviors were quite specific: aggression meant “cruelty to others, destroys own things, gets in many fights, threatens others, and hits others”; noncompliance/disobedience meant “defiant, uncooperative, fails to carry out assigned tasks, temper tantrums, and disrupts class discipline”; and assertiveness meant “bragging/boasting, talks too much, demands/wants attention, and argues a lot.” All three behaviors increased alongside the amount of time in non-maternal care. The effect did not hold for most of the children; Belsky stressed repeatedly that it was “modest.”

He also stressed, however, that even modest negative findings are important for this reason: “In the US more and more children are spending more and more time in non-maternal care than ever before.” Thus, something that has “a small effect on lots of children” can have a large impact on a given setting – such as school. As Belsky wrote, “Consider the consequences of being a teacher in a kindergarten classroom in which many children have a lot of early, extensive, and continuous child-care experience versus being a teacher in a classroom in which many fewer children have extensive child-care experience.” Given the aggression findings, to put his point rhetorically, in which room would you rather teach?

For daring to draw attention to these findings, Belsky has been excoriated by numerous colleagues as well as by many separationist writers – all the more so because the link between day-care and aggression was only the latest in a series of negative effects turned up by his research. His personal story, a fascinating example of the professional perils of ideological heresy, is told in detail in several places, among them Brian C. Robertson’s book, a chapter in Robert Karen’s thorough 1998 work, Becoming Attached, and a recent essay by Belsky himself titled “The Politicized Science of Day-Care.”[12] Yet as Robertson also documents, Belsky’s report on child aggression is only the latest to suggest that at least some children become more belligerent in day-care than elsewhere. “As far as aggressive behavior goes,” Robertson summarizes, “here too the recent studies simply underscore a long history of findings” – including those from a 1974 report in Developmental Psychology that found higher levels of verbal and physical abuse among day-care children to numerous more recent studies which showed, as Belsky did, that at least some children institutionalized from infancy appear more likely to hit, kick, push, and otherwise behave badly than do children in noninstitutional care.[13]

This same idea – that institutionalized children might become more aggressive on account of their surroundings – also received strong independent support from a very different kind of study published in Child Development in 1998.[14] Here, researchers measured not behavior – which is intrinsically subjective – but, rather, levels of cortisol, a stress-related chemical, in day-care children. And what they found was suggestive in the extreme – or, as the researchers put it, “remarkable and unexpected.” While most humans apparently exhibit the same daily pattern in which cortisol is highest in the morning and falls in the afternoon, the day-care children tested showed exactly the opposite pattern: their cortisol levels were higher in the afternoon than in the morning. In other words, their internal stress, unlike that of other people, had apparently been mounting through their institutionalized day.

There is much more that one could relay in this social science vein about the connection between institutional care and aggression for at least some kids. Then again, just how many studies do we need to get the point? I have an independent, quite non-expert source for the same connection, a mental picture worth a hundred research bulletins: biting. Yes, biting. Sitting next to me is a stack of advisory literature written for people who run day-care centers or preschools, and apparently one of the most important things they must prepare for, to judge by the amount of attention it receives, is coping with the inevitable occasional outbreak of human biting. According to any number of authoritative sources, as one preschool publication puts it, the biting of one baby or toddler by another is “the earliest and most troublesome unacceptable behavior in the preschool,” one that “can sweep through a preschool like the measles.” Biting is one of the chief reasons that children are expelled from day-care and preschool. An astonishing range of “strategies” have been devised for handling the problem, a range that of course also speaks to its ubiquity. To browse the literature is to learn that many babies and toddlers in institutional care bite and bite a lot. They bite themselves, one another, and, of course, teachers and adults, too.

Why is this fact so remarkable? Because it doesn’t happen elsewhere the way it does in day-care. On scholasric.com, for example, a resource for teachers, parents, and students, one parent invited to “ask the experts” about parental concerns put the point plaintively: “My two-year-old has been biting other kids at day-care; however, she does not do this at home or at my friend’s house. Why would she bite only at day-care and play well everywhere else she goes?” Of course the “expert” answer is what one would expect – that the toddler may be lonely, in need of affection, frustrated, and so on. But the real point remains that day-care, at least as ordinary experience suggests, makes biting and the feelings associated with it more likely.

This is something some readers will know not only from reading expert literature, but also from their own experience. Of course, as the experts stress, biting is a natural thing. A baby or toddler might do it in fun or because he is teething or simply because he is curious about what will happen. Many of us have seen that kind of biting (and felt it, too). But chronic biting? Contagious biting? No, that is something else altogether, and it is not the way children, even very small children, ordinarily behave. And why does this difference matter? Because if randomly assembled children of the same ages do not spontaneously start using their teeth as weapons, whereas the same kinds of children assembled in a day-care situation do, this strongly suggests that the institutionalized ones are biting at least in part because something about their situation has them especially agitated. In other words, the attention given to biting in the literature on institutional care is itself a sign of what boosters deny – clear evidence that day-care is causing aggressive behavior.

Our skeptical reader might say, “So what? Maybe biting isn’t the best habit, but all of them will outgrow it. Besides, do any longitudinal studies show that recidivist biting of other children at the age of two predicts psychological or academic trouble down the road? No? Well, then, the problem is solved.”

But of course the problem is not solved at all, because our skeptical reader has asked what for our purposes is the wrong question – the one about ends, not means. The right question, the one addressing the overlooked moral dimension of all this, is: What, after all, is the mental state of a bunch of babies and toddlers who take up biting as a habit? And we can all figure out the answer to that without reaching for the social science bookshelf: Those kids aren’t happy. They are exhibiting a self-protective animal instinct, which suggests that they feel unprotected. It is something we would all understand readily enough if, say, zoo animals were to attack each other more frequently in their quarters than in the wild. (And if they did, we would, of course, deplore it and blame the zoo.) Doesn’t that apparent internal turmoil say something undesirable about how institutional care is experienced by at least some small children?

“Sick” Plus “Bad” Equals “Good”?

For parents who do not have options apart from institutional care, the increased likelihood that day-care children will be sick and unhappy are facts of life. They are necessary evils, regrettable but far better than the alternative, which is no care at all. And yet the most curious fact in all our day-care debate, one that brings us to a third and very interesting sort of harm being caused in all this, is that these problems are not seen that way by certain other adults – namely, the separationists dominant in the day-care debate.

These advocates do not see institutional care as a “necessary evil.” They do not write of mother-baby separation with the ambivalence most mothers feel. They refuse to acknowledge that day-care might cause damage of any kind to any child – unlike the many parents who must use it and who worry about just that. The least analyzed and perhaps also the weirdest dimension of our day-care wars so far is the insistence by such advocates that what most people think is bad news – more sick kids and worse-behaved ones – is actually good and maybe even great. And this brings us to a third kind of harm in our experiment in separation: the ideological defense of separationism is further coarsening adult moral sensibility.

For example, anyone actually charged with the care of little children knows that a sick baby or toddler is a uniquely pitiful thing, in part because such a child is too young to understand why. Yet such natural empathy is not the prism through which the sick child problem in day-care is viewed by our advocates. Generally speaking, their response to the sick kids problem has run one of two ways: Either ignore it altogether or rewrite the script so that sicker is actually better.

Thus, in A Mother’s Place: Choosing Work and Family Without Guilt or Blame, Susan Chira acknowledges “several studies have also shown that children in day-care suffer from more ear infections and illnesses in general,” and then brushes it off with “[but] they are hardier when they are older.”[15] Susan Paludi in Backlash sounds the same note: “They soon build up immunities.”[16] Similarly, when a well-publicized 2002 study showed that babies and toddlers in day-care get sick more often than those at home – about twice as many colds, for example – the advocate cheer going up around the country was notably creepy. As one lead researcher explained, this finding “lifts a heavy stone off the backs of guilt-ridden parents who put their children in large day-care centers. The benefit to having colds in the toddler years is that kids miss less school later when it counts.”[17]

Now step back from this discussion for a moment and ask yourself: if we were talking about anything but day-care here, would anyone be caught cheering for the idea that some little children get sick twice as often as others? I think we all know the answer to that one. And that dissonance raises the question of what exactly is going on with this sort of callousness about small children. It is very hard to spend even a day in charge of a sick baby or toddler and be able to accept the Nietzschean line that what does not kill him will make him stronger – in other words, that being sick is good for him. But what if you are not around it, if it has been made someone else’s problem? Might you then be a little less tuned in to just how much a sick baby or toddler needs?

And just as some people have managed to find “good news” in the increase in sick kids, so, too, has there been no lack of advocates who give a thumbs-up to the documented increase in aggression and other behavioral trouble. Belsky antagonist Allison Clarke-Stewart, for example, rationalized the aggression problem in 1989 this way: “Children who have been in day-care think for themselves and want their own way” and “are not willing to comply with adults’ arbitrary rules.” Others have gone further. A University of Chicago psychologist offered the particularly Orwellian response to the 2001 NICHD study that “aggression” was actually “self-assertion” and that day-care babies and toddlers were simply “much more sturdy little interactors” than tots at home. A writer for Salon similarly opined that it is “better to be smart and cheeky than dim and placid.” It was elsewhere suggested that the traits being measured by NICHD are the same alpha qualities of future corporate titans. As with the advocates who have no trouble finding a silver lining in sick kids, so has there been no shortage of those who have translated bad behavior into diapered rugged individualism.

And here again the moral sensibility of our separationists seems to be a different order from that of most people – including most parents, whether they use day-care or not. Anyone who has ever done playground duty with small children knows exactly the difference between an “assertive” little boy playing loudly with a truck and another little boy who just used the same truck to hit another child over the head. Just about anyone who has spent time around small children knows the difference between real aggression and childish high spirits. Bur what about parents who aren’t around to learn this much in the first place? Might they not have a dimmer understanding of that distinction than other people do?

And here is the point in the argument where we leave the narrow matter of institutional care and look more widely at what is said about babies and children more generally in the service of the separationist experiment. Here, too, interestingly enough, the same sort of callousness implicit and explicit in the day-care literature makes routine appearances. Consider a recent example from the letters page of the Atlantic. Writer Caitlin Flanagan had recently penned a largely favorable review of a book by Laura Schlessinger, a review that angered some readers, including one named Nancy, who chided Flanagan for worrying overmuch about middle-class children of divorce. Flanagan aptly replied, “Since writing my review of Laura Schlessinger’s new book, I have had countless people tell me that they can’t stand her because she’s ‘mean.’ But Laura says you’ll hurt a child if you divorce; don’t do it. Nancy says she can’t work up much compassion for a nine-year-old from a broken home. So who’s mean?”[18]

What Flanagan did not go on to say in her short space, but what anyone reading the cable traffic on separationism will know, is that this bitter letter writer to the Atlantic is not alone. She represents a robust tradition of advocates and ideologues who have spent decades doing just what she did: getting very worked up over what mothers ought to have freedom to do and, simultaneously, becoming very dismissive of the possible fallout for children.[19] And once again it seems fair to ask whether practicing what one preaches has had the effect of numbing our separationist advocates just a little as to what babies and children actually need.

Look, for example, at what counts as the moral limbo bar in the day-care debate – the lowest one imaginable. Essentially, advocates have settled for this position: If it doesn’t lead to Columbine, bring it on. But that is obviously a very low perch from which to judge day-care or anything else. Commenting on the NICHD study linking time spent in day-care to aggression, scholar Stanley Kurtz observed something important that ought also to have been obvious to other readers: that the adverse implications were hardly limited to the kids bullying and hitting and that things were likely quite a bit worse than the numbers on aggression alone might suggest. Rather, “Chances are, if a significant percentage of children in day-care evidence clear behavioral problems, or show up as insecurely attached to their mothers, then there are plenty of other children in less obvious, but still significant trouble. If some kids are responding to chronic separation from their mothers with anger, surely others are feeling depressed. Low-level depression is a lot harder to find and verify observationally than obvious classroom bullying, but that doesn’t mean it’s not there.”[20] Less obvious, but still significant trouble. For advocates hardened by the demands of separationism, this kind of moral nuance does not exist.

Similarly, the insistence on the equality of “good” institutional care simply erases from the equation something important and also subjective: how very young humans see the world. Routine and familiarity are everything for small children. Yes, everything. I am no absolutist about non-maternal care – with four children that would be a physically and intellectually untenable position. Very often some warm body – an older sibling, a babysitter, my husband, assorted grandparents – stays with my youngest so that I can do any one of the many things that small children make difficult or impossible. But the separationist insistence that it doesn’t matter whether a baby or toddler is in the house or not simply rings ignorant of what the first two or three years of life are all about. Just being at home carries with it all those non-parental things so comforting to little children – from a familiar bump in the wall to the presence of a pet or sibling to a ripped-up book that must be found this minute.

Even the recent boomlet of lifestyle pieces about mostly well-off career women who have decided to stay home with their small children exhibits an inadvertently revealing one-sidedness of feeling – again, one obviously connected to the influence of separationist thought. One of the more discussed New York Times Magazine articles in 2003, for example, was “The Opt-Out Revolution” by Lisa Belkin. It argued about the “glass ceiling” problem that more women aren’t hitting because they just don’t want to, and one reason they don’t want to is that they want to enjoy the company of their children. Similarly,Time magazine’s cover story in March 2004, “The Case for Staying Home,” cited dropping out of the rat race and enjoying the children as two lures that are perhaps more powerful than yesterday’s generation of mothers understood. Even mothers who are vigorously pro-separation speak of the same unbidden pull they feel toward their children. Joan K. Peters, as staunch a defender of day-care as any, has herself related, “Once, when I was late [getting home from work], I arrived nearly hysterical with worry that I had passed some absolute point of emotional safety for my infant – that in divine retribution for my absence, something awful might have happened. I was so upset that I snatched my daughter from my babysitter’s arms and sank with her on the couch, holding my coat around us both.”[21]

What could be more natural than that? Of course women and men want to enjoy their children; children are enormously enjoyable. But in that one-sided focus on what women want, a hidden but very real insensitivity betrays itself once more. If mother-child separation is so hard on mothers that even pro-separation feminists see it feelingly, then how much worse is that separation for a baby or toddler who does not understand time or distance?[22] Once more, doesn’t that added confusion and distress, all the harder for a being unable to grasp what is happening, carry moral weight of its own?

A third body of evidence that suggests how far our separationist experiment has dulled our thinkers to real babies and children is this: virtually every sophisticated school of thought now ascendant has participated one way or another in the rationalization of hands-off parenting. In an important book published in 1999, Kay S. Hymowitz broke particularly crucial theoretical ground explaining just this. She examined the state of American childhood, not from the bottom up but from the top, at the level of the numerous contemporary theories that have served to justify parental disengagement. Ready or Not: Why Treating Children as Small Adults Endangers Their Future – and Ours outlined in field after field (law, education, and psychology both popular and academic) how the past thirty years have seen a transformation in the way children are perceived – one that de-emphasizes adult guidance and authority while ultra-emphasizing the intrinsic capacities of the child in the absence of such guidance.[23] Uniting all these apparently disparate theories, she demonstrated, is “the idea of children as capable, rational, and autonomous, as beings endowed with all the qualities necessary for their entrance into the adult world – qualities such as talents, interests, values, conscience and a conscious sense of themselves.”

The same insistence that Hymowitz discerned in elite fields of thought is true also of popular child-rearing advice books, which take their direction from a medical establishment profoundly reluctant to roil the political waters over day-care. Almost all leading cultural authorities, including the American Academy of Pediatrics, have now managed a good word for the putative benefits of “early socialization,” which is to say non-parental child-rearing; and though some are careful about the issue of institutional care, almost all glow with the putative benefits of having mothers out of the house. The country’s leading popular child care experts have all revised downward over the years their estimations of just how much young children need their mothers, with every single one concluding that children need less of their mother’s time and presence than was previously thought.[24]

Then there is the telling literature of a different sort: the kind for children themselves. This literature emphasizes parental needs and resolutely draws a happy face over children’s longings; pamphlets exhort those too young to tie their shoes to be “independent,” and stories, articles, and self-help columns share the message that the happy and fulfilled (that is, less encumbered) parent is also the better parent.[25] Has anyone strolled the children’s aisles of the bookstore lately? Have you seen a copy of Carl Goes to Day-Care or any of the many other books for children who are years away from reading – who, indeed, don’t even have all their baby teeth yet – but are targeted for the theme that separating from Mommy every morning isn’t all that bad? Do we really think the new get-tough approach reflected in these texts for tots is in any way an improvement on the at-home adventures of Dick and Jane?

Those texts are also only one manifestation of the desensitization that proceeds apace. Not only ideologically but also practically, the signs of other envelope-pushing are out there – including round-the-clock day-care, or night-care, a trend already established in Scandinavia and now beginning to appear in the United States in response to parental demand.[26] Though only a dozen or so centers currently exist, every reporter mentioning the trend predicts robust growth; “some people have to be available [for work] at all hours,” as one trend analyst puts it.[27] And what is it like for these children who are not even allowed the familiarity of their own beds? Not to worry. After all, “each child brings something special to his or her cot: a pillow, a well-worn blanket, a favorite toy.”

Similarly, during 2003 alone, several stories sprang up around the country about parents using public libraries – yes, libraries – as emergency day-care centers, including depositing children there for the day who are far too young to read.[28] In short, from real-life stories to expert literature of all sorts, there is one and only one prevailing cultural answer to the question of just how much babies and toddlers need, and it’s this: They need less than previously thought.

Shrinking the Need Down to Size

Laura Schlessinger once asked members of an audience to stand up “if you could... come back as an infant... raised by a day-care worker, a nanny, or a babysitter.” No one did, and Schlessinger went on to ask why anyone who could choose otherwise would prefer this for their children. In effect, she was asking a question not about outcomes, but about the immediate moral content of the experiment. Of course she was excoriated in the usual places. But should she have been? How many readers thinking of their own childhoods would answer her question any other way?

In sum, the real trouble with day-care is twofold. One, it increases the likelihood that kids will be unhappy; and two, the chronic rationalization of that unhappiness renders adults less sensitive to children’s needs and demands in any form. Of course, as advocates often say, most children not in home-care are likely to turn out fine (they are resilient). Of course, many adults have to work, and some absolutely have to use out-of-home care. Of course, no one can have his mother all the time, and likely no one should. Of course also, by extension, children are only one of several actors in any given drama, even if they are also the most vulnerable; in other words, their immediate emotional needs cannot and do not always trump.

But can they, should they, ever trump? That is the question advocates will not answer. Single parents, frantic parents, infants being packed off to hospital-style rows of cribs called “school,” toddlers who go for institutionalized walks roped together like members of a miniature chain gang – this is what the experiment means day to day. But our separationists manage to worry instead about the opposite: an alleged excess of maternalism, of “over-parenting” (Joan K. Peters), an oppressive “mommy myth” (Susan J. Douglas and Meredith W. Michaels), and all the other phantoms said to be haunting and impeding – who else? – the modern mother.

Their own rhetoric and that of the long-running day-care wars proves overwhelmingly otherwise, and so do the plain facts. The 2000 census clinched the point that more and more mothers continue to opt out. Between 1975 and 1993 the percentage of children under age six with employed mothers rose from 33 to 55 per cent. By 2000 it had climbed to 70 per cent. Of course not all those women are working full-time and out of the house, but the trend away from home and toward the workplace is very clear. And so is what it represents: the near-total cultural about-face in the way society views working mothers. Once, as has been widely noted, staying home with one’s children was judged the right thing to do, both intrinsically and for reasons of the greater good, by mothers, fathers, and most of the rest of society. Today, the social expectations are exactly reversed.

Before we start worrying ourselves about the alleged perils of too much mothering, we might first look at how much energy and sophisticated thought continues to go into rationalizing too little mothering and what exactly that says about us. We have collectively become one of Shakespeare’s most unattractive characters – wicked daughter Regan who, when faced with an old father demanding his prerogatives of age, diminishes those wants of his over and over. However many horses and knights King Lear demands, she allows fewer; whatever he agrees to, she reduces further still. Just so, contrary to the bitter complaints of our separationists, has our social standard governing exactly what babies and children can demand of us veered in the direction of less.

Once upon a time, after all, parents and experts worried about whether five-year-olds needed a mother in the house; now, when kindergarten has become full days in many or even most districts, and before- and after-school programs abound, that worry has apparently gone the way of the buggy whip. Not so long ago, parents and experts wondered whether two- and three-year-olds could thrive if they were out of their homes and away from their families at preschools or day-care all day, but when packing them off became routine rather than rare, and subjecting them to a rotating set of strangers became thought of as a head start, a good many adults with other things to do decided that that problem had been pretty much solved, too. Having so efficiently shrunk the pool of children we might need to worry about by quite a lot, we now reduce ourselves to scholastic nitpicking over the few who are left: infants and toddlers.

Well, how about it? What real need does a five-year-old have of his mother or home? What need does a three-year-old have? A babe in arms?

King Lear has a pretty famous answer to questions like those: Oh, reason not the need. What the ideological devotion to day-care finally amounts to is just that – reasoning the need, ruthlessly trying to square what for the youngest children will always be a circle with many orbits but only one center.


[1] Joan K. Peters, When Mothers Work: Loving Our Children Without Sacrificing Ourselves (Reading, MA: Perseus Books, 1998), pp. 3–4.

[2] Brian C. Robertson, Day-Care Deception: What the Child-Care Establishment Isn’t Telling Us (San Francisco: Encounter Books, 2003).

[3] Bryce Christensen, “A Schoolhouse Built by Hobbes,” in The Child-Care “Crisis” and Its Remedies, Family Policy Review 1, no. 2 (Fall 2003).

[4] See, for example, Allan Carlson, “The Fractured Dream of Social Parenting,” ibid.

[5] Quoted in Kathleen Curry, “Children’s Ear Infections Rampant Across Country,” Lexington Herald-Leader, November 2, 1993.

[6] Robert A. Hoekelman, “Day-care, Day-care: May Day, May Day!” Pediatric Annals 20 (1991), p. 403. As the editorial further pointed out, it is not only the children in such centers but also their pregnant mothers and their pregnant day-care providers who are at risk – in the case of pregnant women, for fetal infections and for stillbirths.

[7] Jody Heymann, The Widening Gap: Why America’s Working Families Are in Jeopardy – and What Can Be Done About It (New York: Basic Books, 2000), p. 61.

[8] Ibid., p. 62.

[9]Arlie Russell Hochschild, The Time Bind: When Work Becomes Home and Home Becomes Work (New York: Metropolitan Books, 1997).

[10] NICHD Early Child Care Research Network, “Child Care and Mother-Child Interaction in the First Three Years of Life,” Developmental Psychology35 (1999), pp. 1399-1413. See also Jay Belsky’s discussion of this study in “The Politicized Science of Day-Care,” Family Policy Review 1, no. 2 (Fall 2003), pp. 23-40.

[11] National Institute of Child Health and Human Development, Early Child Care Research Network, “Does Amount of Time Spent in Child Care Predict Socioemotional Adjustment During the Transition to Kindergarten?” Child Development 74, no. 4 (July/August 2003), pp. 976–1005.

[12] Robert Karen, Becoming Attached: First Relationships and How They Shape Our Capacity to Love (New York: Oxford University Press, 1994), chapter 22, “A Rage in the Nursery: The Infant Day-Care Wars.” See also Jay Belsky, “The Politicized Science of Day-care,” in The Child-Care “Crisis” and Its Remedies, Family Policy Review 1, no. 2 (Fall 2003).

[13] Robertson, Day-care Deception, p. 79.

[14] Kathy Tour et al., “Social Behavior Correlates of Cortisol Activity in Child Care: Gender Differences and Time-of-Day Effects,” Child Development 69 (1998), pp. 1247-62.

[15] Susan Chira, A Mother’s Place: Choosing Work and Family Without Guilt or Blame (New York: HarperPerennial, 1998), p. 117.

[16] Susan Faludi, Backlash: The Undeclared War Against American Women (New York: Random House, 1991), p. 43.

[17] “Colds with a Silver Lining,” Abraham B. Bergman, Archives of Pediatrics & Adolescent Medicine 156 (2002), p. 104.

[18] Caitlin Flanagan, Atlantic, April 2004. In her previous cover story in the same pages, Flanagan also shrewdly observed another interesting fact of our day-care wars – that some of the most passionate advocates do not use institutional care themselves. Many have instead in-the-house, one-on-one paid help.

[19] For examples of how this callousness permeates so-called Third Wave Feminism, see my “Feminism’s Children,” The Weekly Standard (November 5, 2001).

[20] Stanley Kurtz, “The Guilt Game,” nationalreview.com (April 26, 2001).

[21] Peters, When Mothers Work, p. 73.

[22] Thanks to Stanley Kurtz for the observation about Belkin’s essay. E-mail communication, October, 2003.

[23] Kay S. Hymowitz, Ready or Not: Why Treating Children As Small Adults Endangers Their Future – and Ours (New York: Free Press, 1999). According to the progressive and neo-progressive theories dominant in education, children are self-motivated, inherently cooperative learners who will invent their own strategies on impulse. The idea of the self-sufficient child – even the self-sufficient baby and toddler – is also ingrained in current psychology. Experts from Piaget onward have stressed the rational, competent information-processing of the child, writing off any friction with this happy scenario to “developmental stages.” Influenced partly by such theories, forward-looking legal theorists – Hillary Rodham Clinton, among many others – have also stressed the autonomy and rights of the child against those of the parents (a movement driven particularly, as Hymowitz argued, by the political desire to allow minors easy access to abortion).

[24] See my “Putting Children Last,” Commentary (May 1995).

[25] For a representative list, see Arlie Russell Hochschild, The Time Bind, pp. 226-28.

[26] See, for example, Skip Thurman, “Day-care Becomes Night Care in Era of Busy Work Schedules,” Christian Science Monitor (October 23, 1997), and “A 24-Hour Day-care Trend?” CBSNews.com, November 13, 2003.

[27] Skip Thurman, “Day-Care Becomes Night-Care in Era of Busy Work Schedules,” Christian Science Monitor (October 23, 1997).

[28] See Leet Smith and Elaine Rivera, “Turning Librarians into Babysitters,” Washington Post, February 2, 2004. See also Kellie Patrick, “Libraries: Public Safety Isn’t Assured,” Philly.com (February 10, 2004).