Book Read Free

Confessions of a Wayward Academic

Page 28

by Tom Corbett


  Moving on, this is a drama in four acts. The first covers my involvement in the social indicators movement that flared anew in the early 1990s. The second act touches upon my participation in a frontal attack on the official poverty measure, a task that obviously needs to be done and yet remains just beyond our reach. The third discusses my participation on the National Academy of Sciences expert panel of research methods for evaluating welfare reform. Yes, even the slowest of us can rise to lofty heights. The final act describes my more general assault on what I saw as the deficiencies of our research methods given the rapidly changing landscape of social assistance at the turn of the century.

  As I noted earlier, it is difficult to pinpoint when and how I started constructing some of the counters in my candy store. I believe I started on this counter with a phone call from Robert Haveman one fine morning. He asked, “What do you know about social indicators, Tom?” I told him I knew nothing and hoped that would be the end of it. “I’m going to have someone drop some stuff off for you to look at, okay?” The words “what about the word nothing did you not understand, Bob?” were on the tip of my tongue when he hung up.

  This is where I always make a mistake. I can never quite get past my people pleasing weakness. So, I resigned myself to looking at whatever he had for me to peruse. Within minutes I heard a thud outside my door. This scheme obviously had been well planned. When a sigh of resignation, I looked to discover a box containing an assortment of reports and documents, the accomplice in Bob’s plan clearly felt spasms of guilt and had already escaped from my view. Upon cursory review, I discovered that each publication employed an assortment of social indicators to assess the condition of various jurisdictions, large and small. It looked interesting but so what! My deeper suspicion was that Bob simply was cleaning out his office.

  Soon enough, the box of free reports was followed up by a suggestion from Bob that I solicit a planning grant to initiate a project for generating a status report on Wisconsin’s children. Obviously, no one else at the university wanted to do this. At such times, I would wonder why I never even got a chance to draw a short straw and thereby escape such missions that others clearly were avoiding. My only understanding of what a “status report” might look like came from this pile of reports Bob had someone dump in front of my door. Some of these documents dated back to the 70s, the 1870s that is, though many had seductive covers printed in glossy colors. Still, it started to look a bit intriguing to me. Even then, I did not have much of a life. Besides, I had begun to realize that big things can start with small beginnings.

  As I pondered the task, I considered that we had been living through a demographic earthquake. The number of children in single-parent homes had risen from about 8 percent in 1960 to over 25 percent in the early 1990s. Births to unmarried women had increased from 5 percent to about 30 percent over the same period. Some 70 percent of teen births were to unmarried mothers, up from 15 percent in 1960. The foster care population was exploding, jumping by some 60 percent in the several years after 1985.

  People were looking around in alarm. Is our civilized world lurching toward chaos and the abyss? Probably not, but perhaps we ought to be looking more closely at what was happening just in case. After all, you could get updates on what the equity markets were doing all day long. My smart phone (the irony of my having a smart phone is not lost on me) has me in touch with stock quotes from around the world on a real-time basis. If that is the case, why don’t we collect at least rough data routinely on our children whom we widely advertise as our most precious resource? So, I started down the road on a modest quest…to see if we (me?) should do a status report on Wisconsin’s children.

  Within a year, I forged a collaborative effort between IRP and the Wisconsin Council on Children and Families (WCCF) to secure small amounts of financial support from several Wisconsin foundations and from the University itself, no small victory getting the tight-fisted university to kick in. I recall that Tom Loftus, our intrepid child support champion in the Wisconsin legislature, had by now become part of this scheme. He recently had lost a bid to be governor against Tommy Thompson and was now unemployed.

  The two of us hit many funding possibilities. I recall a visit to a friend of his who was running for U.S. Senate in the Democratic Party primary race. This candidate was running for the seat that Russ Feingold ultimately was to win. The odd thing is that I wound up doing two morning long briefings for Russ and his top campaign staff. These briefings took place rather early in his campaign when no one was giving Russ a chance. Though blazing smart, being a Rhodes scholar, he had little money and was thought of as a Madison liberal, a quality that did not always play well outstate. In truth, it never played well outstate.

  It turns out that I was the only egg head willing to put in some time with him when his prospects looked so bleak. After he won, we would periodically run into each other on the Friday night Midwest Express flights back to Madison…a popular method for getting home among those who spent half their lives in D.C. He would hail me as his welfare guy, occasionally sitting next to me on the flight. I recall one flight where we discussed how the welfare reform wars were playing out. We both agreed that reform was not yet the unmitigated disaster it would eventually become. I found it unusually rewarding to discuss policy with a politician who is smart and cares. Russ tried to recapture his Senate seat, which he lost during the 2010 debacle for the Democrats. I saw him at a campaign event in 2016, another disastrous year for Democratic candidates, as he tried to wrest his old seat from an unreconstructed Tea Party advocate but unfortunately failed. He yet fondly recalled those long-ago briefings of mine that took place during far better times.

  Anyways, this other candidate was on the phone that day during Russ’s successful run for the seat, calling potential campaign donors. He tried to take a break to chat with Loftus, but his handler was clearly annoyed and kept pushing him to get back to the calls. There is no break from the 24/7 grind of raising campaign money, not even to help poor children. This is another reason I could never have run for public office even though the man who was to become the majority leader in the Wisconsin Assembly seriously asked me to run for political office when I was a young man. He was serious! Shockingly, my political suitor was a smart, educated man who sported an MA from the Kennedy School of Public Policy at Harvard. He should have known better. What the hell was he thinking?

  I could never do that no matter who asked. You need to prostitute yourself on a continuing basis to raise money. If I beg for money it will be for a good reason, like a big yacht for me or something equally egregious. The bigger reason for my reluctance to run is that you had to be nice to people, or so I thought at the time. I don’t like people. The biggest reason is my awful personality. I could never put up with all the whiners that want everything but never want to pay any taxes. I am sure I would be jailed for whacking some idiot upside the head when they cheesed me off while making totally stupid points.

  We also visited Senator Herb Kohl, a multimillionaire businessman and, until his retirement, a popular senator from Wisconsin. Despite his wealth and very public persona, he had to be just about the most reserved and shy man you could imagine. Trying to engage him in small talk was painful as Loftus and I worked our way around to the topic of whether one of the Kohl family foundations might kick in a few bucks for our worthy cause. We handed him a written summary of the plans to monitor how Wisconsin’s kids were doing. He crumpled our inspiring plans up in his hands before beating a hasty retreat to his inner office.

  I mentioned this to a colleague back at IRP whose husband worked for Herb for a while. She laughed. “You should have asked him about the Milwaukee Bucks,” she told me. At the time, Herb was an owner of the professional basketball team in Milwaukee, and apparently that is the one topic he could become animated about. Despite his severe shyness, I thought he was a very good senator.

  In short order, we put a funding package together. The WISKIDS initiative was launched as a joint
venture by the IRP and the Wisconsin Council on Children and Families (WCCF), a Madison-based advocacy organization for children. Over the next several years, this initiative produced several products—brief bulletins on selected topics, special reports such as a detailed look at schoolchildren in Madison, and the statewide KIDS COUNT report that is perhaps the signature product of the Casey Foundation.

  Perhaps the best outcome was that we lured Tom Kaplan back to Wisconsin and to IRP. Tom had been a highly respected state worker who found it increasingly difficult to work in the Wisconsin bureaucracy as the institutional culture changed from a program focus to a more political focus. He had found refuge teaching at a small Pennsylvania college when I called him to see if he would return to help us with WISKIDS. I am not sure I finished the call when I heard a click on the line. He was racing to catch a flight back to Madison. Apparently, teaching kids at some backwater college was not a dream come true. He later took over as associate director of IRP when I stepped down.

  Virtually from day one of the WISKIDS effort, I sensed that the state of the art with respect to indicators of child well-being was quite primitive. Those doing this work were aware that the existing data sources were insufficient for the task at hand, as were the extant approaches for using those data. Eventually, I concluded that IRP should initiate a more concerted effort to address the larger set of technical and substantive issues related to producing and using these indicators. After several of the annual publications, it was clear that WCCF could produce the WISKIDS report without our help, perhaps better in fact. Either that or they thought I cost too much and decided that a monkey could do the job just as well. The smart money is on the monkey theory. Besides, I always lose interest as anything becomes a more-or-less routine task. I could never stand boredom.

  I was more attracted to the broader challenges of improving the use of social indicators. The frustrations and constraints of producing a status-of-children report are several. You are always walking a fine line between description and advocacy. Every step in the process, the selection of indicators, mode of presentation, choice of baseline or comparison data, is partly subjective and thus vulnerable to second guessing. Just think how controversial the assessment of school performance has become. Critics rant about the impossibility of really getting at the value-added component of a school or group of teachers. How do you really account for uneven starting positions and overwhelming contextual impediments or advantages? You can always statistically control for some things, but do you get them all? Now, apply those same doubts and concerns to measures of how well communities or states are doing.

  While some purveyors of indicators have a transparent advocacy agenda, others strive to dispassionately portray the condition of children. The latter, despite their proclivities toward objectivity, recognize that their choices about data selection and presentation may well shape subsequent interpretations in unintended ways. In short, I could not help but notice that there were few accepted standards for choosing and displaying data in this emerging field of social indicators.

  The temptation to slip from the goal of informing society about social conditions to influencing current policy debates is ever present. For example, data often were arrayed in scorecard fashion—ordinal rankings of states or counties from best to worst. The implication was that jurisdictions at the bottom were losers. They should do more or at least do something different. When viewed more closely, the real meaning of such rankings is less clear. Are the favorably ranked jurisdictions working harder at protecting their kids or do they simply have fewer problems to address in the first place? Culpability, it turns out, is a slippery notion. Making such determinations demand a more rigorous analysis than the mere publication of available data typically provide.

  The inevitable desire to assign meaning to descriptive data leads us to inappropriately assign blame or praise in some evaluative sense. If births to unmarried teens increase, then it must be the failure of (pick one or more) the schools, the federal government, Democrats or Republicans or Independents, television, working mothers and permissive fathers, or good old “rock and roll.” Looking at temporal trends and casually assigning causality is very seductive but extremely hazardous. We all know that correlation is not causation yet most of us forget this simple fact when data trends comport with our theoretical or normative priors.

  In addition, available social indicators, at the time, tended to rely on what was available. Thus, indicators tended to cluster in certain domains of child well-being that often covaried with one another. Other important areas were underrepresented or ignored all together. Still other measures were confounded with supply versus demand problems. If kids in foster care went up, is it because more kids were in trouble (a bad thing) or more foster care slots suddenly appeared (perhaps a good thing)? Similar issues arise when reports of child abuse and neglect trended upward in an exponential fashion as they had in the not too distant past. Was society going to hell in a hand basket (bad) or were teachers and nurses and neighbors being more attentive to the well-being of children within their purview and taking actions to protect them (good). The number of people incarcerated in America increased from 300,000 to some 2.3 million over several decades. We now have one-quarter of all prisoners world-wide with only 5 percent of the population. What happened? Has society broken down, was it our get tough on crime approach that included harsh and mandatory sentences, perhaps a way of controlling urban minorities, or merely a growing demand for paying customers as we transitioned toward private, for-profit prisons. Take your pick.

  I pondered such mysteries at the time, and many more, largely because I had such a boring life and policy mysteries always amuse me. Perhaps I ponder them because the answers, if I were to stumble on any, might do a bit of good down the road as well. That, however, might be taking a rose-colored view of things. All this struck me as another adventure and I started looking around for more allies, particularly since I was not conversant in the technical data issues surrounding the collection of indicators. Robert Hauser, then director of IRP and a superb scholar and researcher, was an enthusiastic supporter of this idea.

  Bob Hauser and I visited Nicholas Zill in 1992. Nick then headed Child Trends, a respected research organization in D.C. We discussed the issues we believed needed addressing and reached an agreement that the time was right to push the social indicator agenda. Ah yes, another windmill at which to tilt my flagging policy lance! It turns out that the next year, by the summer of 1993, I was at ASPE attacking that other windmill…welfare reform. It was a good year for futile gestures. My location in D.C. was convenient for pulling together a team to push the indicator agenda. A planning team of IRP affiliates, staff from ASPE, and scholars from Child Trends (Kris Moore by then had replaced Nick Zill as CEO) put together a plan. I just love plans though typically I am less eager about carrying them out.

  Anyways, I helped lay out three national conferences to be held over the next three years. I always jumped at the chance to organize such fun activities since I knew I was not important enough to get invited on my own. The first would bring together a small number of important individuals in what would amount to a planning workshop on indicators of children’s well-being. The second would be a more ambitious and in-depth effort to bring together the best expertise available to focus on the issues identified in the planning workshop. The final session would be action-oriented, concentrating on legislative and resource issues necessary to translate the desire for a comprehensive set of indicators into reality.

  The band of co-conspirators had grown. Added to the mix were experts such as Jeffrey Evans from the Center for Population Research, Deborah Phillips, director of the Board on Children and Families at the National Academy of Sciences, William O’Hare, associate director of the Kids Count project at the Casey foundation, and the ubiquitous Wendell Primus. We drew a host of top scholars from many disciplines and perspective to systemically examine how a comprehensive array of indicators might be put together w
hile dealing with numerous technical and logistical impediments. It was a herculean task, but the scholars went at it with gusto. The work produced some two hundred measures covering many distinct domains. It was a scholar’s delight and a number cruncher’s nirvana. It is one thing to come up with a measure in the abstract, quite another to use it in the real world.

  I think that the surrounding policy environment added urgency to the agenda. There was a lot of talk about further devolution of social policy to the states, particularly after 1994 when Republicans took over the House of Representatives. If that were to come about, the classic ways in which program oversight was exercised would be eviscerated. Narrow, categorical programs for helping the poor were very visible. You pretty much knew how many child care slots there were or how many kids got a certain vaccination.

  When dozens of distinct programs are lumped together into broad funding streams with substantial control being given over to states and even local authorities, the picture can cloud over immediately. Planning flexibility and fungible dollars could be a force for extreme good or unmitigated evil. How would the good people in Washington know what was happening to vulnerable kids and why? A comprehensive set of social indicators collected in reasonably efficient fashion and used to monitor population health and well-being would be a great start toward ensuring program accountability, detect positive or negative trends, and serve the “canary in the mine shaft” function. This last function can be thought of as an early warning mechanism that problems were emerging, and some proactive actions were possibly required.

 

‹ Prev