The Sad Story of Learning Outcomes Assessment

I owe one of my worst moments as a college administrator to the learning outcomes assessment movement. I confess I did not try hard to forgive it, and forgive it I never have. In the early 1990’s I was the dean of a College of Arts and Sciences at a public university, with about 20 department chairs that more or less “reported” to me. At that time the learning outcomes assessment movement was on its steady, disciplined march through higher education. Accrediting agencies were increasingly expecting us to demonstrate that we had documented learning outcomes:  “Ninety percent of the students must demonstrate adequate knowledge of punctuation through a minimum score of 85 on the XYZ test.” We also had to show that we actually assessed student achievement of those expected outcomes and that we acted on the results of those assessments.

To help prepare the college to cope with this new expectation, I invited a national expert in the field to give the chairs and me a presentation on the subject. The presentation was dry, most of the chairs incurious and unmotivated, the pale winter sun casting its weak light over the darkening room as the audience dwindled because of one lame excuse or another. At the end I was embarrassed, the speaker demoralized, and most of the chairs as unenlightened as they were before. My later experiences with learning outcomes assessment were better, but never enough to make me happy.

In the 1980’s and 90’s learning outcomes assessment was the buzz in American higher education. It differed from grades or professional certification exams for individual students since it judged the performance of an entire group of students enrolled in a course, a program, or even an entire college. As in the example above, the assessment was based on a predetermined objective with a predetermined level of success. The pressure for learning outcomes assessment came from legislators and government officials increasingly seeking more information about just what they were getting for their growing investment in higher education as it kept moving from an institution for the elite to an expectation for the majority of all students. Some information, such as graduation rates and post-graduate employment rates, was already available, but often not this more direct analysis of what students were learning in their programs.

There were reasons for the appeal of this assessment movement. Leaving aside the apparent subjectivity of individual grades and the partisanship of alumni in love with their lost youth  in the protective embrace of their alma mater, learning outcomes assessment had the feel if not the substance of objectivity (in our first paragraph example, why not 95% of the students with a score of 90 on that exam?). With its parade of statistics, numerical benchmarks, graphs, and bulleted and sub-bulleted reports, even before Power Point presentations spread their cloud of somnolence over stupefied audiences, learning outcomes was indeed the latest thing. It was, however, a poor fit with the culture of higher education. Despite its numerous cheerleaders and heavy-handed enforcers, it remains a poor fit even today, many sad years later.

Erik Gilbert’s opinion essay in the August 14, 2015 Chronicle of Higher Education, “Does Assessment Make Colleges Better? Who Knows? ” points to the crux of the problem. After nearly 40 years of workshops and seminars, exhortations and threats by accrediting bodies, and the hiring of numerous assessment administrators, prospective students and their parents really do not care whether a college has a good learning outcomes assessment program or any learning outcomes assessment program at all. Parents and students care about costs, general academic reputation, and post-graduation employability—not learning outcomes assessment. A large percentage of faculty remain indifferent or even hostile to this movement (significantly, no one has ever tried to get precise numbers) and few institutions can point to positive, substantial change arising from the results of these assessments. After nearly 40 years, the movement remains an orphan, institutionalized but unwelcome.

There are many causes of this sorry situation, including a justified faculty suspicion that this movement arises from a distrust of faculty judgement and a distaste for faculty autonomy. More importantly, we have no clear substantial evidence that this movement amounts to more than a new hill of paper. Those behind this movement—the federal and state education departments, the regional accrediting bodies, senior system and college administrators, and the recently created phalanx of assessment administrators—have created a paper empire that has not really demonstrated its usefulness. These individuals and their empire definitely need to be assessed, and soon.

I confess that as a college administrator I sometimes prodded and sometimes pushed faculty into participating in this effort. The useful results—and there were some—came from departments where the faculty actually talked to each other about the content and pedagogy of their courses and where those discussions led to changes to improve student learning. I suspect many of these departments would have done this without Big Brother looking over their shoulders. Somehow the focus needs to shift from top-down administration to a focus on encouraging better communication about curriculum and teaching within departments, from hiring more assessment administrators to putting whatever few resources are left back into the one area where change is likely to really help students.

The Sad Story of Learning Outcomes Assessment

Tough Times Ahead: Older Professionals Look at the Future

As a part of my work on professional autonomy I have been interviewing professionals age 55 and over and asking them what they see as the future of professional autonomy in their fields.. These professionals, some retired and some still working, usually tell me they had a fair amount of autonomy in their careers but do not foresee that same level of autonomy for those who follow them.

The two college professors I interviewed believe their profession is losing the high degree of professional self-direction that used to make the life of a college professor so attractive. The doctor I interviewed is not happy about the increasing restrictions he faces in his profession and feels that the situation will be even worse in the future. These perceptions from a very limited sample do  not reflect the views and aspirations of younger professionals, who have not yet been interviewed for this project.  Are these comments  just the typical remarks of an older generation that life and work and love were brighter and stronger back when giants roamed the earth–namely when they were young giants?   Or is there more here than a wistful regret that life isn’t the way it used to be?

I think the  people I interviewed aren’t paranoid or suffering from clinical depression. (After all, they did have the motivation to agree to be interviewed by a former college administrator.)  They could point to specific areas of their jobs where things were changing and not for the better. The college professors pointed to the growth of part-time faculty at the expense of full-time faculty, who  usually possess a higher commitment to the institutions and the students, and to the growth of more reporting requirements. The doctor I interviewed pointed to the growing power of insurance companies and “managers” of the health care process.

For medicine, higher education, and many other fields, a growing number of studies point precisely at these trends.  Even those not reading this professional literature or its more boiled-down versions in places like  the New York Times or the Economist must notice that computerization and information technology are profoundly changing the workplace in ways that are profoundly upsetting to many people.

I have not yet interviewed younger professionals, but I ask myself what I would do if I were starting again as an English professor. I  have also asked  this question of the people I interviewed. One respondent told me that the young need to keep their options open in ways we did not feel were necessary when starting our careers. They must understand that they may need to move before their job is moved out from under them or that they may need to switch careers.  (This respondent believes that academic tenure is doomed.)  These options include banding together with other professionals to fight back through professional associations, direct political involvement in choosing and electing candidates, and in some cases unions.

Young professionals also need to understand clearly what exactly they can contribute to their students, patients, or clients that a friendly robot can’t  do (or at least what we now conceive such a robot can do.)  Most professionals pride themselves on their professional expertise, but for those whom they serve their patience, empathy, coaching ability, and judgement are now far more important.  Any smart robot can write still another mediocre piece of literary criticism.    It is much more difficult to convince an insecure community college student that she can write.

Tough Times Ahead: Older Professionals Look at the Future

“You Don’t Owe Anybody Anything.” Oh Yes You Do.

There s a tee shirt that seems to be popular at some political rallies emblazoned with the defiant message: “You don’t own anybody anything.”  The message is apparently popular as a riposte to the welfare state, fueled by the perception that unwed mothers shooting heroin suck up most government money-not primarily middle class entitlements such as Medicare or Social Security and middle class oriented tax breaks such as the sacrosanct tax deductions for mortgage interest.  It is also very American–a defiant individualism suspicious of the government and wary of any social obligation other than those we choose to accept.  Unfortunately, it is also not true. All of us owe others more than we could ever possibly repay-from our parents to all the generations that preceded us who struggled to make a world that is immeasurably more comfortable for us than it was for almost all of them.  We also owe a huge debt to all those people who steadily work at all those jobs we don’t typically want, from the barista who gives you that coffee you need to jump start your heart at 7 am,  to the janitor who makes sure your office is clean, to the EMT who will try to  revive you when you collapse from the strain of being the center of the universe.

Of course, recognizing this debt doesn’t mean you owe your spendthrift sibling a loan  for his down payment on a house, nor does it mean you should expect that the world owes you anything. The happiest and sanest people are probably those who recognize their debt to others but don’t expect the same recognition cast back on themselves.

A society where most people actually believed and lived out the philosophy summarized on that tee-shirt would not survive for very long, even with countless aircraft carriers, drone aircraft, walls at the borders, endless security lines at airports, and intrusive surveillance of our personal lives. In this society anomie (rootlessness ) would be the rule, not autonomy or a genuine independence.  A society where anomie prevailed would be one with a high degree of distrust among its members, fragmented social bonds, and large numbers of individuals only marginally attached to the whole, with some of these lost souls convinced that they have the right to their own rage and to whatever means they choose to express it.  If some of this description seems to fit the world we live in, that similarity is a sign of how far we have already gone down this road into a very dark forest.

Autonomy is the opposite of anomie. It arises from shared values, respect for others, and a self-respect based on an awareness of our own limitations and our obligations to others, even those we don’t especially like. Autonomy is almost certainly necessary for any democracy to function in the long run. The unease some of us feel about the future of democracy is in part a result of our suspicion that anomie is creeping into almost all corners of American life precisely at a time when what is needed will be a common effort to rebuild and improve our inadequate infrastructure (if you don’t believe me, try driving on I-95 on a weekend in the summer), confront climate change, and figure out how  different races, ethnicities , and religions can live together  peacefully and productively. These areas may seem removed from this blog’s main theme of micromanagement at work and the consequent loss of autonomy in the workplace, but autonomy doesn’t start or end when we punch in the clock.  More on this matter later.

“You Don’t Owe Anybody Anything.” Oh Yes You Do.

The Future of Work, or Does Work have a Future?

The last eight years have been tough on many economies in the world, with depression in some (Greece), stagnation in others (most of the Euro Zone and Japan), a painfully slow recovery here in the U.S., and collapsed expectations elsewhere (Brazil.)  The picture is not one of univeral gloom (India as one counter-example),  but especially in advanced economies there has been a persistently high rate of unemployment, especially among the young.  If the French economy is an example of where advanced economies are headed  or where they are stuck, then they are in trouble. In France the rate of unemployment for the young (under 25 and desiring employment) is over 23% (Eurostat: EU Labor Force Survey, April 30, 2016) and has been in that neighborhood for years. The rate in Spain in February 2016 was an unbelievable 45.3%. (Statista: Youth Unemployment Rate in EU Member States as of February 2016 (seasonally adjusted.) Even in the US the rate of unemployment for workers without advanced education or other credentials is depressingly high and shows few signs of improving dramatically.  Of course, these troubles coupled with troubles elewhere, especially in the Middle East, have led to the rise of political extremism, often with a promise to return (you name the country) to the way life used to be.  To me, all these promises (from whatever side of the political spectrum) range from the wildly optimisitc to the delusionary.  What if we can’t go back?

The cheerful answer to that question comes from many economists, represented nicely by the aptly-named magazine, The Economist.  Sure,  old jobs disappear, but new ones will replace them. Blacksmiths give way to auto mechanics who in turn partly give way to computer software engineers who design the programs that increasingly run our automobiles.  New jobs will arise to replace the old, though with new and often more advanced qualifications needed for the work.  In the long run, a new but temporary equilibrium will be established, with a more formally credentialled work force equipped to contribute to the brave new world. Unfortunately, there are several problems with this rosy view.   The long run might be very long indeed, especially if you are  a middle-aged worker with a particular range of job skills that is no longer in demand.  Even if you retrain, the long run might not include you.  And the logic of automation is such that fewer workers (software engineers) will be needed to perform the same functions once performed by many workers (automobile mechanics).  In this country at least there are also large numbers of minimally qualified workers, often wthout even a high school degree, who have at best a tangential relationship to this economy or for that matter this society. Many are frequently unemployed, in and out of jail, and struggling with drug addiction or mental health issues.  It is hard to know what place the brave new world has for them.

Not everyone is likely to qualify for a place or at least a comfortable place in this new world. Even for those with advanced credentials there are few guarantees.  Whereas automation started as a way to replace physical labor, it now increasing focuses on the office and on activities such as billing and various kinds of document processing that was once handled by humans.  (See Simon Head’s Mindless: Why Smarter Machines are Making Dumber Humans). Tax preparation and various accounting functions can now often be  done by software.  The entire field of graphic arts and communications has been transformed by computerization and many jobs in the field–such as video or graphic production of promotion material–have disappeared. ( One of my brothers ran a small company in that field and saw the field transformed and shrunk by computer graphics.)  I have seen arguments that ensuring everyone has a higher level of training will inevitably raise the standard of living in a certain area.  That sounds wonderful, but is it true?   Let’s take a town in West Virginia where there are bascially no jobs becuase the coal mine has shut down and let us ensure that every adult there obtains the equivalent of an associate’s degree from a local community college.  What happens to that town?  I’ll let you guess.

This blog is dedicated to the belief that work is important for people’s dignity and sense of purpose and that work itself should be an area where individuals are allowed to excercise professional judgement and discretion. I also believe that judgement cannot ultimately be automated, though there a lot of very smart people out there betting a lot of money  on the other team. But I also know that we are just at the very beginning of the age of computers and that none of us knows where this age is headed.  Could we end up in a world where the vast majority are unemployed or significantly underemployed, where diversion on top of diversion holds our attention, and where demagogues hold sway over restless and angry mobs?  Just asking the question.

 

The Future of Work, or Does Work have a Future?

HubSpot: Overrating Company Culture

The April 9, 2016 issue of The New York Times  contained an opinion piece severely critical   of various “new technology” firms that claim to give workers autonomy (what firm doesn’t?) and boast of their flexibiity and employee-friendliness, but in fact often turn out to be a “digital sweatshop.” (Dan Lyons, “Congratulations, You’ve Been Fired”). The surroundings are pleasant: plenty of light, comfortable lounges, exercise rooms, even bean bag chairs in some cases, and a mode of operation that is casual in a business casual sort-of-way: few or no suits,  first names for every one, open office spaces. What is not highlighted in these companies’ portrayal of their happy workers is that “flexible” hours often mean being on call  24 hours a day, that liberal vacation policies are meaningless because in the internet-driven 24/7 global economy vacations can at any moment become work  if your smart phone is anywhere within 100 miles of you, and that beneath the pseudo-casual exterior lie the same basic power relations that exist in most employer-employee relationships: the boss may dress like you or even worse and he (especially he in many of these companies) may  be  called by his first name, but he still has the power  to make your life very miserable in any number of ways.  The high tech company that has recently been in the spotlight  is the appropriately named  HubSpot, where among other indignities, “graduation” ceremonies were held for employees who were being terminated.

What I find interesting about HubSpot is not its bad behavior (for some of which, by the way, it was smart enough to apologize) but its faith in corporate culture as the key to success. HubSpot has published its “Culture Code,” where of course it values employee “autonomy and ownership,” but the code also makes it clear that the work hours are “Whenever,” the workplace “Wherever,” and the job tenure “Whatever.”  HubSpot may see this flexibility as liberating; others could see it as something closer to the world of arbitrary authority  described so carefully by Franz Kafka.  HubSpot wants employees to “commit maniacally” to its mission and metrics.  HubSpot also admits that when it hires it is looking for “culture fit” over skills and experience, since it believes that “Compromising on culture fit is mortgaging the culture.”  The problem with “culture fit,” however, is that it is very often used an excuse to exclude anyone who does not look or act exactly like the current employees.  HubSpot even admits it has a problem with diversity. I wonder why.

But whatever we think about HubSpot’s culture and however many business gurus preach the importance of corporate culture, is it true that culture is the key to success? If you read biographies of business leaders or books on the history of American business, such as Ricard S. Tedlow’s  excellent Giants of Enterprise: Seven Business Innovators and the Empires They Built, you find that success can be produced by very different firms, from the traditionally hierarchical  (IBM in the days of Big Blue))to the supposedly flat (Apple), from the closed and secretive (Apple can fit this category) to the more open and apparently transparent such as Microsoft, from the employee-friendly such as Eastman Kodak in its heyday to the awful such as Revlon under the leadership of Charles Revsen.

Management fashions come and go. Today being flat, flexible, and open-structured is in vogue, while being hierarchical with fairly structured planning mechanisms and a stress on corporate decorum  is consigned to the trash along with last year’s calendars. But is the failure rate of these contemporary firms with their open office/blue jean/bean bag culture any lower than it has ever been for new firms? From what I know, the failure rate is still incredbly high–at least 80%.  The office culture guarantees nothing; indeed it may even hinder success if you spend so much time networking with your co-workers that you don’t have time to think for yourself.  And are there firms that remain successful by  running more-or-less as they always  have? Judging by the continuity of brand names in the supermarket and elsewhere, one would assume there are.

Culture is not the magic wand that its proponents claim it is. The keys to success in business are having the right product at the right price at the right time.   The intelligence and diligence of the firm’s leaders count for a great deal, more than our gurus are sometimes willing to admit.  No culture can shield an organization from bad decisions and no culture can magically produce good ones. All around the country there  are attempts to reproduce the success of North Carolina’s Research Triangle or California’s Silicon Valley.  None have really worked in the way their proponents have hoped, in part because the momentum of being first really does make a difference. Sorry HubSpot.  There seems to be no magic recipe we can all follow to organizational success.  If HubSpot eventually becomes a profitable success (it claims to be on the way), it may be despite its somewhat goofy “culture” and because some group in the firm made smart decisions and had the discipline to carry them out.

 

HubSpot: Overrating Company Culture

Handling a Micromanager

In my last blog, “Failure is Underrated, ” I discussed one of the primary reasons that supervisors micromanage, namely their own fear of failure.  There are too many workplaces where there is so great a fear of failure that experimentation is discouraged and mediocrity is disguised as success. Why?  Because in doing so reasons for change or improvement can be ignored. Yet those organizations most willing to accept and learn from failure  will have a greater chance of long-term survival than firms that only want good news and only reward the bearers of good tidings. There are numerous examples of such self-defeating behavior in post-WWII American history, from the “light at the end of the tunnel” in Vietnam to the complacency of Detroit in the early stages of increased foreigh competition and the lethargic response of Kodak executives as their core business largely disappeared.

But what about the victims of micromanagement, those who suffer daily at work from a stifling lack of autonomy ?  What can they do to improve their situation?  What can you do when your boss is a micromanager?  Actually, the problem overlaps the one that probably hundreds of millions of people face every day: What do you do when your boss is a jerk?  Not all jerk bosses are micromanagers and not all micromanaging bosses are jerks–some are overprotective perfectionists who think they are doing you and your organization a big favor. But the issue is more or less the same: someone with authority over you behaves in a way that you find unacceptable to some degree, ranging from the mildly annoying to the repulsive. Unfortunately, there is no easy solution and most of the solutions carry some risks.  As I see it, the options are 1) keep your resume updated, 2) dialogue with the supervisor, 3) move the issue up the ladder,  4) consider transfer in the same orgainzation, and 5) say good-bye.  These options are of course common sense, but they deserve a couple of comments.

First, your resume, the dusty one in the drawer that you have trouble finding.  You  need to keep that current so that you don’t forget importants aspects of your career in case you ever need to put one together in a hurry. Keeping your resume current also gives you some perspective on how you might look to other people if you were in the market for another job. If all of your significant accomplishments are over five years or (depending on the field) even two years old, your updating your resume is sending you a warning signal as clear as a stratospheric blood pressure test. Something needs to be done.

Most important, updating your resume is a sign that you know nothing lasts forever and that ill winds can blow even in the most blissful of locations. Our culture has a profound distaste fot the concept of the wheel of fortune since we want to believe that our merit alone put us where we are and others’ lack of the same put them in the grubby little hole they so clearly deserve to be in.  But I have seen more than one instance where a beloved native-son chancellor of a state system of higher education went quickly from hero to unemployed after a change in just a couple of the members of the governing board. Probably our profoundest instinct is to believe we are indispensable, but we all know that is also one of the biggest lies we can tell  ourselves.

As for the other steps, ranging from dialogue to going over the boss’s head, those steps listed above are meant to be in sequence. Start friendly and try to avoid confrontations. Don’t approach your boss until you have done something out-of-the-ordinary in a good way that will predispose the boss in your favor.  I would point out to the boss how giving you more scope for your work will allow you to be more producitve and will give the boss  more time for far more significant tasks than peering over your shoulder.  I would move this disussion up the ladder only when it is clear that your supervisor is not going to change.  I would also remain polite even though we live in a culture that devalues politeness.  Politeness can be a shield for the less powerful.  In this situation almost always the less powerful person is you.

Sad to say, don’t be surprised if you don’t really get what you want.  Individual behavior is hard to change and often micromanagers change only through extensive experience accompanied by the attendant exhaustion.  Remember also that the instinctual response of most supervisors is to protect the hierarchy; even if they have reservations about your boss, they will support him or her.  There is plenty of bitter evidence to support this statement in the data of the Workplace Bullying Institute, where the data shows that most workplace bullying is top down and that in only 26% of reported cases is the perpetrator punished, dismissed, or quits,  while in 76% of the cases the target is fired, forced out, transferred, or quits. ( Workplace Bullying Institute, 2014 WBI U.S. Workplace Bullying Survey) The odds are  not in your favor. There is a push on for more workplace bullying laws,  but there is of course resistance to such change. There are also unions, but not in many professional fields and not for lots of other workers as well. Standing and fighting may be the most heroic option, but it may not leave you the most satisfied with the outcome.

Two items to keep in mind if you want to deal with micromanagment. First, if at all possible prepared to move if it bothers you enough.  Depending on circumstances  moving can be enormously difficult, but it is your last resort and you need to recognize that upfront. Moving also can be a very liberating experience.  Second, outperform what is expected of you. Doing so strengthens  your case internally, gives you confidence,  and builds your resume if you want or even need to move. You need to see yourself as, and you need to be, someone in control of your own life.

 

 

 

 

 

 

Handling a Micromanager

Failure is Underrated

My last blog on the flight from autonomy ended with a promise to take up the question of what you should do if you are a micromanager, even a “closet” micromanger who tries hard not to be. Millions of words have been written on this topic, much of which boils down to supervisors needing to manage tasks while giving their staff the leeway to do their work. Managers should not do the work assigned to others or to spend the day carefully scrutinizing the work of employees, but focus on ensuring that the task is accomplished.  At Xerox many years ago (and perhaps still today) all superviosrs who were to be promoted into the inner sanctum of stock options and bigger headaches were required to attend a several week seminar entitled “Managing Tasks Through People.” The seminar title pretty much conveys the point of the course.  Unfortunately we all know that many supervisors find it very hard to carry out this advice.  There may be many reasons for their inability to step back and let people do their jobs, including a supervisor’s love of the job she used to have, not the more remunerative one she now has. Examples abound of great teachers who end up as mediocre and miserable school principals or talented engineers who become not-so-terrific bosses.  But the biggest reason for micromanagement is almost certainly fear of failure–both personal failure and organizational failure.

Failure is of course not popular in a world that glorifies success, as ours certainly seems to. Despite the teachings of our religious and ethical heritages, most of us probably regard humility as appropriate only for fools and other suckers.  Failure is also the greatest threat to a technological civilization, since bad things happen when airplane engines fail or the power grid collapses.  The premise and the promise of our civilization are that it is “fail-safe.”  However, in our quest to be “fail-safe”  we may become “learning-proof’ because we learn more from failure than from success.  Success is the pat on the back, the assurance that we were right all along and that all we need to do is keep repeating what we did before. Failure–if we recognize it for what it is–tells us that we need to make changes and that we need to step back, think carefully, examine options, question assumptions.  Failure is far more conducive to “thinking out of the box” than success is. Success breeds complacency and the tendency to repeat what works. But if you repeat as circumstances inevitably change, eventually the results will change as well.  If you are in a competitive environment, the more predictable your behavior, the more likely your competitors will figure out how to get the jump on you.

Constant failure, of course, can lead to despair, but constant success is not the unalloyed bliss that its worshippers imagine.  It carries within it the seeds of its own destruction. Once we understand this hard reality, we don’t want to cultivate failure or “bet the farm” on every venture. But we do need to create in our own work lives and that of others a climate of experimentation where failure is accepted as the price for learning.  An annual evaluation system for an organization would be far more useful if it asked for what was learned from experimentation and failure than from a predictable and tedious charting of success in meeting goals.  Besides in an organizational world flooded with data, the big shots usually already know whether  you met your targets or not.

For those big shots who believe that “failure” is the real “f word,” they need to recognize that one of the worst kinds of failure is failure that pretends to be something else, namely success.  That kind of dishonesty can be fostered by a culture that worships success. Ironically, the inability to acknowledge failure can make failure when it occurs far more severe and cataclysmic.  Just ask the Detroit auto executives who for years under-estimated  the threat of low-cost, high quality Japanese competition.  How many of them admitted failure until they had no choice? Perhaps humility ultimately has greater evolutionary survival value for organizations and for the human species. Some of us would not be surprised.

 

 

 

 

 

Failure is Underrated