Connect with us
close

Long Reads

Scapegoats and Holy Cows: Climate Activism and Livestock

15 min read.

Opposition to livestock has become part of climate activism. Veganism is growing, particularly amongst affluent Westerners, and billions of dollars are flowing into the associated “animal-free meat and dairy” industry. This will result in yet more people forced off their land and away from self-sufficiency, give more profits and power to corporations, and may have little or no positive impact on the environment.

Published

on

Scapegoats and Holy Cows: Climate Activism and Livestock

Until recently, Greta Thunberg kept a filmed appeal to stop eating meat and dairy as the first item on her twitter account—she has been a vegan for half her life, so that is not surprising. Her message begins with pandemics but swiftly segues to climate change, as might be expected. (Assertions linking deforestation with pandemics are tenuous and speculative: there is no established link between COVID19 and deforestation or the wildlife trade.) The film was made by Mercy for Animals, which she thanks.

The film remained top of her twitter account for months. She has several million followers, so the value of the advertising she gave this little-known not-for-profit must run into millions of dollars. As opposition to livestock has become a major plank of climate activism, it is worth looking at how the world’s biggest climate influencer chooses to influence it.

Greta Thunberg’s 2021 Mercy for Animals film: “If we don’t change, we are f***ed.”

Greta Thunberg’s 2021 Mercy for Animals film: “If we don’t change, we are f***ed.”

Mercy for Animals is an American NGO with the stated purpose of ending factory farming because it is cruel to animals, a fact with which few would disagree. There are other reasons to shun factory-farmed meat as opposed to meat from animals raised on pasture, not least because some of the meat thus produced is subsequently heavily processed using unhealthy ingredients and then shipped long distances. The reason factory-farmed meat remains profitable is, obviously, because it is cheap and those who cannot afford expensive free range or organic have little other choice.

There is no doubt that factory farming is an industrial process that pollutes. There is also no doubt that an average Western—especially urban—diet contains a lot of unhealthy things, including too much meat. But whether or not folk who eat sensible amounts of local, organic meat and dairy, and try to stay fit and healthy, would have any significant impact on the planet’s climate by changing their diet is another matter, which I will come back to.

Mercy for Animal’s beliefs go much further than opposing animal cruelty. The organisation believes in speciesism or rather anti-speciesism, the idea that humans have no right to impose their will on other animals or to “exploit” them. It is a view that is shared by a growing number of people, especially vegans in the Global North. Thunberg goes as far as believing that only vegans can legitimately “stand up for human rights,” and wants non-vegans to feel guilty. Even more radical is Google founder, Larry Page, who reportedly thinks robots should be treated as a living species, just that they are silicon-based rather than carbon-based!

Whatever novel ideas anti-speciesists think up, no species would evolve without favouring its own. Our ancestors would never have developed their oversized brains if they had not eaten scavenged or hunted meat, and we have always lived in symbiosis with other animals, sometimes to the benefit of both. It seems likely that the wolf ancestors of dogs freely elected to live close to humans, taking advantage of our hearths and our ability to store game. In this, the earliest proven instance of domestication, perhaps each species exploited the other.

Having visited many subsistence hunters and herders over the last half century, I know that the physical – and spiritual – relationship they have with the creatures they hunt, herd or use for transport, is very different from that of most people (including me!). Most of us now have little experience of the intimacy that comes when people depend at first-hand on animals for survival.

Hunters, for example, often think they have a close connection with their game, and it is based on respect and exchange. A good Yanomami huntsman in Amazonia does not eat his own catch but gives it away to others. Boys are taught that if they are generous like this, the animals will approach them to offer themselves willingly as prey. Such a belief encourages strong social cohesion and reciprocity, which could not be more different from Western ideals of accumulation. The importance of individual cows to African herders, or of horses to the Asian steppe dwellers who, we think, started riding them in earnest, can be touchingly personal, and the same can be found all over the world.

Our ancestors would never have developed their oversized brains if they had not eaten scavenged or hunted meat

Everyone knows that many small children, if they feel safe, have an innate love of getting up close and personal to animals, and projects enabling deprived city kids to interact with livestock on farms can improve mental wellbeing and make children happier.

This closeness to other species is a positive experience for many, clearly including Thunberg; her film features her in an English animal sanctuary and cuddling one of her pet dogs. Those who believe speciesism is of great consequence, on the other hand, seem to seek a separation between us and other animals, whilst paradoxically advancing the idea that there is none. Animals are to be observed from a distance, perhaps kept as pets, but never “exploited” for people’s benefit.

Mercy for Animals does not stop at opposing factory farming. It is against the consumption of animal products altogether, including milk and eggs, and thinks that all creatures, including insects, must be treated humanely. Using animals for any “work” that benefits people is frowned upon. For example, the foundation holds the view that sheepdogs are “doubly problematic” because both dogs and sheep are exploited. It accepts, however, that they have been bred to perform certain tasks and may “experience stress and boredom if not given . . . work.” In a communication to me, the organisation has confirmed that it is also (albeit seemingly reluctantly) ok with keeping pets as they are “cherished companions with whom we love to share our lives”, and without them we would be “impoverished”. Exactly the same could be said for many working dogs of course.

Anyway, this not-for-profit believes that humans are moving away from using animals for anything, not only meat, but milk, wool, transport, emergency rescue, and everything else. It claims “several historical cultures have recognized the inherent right of animals to live . . . without human intervention or exploitation,” and thinks we are slowly evolving to a “higher consciousness” which will adopt its beliefs. It says this is informed by Hindu and Buddhist ideals and that it is working to “elevate humanity to its fullest potential.”

We all exalt our own morality of course, but professing a higher consciousness than those who think differently casts a supremacist shadow. The alleged connection with Indian religions is a common argument but remains debatable. The sacredness of cows, for example, is allied to their providing the dairy products widespread in Hindu foods and rituals. The god Krishna, himself a manifestation of the Supreme Being Vishnu, was a cattle herder. The Rig Veda, the oldest Indian religious text, is clear about their role: “In our stalls, contented may they stay! May they bring forth calves for us . . . giving milk.” Nearly a third of the world’s cattle are thought to live in India. Would they survive the unlikely event of Hindus converting to veganism?

Most Hindus are not wholly vegetarian. Although a key tenet of Hindu fundamentalism over recent generations is not eating beef, the Rig Veda mentions cows being ritually killed in an earlier age. The renowned Swami Vivekananda, who first took Hinduism and yoga to the US at the end of the 19th century and is hailed as one of the most important holy men of his era, wrote that formerly, “A man [could not] be a good Hindu who does not eat beef,” and reportedly ate it himself. Anyway, the degree to which cows were viewed as “sacred” in early Hinduism is not as obvious as many believe. The Indus Civilisation of four or five thousand years ago, to which many look for their physical and spiritual origins, was meat-eating, although many fundamentalist Hindus now deny it.

Vegetarians are fond of claiming well-known historical figures for themselves. In India, perhaps the most famous is Ashoka, who ruled much of the subcontinent in the third century before Christ and was the key proponent of Buddhism. He certainly advocated compassion for animals and was against sacrificial slaughter and killing some species, but it is questionable whether he or those he ruled were actually vegetarian.

We all exalt our own morality of course, but professing a higher consciousness than those who think differently casts a supremacist shadow.

Whatever Ashoka’s diet included, many Buddhists today are meat-eaters like the Dalai Lama and most Tibetans—rather avid ones in my experience—and tea made with butter is a staple of Himalayan monastic life. Mercy for Animals however remains steadfast to its principles, asserting, “Even (sic!) Jewish and Muslim cultures are experiencing a rise in animal welfare consciousness.”

Mercy for Animals might look at how racists have supported animal rights over the last hundred years, sometimes cynically and sometimes not. “Concern for animals can coexist with a strong strain of misanthropy, and can be used to demonise minority groups as barbaric, uncivilised and outdated . . . in contrast to supposedly civilised, humane Aryans. . . . The far right’s ventures into animal welfare is sometimes coupled with ‘green’ politics and a form of nature mysticism.”

Mercy for Animals was founded by Milo Runkle, a self-styled “yogi” who lives in Los Angeles. He was raised on an Ohio farm and discovered his calling as a teenager on realising the cruelty of animal slaughter. He is now an evangelical vegan who believes an “animal-free” meal is, “an act of kindness”. He is also a keen participant in the billion-dollar Silicon Valley industry trying to make and sell “meat and dairy” made from plants, animal cells and chemicals. He is a co-founder of the Good Food Institute and sits on the board of Lovely Foods. Like others in the movement, he rejects the term “fake” and insists that the products made in factories—that are supported by billionaires like Richard Branson and Bill Gates—are real meat and dairy, just made without animals.

The multi-million dollar Good Food Institute is also supported by Sam Harris, a US philosopher who came to prominence with his criticism of Islam, which he believes is a religion of “bad ideas, held for bad reasons, leading to bad behaviour”, and constitutes “a unique danger to all of us.”

Milo Runkle, in white, and vegan friends, 2019.

Milo Runkle, in white, and vegan friends, 2019.

Ersatz animal products are of course ultra-processed, by definition. They use gene modifications, are expensive, and produce a significant carbon footprint, although figures for the gasses emitted for any type of food depend on thousands of variables and are extremely complex to calculate. The numbers bandied about are often manipulated and should be viewed with caution, but it seems that the environmental footprint of “cultivated meat” may actually be greater than that of pork or poultry.

Is opposing livestock—and not just factory farming—and promoting veganism and fake meat and dairy a really effective way of reducing environmental pollution? Few people are qualified to assess the numerous calculations and guesses, but it is clear that there are vastly different claims from the different sides in the anti-livestock debate. They range from it contributing some 14 per cent of greenhouse gases, to a clearly exaggerated 50 per cent—and the fact that livestock on pasture also benefits the atmosphere is rarely mentioned by its critics. Thunberg plumps for a vague “agriculture and land use together” category, which she thinks accounts for 25 per cent of all greenhouse gas emissions, but which of course includes plants. It is also important to realise that some grazing lands are simply not able to produce human food other than when used as animal pasture. Take livestock out of the picture in such places, and the amount of land available for food production immediately shrinks.

In brief, some vegetarians and vegans may produce higher greenhouse gas emissions than some omnivores—it all depends on exactly what they consume and where it is from. If they eat an out-of-season vegetable that has travelled thousands of miles to reach their plate, it has a high carbon footprint. The same thing, grown locally in season, has a much lower carbon footprint. If you are in Britain and buy, for example, aubergines, peas, beans, asparagus, or Kenyan beans, you are likely consuming stuff with a high environmental impact.

Mercy for Animals might look at how racists have supported animal rights over the last hundred years, sometimes cynically and sometimes not.

In any event, there is no doubt that a locally sourced, organically raised—or wild—animal is an entirely different creature from one born and living in a factory on the other side of the world. There is also no doubt that the factory version could be a legitimate target for climate activism. So could the felling of established forests, whether it is for cattle, animal feed or any number of things.

*

Why should anyone who does not want real meat or dairy want to eat an expensive lookalike made entirely in a factory? Is it mere taste, habit, or virtue signalling? Few would dispute that the food we eat is at the centre of our identity. This has long been recognised by social scientists, and is in plain sight in the restaurant quarter of every city, everywhere in the world. “You are what you eat” is also as scientific as it is axiomatic.

3D printed meatDiet is central to many religions, and making people change what they eat, whether through the mission, schoolroom, or legal prohibitions, has long been a significant component in the colonial enterprise of “civilising the natives”. Many traditional indigenous diets are high in animal protein, are nutrient-rich, and are low in fat or high in marine sources of fat. Restricting the use of traditional lands and prohibiting hunting, fishing and trapping—as well as constant health edicts extolling low animal fat diets—have been generally disastrous for indigenous people’s wellbeing, and this is particularly noticeable in North America and Australia. The uniquely notorious residential schools in North America, where indigenous children were taken from their families and forced into a deliberately assimilationist regime, provided children with very little meat, or much of anything for that matter. Many died.

Western campaigns around supposedly improving diet go far beyond physical welfare. For example, the world’s best known breakfast cereal was developed by the Seventh Day Adventist and fiercely vegetarian Kellogg brothers in 1894. They were evangelical about the need to reduce people’s sex drive. Dr Kellogg advocated a healthy diet of his Corn Flakes, which earned him millions. He separately advised threading silver wire through the foreskin and applying acid to the clitoris to stop the “doubly abominable” sin of masturbation. Food choices go beyond animal cruelty or climate change!

The belief that meat-eatingparticularly red meatstimulates sexual desire and promotes devilish masturbation is common in Seventh Day Adventism, a religion founded in the US in the 1860s out of an earlier belief called Millerism. The latter held that Christ would return in 1844 to herald the destruction of the Earth by fire. Seventh Day Adventism is a branch of Protestantism, the religion that has always underpinned American attitudes about material wealth being potentially allied to holiness. I have written elsewhere on how Calvinist Protestant theology from northern Europe underpins the contemporary notion of a sinful humankind opposing a divine “Nature”, and it is noteworthy that Seventh Day Adventism starts at exactly the same time as does the US national park movement in the 1860s.

Restricting the use of traditional lands and prohibiting hunting, fishing and trapping have been generally disastrous for indigenous people’s wellbeing.

Although this is not widely known by the general public, Seventh Day Adventism is one of the world’s fastest growing religions, and has sought to push its opposition to meat into wider American attitudes for over a century. For example, the American Dietetic Association was co-founded by a colleague of Kellogg, Lenna Cooper, in 1917. It evolved into the Academy of Nutrition and Dietetics and is now the world’s largest organisation of nutrition and dietetics practitioners.

Protestants figuring out what God wants humans to eat dates from before Seventh Day Adventism. The famous founder of Methodism, John Wesley, did not eat meat; some years after he died, a few of his followers started the vegetarian Bible Christian Church in England’s West Country. They sent missionaries to North America a generation before the foundation of Seventh Day Adventism and were also closely involved in establishing the Vegetarian Society in England in 1847three years after Christ did not come to end the world with fire as originally predicted. It was this society that first popularised the term “vegetarian”. In 1944, a hundred years after that non-appearance of Christ, the word “vegan” was coined.

Fundamentalist Christians might believe that humankind’s supposedly vegan diet in the Garden of Eden should be followed by everyone, and that is obviously open to question from several points of view. What is clearer, and worth repeating, is that the “normal” Western urban diet, particularly North American, contains a lot of highly processed factory foods and additives and is just not great for human health.

In 1944, a hundred years after that non-appearance of Christ, the word “vegan” was coined.

It is also true that, in spite of generations of colonialism trying to erode people’s food self-sufficiency, hundreds of millions of people still depend on eating produceanimal as well as vegetablewhich is collected, hunted, caught or herded by their own hands, or by others close by, often sustainably and organically. Perhaps rather paradoxically, Thunberg visited Sami reindeer herders the year before her Mercy for Animals film. They are recognised as indigenous people in her part of the world and are about as far from veganism as is possible. They not only eat the reindeer, including its milk, cheese and blood, but also consume fish, moose and other animals. As far as I know, there are no indigenous peoples who vegan anywhere in the world.

Sami haute cuisine, about as far from veganism as imaginable.

Sami haute cuisine, about as far from veganism as imaginable.

Like the Sami, about one quarter of all Africans depend on sustainable herding, and the pastoralists in that continent have an enviable record of knowing how to survive the droughts that have been a seasonal feature in their lives for countless generations. It is also the case that pasturelands created or sustained by their herds are far better carbon sinks than new woodlands.

Some wild as well as domesticated animal species feed a lot of people. In spite of conservationist prohibitions and its relentless demonisation, “bushmeat” is more widespread than is admitted and remains an important nutritional source for many Africans. Denigrating it has an obviously racist tone when compared to how “game” is extolled in European cuisine. If you are rich, you can eat bushmeat, if you are poor, you cannot.

Many do not realise that bushmeat is openly served in African restaurants, particularly in South Africa and Namibia, the countries with by far the highest proportion of white citizens. During the hunting season, no less than 20 per cent of all (red) meat eaten is from game with, for example, ostrich, springbok, warthog, kudu, giraffe, wildebeest, crocodile and zebra all featuring on upmarket menus. Meanwhile, poor Africans risk fines, beatings, imprisonment or worse if they hunt the same creatures. When “poachers” are caught or shot, Western social media invariably erupts with brays of how they deserve extreme punishment.

 The Carnivore, Johannesburg (also in Nairobi), “Africa’s Greatest Eating Experience”, makes a feature of bushmeat on its menus.

The Carnivore, Johannesburg (also in Nairobi), “Africa’s Greatest Eating Experience”, makes a feature of bushmeat on its menus.

Some conservationists would like to end both herding and hunting and, even more astonishingly, advocate for Africans to eat only chicken and farmed fish. In real life, any step towards that luckily unattainable goal would result in an increase in malnutrition, in the profits of those who own the food factories and supply chains, and probably in greenhouse gas emissions as well.

Controlling people’s health and money by controlling their access to food has always featured large in the history of human subjugation. Laying siege was always a guaranteed way of breaking an enemy’s body and spirit. If most food around the world is to be produced in factorieslike fake meat and dairythen the factory owners will control human life. The drive to push small-scale hunters, herders and farmers off their land, supposedly for rewilding or conservation, is a step towards that ruin.

The clamour against meat and dairy goes far beyond opposition to factory farming, and that is the problem. Of course, there is nothing wrong with celebrating vegetarianism and veganism, but claiming they are a product of a higher consciousness or morality, and labelling those who do not follow the commandment as cruel or guilty if they stick to their existing diet, as Thunberg and Runkle do, turns them into religious beliefs. These invariably encompass fundamentalist undertones that can tip all too easily into violence against non-believers.

“Meet [sic] is murder” – vandalism of meat and cheese shops is a common tactic of vegan activists.

“Meet [sic] is murder” – vandalism of meat and cheese shops is a common tactic of vegan activists.

Some vegans go beyond persuasion, and try to force others to their belief whether they like it or not. One way in which they do this is by raiding factory farms illegally to “liberate” the animals, as Milo Runkle did, or they engage in other low-level vandalism like spray-painting meat and cheese shops or breaking windows, or go further and wreck vehicles. The fact that the most extremist animal rights activistsusually referencing veganismdo all of this and a great deal more, including physical threats, arson, grave robbing (sic), and planting bombs, is unfortunately no invented conspiracy theory.

The most extreme protests involving firebombs and razor blades in letters are normally reserved for those who use animal tests in research. The homes of scientists are usually the targets, although other places such as restaurants and food processing plants are also in the firing line. One US study found that the activists behind the violence were all white, mostly unmarried men in their 20s. Their beliefs echoed those of many ordinary climate activists. They included supporting biodiversity; that humans should not dominate the earth; that governments and corporations destroy the environment; and that the political system will not fix the crisis.

An organisation called Band of Mercy (unrelated to Mercy for Animals) was formed in 1972 and renamed the Animal Liberation Front four years later. Starting in Britain, where by 1998 it had grown to become “the most serious domestic terrorist threat”, it spawned hundreds of similar groups in forty countries around the world. Membership is largely hidden but they do seek publicity—in one year alone, they claimed responsibility for 554 acts of vandalism and arson.

Of course, moderate vegans are not responsible for the violence of a small minority, but history shows that where there are lots of people looking for a meaningful cause, some will support those they latch onto in extreme ways. In brief, there is a problematic background to opposing meat and dairy that should be faced. Big influencers must accept a concomitantly big responsibility in choosing what to endorse. The most powerful influencers who demonise anything must be sensitive to the inevitability of extremist interpretations of their message.

The drive to push small-scale hunters, herders and farmers off their land, supposedly for rewilding or conservation, is a step towards that ruin.

We know that digital communication is a new and effective way of stoking anger that can lead to violence. For example, the risk that Muslims in India today might be murdered by Hindu fundamentalists if they are even suspected of eating beef seems to have increased with the proliferation of social media. Characterising a meal as cruel if it includes meat or even dairy, as Runkle wants us to, could be used to stoke deadly flames far from his West Coast home.

Hindu fundamentalists, having lynched a Muslim suspected of storing beef, burn his effigy in response to an inquiry that found that he had not.

Hindu fundamentalists, having lynched a Muslim suspected of storing beef, burn his effigy in response to an inquiry that found that he had not.

More broadly, well off influencers trying to make others feel guilty about what they eat should be careful about unintended consequences. Disordered eating damages many people, especially young girls who already face challenges around their transition to adulthood. In addition to everyday teenage angst and biology, they are faced with the relentless scourge of social media, now with eco- and COVID19-anxiety as added burdens. In a rich country like the UK, suicide has become the main cause of death for young people. In that context, telling people they are guilty sinners if they carry on eating what they, or their parents, have habitually eaten could set off dangerous, cultish echoes.

On another level, corporations and NGOs should stop trying to deprive people of any food self-sufficiency they might have left, and stop kicking them off their territories and into a dependence on factories from which the same corporations profit.

The obvious lesson from all this is to eat locally produced organic food as much as possible, if one can. That is a good choice for health, local farming, sustainability, and reducing pollution. Those who want to might also choose to eat less meat and dairy, or none at all. That is a good choice for those who oppose animal slaughter, believe milk is exploitation, or decide that vegan is better for them. However, claiming veganism means freedom from guilt and sin and is a key to planetary salvation is altogether different and, to say the least, open to question.

Thunberg’s core message in her Mercy for Animals film is “We can change what we eat”, although she admits that some have no choice. In reality, choosing what to eat is an extraordinarily rare privilege, denied to most of the world’s population, including the poor of Detroit and Dhaka. The world’s richest large country has 37 million people who simply do not have enough to eat, of anything; six million of these Americans are children. Those lucky enough to possess the privilege of choice do indeed have an obligation to use it thoughtfully. In that respect anyway, Thunberg is right.

Support The Elephant.

The Elephant is helping to build a truly public platform, while producing consistent, quality investigations, opinions and analysis. The Elephant cannot survive and grow without your participation. Now, more than ever, it is vital for The Elephant to reach as many people as possible.

Your support helps protect The Elephant's independence and it means we can continue keeping the democratic space free, open and robust. Every contribution, however big or small, is so valuable for our collective future.

By

Stephen Corry (b. 1951, Malaya) was projects director of Survival International, the global movement supporting indigenous and tribal peoples, from 1972, and has been its director since 1984.

Long Reads

The Possibilities and Perils of Leading an African University

This is the first of a ten-part series of reflections on various aspects of my experiences over six years as Vice Chancellor of USIU-Africa that will be expanded into a book.

Published

on

The Possibilities and Perils of Leading an African University

For six years, from 2016 to 2021, I was Vice Chancellor (President) of a private university in Kenya, the United States International University-Africa. It was an honor and privilege to serve in that role. It marked the apex of my professional academic life. It offered an incredible opportunity to make my small contribution to the continued development of the university itself, put into practice my scholarly research on African higher education, and deepen my understanding of the challenges and opportunities facing the sector at a time of tumultuous change in African and global political economies.

When I took the position, I was quite familiar with both African universities and Kenya as a country. I was a product of African higher education having undertaken my undergraduate studies at the University of Malawi, my home country, in the 1970s. I had done my PhD dissertation at Dalhousie University in Canada on Kenya’s economic and labor history where I spent about fifteen months in 1979-1980.

Later, I taught at Kenyatta University in Nairobi for five and half years between 1984-1989. That is one reason the position of Vice Chancellor at USIU-Africa eventually proved attractive to me.  I would be returning to my African “intellectual home.” Or so I thought. I came back to a different country, as I will elaborate later in my reflections.

After I left Kenya at the beginning of January 1990, I spent the next 25 years at Canadian and American universities. But Africa was always on my mind, as an epistemic and existential reality, the focus of my intellectual and political passions, the locus of my research work and creative writing. My scholarly studies on intellectual history examined the construction of ideas, disciplines, interdisciplines, and higher education institutions and their African provenance, iterations, and inflections.

Over the years I had published numerous books and papers on African studies and universities including in 2004 African Universities in the 21st Century (Vol.I: Liberalization and Internationalization and Vol II: Knowledge and Society), and in 2007 The Study of Africa (Vol. I: Disciplinary and Interdisciplinary Encounters and Vol.II: Global and Transnational Engagements).

In early 2015, I was commissioned to write the Framing Paper for the 1st African Higher Education Summit on Revitalizing Higher Education for Africa’s Future held in Dakar, Senegal March 10-12. I was also one of the drafters of the Summit Declaration and Action Plan. So, I was well versed on the key issues facing African higher education. But leading an actual African university proved a lot more complex and demanding as this series will show.

The vice chancellor’s position at USIU-Africa was advertised after the Dakar Summit. Initially, it had little appeal for me. My earlier experiences at Kenyatta University had left me wary of working as an “expatriate”, as a foreigner, in an African country other than my own. In fact, in 1990 I wrote a paper on the subject, “The Lightness of Being an Expatriate African Scholar,” which was delivered at the renowned conference convened by the Council for the Development of Social Science Research in Africa, held in Uganda in late November 1990, out of which emerged the landmark Kampala Declaration on Intellectual Freedom and Social Responsibility. The paper was included in my essay collection, Manufacturing African Studies and Crises published in 1997.

The paper began by noting, “The lack of academic freedom in Africa is often blamed on the state. Although the role of the state cannot be doubted, the institutions dominated by the intellectuals themselves are also quite authoritarian and tend to undermine the practices and pursuit of academic freedom. Thus, the intellectual communities in Africa and abroad, cannot be entirely absolved from responsibility for generating many of the restrictive practices and processes that presently characterize the social production of knowledge in, and on, Africa. In many instances they have internalized the coercive anti-intellectualist norms of the state, be it those of the developmentalist state in the South or the imperialist state in the North, and they articulate the chauvinisms and tyrannies of civil society, whether of ethnicity, class, gender or race.”

The rest of the paper delineated, drawing from my experiences at Kenyatta, the conditions, contradictions, constraints, exclusions, and marginalization of African expatriate scholars in African countries that often force them to trek back to the global North where many of them studied or migrated from, as I did.

Once I returned from the diaspora back to Kenya in 2016, I soon realized, to my consternation, that xenophobia had actually gotten worse, as I will discuss in later sections. It even infected USIU-Africa that took pride in being an “international American university.” In my diasporic excitement to “give back” to the continent, to escape the daily assaults of racism that people of African descent are often subjected to in North America, Europe and elsewhere, I had invested restorative Pan-African intellectual and imaginative energies in a rising developmental, democratic, integrated and inclusive post-nationalist Africa.

Over the next six years, I clang desperately to this fraying ideal. It became emotionally draining, but intellectually clarifying and enriching. I became an Afro-realist, eschewing the debilitating Afro-pessimism of Africa’s eternal foes and the exultant bullishness of Afro-optimists.

In 2015, as I talked to the VC search firm based in the United States, and some of my close friends, and colleagues in the diaspora I warned up to the idea of diaspora return. The colleagues included those who participated in the Carnegie African Diaspora Fellowship Program (CADFP). The program was based on research I conducted in 2011-2012 for the Carnegie Corporation of New York (CCNY) on the engagement of African diaspora academics in Canada and the United with African higher education institutions.

CADFP was launched in 2013 and I became chair of its Advisory Council comprised of prominent African academics and administrators. This was one of four organs of the program; the other three were CCNY providing funding, the Institute for International Education (IIE) offering management support, and my two former universities in the US (Quinnipiac) and Kenya (USIU-Africa) hosting the Secretariat. Several recipients ended up returning to work back on the continent long after their fellowships. I said to myself, why not me?

For various reasons, my position as Vice President for Academic Affairs in Connecticut had turned out to be far less satisfactory than I had anticipated. I was ready for a new environment, challenges, and opportunities. So, I put in an application for the USIU-Africa vice chancellorship. There were 65 candidates altogether. The multi-stage search process replicated the ones I was familiar with in the US, but it was novel in Kenya where the appointment of vice chancellors tends to be truncated to an interview lasting over a couple of hours or so in which committee members score the candidates sometimes on dubious ethnic grounds.

At the time I got the offer from USIU-Africa, I had two other offers, a provostship in Maryland, and as founding CEO of the African Research Universities Alliance. Furthermore, I was one of the last two candidates for a senior position at one of the world’s largest foundations from which I withdrew. I chose USIU-Africa after long deliberations with my wife and closest friends. Becoming vice chancellor would give me an opportunity to test, implement, and refine my ideas on the Pan-African project of revitalizing African universities for the continent’s sustainable transformation.

USIU-Africa had its own attractions as the oldest private secular university in Kenya. Originally established in 1969 as a branch campus of an American university by that name based in San Diego that had other branches in London, Tokyo, and Mexico City, it was the only university in the region that enjoyed dual accreditation by the Commission for University Education in Kenya and the Western Association of Schools and Colleges in the United States. Moreover, it was the most international university in the region with students from more than 70 countries; an institution that seemed to take diversity and inclusion seriously; a comprehensive university with several schools offering bachelor’s, master’s, and doctoral programs; one that boasted seemingly well-maintained physical and electronic infrastructure poised for expansion. The position prospectus proclaimed the university’s ambitions to become research intensive.

Six months before my wife and I packed our bags for Kenya, I took up a fellowship at Harvard University to work on a book titled, The Transformation of Global Higher Education: 1945-2015 that was published in late 2016. I had long been fascinated by the history of ideas and knowledge producing institutions around the world, and this book gave me an opportunity to do so, to examine the development of universities and knowledge systems on every continent—the Americas, Europe, Asia, and of course Africa. Writing the book filled me with excitement bordering on exhilaration, not least because it marked the second time in my academic career that I was on sabbatical.

I thought I was as prepared as I could be to assume leadership of a private African university. As I showed in my book, by 2015, private universities outnumbered public ones across the continent, 972 out of 1639. In 1999, there were only 339 private universities. Still, public universities predominated in student enrollments, and although many had lost their former glory, they were often much better than most of the fly by night profiteering private institutions sprouting all over the place like wild mushrooms.

Africa of course needed more universities to overcome its abysmally low tertiary enrollment ratios, but the haphazard expansion taking place often without proper planning and the investment of adequate physical, financial, and human resources only succeeded in gravely undermining the quality of university education. The quality of faculty and research fell precipitously in many countries and campuses as I have demonstrated in numerous papers.

Serving in successive administrative positions ranging from college principal and acting director of the international program at Trent University in Canada, and in the United States as center director and department chair at the University of Illinois, college dean at Loyola Marymount University, and academic vice president at Quinnipiac University, I had come to appreciate that once you enter the administrative ladder, even if it’s by accident or reluctantly as was in my case, there are some imperatives one has to undertake in preparing for the next level.

Universities are learning institutions and as such university leaders at all levels from department chairs to school deans to management to board members must be continuous learners. This requires an inquisitive, humble, agile, open, creative, entrepreneurial, and resilient mindset.

It entails, first, undergoing formal training in university leadership. Unfortunately, this is underdeveloped in much of Africa as higher education leadership programs hardly exist in most countries. As part of my appointment, I asked for professional training opportunities to be included in my contract for the simple reason I had never been VC before so I needed to learn how to be one! In summer 2016 and summer 2017, I attended Harvard University’s seminars, one for new presidents and another on advancement leadership for presidents. Not only did I learn a lot, I also built an invaluable network of presidential colleagues.

Second, university leaders must familiarize themselves with and understand trends in higher education by reading widely on developments in the sector. In my case, for two decades I became immersed in the higher education media by subscribing to The Chronicle of Higher Education and later Times Higher Education, and reading the editions of Inside Higher EducationUniversity World News, and other outlets. As vice chancellor I took to producing a weekly digest of summaries of pertinent articles for the university’s leadership teams. I got the impression few bothered to read them, so after a while I stopped doing it. I delved into the academic media because I wanted to better understand my role and responsibilities as an administrator. Over time, this morphed into an abiding fascination with the history of universities and other knowledge producing institutions and systems.

Third, it is essential to develop the propensity for consulting, connecting, and learning from fellow leaders within and outside one’s institution. As a director, chair or a dean that means colleagues in those positions as well as those to who one reports. The same is true for deputy vice chancellors or vice presidents. For provosts and executive vice presidents and presidents the circle for collegial and candid conversations and advice narrows considerably and pivots to external peers.

In my case, this was immensely facilitated by joining boards including those of the International Association of Universities, the Kenya Education Network, better known as KENET, and the University of Ghana Council, and maintaining contacts with Universities South Africa. These networks together with those from my previous positions in Canada and the United States proved invaluable in sustaining my administrative and intellectual sanity.

Fourth, it is imperative to develop a deep appreciation and respect for the values of shared governance. Embracing and practicing shared governance is hard enough among the university’s internal stakeholders comprising administrators, faculty, staff, and students. It’s even more challenging for the external stakeholders including members of governing boards external to the academy. This was one of the biggest challenges I faced at USIU-Africa as I’ll discuss in a later installment.

Fifth, it is critical to appreciate the extraordinary demands, frustrations, opportunities and joys of leadership in African universities. Precisely because many of these universities are relatively new and suffer from severe capacity challenges of resources in terms of funding, facilities, qualified faculty, and well-prepared students, it creates exceptional opportunities for change and impact. Again, as will be elaborated in a later section, I derived levels of satisfaction as vice chancellor that were higher than I had experienced from previous positions in much older and better endowed Canadian and American institutions where university leaders are often caretakers of well-oiled institutional machines.

Sixth, during my long years of university leadership at various levels I had cultivated what I call the 6Ps: passion for the job, people engagement, planning for complexity and uncertainty, peer learning, process adherence, and partnership building. This often encompasses developing a personal philosophy of leadership. As I shared during the interviews for the position and throughout my tenure, I was committed to what I had crystallized into the 3Cs: collaboration, communication and creativity, in pursuit of the 3Es: excellence, engagement, and efficiency, based on the 3Ts: transparency, trust, and trends.

Seventh, it is important to pursue what my wonderful colleague, Ruthie Rono, who served as Deputy Vice Chancellor during my tenure, characterized as the 3Ps: protect, promote, and project, in this case, the mission, values, priorities, and interests of the institution as a whole not sectarian agendas. She often reminded us that this was her role as Kenya’s ambassador to several European and Southern African countries during a leave of absence from USIU-Africa, to safeguard Kenya’s interests. Unfortunately, outside the management team, this was not always the case among the other governing bodies as will be demonstrated later.

Eighth, as an administrator one has to balance personal and institutional voices, develop an ability to forgive and forget, and realize that it’s often not about you, but the position. Of course, so long as you occupy the position what you do matters; you take credit and blame for everything that happens in the institution even if you had little to do with it. Over the years as I climbed the escalator of academic administration, I confronted the ever-rising demands and circuits of institutional responsibility and accountability. You need to develop a thick skin to deflect the arrows of personal attack without absorbing them into your emotions. You need to anticipate and manage the predictable unpredictability of events.

Ninth, I had long learned the need to establish work balance as a teacher, scholar, and administrator. In this case, as an administrator I taught and conducted research within the time constraints of whatever position I held. I did the same during my time as vice chancellor. I taught one undergraduate class a year, attended academic conferences, and published research papers to the surprise of some faculty and staff and my fellow vice chancellors. I always reminded people that I became an academic because I was passionate about teaching and research. Being an administrator had actually opened new avenues for pursuing those passions. I had a satisfying professional life before becoming vice chancellor and I would have another after I left.

There was also the question of work-life balance. Throughout my administrative career I’ve always tried to balance as best as I can my roles as a parent, husband, friend, and colleague. Moreover, I maintained outside interests especially my love for travel, the creative, performing and visual arts, voracious reading habits developed in my youth over a wide range of subjects and genres, not to mention the esthetics of cooking and joys of eating out, and taking long walks. I found my neighborhood in Runda in Nairobi quite auspicious for the invigorating physical and mental pleasures of walking, which I did every day for more than an hour during weekdays and up to two hours on weekends.

Not being defined by my position made it easier to strive to perform to the best of any ability without being consumed by the job, and becoming overly protective of the fleeting seductions of the title of vice chancellor. I asked colleagues to call me by my first name, but save for one or two they balked preferring the colorless concoction, “Prof.” Over the years I had acquired a capacity to immerse myself and enjoy whatever position I occupied with the analytical predisposition of an institutional ethnographer. So, I took even unpleasant events and nasty surprises as learning and teachable moments.

This enabled me to develop the tenth lesson. Leave the position when you’ve given your best and have the energy to follow other positions or pursuits. When I informed the Board of Trustees, Chancellor, and University Council fourteen months to the end of my six-year contract that I would be leaving at the end of the contract, some people within and outside USIU-Africa including my fellow vice chancellors expressed surprise that I was not interested in another term.

The fact of the matter is that the average tenure of university presidents in many countries is getting shorter. This is certainly true in the United States. According to a 2017 report on the college presidency by the American Council of Education, while in the past presidents used to serve for decades—my predecessor served for 21 years—“The average tenure of a college president in their current job was 6.5 years in 2016, down from seven years in 2011. It was 8.5 years in 2006. More than half of presidents, 54 percent, said they planned to leave their current presidency in five years or sooner. But just 24 percent said their institution had a presidential succession plan.” Whatever the merits of longevity, creativity and fresh thinking is not one of them!

A major reason for the declining term of American university presidencies is, as William H. McRaven, a former military commander who planned the raid that killed Osama bin Laden, declared as he announced his departure as chancellor of the University of Texas system after only three years, “the job of college president, along with the leader of a health institution, [is] ‘the toughest job in the nation.’ In my case, there was a more mundane and compelling reason. My wife and I had agreed before I accepted the position that I would serve only one term. Taking the vice chancellorship represented a huge professional and financial sacrifice for her.

By the time I assumed the position, I believed I had acquired the necessary experiences, skills and mindset for the pinnacle of university leadership. Over the next six years I experienced the joys and tribulations of the job in dizzying abundance. This was evident almost immediately.

Two days after we arrived in Nairobi, we were invited to the home of one of my former students at Kenyatta University and the University of Illinois. Both he and his wife, who we knew in the United States from the days they were dating, were prominent public figures in Kenya; she later became a cabinet minister in President Kenyatta’s administration. We spent New Year’s Day at their beautiful home together with their lovely and exceedingly smart two daughters and some of their friends and relatives eating great food including roasted meat in Kenyan style. It was a fabulous welcome. We felt at home.

But the bubble soon burst. Hardly two weeks later, our home in the tony neighborhood of Runda was invaded by armed thugs one night. I was out of town at a university leadership retreat. My wife was alone. While she was not physically molested, she was psychologically traumatized. So was I. The thugs went off with all her jewelry including her wedding ring, my clothes and shoes, and our cellphones and computers. My soon to be finished book manuscript on The Transformation of Global Higher Education was in my stolen computer. It was a heinous intellectual assault.

Our Kenyan and foreign friends and acquaintances showered us with sympathy and support. Some commiserated with us by sharing their own stories of armed robbery, what the media called with evident exasperation, Nairoberry. We later learnt there was more to our hideous encounter, the specter of criminal xenophobia. It was a rude awakening to the roller coaster of highs and lows we would experience over the next six years during my tenure as Vice Chancellor of USIU-Africa.

Both of us had fought too many personal, professional, and political battles in our respective pasts to be intimidated. We were determined to stay, to contribute in whatever way we could to higher education in our beloved motherland.

Continue Reading

Long Reads

Rethinking the Internationalization of African Universities Post-COVID-19

For the developed countries higher education internationalization is part of their arsenal of global soft power, while the developing countries value it for its potential to build high quality human capital.

Published

on

Rethinking the Internationalization of African Universities Post-COVID-19

African universities have had complex and contradictory experiences with internationalization rooted in colonial and postcolonial histories and the international division of intellectual labor. Africa’s insertion and positioning in the global knowledge economy have largely been characterized by marginality spawned by asymmetrical and unequal relations with the global centers. This leads to a paradox. Because of the histories of colonial and postcolonial dependence and underdevelopment, African higher education is simultaneously highly internationalized and the most internationally peripheral system.

Some of the challenges and opportunities for higher education internationalization in general and for African universities specifically have been exposed by the COVID-19 pandemic. In this presentation, I would like to explore possible new directions that may develop in the post-COVID era. I have argued in a series of essays and books, including most recently, The Transformation of Global Higher Education: 1945-2015 (2016) and Africa and the Disruptions of the Twenty-first Century (2021) that the motivations and rationales of higher education internationalization are as multifaceted as the institutional contexts and actors are diverse.

Some of my next remarks in the next few paragraphs are derived from a chapter in my latest book. For the rest of the presentation, I seek to explore, rather tentatively as the pandemic continues to ravage economies, health care systems, organizations and institutions including universities around the world, the systemic fractures, hierarchies, and inequalities in higher education internationalization and some of the opportunities for imagining and investing in different processes, patterns, and practices that could be more equitable and mutually beneficial for African countries and universities and their global partners.

Anatomy of Internationalization

The incentives and justifications for higher education internationalization are variously articulated at national, sectoral, and institutional levels. They include national development and demographic imperatives. For decades internationalization has followed the historic trails and reproduced the unequal flows of people, programs, and paradigms between and the global South and the global North, the polite bureaucratic terminology for the old imperial metropoles and their former colonial dependencies.

For the developed countries higher education internationalization is part of their arsenal of global soft power, while the developing countries value it for its potential to build high quality human capital. For example, it was reported in October 2021 that “France is strengthening partnerships with the African higher education sector – a move perceived to be part of a broader strategy of higher education diplomacy, or ‘soft power’ aimed at strengthening African alliances to serve France’s cultural, economic and political agendas.” Among the initiatives launched was “a five-year €130 million (about US$151 million) fund to support African digital start-ups through the Digital Africa initiative… a three-year €30 million fund for democracy in Africa, in which African universities could play an important part. In addition, a ‘House of African worlds and diasporas’, will be established.” All paltry gestures and homage to the tired discourse of foreign aid.

For its part, also in October, “The British Council has announced the names of nine South African universities that were awarded grants under the Innovation for Africa project, which is part of its Going Global Partnerships program… The project, which is university-driven and learning- and networking-focused, is designed to support the development of Africa-UK university partnerships that build institutional capacity for higher engagement in entrepreneurship.”

Demographic pressures vary but reinforce these demands. Internationalization provides an important outlet for excess and specialized demands for higher education from the emerging economies and countries of the global South with their bulging youthful populations. At the same time, in the increasingly aging countries of the global North, importing students from the global South is critical to universities facing a youth demographic squeeze.

For the higher education sector, there is an assortment of economic, political, social, cultural, and academic imperatives and rationales. Economically, internationalization is often justified in terms of preparing students for careers in a globalized economy, enhancing national development and competitiveness, and as a means of generating extra-institutional income. Politically, it is valorized for promoting international understanding, global citizenship, and intercultural competency in an increasingly polarized and dangerous world. Its sociocultural imperative lies in the need to cultivate intercultural literacy in progressively multicultural societies.

There are also specific institutional rationales. Many universities pursue internationalization for financial reasons, as a critical revenue stream, since foreign students tend to be charged higher fees than local students are. It is maintained that internationalization facilitates inter-institutional co-operation, competition, and comparison, which can enhance the quality of higher education by compelling institutions to meet or rise to international standards. In a globalized world, internationalization is seen as an indispensable part of institutional recognition and branding, an essential attribute and asset in the intensifying competition for talented students, faculty, resources, and reputational capital among universities within and among countries.

Surveys conducted by the International Association of Universities including the latest undertaken in 2018 show that more than 90% of respondents mention internationalization as a priority in their strategic plans. The most important expected benefits of internationalization included “Enhanced international cooperation and capacity building” and “Improved quality of teaching and learning.” The main institutional risk identified was that “International opportunities [were] accessible only to students with financial resources,” followed by “Difficulty to assess/ recognize quality of courses/programs offered by foreign institutions,” and “Excessive competition with other higher education institutions.”

The incentives and justifications for higher education internationalization are variously articulated at national, sectoral, and institutional levels. They include national development and demographic imperatives

At the societal level, the main risks to internationalization were “Commodification and commercialization of education programs” as well as “Brain drain.” Institutional leaders were seen as the main internal drivers, while externally it was “Business and industry demand”, “Demand from foreign higher education institutions”, and “Government policy”. As for obstacles, they include “Insufficient financial resources,” followed by “Administrative/bureaucratic difficulties,” “Lack of knowledge of foreign languages,” and “Difficulties of recognition and equivalences of qualifications, study programs and course credits.”

In all regions the most critical internationalization activity remains student mobility, followed by strategic partnerships, and international research collaboration. Many institutions have identified geographic priorities for internationalization. To quote the report, “Europe and, to a lesser extent, North America are considered priority regions by all other regions, while Asia & Pacific is the top priority region for North American HEIs and the second most important region for European HEIs. Africa and the Middle East were identified as priority regions only by their own institutions who replied to the Survey.” Sehoole and Lee show that not all internationally mobile African students want to go to the global North; they prefer the comforts of familiarity in studying in other African countries that have quality education such as South Africa. Overall, North American responses in the IAU report diverged from the global norm, and there were some differences within regions as well.

Historically, higher education internationalization has entailed the flows of people, programs, and paradigms. When it comes to people the focus has largely been on student flows, and far less on faculty mobility. Globally, student mobility has been decidedly unequal. According to UNESCO Institute of Statistics datasets, the number of outbound international students rose from 4.8 million in 2015 to 6.1 million in 2019, the latest date for which data is available. It is important to note that internationally mobile students represent a small proportion of tertiary students, a mere 2.2% in 2015 and 2.6% in 2019.

The largest share of internationally mobile is claimed by Asia 2.4 million or 50.2% of the world total in 2015, which rose to 3.1 million in 2019 (51.8%). Europe came next accounting for 919,192 in 2015 (19.2%) and 1,001,931 in 2019 (16.52%) in 2019. Africa accounted for 510,491 (10.7%) in 2015 and 577,308 (9.5%) in 2019. The share for Latin America was 285,451 (6.0%) and 386,532 (6.4%) and for Northern America 132,161 (2.8%) and 152,541 (2.5%), respectively.

The script is flipped when it comes to inbound internationally mobile students. Europe and North America claimed the lion’s share. In 2015, Europe received 1.95 million international students (40.9%), while Northern America got 1.08 million (22.6%), or a combined total for the two regions of 63.44% of the world total. Asia increased its share numerically from 999,348 (29.9%) to 1.5 million (24%). Africa’s share declined from 226,155 (4.7%) to 224,312 (3.7%). Latin America and the Caribbean received 162,511 international students in 2015 and 239,769 in 2019. Oceania, principally Australia and New Zealand, which exported very few students (29,610 in 2015 and 31,285 in 2019) received 359,434 in 2015 and 570,180 in 2019.

In each of these regions there were notable differences. The bulk of Europe’s mobile students went to other European countries. In Asia, China, Japan, Malaysia, and the United Arab Emirates dominated. In Africa, South Africa, Egypt, and Morocco claimed a disproportionate share. Globally, the United States led followed by the United Kingdom, Australia, Canada, France, Germany, the Russian Federation, and China. Several countries benefited from the anti-immigration backlash of the Trump administration in the US and Brexit in Britain. In July 2021, the Biden administration called for “Renewed U.S. Commitment” to international education in an unprecedented joint statement issued by the Departments of Education and State that promised to reverse policies of the Trump Administration and outlined a comprehensive national strategy.

The Disruptions of COVID-19

The outbreak of the COVID-19 at the turn of 2021 led to the closures of educational institutions from primary schools to universities around the world. Internationalization in the form of physical mobility was brought to a screeching halt. In many countries international students found themselves stranded as campuses closed, unable to go home, study, and work in their countries of residence. Some were subjected to increased racist abuse, scapegoating, xenophobia, and hate speech stoked by right wing populist politicians and pundits for whom foreigners, especially those from China and other parts of Asia and the global South, whose presence and bodies were pathologized as diseased and deadly vectors of the pandemic. Students planning to go abroad languished at home as their dreams of overseas study were unceremoniously postponed.

Familiar international engagements froze as travel restrictions led to cancellations and rescheduling of professional conferences, research collaborations became harder to pursue as the academic world moved to Zoom, organizations supporting the mobility of students and scholars became paralyzed, and retrenchments by cash-strapped institutions threatened the livelihoods, well-being, and careers of academics. Doomsayers predicted the end of higher education internationalization as it had developed since World War II.

The pandemic is of course still unfolding its lethal effects and consequences on global economies and societies. Many higher education institutions are still grappling with its massive devastations. However, they have also increasingly learned to manage and navigate its disruptions, develop new modalities of teaching and learning, research and scholarship, public service and engagement, and as hubs of innovation and entrepreneurship. The capacities to contain the crisis and plan for a more resilient, robust, and promising future vary enormously between and within countries and institutions. Overall, African universities, and smaller and poorer universities elsewhere, have been the least effective in containing the destructive effects of the pandemic, and plan for reform and transformation; many of them wishfully and desperately hope for a magical restoration of the pre-COVID-19 era.

Demographic pressures vary but reinforce these demands. Internationalization provides an important outlet for excess and specialized demands for higher education from the emerging economies and countries of the global South with their bulging youthful populations

As an administrator, I witnessed this tragic fatalism among some academics at many universities in Kenya including my own and elsewhere on the continent, a pervasive and perverse denial that the pandemic was destined to usher profound changes. Keen to protect their already measly privileges they willfully wallowed in a seductive, blinding, facile, retrogressive and nostalgic hankering for the past to return. The past that some long to rehabilitate was of course no golden age. On the contrary, it was characterized by inequalities, exclusions, hierarchies, and asymmetries. In the case of Africa, it reproduced the marginalization of the continent’s universities, scholarly communities, and knowledge production systems.

I have maintained that COVID-19 should be harnessed as an accelerator for much-needed institutional change. One paper identified seven key transformations exacerbated that were underway: reduced resources, growing competition, the impact of the fourth industrial revolution, the changing nature of jobs, shifts in university demographics, growing public demands on universities, and, of course, the impact of COVID-19. The changes identified above have implications in six key areas for universities in Africa and elsewhere: rethinking the nature of academic programs; delivery of teaching and learning; human resources; resource mobilization and utilization; institutional partnerships; and the nature of leadership skills.

In another paper, I argued that what is at stake for some universities is survival, for others stability, and for many sustainability. Institutional survival is a precondition for stability which is essential for sustainability. Confronting the entire higher education sector is the question of its raison d’être, its value proposition in a digitalized world accelerated by COVID-19. I focused on four critical dimensions: promoting progressive digital transformation, effective leadership, strong institutional cultures, and sustainable funding for African universities. For the first two I proposed a dozen strategies for each, and for the last two seven strategies for each, respectively.

As an administrator, I witnessed this tragic fatalism among some academics at many universities in Kenya including my own and elsewhere on the continent, a pervasive and perverse denial that the pandemic was destined to usher profound changes. Keen to protect their already measly privileges they willfully wallowed in a seductive, blinding, facile, retrogressive and nostalgic hankering for the past to return

The twelve-point agenda for the digital transformation of universities was outlined in considerable detail in a lengthy article. In a keynote address at the quadrennial conference of the Association of African Universities, I crystallized many of my reflections, arguing that quality higher education was indispensable for Africa’s future.

My propositions are not simply based on the appealing aphorisms that opportunity is the flip side of crisis, or that a crisis is too valuable to waste. It arises out of my reading of history as a historian, which shows that moments of global emergencies including pandemics give added impetus to trends already underway and serve to reshuffle the international pecking order, competitiveness, and sustainability of power, socioeconomic systems, and institutions. Those who wish to recreate the past after a crisis as profound as the pandemic are doomed to remain frozen as others charge past them to the future spawned by that very crisis.

For higher education institutions the changes will include the modes of internationalization that have prevailed for decades. During the pandemic universities have generally showed, contrary to their purported aversion to change, an incredible capacity to embrace reform by using digital technologies and experiment with new modalities of delivering teaching, research and support services for students, faculty, and staff.

For example, progressive universities have adopted “virtual internships and exchanges, navigating shifting enrollment patterns and student markets and reconsidering best practices in student services. While a certain measure of retrenchment and re-direction is inevitable in the short term, this pandemic is also presenting opportunities for us to broaden the scope and direction of what we do in international education. Whether we welcome these opportunities or not, these changes could potentially move us in very positive and new directions, directions that we might have otherwise not realized or been slower to pursue.”

Rethinking Internationalization

I would like to propose an eleven-pronged agenda to rethink internationalization. None of what I say is new; it simply encapsulates trends that are both old and new. First, there is need for an expansive view of internationalization that integrates internationalization at abroad and at home and embraces internationalization of the curriculum and research. The former refers to curriculum integration that maximizes “students’ learning through international and intercultural experience.” It implies international exposure that is mediated through intentional interactions with local communities that have origins in different parts of the world and embedding global perspectives in the curriculum. “A pedagogy of encounter,” writes Leask and Green, “is a powerful concept because it does not rely on mobility. There are many opportunities to engage students in intercultural and global setting in class, on campus, and in local communities.”

This is an agenda for domestic and international intercultural engagement and understanding, for “border crossing” for all students rather than the privileged few who are able to travel abroad. After all, as noted earlier, only a tiny minority of tertiary students can study abroad. In terms of research, it entails valorizing equitable, transparent, and respectful practices of knowledge production and co-creation of research projects and agendas, and penalizing exploitative, fraudulent, and unprofessional behaviors. Student research, experiential and service learning, and international and virtual internships should be incorporated.

Second, promoting transformative technology-enhanced partnerships through inter-institutional collaborations and consortia that offer innovative enrollment opportunities for students, online program management, virtual and in-person internships, quality assurance and more seamless credit transfer. The widespread adoption and acceptance of emerging digital technologies and online communication and learning platforms and scholarly engagements forced by COVID-19, should be used by universities to develop effective mechanisms for collaborative online course development, curation and exchange through multi-institutional and transnational alliances, registries, and protocols. There are also opportunities for students to create and foster global online communities with those who share their curricular and co-curricular interests.

The development of immersive technologies of virtual reality, augmented and mixed reality offers incredible opportunities for virtual internationalization, foreign language learning, co-teaching, internships, and provision of student services. Particularly exciting are the possibilities of promoting transnational lifelong learning, continuing education, and professional education programs for all those seeking upskilling and reskilling with globally valued shorter degrees and micro-credentials as the nature of jobs and lifespans undergo massive changes upending the three-stage cycle of life (education-work-retirement) to a multi-stage cycle in which education and working life are continually intersected and retirement pushed further away.

At the societal level, the main risks to internationalization were “Commodification and commercialization of education programs” as well as “Brain drain.” Institutional leaders were seen as the main internal drivers, while externally it was “Business and industry demand”, “Demand from foreign higher education institutions”, and “Government policy”.

Given the wide inter- and intra- institutional disparities in access to these technologies it is critical to address the issue of enhancing the capabilities of disadvantaged institutions and communities through bilateral and multilateral forums and the mobilization of business, international and intergovernmental agencies, and philanthropic foundations and donors.

This underscores the third critical issue, how to finance sustainable and equitable internationalization among institutions and countries that sometimes have divergent economic resources and needs. In addition to aggressively sourcing external funding, international partnerships and projects it is critical to foster candid discussions and the development of pragmatic financial models and institutional investments. It entails rethinking student aid for dispersed students and many other financial aspects of the distributed university whose functions and impact are spread across domestic and international domains, and physical and virtual arenas.

Fourth, internationalization should focus more on building faculty and staff capabilities and collaborations, in addition to the traditional focus on student mobility and exchanges. Worldwide international faculty constitute a tiny minority of overall faculty and staff members. The IAU report revealed that “At 20% of HEIs there are no international academic staff members at all and at 34% the percentage of international academic staff is less than 5%. Latin America & the Caribbean is the region with the smallest percentage of international academic staff while Asia & Pacific and North America have the highest.” Internationalization of faculty must be a critical part of what Jane Knight, the Canadian scholar calls ‘knowledge diplomacy,’ a term she deems broader than “cultural diplomacy,” “science diplomacy,” or “education diplomacy;” it refers to the deployment of internationalization to build a horizontal two-way process of bilateral and multilateral inter-institutional relations that is so essential in tackling pressing global challenges.

Institutional indifference, ignorance, xenophobia, and racism often militate against the effective internationalization of faculty and staff. This is as true of countries in the global North as it is in the global South as I know from personal experience working over the past four decades in Canada, the US, Jamaica, and Kenya, where I was ‘othered’ on racial or national grounds. The IAU report continues, “International experience is valued at the majority of HEIs, but it is considered more an added value than a fundamental requirement, and there is still a substantial group of HEIs (30%) where international experience is not perceived as something important for hiring and promoting academic staff. International experience seems to be highly valued in the Middle East but very little so in North America. In other regions, it seems to be valued more in Asia & Pacific and Europe than in Africa and Latin America & the Caribbean.”

Universities that are serious about internationalization should seek to employ more international faculty. They should also collaborate in capacitating each other in their human resources in terms of faculty development and leadership opportunities in ways that are mutually beneficial and leverage on their respective assets. This is to stress the importance of building partnerships from an asset-based mindset, not a deficit-based mindset, in which the focus is on the potential each brings to the partnership, rather than the problems. This is a particularly pernicious problem in North-South partnerships in which the currency is material wealth and brand, rather than possibilities of forging social transformation that comes from connectedness, cross-fertilization, and mutuality that encompasses equity, autonomy, solidarity, and participation.

This leads to the fifth challenge and opportunity we must confront, namely, promoting ethical internationalization. This means pursuing inclusive internationalization in line with the increasingly popular discourses of equity, diversity, inclusion, and belonging on North American and European campuses, although they are often championed more in rhetoric than practice. We must always ask, how do we integrate students, faculty and staff from marginalized communities, countries, and intellectual traditions in internationalization endeavors that are holistic, rigorous, robust, and transformative? The ethics of internationalization also entails sharing the results of international research, experiences, and interventions widely with the scholarly communities and societies on which the knowledges were produced and being purposeful in making meaningful contribution to society as part of a comprehensive knowledge management process.

Furthermore, ethical internationalization calls for what Luciane Stallivieri calls balance that seeks mutual reciprocity and cooperation regarding “geographic location; valuing different languages for the purposes of teaching, not just English; promoting South-South partnerships to enhance the knowledge produced in other parts of the planet; and creating an equilibrium with regard to incoming and outgoing mobility. Internationalization goals should be based on equal opportunities, symmetry, equivalence of interests and a focus on the equilibrium of international activities, especially when we design our internationalization plans and policies.” This implies institutional and intellectual accountability and integrity. These efforts must be part of the larger agenda of what some call decolonizing international partnerships, stripping them of enduring imbalances bequeathed from the histories of imperialism and colonialism.

Sixth, smart internationalization as Damtew Teferra, the renowned Ethiopian HE scholar calls it, requires integrating internationalization in institutional mission, values, strategic plan, budgeting priorities, and culture. It entails university administrations taking the leadership, the university community developing a shared understanding and commitment, engaging and mobilizing faculty and staff, and developing dual purposing resource strategies in which existing programs are enriched by incorporating internationalization initiatives and orientations.

The issue of differential tuition for domestic and international students is part of the calculus of dumb and smart internationalization. Many universities in North America, Europe, and Australasia impose much higher tuition on international students sometimes as much as four times. They rake in billions—$45 billion in the US in 2018, $15.5 billion in Canada in 2016, £22.6 billion in the UK in 2015/16, and $37.6 billion in Australia in 2018 according to official data. These revenues help many a university to balance their books and subsidize domestic students.

Treating foreign students as “cash cows” raises troubling moral questions and is unsustainable. Some studies indicate that “evidence from tuition fee reforms for international students suggests that the number of international students coming to a country can fall dramatically following an increase in tuition fees. For example, Denmark (in 2006) and Sweden (in 2011) introduced tuition fees for international students from outside the European Economic Area and experienced a sharp decline in the number of international students, by 20% and 80% respectively.” However, in the United States when the Big Ten Academic Alliance of research intensive universities raised tuition by 29% between 2007-2008 and 2014-15, they experienced “an increase of 74% in international student enrollment.”

This underscores different markets and institutions attract different types of international students. Choudaha and de Wit distinguish between “explorers” and “highflyers” who have resources and “strugglers” and “strivers” who are more price sensitive. The main Asian markets of India and China will increasingly dry up as these countries build world class universities following the path of earlier major exporters of students, namely, Japan and South Korea. The bourgeoning African market does not have the levels of wealth as in Asia to replace the declining Asian market, notwithstanding the Africa Rising/Rising Africa narrative that finds increasingly fewer cheerleaders as African economies reel from pandemic generated recessions.

Fay Patel argues that COVID-19, which led to sharp declines in international student flows and demonization of foreign students as part of pandemic panic and hysteria, opens possibilities of what she calls “humanizing international higher education.” She believes this will be achieved through the “glocalization of learning partnerships and frameworks from the developing community perspective so that compassion, kindness and empathy will form the basis for shared co-construction of knowledge in a post-COVID-19 world.” She urges international student communities to lead this agenda.

For smart internationalization to progress, the onus is on universities in the global North and global South to put their own houses in order. This entails restoring healthy and sustainable public investment in higher education that fell into precipitous declines with the rise of neo-liberalism at the turn of the 1980s. In the United States student debt has rocketed to $1.6 trillion and is higher than credit card debt. In many African countries many universities are virtually bankrupt which manifests itself in dilapidated infrastructure, poor salaries, low morale, declining instructional quality, and depressed research productivity. The private universities that have mushroomed in recent years are often glorified high schools while the public ones are ethnic enclaves mired in pettiness, corruption, and toxic politics and averse to planning for the future. Internationalization must be built on the healthy trunks of robust, resilient, and responsible universities, not the flimsy reeds of unstable, indigent, insular, and intolerant institutions.

Seventh, as part of the internationalization agenda universities need to become stronger advocates for more progressive forms of globalization as the specter of de-globalization spreads its xenophobic tentacles. The pandemic has led to border closures and retreat into the imagined safety of national laagers and brutally exposed the fictions of national isolation and security as it marched relentlessly around the world across the imaginary divides of race, class, gender, nationality, ethnicity, religion, and other social inscriptions. Universities must, as noted in a report by the UNESCO International Institute for Higher Education in Latin America and the Caribbean issued in March 2021, take “active responsibility for our common humanity, promoting well-being and sustainability, drawing strength from intercultural and epistemic diversity, and upholding and creating interconnectedness at multiple levels.”

Universities have a more immediate sectoral interest in developing systems for easier credit transfer and qualification recognition frameworks, especially as transnational multi-institutional partnerships expand. Various regional Higher Education Areas have been created. In November 2019, UNESCO adopted the Global Convention on the Recognition of Qualifications concerning Higher Education, marking “the first United Nations treaty on higher education with a global scope. [It is] designed to facilitate international academic mobility and promote the right of individuals to have their higher education qualifications evaluated through a fair, transparent, and non-discriminatory manner… Moreover, it promotes the recognition of refugees’ qualifications, even in cases where documentary evidence is lacking.”

The latter follows the European Qualifications Passport for Refugees that “provides a tested method for assessing and describing qualifications that cannot be adequately documented.” This was an important move given the scale of global internally displaced and refugee flows, which reached 82.4 million in 2020 double the number in 2010 and higher than it has ever been. One of the initiatives I am most proud of during my tenure as Vice Chancellor at USIU-Africa was the $63.2 million scholarship program with the Mastercard Foundation that earmarked 25% of the recipients for forcibly displaced youths. Similar programs exist at several European and North American universities.

For example, progressive universities have adopted “virtual internships and exchanges, navigating shifting enrollment patterns and student markets and reconsidering best practices in student services. While a certain measure of retrenchment and re-direction is inevitable in the short term, this pandemic is also presenting opportunities for us to broaden the scope and direction of what we do in international education.

Eighth, there is need to interrogate the impact of rankings that have become increasingly influential and ubiquitous. They have helped sanctify global academic capitalism by generating fierce competition among universities in a relentless race for reputational status that validates the hegemony of institutions in the global North and the primacy of research often at the expense of teaching and learning and social impact. To be sure, the Times Higher Education Impact Rankings were introduced a few years ago to assess universities against the United Nations’ Sustainable Development Goals. The indicators compare institutions across four broad areas: research, stewardship, outreach, and teaching. However, the impact of the traditional rankings in which the top performers trade places among themselves still hold sway.

Ninth, there is need to probe the growth of international research and its implications as evident in the expansion of publications with international co-authors, increased funding for international research through grants from international organizations, national agencies, and universities’ own resources. Commenting on a report by the EU titled Benefits and Costs of Transnational Collaborative Partnerships in Higher Education, Orosz and Carcium noted, “There is plenty of evidence that international research collaborations result in more and better publications and patents.”

What are the prospects for forging new modes of academic hospitality that promote productive intellectual exchange, creativity, symmetrical and collective pursuit and production of knowledge? The recent collapse of the prominent academic partnership between Yale University and Singapore National University coming on top of restrictions faced by other major American universities in China, and growing distrust of Chinese academics in the US, has raised questions about the sustainability of such collaborations in the face of geopolitical tensions and rivalries.

Tenth, the phenomenon of international program and provider mobility (IPPM), which constitutes a crucial leg in the internationalization enterprise triad, needs to be critically assessed. The growth of IPPM is evident in the establishment of “international branch campuses, international joint universities, joint and double degree programs, franchise arrangements and distance education.” In the UK, for example, 52% of all international students enrolled in a UK qualification awarding program take some of their program through IPPM provision.” In terms of IPPM host countries such as Mauritius approximately 43% “of all local students are enrolled in some type of IPPM. Meanwhile, in Malaysia, Singapore, Hong Kong and Botswana 20 to 30 per cent of all local students are enrolled in IPPM courses.” Thus, programs and providers are increasingly moving to students rather than the other way.

Unfortunately, the growth of IPPM retain and reproduce the enduring asymmetric flows between the global North and global South. In the words of a report by the American Council of Education issued in 2014, “among the institutions surveyed, enrollment in joint and dual degree programs administered by U.S. institutions is heavily skewed towards students from the partner country; participation of American students is limited, and study participants were not optimistic that this situation is likely to change. Given this imbalance, collaborative degree programs may be more of a proxy for recruiting international students and are likely to contribute to the continuing “imbalance of trade” in outward and inward flows of students. The data also reveal geographic imbalances; partner institutions are concentrated in Europe and Asia, with almost no representation in Africa. These issues are worthy of note by institutions that wish to engage their U.S.-based students in international education and to establish truly collaborative, reciprocal relationships with a diversity of inter- national partners.”

The report concludes: “Comprehensive, integrated internationalization requires a holistic approach that includes attention to curriculum, faculty engagement, leadership, strategic planning, and other key areas. Partnerships that not only allow credit transfer back and forth, but truly engage faculty, staff, and students from both institutions around substantive issues such as curriculum design and instructional philosophies have the greatest potential to advance internationalization on all these fronts. Whether through joint degrees or other program models, establishing such in-depth collaborations will help U.S. institutions lay the groundwork for mutuality and sustainability in their international partnerships, and advance not only the internationalization of U.S. higher education but also the global higher education enterprise as a whole.” I couldn’t agree more.

Let me conclude by sharing brief reflections on an important aspect of internationalization, the eleventh agenda. As you might know or would appreciate from my personal and professional biography, as an African diaspora academic I put great premium on the role of the diaspora as an indispensable global partner in Africa’s development drive. This includes higher education. Not only have I written extensively on the subject, my research on the African academic diaspora led to the establishment of the Carnegie African Diaspora Fellowship Program (CADFP), whose success has exceeded expectations.

For example, to date, the program has sponsored nearly 500 African born academics in the United States and Canada to work with dozens of universities in six African countries that were selected. I firmly believe the African academic diaspora, both the new and historic diasporas, represent huge intellectual assets for the development and globalization of African knowledge economies. However, as is true in other realms of international relations, the relationship between academics on the continent and in the diaspora are exceedingly complex.

As I’ve noted in my own work and is clear in the recently published African Diaspora Toolkit these relations are characterized by collaboration and conflict, misunderstandings and mutuality, and inclusivity and indifference. They are conditioned by historic and prevailing institutional, intellectual, and ideological structures, systems, and values, as well as individual behaviors and attitudes that must be clearly understood and always negotiated and cultivated to ensure reciprocal benefits. African diaspora partnerships must be embraced as an integral part of the internationalization of African higher education. This is equally true of the global diasporas of other world regions.

Continue Reading

Long Reads

Conservation: What Do John Muir’s Writings and Thoughts Mean Today?

Conservation practice today is based on settler colonialism because it is invariably led by the needs, sentiments and aspirations of outsiders. This is a fundamental flaw that began with European immigrants to North America.

Published

on

What Do John Muir’s Writings and Thoughts Mean Today?

John Muir is widely acknowledged as the “father” of conservation thinking, indeed considered by many to be a conservation hero whose standing straddles two centuries. Whether or not that is justified is a different issue, but the pinnacle of this adulation is probably the reference to him as the “Father of the National Parks”. A historical examination of the global network of protected areas shows that the original US National parks like Yellowstone were the inspiration behind the establishment of similar structures all over the world.

From a personal perspective, there are a number of different things that inspired my childhood fascination with wildlife and wild places, and a major one was the wonderful hardcover book “The American Wilderness” by John Muir, a somewhat unlikely read for the typical African child, but part of my experience spending part of my childhood in the United States. Others were the romanticized and shallow portrayals of African wildlife in films like “Born Free” and “Serengeti Shall Not Die” by Joy Adamson and Bernhard Grzimek, respectively, which were made in the 1960s, but really became staples in the 1970s. One theme that endured the test of time from the turn of the 20th century into the beginning of the 21st century has been the dominance of white people in the global conservation narrative.

When we read the works of John Muir today, what stands out to critical observation is his gift of expression and passion for the wilderness he talks about. However, therein also lies the deep malaise that grew from charismatic men like him, and their gift for imparting their belief.  Conservation practice around the world today is based loosely on the fortress conservation model developed in 19th Century North America. Reference to history books reveals to us portraits of a society that had little or no place for any human perspectives that weren’t Christian, Male, and white. These dominant perspectives were enforced by continuous violence, perpetrated on Native American nations, women, black slaves brought from Africa, and Hispanic people from further south.

One of Muir’s more famous works, Our National Parks, was published in 1901 and this caught the attention of President Theodore Roosevelt. They corresponded regularly and in 1903, Roosevelt visited Muir and they undertook a camping trip in Yosemite. To this day, it is still considered the “most significant camping trip” in history. There, together, beneath the trees, they laid the foundation of Roosevelt’s expansive conservation programmes. This was the beginning of the global protected area network, driven largely by hubris and the need for self-actualization through the purported protection of “nature”. This convergence of ideas between these two famous men also conferred acceptability to the notion of “pristine wilderness” devoid of human presence. The seeds of racism in conservation practice were also sown during this time by Muir, who regarded Native Americans as “unclean” and something of a stain on the pristine wilderness that was Yosemite.

The perceptions of European colonists around the world determined what part of biodiversity was worth killing and what was worth saving, resulting in a strange situation where they simultaneously occupied the place of killers and that of “saviours”. Even today, where the intrinsic values of biodiversity seem to be indelibly stained by the needs of tourism, the desires and aspirations of white people continue to influence what in nature is to be eliminated as vermin, what is to be hunted for prestige as trophies, and what is precious enough to be protected through the use of force and violence.

In some cases, species like the African elephant are hunted by white hunters for prestige while being protected by white saviours from black “poachers”. Indeed, conservation practice is one of the few facets of human endeavour where duplicity is accepted as the norm, with endangered wildlife protected with military might, while those who are wealthy enough gladly pay for licences to kill them for fun. We have Muir’s contemporaries like Teddy Roosevelt who were extraordinarily proficient killers of wildlife celebrated as pioneers of conservation. To fully understand the true import of John Muir’s story and legacy, conservation scholars ought to delve into American history to understand the context within which he was living and marvelling about this “beautiful wilderness” within which he found himself.

Firstly was the creation of Yellowstone National Park in March 1872. This park is widely acknowledged to be the foundation upon which the development of protected areas as a conservation tool grew into the widely held paradigm that we see today. We know very well what Yellowstone is in the history of America as written by the European colonists, but what is Yellowstone on the ground? A two million acre expanse of land from which the ancestors of the Kiowa and Crow Nations were excluded in order to provide a recreation area for the settler colonists. Conservation was and still is an integral cog in the wheel that is colonialism, because it “erases” indigenous people from landscape and lexicon. Even today, the history of Yellowstone as detailed in the US National Parks’ service website does not have any account of the Native Americans’ role in the history of the park. The history of Yellowstone National Park is therefore resolutely “white”.

Species like the African elephant are hunted by white hunters for prestige while being protected by white saviours from black “poachers”.

The efficacy of terrestrial wildlife conservation is widely (and erroneously) stated by science to be proportional to the geographical size of the “protected area”. Conservation science knows, but never acknowledges that the geographical size of a protected area is also directly proportional to the degree of violence required to establish and maintain it. This thinking is also an inadvertent admission that present day conservation science is based on settler colonialism, primarily the notion that wildlife cannot sustainably share landscapes with humans. The sharing of habitats, landscapes and resources with wildlife is an ancient and present reality of indigenous populations all over the world. It is a widely accepted fact that human populations, consumption patterns, and carbon footprints have changed irreversibly over the centuries. However, humans are a unique species in that we are able to change the carrying capacity of their habitats through behaviour. When we examine this capacity for behaviour change through the prism of natural resource use, it is conservation practice in its purest form, the very essence of civilization.

From their own historical accounts, the settler society in North America was still a very primitive and violent society in the 18th and 19th centuries, with the gun becoming an essential part of everyday life for people everywhere, in farms, in the wild, and even in urban settings. The Hobbesian nature of North American settler society of the period gave rise to the second amendment to the US constitution giving every citizen the right to bear arms. This was a self-justifying law, because the right to bear arms, in itself gave rise to the overarching “self-defence” justification for bearing arms.

This is the world in which John Muir distinguished himself through his appreciation of wilderness and natural spaces. Through his appreciation for wild spaces and his gift for writing, he managed to bring nature “home” for the white settler population, gathering a vast following in the process. The amount of interest in nature that Muir managed to garner was also a reflection of the entitlement to it that was felt by the European settler population. The settler population’s admiration for nature made no reference to the Native American nations that they found living with, and using these resources. The universal admiration for National Parks also reveals ignorance or tacit acceptance of the fact that protected areas are almost universally created through acts of violence and disenfranchisement. The peace and tranquillity that protected areas represent, is actually a result of areas being “cleansed” of the indigenous people that used them, an act that in itself reduces the owners of resources to the level of a nuisance that needs to be removed, or otherwise dealt with.

Racism is very difficult to deny or escape in conservation literature over the generations and Muir wasn’t immune to this difficulty. His writings revealed a grudging admiration for the ways in which the native Americans lived off the land on naturally available resources while leaving a very light footprint on the landscape. Muir’s own limitations in this regard were brought into sharp relief by his inability to find sufficient food during his forays into the wilds of Yosemite, the main limitation that curtailed the amount of time he could spend out in the wild. He was conflicted in the manner in which he regarded nature as clean and pristine, while regarding the natives who blended so well into it as “unclean”.

Conservation was and still is an integral cog in the wheel that is colonialism, because it ‘erases’ indigenous people from landscape and lexicon.

Despite being a typically detailed and expressive writer, Muir’s work studiously avoids mentioning any positive interactions with, or assistance from the Native Americans, although he does describe interactions and conversations with Chinese immigrants. This starkly illustrates conservation’s greatest prejudice—the disregard for indigenous peoples. It does not stand to reason that Muir could have spent so much time exploring, sketching, painting and describing Yosemite without regularly encountering its original residents. This was a deliberate effort to erase them from the Yosemite narrative and it is a sad testament to the culture of conservation that this erasure was accepted without question for over a century.

Once we understand the attitudes of John Muir and European settler colonialism, the pall of racism that stains his writings becomes more visible, for instance in the way he refers to the Natives’ knowledge of their environment as “instinctive behaviour”, a description commonly used in zoology to describe the behaviour of various wildlife species. In addition, there is the conflicted manner in which one who considers himself a naturalist grudgingly admires indigenous knowledge, while stating that it is beyond the remit of what he describes as “civilized whites”.

Another vital lesson that contemporary conservation scholars should draw from Muir’s descriptions of his experiences is the nexus between western Christianity and conservation’s attendant prejudices.  Muir grew up in a staunch Christian family and the spiritual tone is very “audible” in his writing, particularly when describing beautiful features and landscapes. More telling, are the repeated references to the Native Americans as “unclean”. He expressly refers to “dirt” on their faces, but it is a well-known fact that face painting is an integral part of the culture of many Native American nations. Moreover, personal hygiene was never an important attribute of 19th century “civilized white” outdoorsmen. This was more a reference to “heathen” in the biblical sense—recognizing them as human, but unable to attain the imagined level of “cleanliness” upon which he placed “civilized whites” and nature.

Once we understand the attitudes of John Muir and European settler colonialism, the pall of racism that stains his writings becomes more visible

In order to understand the impact (or lack thereof) that John Muir had on the behaviour of the “civilized whites” of his time, we need to examine the state of biodiversity in North America at the time. His lifetime (1838-1914) straddled a period of precipitous decline in wildlife species that had astounded settlers by their sheer abundance. The Bison roamed the prairies in the tens of millions until the early 19th century, but were hunted down to less than a hundred animals in 1880. They were killed for their tongues and hides, with the rest of the animal left out to rot. Even by contemporary standards, the sheer destructiveness and bloodlust defy belief. Some heroes like William “Buffalo Bill” Cody are still celebrated today for having single-handedly killed thousands of Bison.

Two notable declines of avifauna in North America also occurred during Muir’s years, namely the Ivory-billed woodpecker and the passenger pigeon—species that neither posed any danger to, nor had direct conflict with human populations. The passenger pigeon’s decline was far more precipitous and it finally went extinct when the last specimen died in captivity in 1914. To put this into context, the passenger pigeon was once the most abundant bird in North America, with a population estimated at around 5 billion birds. They were hunted by the Native Americans for food, but the precipitous decline and eventual extinction was driven by the arrival of Europeans and hunting for food on a commercial scale. The ivory-billed woodpecker was also decimated by hunting, probably driven by the fact that it was a large, brightly coloured and highly visible bird.

Protected areas are almost universally created through acts of violence and disenfranchisement.

The casual destruction is starkly illustrated by reports that in the late 1940s when the bird was critically endangered some hunters and fishermen were still using ivory-billed woodpecker flesh to bait traps and fishing hooks. Given the low level of technological advancement at the time, these extirpations demonstrated an extraordinary commitment to killing, which indigenous populations around the world are neither psychologically equipped to understand nor to perpetrate.

The unique neurosis of conservation interests and naturalists is notable in the fact that the final precipitous decline of both these species was driven by collectors who went out to shoot specimens for private and institutional collections. The inexplicable pride that collectors have in writing scientific papers about the “last collected” specimen is a feature that excludes indigenous people from extirpating species, because even where they are hunters, an animal that is reduced to a few specimens is no longer worth the energy and time required to hunt and kill it. For a “conservation scientist” however, when there are only two left, there is an irresistible compulsion to find them, kill them and collect them. The reward is fame through media coverage, scientific publications, promotion in academia, and bizarrely, conservation grants to help “prevent extinction ever happening again”.

The hundreds of skins of these decimated species that remain in the collections of universities and museums bear silent witness to the wanton destruction done in the name of conservation “science”. So the most important question of our times is, if we claim that John Muir inspires our conservation work more than a century after his passing, why wasn’t he able to stop the mass extinctions precipitated by the destructive nature of his fellow “civilized whites” in North America during his lifetime? It is very difficult to find any records of his trying to do so, despite all his lofty social connections. The lack of concern from conservationists over the actions of white people obviously isn’t a new phenomenon.

Conservation being a “noble obligation, therefore, set the stage for the exclusion of the proletariat, and the celebration of the nobles, regardless of what they actually do. Our state of knowledge in 2021 needs to acknowledge that sustainable use of natural resource is the cornerstone and a necessary preoccupation of indigenous civilizations all over the world. This is uncontestable because the indigenous societies that didn’t have this culture died out, leaving only ruins as evidence of their past existence. The fact that wildlife and biodiversity still exist side by side with these societies implies that conservation was, and still is part and parcel of their cultures and livelihoods. In over 20 years of research and practice in wildlife conservation policy and practice, I still haven’t encountered a word for “conservation” in any African language because it was simply a principle governing how people lived their lives. The word used most often in Kiswahili, for example, is uhifadhi, which translates more accurately to “keep”, more akin to an item kept on a shelf than a living system with producers and consumers.

Therefore, conservation as a structured, abstract and discrete concept is a creation of destructive people, those who need to create barriers to their own consumption, which extends far beyond the remit of their needs and into the realm of wanton destruction. It is a concept that lives very comfortably with contradictions and duplicity, for example, the protection of wildlife not for its intrinsic value but in order to satisfy the desires of those who seek dominion over it. This fundamental flaw is the reason why wildlife conservation in Africa still remains and has been unable to escape from racism and violence to this day. Its mind-set grew from the thoughts, imaginations and deeds of Theodore Roosevelt and his bloodthirsty harvest of wildlife during his 1909 Kenya safari. The animals he brutally killed for his self-actualization are still displayed in the Smithsonian National Museum of Natural History (which sponsored his trip and still celebrates this slaughter to this day).

Many euphemisms have been used to describe his trip, including high-brow terms like “expedition” and “scientific collection”, but the truth is that he was neither a scientist nor and an explorer. He was a wealthy American seeking self-actualization and to demonstrate dominance through the slaughter of African wildlife. This is still the profile of westerners’ involvement in African conservation even in the 21st century, with the only notable difference being that some of them are now women and some come to “save” rather than slaughter our natural heritage.

Looking at the story of John Muir, there is so much importance attached to his association and closeness to Theodore Roosevelt, which of course is perceived as a significant bolster to his already considerable credentials as a conservationist. So who was “Teddy” Roosevelt and what does he mean to those of us seeking harmony with our natural heritage today? He was a picture of the self-interested need to possess and dominate nature that so often masquerades as love for nature and wildlife. This sentiment is also a very comfortable redoubt for racism, because it instantly places indigenous people living in situ in the position of “obstacles” to conservation and the survival of ecosystems. Roosevelt was an unabashed racist, and conservation will continue to suffer, until it can escape from the intellectual clutches of its prejudiced icons.

The passenger pigeon was once the most abundant bird in North America, with a population estimated at around 5 billion birds.

Racism amongst individuals wasn’t remarkable in early 20th century America, but scholars of conservation today must pause for thought at the manner in which racists are accepted and celebrated by individuals and institutions in our field. The American Museum of Natural History in New York features a larger than life bronze statue of Roosevelt at the front entrance. He is astride a magnificent horse and flanked by two people on foot: a Native American, and a black man. The location of the statue implies appreciation of Roosevelt as a conservationist, but the statue itself portrays Roosevelt as a conqueror of sorts, giving a nod to white supremacy. We in conservation accepted it and never once questioned its message. It took the social upheavals and racial tensions of 2019 to commence discussions around its removal, almost 80 years after it was erected.

There are many cases in history where racists escaped odium because their prejudices were closeted. Theodore Roosevelt wasn’t of that ilk. He described Native Americans as “squalid savages” and justified the taking of their lands to spread white European “civilization”. It isn’t difficult to imagine that this sentiment played an important part in his affinity for John Muir, especially since the creation of national parks was a faster and more effective way to appropriate lands belonging to indigenous people. It still is, all over the world. The following are excerpts from his spoken and written words. One of the most starkly racist statements from a world leader was probably Roosevelt’s expressed opinion on the lengthy genocide perpetrated by European settlers on the Native American population. “The most righteous of all wars is a war with savages, though it is apt to be also the most terrible and inhuman . . . A sad and evil feature of such warfare is that whites, the representatives of civilization, speedily sink almost to the level of their barbarous foes.” 

It isn’t a stretch of imagination to perceive the possible part this sentiment played in the mutual admiration between Muir and Roosevelt. Muir’s regard of Native Americans as “unclean” would have been bolstered by the unequivocal support of a racist president. This in turn, would have greatly enhanced the moral acceptability of the violent eviction of Native Americans to make room for the National Parks that are so celebrated today.

Roosevelt’s racism wasn’t restricted to the natives and this was his opinion on the annexation of Texas in 1845: “It was of course ultimately to the great advantage of civilization that the Anglo-American should supplant the Indo-Spaniard.” His opinions on black people probably give us the most pause for thought when we examine the spread of fortress conservation around the world’ On slavery, Roosevelt said: “I know what a good side there was to slavery, but I know also what a hideous side there was to it, and this was the important side.” He also believed that the “average Negro” was not fit to take care of himself, and this was the cause of what he referred to as “the Negro problem”.

It isn’t debatable that Theodore Roosevelt was racist, and racism isn’t a new malaise afflicting societies around the world, but it is imperative that we ask ourselves why conservation is the one field that can make a racist acceptable and even celebrated around the world. Judging from the views he aired in the public domain, Roosevelt’s bigotry borders on white supremacy, but he was a much admired figure during and after his presidential tenure, receiving many accolades, including the 1906 Nobel Peace Prize. In the 100 years since his death in 1919, he has been relentlessly celebrated as a champion of nature with a slavish devotion that only conservation and religious cults are able to inspire. He is praised for “setting aside” land for National Parks, even though it didn’t belong to him, and the process was in violation of Native Americans’ rights.

In over 20 years of research and practice in wildlife conservation policy and practice, I still haven’t encountered a word for “conservation” in any African language.

Conservationists even celebrate the centenary of his slaughter of Kenyan wildlife, somehow seeing it as the inspiration for their conservation work. It was meticulously documented in photographs and journals, but through it all, Africans were conspicuous by their absence from the narrative. Photos therein depict hundreds of black porters carrying heavy loads on foot, while Roosevelt and his white companions rode on horseback. The inspiration for the bronze statue outside the American Museum of Natural History is obvious. His detailed journals narrating his direct bloodthirsty interaction with African wildlife without “interference” from indigenous populations helped give rise to  the myth of an untrammelled African “wilderness” devoid of human presence or influence.

Sadly, this still remains the cornerstone of what western tourists aspire to experience. It is ludicrous that conservation science even extends this fallacy to the rangelands of East Africa, which archaeology acknowledges to be the cradle of mankind, having developed in the presence of humans for over a million years. Indigenous peoples have been vilified by conservationists for generations as “primitive”’, “unclean” and “uncivilized”, but it is now time for us to acknowledge the truth and confront the fact that those dubious distinctions are actually features of conservationists and their chosen profession.

As we evolve into more intellectually astute or “civilized” practitioners of conservation, we will have to look at how attitudes have been shaped by Christianity, in nations where the settler colonialists included Christian missionaries. Dominion of man over other creatures is a well-known Christian tenet and dominion over the same in foreign lands is an expression of expansionism and imperialism over lands and peoples. An examination of the fortress conservation model and how it has been practiced around the world, its perceived successes and failures is basically a walk along the path trodden by European colonists in partnership with Western Christian missionaries.

The fortress conservation model has either died, or had to remodel itself radically in those societies where Christian missionaries and colonialists found strong and structured religious traditions that preceded them. Knowledge of this history is vital in the understanding of contemporary conservation narratives. China and India are the world’s two most populous nations. They are also home to a magnificent array of biodiversity, including iconic endangered species like the giant panda and the tiger, respectively. A common narrative in the conservation arena today is the claim that human populations are compromising the space available for wildlife and space needs to be “secured” for wildlife in one way or another. We have known for some time now that this is a racially biased concern because it is never said in reference to any country whose indigenous inhabitants are white, despite the fact that some like the UK have relatively high population densities and hardly any biodiversity worth speaking of. Countries like my homeland (Kenya) that have a well-developed tourism industry have the added intellectual burden of selling a spurious product which presumes to place indigenous people in the position of bystanders and props. Terms like “winning space for wildlife” are de rigueur in conservation circles with the attendant loss of resource rights remaining unsaid.

Closer scrutiny of all the noise surrounding human population as a challenge to conservation will reveal that China and India are prominent by their absence from this discussion. You are unlikely to hear of China’s human population being described on global platforms as a threat to the survival of the giant panda, or India’s population as a threat to survival of tigers. A similar argument could be extended to Indonesia on the nexus of population density, coexistence with biodiversity and low penetration of Christianity, which has now been replaced by conservation as the most effective vehicle of Caucasian hegemony in the world today.

Conservation will continue to suffer, until it can escape from the intellectual clutches of its prejudiced icons.

Conservation practice today is based on settler colonialism because it is invariably led by the needs, sentiments and aspirations of outsiders. This is a fundamental flaw that began with European immigrants to North America like John Muir, who then presumed to claim ownership, concern and value for natural heritage beyond that of the Natives in whose presence this entire ecosystem had evolved. Today, it is incumbent upon conservation practitioners plying their trade away from their homelands to correct this by understanding and accepting local people’s aspirations.  We need to accept that these aspirations could include the desire for us to go away and leave them alone. We in conservation need to read more into the social sciences and move away from the imagination that ours is a field of biology. The Australian anthropologist Patrick Wolfe accurately (if inadvertently) describes conservation’s neuroses in his 2006 paper entitled Settler Colonialism and The Elimination of the Native. We must realize that the challenges we are facing aren’t discrete events, but a flawed, cruel and unjust structure, that must be dismantled before it collapses on itself.

The foibles and faults of Individuals like John Muir, therefore, aren’t the problem as much as the structure that they and others created that removes humans from nature and worships the resultant falsehood as a fetish. Yosemite, Yellowstone, Serengeti, Kruger, Tsavo, Corbett, Kaziranga and all the other national parks around the world are monuments to this fetish. We must value them, not just for their ecosystem values, but for the memories of the brutalized populations who were victims of their creation and continue to be victims of their maintenance. Anything else is patently false. We in Africa are constantly under assault from “saviours” who love African wildlife but hate African people. We must temper our celebration of conservation “icons” with knowledge of their deeds and context. If we do this, we’ll understand that someone like John Muir was just a relatively sensible member of a brutal and primitive colonial settler class who appreciated nature and sought to take it away from its indigenous owners.

It is imperative that we ask ourselves why conservation is the one field that can make a racist acceptable and even celebrated around the world.

The relentless need that westerners have to impose the colonial model of conservation on indigenous peoples of other races is less about concern for biodiversity and more about the need to deny the existence of the civilizations that preceded them in these lands. In my experience, one of the highest forms of civilization is the capability of living with wildlife for millennia without destroying it, and for this alone, indigenous peoples deserve to be celebrated, and not vilified, displaced and occasionally killed. This regular injustice has been accepted and celebrated for the last 100 years, until completely unrelated events in 2019 gave rise to the Black Lives Matter movement. Suddenly, global consciousness of racial injustice has been heightened, with calls for the removal of monuments to racists around the world, including Roosevelt at the American Museum of Natural History.

Things came full circle in July 2020, when the Sierra Club distanced itself from John Muir, its icon and founder, admitting that he was actually a racist. Is this new information? No it isn’t. John Muir was one of those people who achieved “icon” status in their lifetimes, so society has always been aware of what he said and his thoughts. Our acknowledgement today of his prejudices is simply a sign that the primitive conservation field is gradually becoming civilized. Ideally, we shouldn’t regard the legend of John Muir as an inspiration for anything we should do in the future, but as a prism through which we can view the false edifice we refer to as “conservation”. It is also a glimpse into conservation’s dark past—one that we should neither forget nor repeat. Through this prism of historical knowledge, “Yosemite” is an essential historical work for anyone who understands where conservation has come from, and where it should be aiming to go.

Continue Reading

Trending