Connect with us
close

Ideas

The Potential of Indigenous Knowledge in Building Climate Resilience

7 min read.

The depth of knowledge that the pastoralist communities possess provides an opportunity for its integration with scientific observations for their improved adaptation to climate change.

Published

on

The Potential of Indigenous Knowledge in Building Climate Resilience

“I am convinced that climate change, and what we do about it, will define us, our era, and ultimately the global legacy we leave for future generations. Today, the time for doubt has passed”.  Secretary-General Ban Ki-moon, 24 September 2007

All earlier geological epochs have resulted from natural processes, but the changes currently being experienced by the planetary ecosystem are anthropogenic—human-induced. Our over-reliance on fossil fuels and unsustainable economic practices have led to irreversible climate changes. The two significant causes of climate change are the excessive use of carbon-based energy sources that emit most greenhouse gases and a consumption model that outstrips the available resources.

While wealthy nations are the largest emitters of greenhouse gases, poor developing countries that cause the least emissions are the most affected by climate change. For instance, the African continent contributes the least to greenhouse gas emissions but is disproportionately affected by climatic changes. With 17 per cent of the global population, Africa contributes a paltry 2 to 3 per cent of CO2 while the US, with 4 per cent of the world’s population, generates 14 per cent of global CO2 emissions.

The changes in climate mainly manifest in the increase of the global temperature that has been rising since record keeping began in the 1880s, with 2016 and 2020 being the hottest years on record and 2011-2020 being the warmest decade. The change results in extreme events and increased frequency of natural hazards such as droughts, floods, storms, hurricanes and wildfires, often with catastrophic consequences for human life and livelihoods.

These climatic hazards pose existential threats to societies that directly rely on natural capital for their livelihoods. This is particularly the case for Africa, where the vast majority of the population depends for their livelihoods on activities that are primarily rainfall-dependent, such as rain-fed crop production and livestock rearing. These livelihood activities are highly susceptible to climate-related hazards and shocks, rendering rural households vulnerable to shifts and changes in the climate.

Climate change creates favourable conditions for the spread of diseases and the establishment of invasive alien species. For instance, areas previously not mosquito-prone due to low temperatures are now likely to become malaria zones. Since the populations in such areas lack immunity towards the disease, they are likely to be more susceptible than those who inhabit regions prone to malaria. Extreme weather patterns also cause an increase in resource-based conflicts and population displacement—the number of climate refugees is on the rise.

The response by governments—particularly those of rich nations—to address the underlying causes of climate change remain unsatisfactory. Forecasting and early warning systems remain poor in developing countries due to limited meteorological capacities, making it difficult to ascertain the extent and the impact of climate change. On the other hand, local communities rely on their indigenous knowledge and practice to makes forecasts and adapt to the changing climate.

Indigenous knowledge and climate change 

Indigenous communities are not victims of climate change; they are holders of vast intergenerational indigenous knowledge that is critical to climate change mitigation and adaptation. With advances in science, these local knowledge systems have been disregarded as they are perceived as not based on any scientific principles. In recent decades, however, there has been a realization that local communities have developed intricate knowledge systems that are used to forecast climatic events, plan resource use, and adapt to climatic changes. Climate change and resilience experts are calling for the integration of local knowledge systems with scientific know-how to develop sustainable responses to the challenges of climate change.

Indigenous communities are not victims of climate change; they are holders of vast intergenerational indigenous knowledge that is critical to climate change mitigation and adaptation.

Pastoralist communities have borne the brunt of climate change. In contrast, pastoralism has evolved to adapt to the changing climate; an increase in hazardous climatic effects often catches pastoralist communities on the back foot. However, their sustained livestock production systems within marginal environments is testimony that their knowledge systems and practices have somehow enabled them to navigate the vagaries of climatic hazards.

While the pastoralist communities of northern Kenya are of different ethnic backgrounds, the basic tenets of their local knowledge systems are similar and interlinked.

The Borana community is a Cushitic community that inhabits the rangelands of southern Ethiopia and northern Kenya. The Borana have a rich indigenous knowledge system that helps them make decisions within the highly variable environment in which they produce.

Holders of knowledge and its application

Knowledge holders and knowledge types are diverse and are used in a triangulating way to reduce the uncertainty associated with forecasting. The sources of the knowledge are also varied. They include observing the behaviour of livestock and wild animals and the observation of constellations, ultimately co-relating these observations over time and space using rich oral traditions.

Primary to the forecasting knowledge of the Borana is the detailed 12-month calendar that comprises twelve lunar months (Abraasa, Ammaji, Gurandhalaa, Bittateesa, Caamsa, Buufa, Wacabaji, Obora Gudda, Obara Diqqaa, Birraa, Ciqqaaqaa, Sadaasaa) with each month having 27 days (ayyana). The reckoning of the ayyana is based on a lunar cycle similar to that of the Mayan, Hindu and Chinese calendar. The custodians of this knowledge are called ayyaantu. They use a deep understanding of the constellations to keep track of the calendar days. The ayyaantu help with understanding seasonality and foretell the timing and intensity of rainfall occurrences based on the stars. Ayyaantu’s knowledge is hereditary—a father passes it on to a selected son who passes it on in the same fashion. Under certain circumstances, others from a non-ayyaantu family may also be trained by the ayyaantu. This ensures the passing down of knowledge in cases where a direct heir to the knowledge is lacking. For in-depth learning, a son’s only duty is to study during his father’s lifetime; an ayyaantu cannot disseminate the information.

The observations of occurrences in each ayyana  over the years are kept in memory and used to forecast climatic conditions. A significant observation is, for example, the sighting of the new moon, which is used to determine the performance of the upcoming rainy season. Also, at the start of a rainy season, the ayyana on which it begins is an indication of the general performance of the rains. For example, it is believed that if the onset of the rains is on basaa days (basaa durra and basaa balloo), it is expected to be good. It is also on basaa days that sacrifices and prayers for rain—goromti rooba; she-goat sacrifice—are normally conducted.

Usually, the information gathered from the readings of the constellations and its interpretation is disseminated during public gatherings. Based on this system, the Borana ayyaantus have perceived changes in the climate through a decrease in the length of the rainy season since the 1960s.

The observations of occurrences in each ayyana  over the years are kept in memory and used to forecast climatic conditions.

Each ayyana is associated with either good or bad omens and for this reason, beyond forecasting the weather, an important function of the ayyaantu is to determine the dates on which important ceremonies should take place such that they occur during the days associated with good omens. For instance, on the day a child is born, particularly a boy child, the parents immediately consult the ayyaantu to know the omens associated with that day.

The solar systems and climate forecasting

The interpretation of the constellation of stars is a means of counting the calendar days and a major climate forecasting system. The interpretations are made based on the timings of the rising for some of the stars, their colour, and the alignment of the constellations with the moon. The main stars observed include Lamaii (Beta Triangulum), Buusan (Pleiades), Sorsa (Aldebarran), Algajima (Bellatrix), Arba (Central Orion), Walla (Saiph) and Basaa (Sirius).

The interpretation of the stars is done in relation to the traditional months and the respective days of the month, called ayyana. How a specific star aligns with the moon on successive nights is carefully observed and used to forecast rainfall or the severity of the dry season. For instance, if the lamii (Beta Triangulum) stars are aligned with the moon on the 14th of Biraa, the 11th of Ciqaa, the 9th of Sadaasaa, the 7th of Abraasaa, the 5th of Ammaji, and the 3rd of Gurandhala, then it is interpreted that the dry season that follows will be normal with limited negative impacts on people and livestock. This example illustrates the complexity of the knowledge system and intricate observations that can only be memorised by the individuals trained as ayyaantu. This knowledge is a reserve for a few, which gives the holders a revered status in the community.

The sun and moon are also used in weather forecasting. The degree of tilt of the new moon towards the north or south depending on the month provides an indication of how the rainy season will perform. When a ring forms around the moon and the sun, it is associated with the occurrence of heavy rains.

Livestock and wild animals

The Borana observe the behaviour of cattle closely to determine whether the season ahead will be good or bad. For instance, some of the indications of poor conditions ahead include observations such as when the cattle are reluctant to go out to graze or to return to the kraal; lead cattle refusing to take up the role of leading the livestock to graze; cattle sleeping close to each other in the kraal and at times defecating while lying down; cattle licking each other around the throat; and female cows constantly worried and looking out for their calves even after suckling them.

Following slaughter, livestock is also a source of information particularly through the “reading” of the entrails (uusa) by specialists called uuchu. The reading of entrails is also an art that sons learn from their fathers while others learn by following experienced uuchu. The marks in the entrails are observed and used to foretell rainfall. Often, several uuchu observe the same entrails and each provide their observations. The interpretation can differ which leads to interesting arguments that demonstrate the variations in the interpretation.

The Borana observe the behaviour of cattle closely to determine whether the season ahead will be good or bad.

The specific sounds of certain birds, frogs, hyenas, and foxes are also used as rainfall indicators. When the ground squirrel digs out the soil, and the ant constructs its mound during the late afternoon in the dry season instead of doing so at night, this indicates that the rainy season is about to begin.

The information from these various sources is triangulated and further references are made to the knowledge of oral historians (jarsa argaa dhageeti) who have the ability reckon the cyclic occurrences of climatic events.

The foregoing is an attempt to show the various age-old sources of information used by pastoralist communities to enable them to make decisions within the highly variable environment in which they produce. Sustaining livestock production in this environment requires a high degree of livestock mobility, which in turn requires reliable forecasting and good information on the availability of resources. It is a knowledge system that has evolved to fulfil these constant needs for information on which to base decisions.

Opportunities for knowledge integration for informed decisions 

The current climate change situation calls for close collaboration between local actors and global decision-makers. The depth of knowledge that the pastoralist communities possess provides an opportunity for its integration with scientific observations for improved adaptation by these vulnerable communities.

There is currently an increased appreciation of the knowledge held by local indigenous communities within scientific circles. The local knowledge system has been developed over time and passed on across generations, and it has better local applicability and acceptability. Integrating the practical aspects of this local knowledge with scientifically generated knowledge will bridge the divide between knowledge systems and increase the acceptability of climate forecasting information.

Poor investment in equipment and lack of adequate human resources leads to lack of reliable information for planning and decision-making. In Kenya currently, the meteorological department is the primary source of climate information. By bringing the two systems of local and scientific knowledge together, there are areas of complementarity that can improve resilience planning in the country.

Support The Elephant.

The Elephant is helping to build a truly public platform, while producing consistent, quality investigations, opinions and analysis. The Elephant cannot survive and grow without your participation. Now, more than ever, it is vital for The Elephant to reach as many people as possible.

Your support helps protect The Elephant's independence and it means we can continue keeping the democratic space free, open and robust. Every contribution, however big or small, is so valuable for our collective future.

By

Dr Jaro Arero is a Science Policy Expert. Dr Hussein Tadicha is a Natural Resources Management Expert and Executive Director for Center for Research and Development in Drylands.

Ideas

Fourth Industrial Revolution: Innovation or New Phase of Imperialism?

Africans must enter the Fourth Industrial Revolution in a manner that upholds our human dignity, our liberty as communities and individuals, and our human agency.

Published

on

Fourth Industrial Revolution: Innovation or New Phase of Imperialism?

“Welcome to tomorrow!” and “Tomorrow is already here!” are popular phrases often used in the context of the so-called Fourth Industrial revolution (“4IR”). Thus at the Sight Tech global Conference held on 2nd and 3rd December 2020, one of the plenary sessions was titled “Our AI future is already here”. In Profit and Prejudice: The Luddites of the Fourth Industrial Revolution, Paul Donovan summarises the past three industrial revolutions as (1) steam power, (2) electric power, and (3) computer power. Nicholas Johnson and Brendan Markey-Towler speak of the four revolutions as the industrial revolution, the technological revolution, the digital revolution, and the fourth industrial revolution. They go on to note that the Fourth Industrial Revolution is the current period of economic transition since the mid-2000s, characterized by a fusion of new digital technologies, rooted in advances from the Digital Revolution, with technological applications in the physical and biological domains. Similarly, Klaus Schwab, Founder and Executive Chairman of the World Economic Forum, formerly the European Management Forum, observes that the fourth industrial revolution is characterized by a fusion of technologies that is blurring the lines between the physical, digital and biological spheres.

Nancy W. GLEASON cites MIT’s Erik Brynjolfsson and Andrew McAfee as referring to 4IR as the Second Machine Age (“2MA”). According to them, while the first machine age was about the automation of manual labour and physical strength, the 2MA technological progress in digital hardware, software and networks is about the automation of knowledge. At the core of the automation of knowledge is artificial intelligence (AI). Johnson and Markey-Towler explain: “Artificial intelligence, especially when endowed with machine learning algorithms, is a technology which seeks to mimic the functioning of the human mind, and which can therefore mimic human action guided by a process that mimics human thought.” Johnson and Markey-Towler further observe that artificial intelligence has greatly enhanced the use of robots:

…, the 4IR moves the goalposts from automation to smartization, whereby intelligently programmed software and robots are able to collect new data during the regular course of their operation, share it with other approved devices on the network, analyse the data, and use the conclusions to update their course of action. The 4IR took “dumb” autonomous machines and made them “smart.” This step was essential to the development of technological marvels such as self-driving cars and trucks and next-generation industrial robotics.

During the December 2020 Sight Tech Global Conference which I referred to at the beginning of this article, Kai-Fu Lee, one of the world’s top scientists and top investors in the field of artificial intelligence and author of AI Superpowers: China Silicon Valley and the New World Order, observed that the current generation’s breakthrough in a type of AI called neural nets, sometimes referred to as deep learning, has enabled remarkable advances in areas such as computer vision and natural language processing. He went on to state that today’s AI capabilities are so great in this raw form that what is needed now are the engineers, and, most importantly, the data to make the most of all the possibilities. He explained:

… computers … can see and hear at the same level as people now. So with speech recognition for machine translation and for object recognition, AI is now at about the same level as humans. And AI is improving rapidly, based on its ability to take a huge amount of data whether it’s spoken language or recorded videos to really train itself to do better and better. So over time, it will be a better see-er and hear-er than humans.” Referring to what he calls the third wave of artificial intelligence as perception AI, Lee spoke of “… extending and expanding this power throughout our lived environment, digitizing the world around us through the proliferation of sensors and smart devices. These devices are turning our physical world into digital data that can be analyzed and optimized by deep learning algorithms.

Nevertheless, Donovan notes that the phrase “industrial revolution” entered common usage long after the first industrial revolution had begun. He explains that Karl Marx’s collaborator on The Communist Manifesto, Frederick Engels, used the phrase in German in the 1840s, and the phrase was first used in English by Arnold Toynbee in 1882. This points to the fact that human beings often name something quite a while after they have experienced it, and the same has been true of 4IR, although we may have named it earlier than the first three because we are now more used to the idea of industrial revolutions than those who went before us were.

Klaus Schwab listed emerging 4IR technology breakthroughs in fields such as artificial intelligence, robotics, the Internet of Things, autonomous vehicles, 3-D printing, nanotechnology, biotechnology, materials science, energy storage and quantum computing among the things that would drastically change our lives. Indeed, the lives of the peoples of Africa are already being touched by 4IR in ways that many of them are yet to perceive—their smart phones, with their “Location” function on, are beaming data about their movements to networks, and the data are then sold to high-tech transport companies desperate for information about traffic flow in cities; many of them unwittingly allow phone apps to access their microphones and cameras, with the real possibility of their conversations and actions being monitored; their emails and social media accounts are being monitored for information about them that is sold to marketers, advertisers and politicians who use it for “targeted messaging”; their faces are increasingly being scanned by cameras connected to face-recognition software ostensibly to enhance security, but with the real possibility of surveillance for purposes unknown to them.

Human beings often name something quite a while after they have experienced it

What is likely to be more alarming to many, however, is the fact that the combination of artificial intelligence and robotics supported by high-speed online connectivity is threatening to render jobless in a few years’ time those without requisite new skills.

In 2021, Rob Floyd informed us that the African Centre for Economic Transformation (ACET), working with other institutional partners and nearly 40 data scientists and machine learning experts from around the globe, had completed the continent’s first “Artificial Intelligence Challenge”, ostensibly to help predict what infrastructure Africa will need in the future. According to Floyd, the exercise sought to identify machine learning tools and approaches that can inform policy decisions. The data scientists created models and designed methodologies that could help determine what infrastructure to build, where to build it, and what factors would have long-term economic impacts on the continent.

The fourth industrial revolution perpetuating western imperialism

According to the Merriam-Webster Dictionary, imperialism is “the policy, practice, or advocacy of extending the power and dominion of a nation especially by direct territorial acquisitions or by gaining indirect control over the political or economic life of other areas.” The peoples of Africa, Asia, the Americas, Australia and New Zealand first bore the brunt of Western imperialism in the form of colonialism. In The Invention of Africa, the Congolese philosopher V.Y. Mudimbe notes that “colonialism and colonization basically mean ‘organization’, ‘arrangement’. The two words derive from the Latin word colere, meaning to cultivate or to design.” He goes on to point out that the colonists (those settling a region), as well as the colonialists (those exploiting a territory by dominating a local majority) have all tended to organize and transform non-European areas into fundamentally European constructs.

Thus the politics, economics and systems of knowledge production in colonised territories were designed to imitate those of their Western colonisers. At independence in the late 1950s and early 1960s, classical colonialism in Africa was replaced by neo-colonialism. Kwame Nkrumah, in the Introduction to his Neo-Colonialism, the Last Stage of imperialism, wrote: “THE neo-colonialism of today represents imperialism in its final and perhaps its most dangerous stage. …. The essence of neo-colonialism is that the State which is subject to it is, in theory, independent and has all the outward trappings of international sovereignty. In reality its economic system and thus its political policy is directed from outside.”

In a chapter in The Disruptive Fourth Industrial Revolution: Technology, Society and Beyond, Rashied and Bhamjee observe that industrialisation in 4IR could easily continue along the path of coloniality, in which the wealthy countries of the Northern hemisphere exploit the resources of countries in the South, but that it could also result in some of the wealthier countries of the Global South exploiting their poorer counterparts. During the Third Industrial Revolution, the inequality between the wealthy countries in the North and the poor ones in the South was regularly referred to as “the digital divide” — a divide that is already finding its way into 4IR. Thus as Donovan observes, there are many people who cannot afford a smart phone and a data plan to enjoy the benefits of 4IR, so that “The democratisation of communication only applies to those above a certain income level.”

Indeed, the Digital Economy Report 2019, released by the UN Conference on Trade and Development, highlighted the disproportionate concentration of the digital economy in the United States and China, with the rest of the world trailing considerably, especially countries in Africa and Latin America. According to the Report, the United States and China accounted for 90 per cent of the market capitalization value of the world’s 70 largest digital platforms, over 75 per cent of the cloud computing market, 75 per cent of all patents related to blockchain technology, and 50 per cent of global spending on the Internet of Things.

“The democratisation of communication only applies to those above a certain income level.”

The report predicted that under current regulations and policies, this trajectory was likely to continue, contributing to increasing inequality. Yet, perhaps even more disturbing, is the digital divide right inside each of our countries in Africa, where the middle class enjoys virtually all the benefits of 4IR technologies that their counterparts in the affluent West and East enjoy, while the vast majority of their compatriots still grapple with lack of basic amenities such as access to piped water and electric power so that for them the issue of entering the digital world does not even arise. This latter digital divide significantly contributes to the perpetuation of the neo-colonial structures of domination for the benefit of the West and East.

Furthermore, in the edited volume The Fourth Industrial Revolution and Its Impact on Ethics, Geneviève Tanguay notes that disregard for factors such as cultural identity and political convictions is often reflected in the very design of 4IR products themselves. For example, observes Tanguay, machine learning algorithms, although designed to help in problem-solving and decision-making, are vulnerable to biases and errors arising either from their creators or from the datasets used to train the systems themselves. Tanguay goes on to write that Amazon’s time- and resource-intensive effort to build an Artificial Intelligence (AI) recruitment tool was shot through with bias against women: engineers reportedly attributed this bias to the AI combing through CVs submitted to the company over a 10-year period, most of which were submitted by men.

Donovan points out that consumers can now boycott companies that do not agree with their political positions: apps even suggest alternative products with better scores. However, he goes on to caution that, “With an app, the opinion that works out the details is someone else’s opinion. …. If the shopper has different priorities to the app designer, they may spend in areas they do not actually support.” In addition, observes Donovan, although it is often claimed that the communication technologies have democratized communication, “Algorithms give preference to some social-media users. They also will censor others. Government censorship was commonplace 300 years ago. The private-sector equivalent is the demonetisation, downgrading or banning of published content.” Such censorship from the so-called big tech has escalated in the era of COVID-19, ostensibly in a bid to fight the virus through scientifically-based information.

Disregard for factors such as cultural identity and political convictions is often reflected in the very design of 4IR products themselves.

Moreover, Western cultures are putting non-Western cultures under great pressure to allow themselves to be assimilated in the global (read “largely Western”) cultural pool on the false presumption that they are inferior to Western cultures. Thus in a chapter in African Values, Ethics, and Technology, Maleselo John Lamola points out that as the peoples of Africa use 4IR technologies designed with a Western cultural bias, they are negatively affected at a fundamental level:

The culturally disadvantaged user is … simultaneously mesmerised and alienated by an object that imposes itself as instrumental for the efficiencies of her life; during the same experience she must align her way of doing things to the intricacies of the operation of this device or machine, as well as to the social role it is cast to serve in her life.

A crucial aspect of human welfare is personal liberty, entailing rights such as those of free association, movement, expression and privacy. Yet 4IR is eroding these very liberties through surveillance: smart phones now easily “hear” and “see” much more than their users intend or know. Besides, governments are consolidating various databases (such as those on health insurance, births and deaths, voters’ lists, and criminal records) into single super-databases, so that at the click of a button those with access can view a citizen’s information in astoundingly fine details that can be used against him or her. Thus in the run-up to the 2020 US elections, some US citizens wrote a parody of the famous American civil war-period song “His Truth Goes Marching On”, part of which stated:

Our right to privacy is gone, devices are the spies.

For government surveillance those are now the ears and eyes.

They use the corporate data, no subpoenas, no surprise,

And still we don’t catch on.

All this calls to mind George Orwell’s dystopian novel, 1984, in which the single party, embodied by the mythical “Big Brother”, deploys 4IR-type technologies to monitor not only the people’s actions, but also their thoughts. The “inner party” consists of an elite which wields power by getting the “outer party” members to do their bidding. The party has a “thought police” which deploys all manner of 4IR type technologies to keep tabs on members of the outer party, including “telescreens” in homes and in public places that “listen to” and “see” all that the citizens say and do round the clock. The thought police are even able to read the thoughts of the members of the “outer party” and unleash punishments on them for any dissenting ideas. The “proles” (short for “proletariat”) are the illiterate masses, deeply despised by both inner and outer party members, and hardly have any interaction with the political process.

Furthermore, the party constantly re-writes history to suit its immediate purposes, fabricates narratives about consistent and abundant economic growth, about a mythical enemy of the state called Goldstein, and about never-ending war with this or that foreign power. It is working on a language called “Newspeak” to totally replace “Oldspeak” (English as we now know it, with a view to reducing the number of vocabulary in the language in order to eventually make it impossible for anyone to entertain or express critical thoughts against the regime. The party’s three slogans are: War is Peace, Freedom is Slavery, Ignorance is Strength.

Indeed, 4IR technologies now make a global dictatorship a much more conceivable possibility than it was for the first readers of 1984.

Kalundi Serumaga has illustrated how Africa’s land is “The Final Frontier of Global Capital”. This corroborates Mordecai Ogada’s assertion that “Conservation interests have built a cauldron into which the extremely wealthy are pouring startling amounts of money to subvert systems, grab lands, and plunder resources.” Yet the domination of Africa’s land that global capital seeks to achieve is being greatly aided by 4IR technologies that not only enhance the digitization of land records, but also detailed surveillance and high-tech warfare using 4IR-driven devices such as drones.

Which way forward?

We in Africa ought to urgently clarify our moral values, and based on them, formulate clear guidelines to restrain developers and marketers of 4IR technologies, serving the same old imperialists and some new ones, from dehumanising our people by manipulatively imposing technological innovations on them. Thus we ought to deeply reflect on the social visions of our forerunners such as that of Pixley ka Isaka Seme in his celebrated 1906 Columbia University speech titled “The Regeneration of Africa”. Seme spoke of a regenerated African civilization whose most essential departure “is that it shall be thoroughly spiritual and humanistic — indeed a regeneration moral and eternal.” As Lamola explains, for Seme, “the surrender of human agency to machines is … not fathomed. His was a novel conception of the possibility of the symbiosis of scientific progress with human spirituality.” Thus in my recent journal article on “The Fourth Industrial Revolution”, I proposed four normative considerations that, in my view, ought to guide the initiatives of the peoples of Africa in their deployment of 4IR technologies, namely, inclusiveness to meet the needs of all human beings, affordability to bridge the digital divide, respect for the right to cultural identity to guard against cultural imperialism, and ethical orientation as the over-arching guide to building a truly human society.

The domination of Africa’s land that global capital seeks to achieve is being greatly aided by 4IR technologies.

In sum, as we the peoples of Africa enter the Fourth Industrial Revolution, we ought to do so in a manner that upholds our human dignity, our liberty as communities and individuals, and, as a result, our human agency. This will entail a conscious and consistent repudiation of Eurocentrism in the realm of technology in line with Frantz Fanon’s admonition in the final chapter of his The Wretched of the Earth, A book he diligently worked to complete during the last ten months of his life:

If we want to turn Africa into a new Europe …, then let us leave the destiny of our countries to Europeans. They will know how to do it better than the most gifted among us.

But if we want humanity to advance a step further, if we want to bring it up to a different level than that which Europe has shown it, then we must invent and we must make discoveries.

Continue Reading

Ideas

Higher Education: Managing Institutional Change in an African University

Change is inevitable in the lives of nations, institutions, and individuals, but it is not easy because of entrenched mindsets, habits, and behaviors. USIU-Africa was no exception.

Published

on

Higher Education: Managing Institutional Change in an African University

After the approval of the USIU-Africa’s 2015-2020 Strategic Plan and my 90 Days of Listening, I rolled my sleeves for the exacting and exciting work of leading institutional change. I had participated in similar exercises in various contexts and leadership positions, but never in an African university and as vice chancellor. So, my previous experiences in Canada and the United States could only serve, at best, as broad guideposts.

I had to consciously guard against the dangers of transplantation from the far better resourced, much older, and larger American and Canadian universities I was familiar with, notwithstanding the fact that USIU-Africa partly saw itself as an American institution. It was jointly accredited in Kenya and the United States, and had until the early 2000s been a branch campus of an American university by the same name based in San Diego, California. In fact, the position prospectus had explicitly indicated familiarity with both higher education systems was a desirable attribute for the next vice chancellor.

Importing Educational Models

I had discussed the question of the export of foreign higher education models to Africa and other regions of the global South in the chapter on internationalization in my book, The Transformation of Global Higher Education, 1945-2015. I noted, “The new universities created after independence often replicated the institutional structures, instructional practices, and intellectual values of their colonial predecessors and imperial models… across much of Africa, at the turn of the twenty-first century, instructional languages, practices, and materials, as well as administrative systems and nomenclature, modes of academic organization, research methodologies, paradigms, and themes remained tied to the patterns and trends in Europe.”

The American university model joined the fray as the numbers of American trained academics and prestige of US universities in Africa rose. I wrote, “The model is variously encapsulated in the preeminence of the research university, the prominence given to liberal arts education, or the primacy of market values. In its contemporary incarnation, it is seen as a system whose institutions have become ever more commercialized, their governance corporatized, students consumerized, knowledge production commodified, learning credentialized, and faculty casualized.”

It is a malleable model “that allows its exporters and importers to project attributes, both real and imaginary, that they wish to highlight and embrace in branding and bracing themselves in the intensifying global competition for resources, reputations, and relevance. The model manifests itself in the establishment of American-style institutions, adoption of US-centered academic cultures, and performance of US-institutional identities.” Thus, the importation of the US model involves “the appropriation and performance of the institutional structures, styles, and symbols of US higher education.”

This is what I found at USIU-Africa, an institution that combined, sometimes uneasily, its Americanness and Kenyanness. Hybrid identities for institutions or individuals can be a source of creativity and empowerment. They can also generate perpetual confusion, contradictions and inconsistencies that engender institutional inertia and paralysis. This was sometimes manifested in continuous selectivity in which people would invoke whichever American or Kenyan university practice that was most advantageous to whatever position they supported and wished to advance.

USIU-Africa embraced several key aspects of American universities, such as the general education curriculum for undergraduate studies and the semester system. On a lighter note we celebrated some American holidays and events such as the Fourth of July, Black History Month, and named our three semesters Fall, Spring, and Summer that had no connection to Kenyan weather seasons—the Summer Semester coincided with the coldest part of the year in Kenya. An interesting example of cognitive dissonance that underlined more serious challenges of reconciling divergent institutional identities.

American accreditation gave our students wishing to continue their studies in the United States and in parts of Europe an advantage in that their credits and degrees were recognized. Students from other local universities had to navigate various barriers to entry given the poor perception of the quality of African higher education outside the continent sanctified by their relatively low standing in global rankings that have become increasingly ubiquitous and critical in the international division of intellectual labor.

But USIU-Africa didn’t adopt several key features of American universities. Faculty never underwent the rigors of the tenure system, nor were they entitled to sabbaticals, although they could take unpaid leave of absence. While several schools offered graduate degrees, there was no graduate school which could coordinate graduate enrollments and standards across the university. Despite purported commitment to enhancing research there was no holistic research policy and protocols.

The typical two semester system in American universities was turned into a fully-fledged three semester system. Faculty were obliged to teach for two semesters, and get paid as adjuncts for the third. Given the relatively low pay, most taught all three semesters, leaving little room for research. Students paid tuition each semester rather than for the academic year, which made annual financial planning challenging. But it also meant students who could afford to take all three semesters could finish their undergraduate degrees in about three years instead of the expected four.

These are some of the contexts that framed the various changes the university leadership and I undertook during my six year tenure as vice chancellor. There are many dynamics and dimensions of institutional change. Borrowing from the contentious neo-liberal discourse of national reform for economic growth and development, the following seven stand out: getting priorities right, getting governance right, getting policies right, getting processes right, getting communication right, getting resources right, and getting culture right.

In this reflection, I’ll mainly focus on the first four, and examine the others in more detail in subsequent reflections.

Setting Strategic Priorities

The university’s priorities under my tenure were clearly laid out in the new 2015-2020 strategic plan. I was impressed by the robust, transparent, inclusive, and participatory process through which the plan was developed prior to my arrival, which resonated with my own views and experiences with effective strategic planning elsewhere.

Over the next six years we assiduously sought to implement and evaluate the plans’s five priorities and twelve objectives. The monitoring and evaluation matrix measured more than 400 action items, which we later agreed was rather too much as some of these were routine operational matters that would go on regardless of any plan.

The five goals included: (1) “Provide globally competitive and innovative academic programs incorporating research and co-curricula activities for holistic education;” (2) “Expand and efficiently manage the university’s financial and human resources to meet its capital and operating expenditures;” (3) “Improve human resource management using best practices;” (4) “Expand, maintain, and optimize use of physical facilities and technology;” and (5) “Increase visibility and enhance quality services to internal and external customers.”

I’ll address the implementation of most of these goals in later reflections. Here, I want to discuss process issues.

Strategic plans provide a critical guide for institutional direction, ranging from introducing new initiatives to strengthening the university’s mission, values and role in society to the allocation of resources to organizational restructuring. They are not cast in stone in so far as the external landscape, and even internal environment, often presents unforeseen challenges and opportunities.

For example, when the strategic plan was approved in March 2016, we didn’t anticipate the changes in the national examination system for the Kenya Certificate Secondary Education that led to a sharp drop in students eligible for university entry. And no one of course could have predicted the massive devastations and disruptions of the Covid-19 pandemic. But we had to navigate both, a subject I’ll discuss more fully in a later reflection.

Suffice to say here, fidelity to strategic plans and priorities has to be counterbalanced by flexibility, agility, and adaptability to manage unforeseen circumstances, while maintaining adherence to institutional values and protecting fundamental institutional interests.

Successful implementation of strategic priorities depends to a large extent on the coherence, commitment, competence, and effectiveness of the university leadership at all levels. Reflecting its hybrid identity, USIU-Africa had an unusual and uncommonly large governance structure by the standards of Kenyan and American universities.

The Dynamics of Governance

USIU-Africa has ten governing organs and persons. At the apex is the Board of Trustees, the ultimate fiduciary, a self-perpetuating body that appoints the chancellor, and the university council. The chancellor is the ceremonial head of the institution who presides over commencement among other prescribed duties, while the university council appoints and evaluates the vice chancellor (I was appointed by the Board of Trustees as the council was created just before my inauguration) and provides oversight and approves university policies and budgets.

The vice chancellor is the chief executive officer of the university who oversees its administrative and academic affairs and chairs the management board whose members run different divisions, and the university senate that deliberates and makes decisions or recommendations on academic matters. The management and senate constitute governing organs in their own right. The university charter and statutes revised in compliance with Kenya’s Universities Act of 2012, also recognize the faculty, staff, student, and alumni councils as governance organs.

I had to juggle all these governance bodies in addition to the various internal and external constituencies of stakeholders. They included students, faculty, and staff who didn’t always feel adequately represented by their respective councils, or who I had to engage independently in any case as the university’s vice chancellor.

In their exit reports and re-accreditation letters conducted during my tenure, both regulators, the Commission for University Education in Kenya and the Western Association of Schools and Colleges in the US (the latter posts its institutional evaluation reports on its website) identified governance challenges as one of the university’s biggest structural constraints that needed urgent attention and rectification.

They singled out the relationship between the Board of Trustees and the University Council, in which the latter was cleaved from the former. Even more problematic was the relationship between the Faculty Council and the University Senate in which the former lost many of its functions to the latter and increasingly assumed a welfare role akin to a trade union. At every leadership retreat management ran sessions on the roles and responsibilities of the different governing bodies sometimes assisted by external consultants. Unfortunately, these structural challenges were not addressed by the board or council despite repeated requests by management, which proved costly in managing crises as I’ll discuss later.

Thus, the governance system at USIU-Africa had its own unique features that were neither distinctly Kenyan nor American. Such creativity was to be expected and in some cases was commendable. However, all too often this resulted in unnecessary structural dysfunctions.

Unlike public and private universities in Kenya, the top governance organs were not appointed by the government or the proprietors of the institutions. In my six years not once did the board or council conduct self evaluations as is common among governing boards in American universities. In late 2018, the US-based Association of Governing Boards was commissioned, at considerable cost, to undertake a survey titled, Comprehensive Evaluation of the Vice Chancellor and Baseline Assessment of the Board of Trustees and University Council, the first in the university’s history. The AGB’s report was never tabled and discussed by the board or council, let alone was a summary shared with the university community as promised.

Moreover, during my tenure I was only evaluated for half the time, so there were years when I had no key performance indicators that could be cascaded to the rest of the academic and administrative leadership in the institution. Many members of the board and council with academic backgrounds and from the US found it rather strange and several left within a short time of their appointment to the loss of the university. Many were disconcerted by the lack of understanding of university institutions displayed by several key members on the board and council.

By the time I left, virtually all the leaders of the council including the chair, vice chair, and chairs of committees had non-academic backgrounds. This undermined understanding of shared governance and fostered a culture of micromanagement, which is almost invariably counterproductive and doesn’t allow managers to optimize their professional skills and take responsibility and learn from failures.

In this, governance at USIU-Africa reflected the university’s location in a slowly democratizing society whose political culture and socialization were shackled by deeply entrenched reflexes of authoritarianism and the legacies of rule by decree. Clearly, universities, even international transplants, are immersed in their domestic political ecologies, as several prestigious American universities have discovered in Asia from China to Singapore to the United Arab Emirates. In Europe, there’s the case of the American styled Central European University, established by the Hungarian-born American philanthropist, George Soros, which was forced to relocate from Budapest, Hungary to Vienna in Austria by the illiberal and populist regime of Victor Orbán.

This is of course neither new nor peculiar to universities fashioned after the US model. Africa’s colonial and postcolonial universities borrowed the institutional shells of universities in the imperial metropoles, not the substance of structural autonomy and academic freedom that was often contested of course.

Managing contemporary universities is harder than ever as many university leaders everywhere would attest. Besides the internal stakeholders, externally there are what are called in the United States the helicopter or snowplow parents, media pundits, politicians, and ideologues for whom universities often provide soft targets, and the ubiquitous social media with its limitless capacity for fueling mendacity, acrimony, trolling, academic incivility and bullying.

Other powerful external actors include alumni, the private sector, philanthropic donors, international and intergovernmental agencies, and non-governmental and community organizations, all harboring their own pressing and, sometimes unrealistic and conflicting, expectations of higher education institutions and their leaders.

Arising out of the above, are ideological pressures on universities from across the political spectrum for representativeness. In short, in many countries including Kenya universities have become embroiled in the culture wars and incendiary polarization and partisanship of the larger polity.

Undertaking Institutional Reform

When I joined USIU-Africa, I was immediately struck by the excessive power and expectations of the vice chancellor. In keeping with authoritarian institutional or national cultures, the VC was the “big chief,” almost singularly responsible for making many academic and administrative appointments, signing checks even for mundane amounts, and dispensing favors and punishment to those who crossed him or her.

It replicated the highly personalized and patrimonial exercise of power in African autocracies. A more generous reading is that it reflected the pangs of expansion from a small institution to a larger and more complex one that needed more explicit and sustainable structures, policies, and processes. I had witnessed similar transitions at several Canadian and American universities. At USIU-Africa there were no academic departments; instead, there were programs. Colleges did not have their own dedicated budgets that they controlled. They had no staff for communication, advancement, faculty development, and external outreach. Everything was centralized.

One of my priorities and expectations from the board and later the council was to undertake institutional reform, to align organizational structure to strategic priorities, build on the university’s assets for future growth, raise the quality and reputation of its programs and partnerships, and generate more revenues to support the bold aspirations of the strategic plan.

In the first year, management and I initiated several key initiatives. One was an exhaustive forensic audit covering the previous five years to improve operations and effective systems university-wide. The audit facilitated the integration of our financial, human resource, and electronic systems to remove opportunities for mistakes or malfeasance.

Another was an extensive and inclusive organizational review, which was led by an external consultant, titled Job Evaluation, Salary & Organization Structure Review. The recommendations from the review led to organizational restructuring and strengthening talent management processes. The outcomes included adjustments of salaries and allowances, the establishment of clearer career pathing advancement for staff, the establishment of academic departments, and appointment of new school deans and department chairs. When I joined the university, all the deans were male and none was a full professor; one was even an assistant professor who had not been promoted in more than twenty years.

Both were the first comprehensive reviews of their kind in the university’s history. Predictably, there were “winners” and “losers” from these and subsequent reforms. In proceeding years, we undertook several surveys out of which some new policies were developed. This always involved exhaustive consultations with the university’s key internal constituencies and governance bodies, as well as benchmarking with other universities locally, regionally, and internationally. For the audits we often engaged reputable consultancy firms.

As can be expected, the various reforms had their supporters, opponents, and straddlers. Or to put it differently, each new policy and structural change had its advocates, antagonists, and ambivalents. This was in keeping with the so-called 40-40-20 rule, which posits that when trying to influence a community for change 40% tend to agree, the other 40% need to be convinced, and the other 20% will never be persuaded. So leaders should spend their time on the middle 40%.

But the 20% often bide their time and regroup. They tend to use any future crisis to articulate and generalize their grievances. Some even resort to ideological, racist, misogynistic, xenophobic attacks, bullying and mobbing of the leaders and their supporters. I mentioned some of the xenophobic attacks I was occasionally subjected in the last two reflections. In the age of social media those opposed to change that don’t benefit their personal or sectarian interests eagerly mobilize social media for virulent personal attacks.

I always urged the academic and administrative leadership to stay the course, to keep our eyes on the prize of institutional change for the continued enhancement of our beloved university. As part of this agenda, it was imperative to follow institutional priorities, policies, processes, and procedures. For each reform initiative we followed what I call the 6Ps: clearly identifying the problem we were trying to rectify, the policy that would guide any review, the process we would follow, best practices in other institutions at home and abroad, the desired product, and determining how we would promote and operationalize it.

In the language of change management models this entails, first, identifying the need for change; second, determining the change agenda including cost and risk analysis; third, assessing the needs and interests of stakeholders and communicating with them; fourth is the implementation stage; and finally, the monitoring phase.

There are of course many models of change management. One is Kurt Lewis’s three-stage model of “unfreezing” organizational behavior, implementing change, and “refreezing” by sustaining the enacted change. Another is John Kotter’s eight step model that comprises creating a sense of urgency, building a guiding coalition, forming a strategic vision, empowering broad-based action, communicating the change vision, generating short-term wins, consolidating gains and producing more change, and institutionalizing new approaches.

The ADKAR model focuses on how people adapt to change through awareness, desire, knowledge, ability, and reinforcement. For its part, the McKinsey 7-S model calls for paying attention to strategy, structure, systems, skills, staff, style, and shared goals. The change acceleration process model advances seven steps including identifying a champion or champions to lead change, creating a shared need, shaping a vision, mobilizing commitment, monitoring progress, and integrating change in the organization’s systems, structures and culture.

Several scholars have adapted some of these models from industry to universities as competition and demands for value and impact among higher education institutions intensify. Change in universities is especially hard because they are concentrated communities of experts, valorize shared governance, characterized by organizational decentralization, and have multiple divergent constituencies, all of which tends to make them risk averse.

The diversity of change models noted above underscore the variability and multi-dimensionality of organizational change in higher education in terms of the processes for executing change, people participation and communication, leadership commitment and style, and developing empowerment behaviors and culture.

Among the numerous surveys we conducted in subsequent years included those on student retention, student employability, and alcohol and drug abuse. Among the newly developed policies were those on sexual harassment, inclusion of persons living with disability, data protection, crisis management, fraud and corruption prevention, whistleblowing, religion on campus, business continuity, and signing authority limits stipulating thresholds for different signatories.

The reviews undertaken comprised those of the constitutions of the faculty, staff, and student councils, the HR Policies and Procedures Manual, Employee Handbook, and Faculty Handbook. In 2019-2020, the university developed its first full research manual. Other audits encompassed a legal audit, review of the internal audit function, and the university’s risk register.

The process of institutional reform, with its complex dynamics and dimensions, demands and disappointments, as well as opportunities and gratifications taught me a lot about how notoriously difficult but critical it is to implement progressive, effective, and sustainable change in universities as they seek to strengthen their academic programs, operational systems, service delivery, and social impact.

Change is inevitable in the lives of nations, institutions, and individuals, but it is not easy because of entrenched mindsets, habits, and behaviors. USIU-Africa was no exception. Ironically, that was a source of equanimity for me.

Continue Reading

Ideas

Insights for a Different African Urban Future: A Reflection

Alternatively, there could emerge a leadership that seeks to respect each ambition, and find a happy medium between them, by first addressing the question: what are these cities for, and how will they feed and maintain themselves

Published

on

Insights for a Different African Urban Future: A Reflection

From recent developments, it would appear that there are four contenders vying for ownership of Africa’s urban spaces: the international financial system, the operators of the informal economy, the new elite still holding on to the post-colonial dream of building shiny new metropolis, and finally: Mother Nature.

Beginning with the last: she only recently staked an angry claim to the low-lying areas of Kenya’s capital Nairobi, in a stunning six-hour rainstorm that overwhelmed the city’s road and drainage system. The usually congested evening commute traffic jam was escalated to a biblical disaster. On top of a resultant widespread power outage, large numbers of motorists had to spend the entire night marooned in their immobilised cars.

The nightmare of that evening became even more horrifying when many of those parents received news that their children were trapped in school-buses on their way home. With police cars, fire engines and ambulances also immobilized, those children spent much of the night inside the buses, watching the rising tides of rainwater inching up to bus-window level.

In the end, at least 10 city residents were reported to have died due to the floods, millions worth of property destroyed, and what remained of the city’s reputation as perhaps the most long-standing “modern” sub-Saharan city outside South Africa, had become bogged down in less than useful official explanations.

Perhaps the greatest indication of how devastating the incident has been was the very rare sight of a Kenyan public official –in this case the Governor of the City- actually making a public apology to hard-bitten Nairobeans.

The same scenario played out even in Accra in early June last year, and in Lagos. For the city-zens of Accra, the recent floods have had nasty political aftermath. When many pointed out that the flooding was the result of informal constructions along the city’s drainage system, the authorities diagnosed the problem to be the old Sodom and Gomorrah slum. Attempting to mow it down provoked the worst riots witnessed in the the city’s recent history.

Urbanisation is the dominant trend on the continent, albeit not as quickly as for other parts of the world. The key question therefore is no longer if, but rather: how, and to benefit whom? Who will be the owners and shapers of the emergent African urban spaces, and what will be their ambitions for it?

In his book, The Open Veins of Latin America: Five centuries in the Pillage of a Continent, the radical Uruguayan writer Eduardo Galeano dissects the meaning of the patterns of infrastructure there, particularly the road and rail networks, to reveal a legacy of exploitative intent. The networks, he explains, were laid down in such a way as to primarily serve the extractive agenda of the imperial powers that brought them. In short, the roads and rail network were like slashed veins bleeding treasure from the interior out to the coast by the shortest possible route, for onward relaying back to Europe.

Similar questions could be raised about many major urban centres in Africa. In name and in nature, most seem to have been established in locations logical only from the perspective of the dominant empire of the day – whether Arab or European.

As Kenyan journalist Christine Mungai explains in her May 2015 Mail and Guardian article ‘Not just trees’, even the vegetation on these spaces was often initially imposed from other eco-spaces.

Many cities are the product of the process somewhere between initial encounter and eventual occupation, in the narrative of colonialism. Some, like Nairobi, were essentially way-stations along the continuing journey further into the interior.
Others were mission stations, or trading posts or the colonial forts near which the last battles that defeated then-to-be colonised natives were fought. In all events, they owe their existence more to the arrival of new power than to the aspirations of the old one.

It is for this reason that they tend to have European names – “Lagos” is Portuguese for “bay”, for example – or the names of the monarchs, founders (or of their spouses, and sometimes hometowns), favoured saints and even immediate bosses of the time, originally.

What they all tend to have in common is a spatial and cultural disregard for the sensibilities of the people among whom they were planted. The real native African urban settlements were often either subsumed into this new reality, or left to atrophy and die out in what became “upcountry” spaces.

Therefore, with very few exceptions, the contemporary idea of an urban space in Africa comes infused with notions of new beginnings, a thing quite separated from the past (of which the hinterland is seen as a remnant) and not really answerable to it in any tangible way.

The rising urban middle class fortifies this space. After all, it is a cultural value established by, and enshrined in the dreams of post-colonialism. A modern, buzzing city was the perfect symbol of one having earned one’s place in the modern world. Literally square pegs in round holes, our cities thus speak of our elite aspirations of arrival rather than as melting pots of genuine human interaction. As such, much of the work of city administration is a civilizing mission to discipline the unruly natives into fitting into the idea. It rarely ever works the other way around.

Nigerian journalist Dolapo Aina reflects this posture with his opinions on what ails Lagos, from its days as the capital. The space is wholly segregated between a settled wealthy elite whose water and power supplies long separated themselves from what the Nigerian state had to offer on the one hand, and a large mass of struggling poor, often newly arrived citizens referred to as “JJC” (Johnny Just Come). The clear inference here is that it is the duty of the JJC to conform to the norms of the city.

Policy for Kampala is currently being driven by a similar thought process. Whatever the future of the city, its current managers seem to feel that the presence of the urban poor, owners and operators of a vast informal economy, should not be part of it.

This makes one thing clear: the actors in the informal economy are the one category of claimant who find themselves in retreat. In a pattern reflecting what is happening also in some parts of major European cities, notably London, a process of “gentrification” is driving out the hawkers, stall-owners, low-income transporters from the more financially lucrative parts of the cities.

In what is essentially a more sustained expression of the same spasmodic impetus to rid the cities of their visible poor – usually the impulse for these Potemkin visions are provoked by the hurried preparations to host one international event or another – “idle and disorderly”, vagabonds, street beggars and other failed entrants to the modern project would, especially in the autocracies of the `70s and `80s, be forcibly rounded up and “disappeared” for the duration of the event.

The figurative descendants of those poor are the participants of the large informal economies – the anonymous millions of the sprawling megapolises we see today. Notably, they have become a distribution network of sorts for the massive influx of cheap manufactured goods flooding in from the factories of South Asia; and the tonnes of freshly harvested upcountry produce brought daily into the city. It is because of their capacity to endure transport hardships and pre-dawn market indignities; a certain native intelligence has developed, and is now deployed usually at below-market rates, to ‘push product’ down the capillaries of the system.

The modernists thinking holds a very determined vision to create cities that, in terms of infrastructure, architecture and amenities, would rival the post-colonial wonders of the Far East (Singapore, Hong Kong, Shanghai, etc.). The critical gap in this plan is frankly, the absence of large, more ruthlessly exploited natural resources and economies. The irony here is that the national state, acting on behalf of the local business elite, may have putative control but in real terms possesses neither the discipline nor the wherewithal to support such long-term capital investment.

This is indeed where another of the abovementioned potential “claimants” to the African urban space comes in. Chinese largesse aside, there is a growing appetite among the technocracies managing our cities to finance their new ambitions for large infrastructure projects by floating municipal bonds on the international markets.

South Africa’s Cape Town and Rwanda’s Kigali, as well as the venerable Addis Ababa have already done so, with the latter even being heavily oversubscribed. Kampala now seeks to get in on the act. The city’s technocrats recently made an application to have the national government allow them to issue bonds worth $500 million on the global market, according to the financial news agency Bloomberg.

Clearly, there will be a process to every option, And with every option, an attendant risk.

For those seeking the modern “shining city on a hill”, the price paid for this will be democracy and demographics. In Kampala it has led to the locking out of the hugely popular Mayor of Kampala City, who probably drew the largest-ever majority in the last election. Alarmed by his popular mandate, the national government swiftly replaced him with a selected city commissioner, whom they could more readily control.

Reports from Cape Town indicate a creeping, unspoken policy of fiscally – and therefore as a by-product, racially – resegregating the city.

For the global financial markets, this view of Africa’s urban spaces as viable loan destinations through the municipal bonds market may well presage the kind of scenarios that led to the Debt Crisis of the early 1980s – or closer to home, present-day Greece. Where national coffers were affected and citizens shocked into austerity, exposing cities to the vagaries of the international bonds markets well threatens the onset of a similar crisis.

Is anybody asking what ‘conditionalities’ were given for municipal loan repayments? How were the loans structured, and who will bear the cost of repayment?

A worst-case scenario may see beleaguered city governments employing brutal means to ensure uninterrupted (taxable) production, such as the 2012 bloody suppression of the miners strike at Marikana.

For the owners of the informal economy and its markets, the spectre of a new economic apartheid looms large: the creation of a regulatory regime that, deeming their ‘business models’ unsustainable,  will seek to banish them from the spaces they currently occupy.

And so in this emerging era of the privatized city, what is to happen to the urban poor? Can a city really exist without them? After decades of practice inspired by a similar “brave new world” ethos, modern medicine has come to realise the importance of balance. The mass use of antibiotics has precipitated a crisis in how the human body manages infection. Put simply, western and westernized bodies were becoming so internally sanitised that the entry of any foreign organism could set off a physical crisis. This has led to the practice of “probiotics”, whereby organisms once considered to be merely germs, are actually re-introduced to the over-sanitised body so as to revive the patients natural resistance system.

A parallel can be drawn with the thinking that the “un-rich” automatically constitute a blight on the cityscape. Just as the body needs what were mistakenly regarded as mere germs, a living city may also need low-income citizens, and less-than-aesthetically-pleasing settlements, to keep the economy healthy.

Those who feel nature must make way for “progress”, seem to still be guided by the thinking of the industrial revolution which guided the original planners of the colonial cities who planted them right on top of spaces that should have best been avoided.

Much of eastern and southern Kampala – also a city prone to floods – is actually drained and filled-in swampland and waterways, as even the original local language place-names imply. A similar situation existed in Lagos where a collection of swampy islands became joined together in decades of traffic-jammed misery, until the KAYEMBA people seven hundred kilometres to the north, were displaced to enable the creation of the new capital Abuja in 1991.

With the arrival of the aspirational, and even acquisitive and avaricious middle class, this process is also gradually continued on a retail basis, with well-connected individuals now encroaching on the remaining river basins, swamps, urban woodlands and nominally “protected areas”.
As many angry commentators on Nairobi social media pointed out, this was the primary cause of the devastating floods that paralysed the city. Encroach and build where you may, they pointed out, but rainwater will still insist on finding its level.

Compounding an error, by building further on the places created by another culture for very different imperatives, may not be the wisest of the decisions that modern Africa’s governments are committing to.

Alternatively, there could emerge a leadership that seeks to respect each ambition, and find a happy medium between them, by first addressing the question: what are these cities for, and how will they feed and maintain themselves? However, the necessary re-think – which compels us all to reconsider our assumptions about what “development” should actually mean in the African context – may threaten many of these entrenched interests.

As for Mother Nature, She will continue to rain on all of man’s aspirations.

Continue Reading

Trending