Connect with us
close

Politics

Under Fire: Forced Evictions and Arson Displace Nairobi’s Poor

7 min read.

Urban displacements greatly diminish the living conditions of already desperate populations living on the brink of poverty.

Published

on

Under Fire: Forced Evictions and Arson Displace Nairobi’s Poor

On 15 November, Minoo Kyaa, a community activist from Mukuru kwa Njenga, South Nairobi, tweeted,

We keep asking each other “we unaenda wapi?” [Where are you going?] and even tho it isn’t funny we laugh about it and stare at each other in disbelief.

At the time, Minoo was living in a tent on the site of her former home, along with some 40,000 other forcibly evicted Mukuru residents. Their dwellings had been demolished to make way for a link road connecting the city’s industrial zone to a contentious new expressway, plunging them into a humanitarian crisis. A pet project of incumbent President Uhuru Kenyatta – intended to be another symbol of his “legacy” – the seventeen-mile toll-road, dubbed a “Road for the Rich” by critics, aims to facilitate speedier movement between Nairobi’s Jomo Kenyatta Airport and the Central Business District.

Last November, the authors of this article convened a meeting with activists and community representatives from across Nairobi to understand the causes and consequences of displacement in urban contexts. We wanted to get a clearer sense of the circumstances under which poor urban residents are regularly compelled to leave their homes. Four of those who attended, roughly a third of our participants, were and remain directly affected by events in Mukuru, which they recounted in disturbing detail, providing troubling insights into the conditions of extreme precarity under which Nairobi’s poorest residents are repeatedly forced to rebuild lives and livelihoods following forced evictions.

The first wave of displacement in Mukuru, they told us, began abruptly on October 10th, just two days after a public announcement that they should move to make way for the road. Some say they received notice only moments before the Nairobi Metropolitan Services (NMS) and the Kenya National Highway Authority (KeNHA) bulldozers appeared. Others were caught unawares: “We only saw bulldozers, the General Service Unit [the paramilitary force known as ‘GSU’] and men in blue [regular police officers],” said Rukia, a Mukuru resident also living at the time on the site of her demolished home. She continued: “[It was] like they were marching at Nyayo Stadium,” in reference to the military parades these forces perform before dignitaries at the National Stadium during state functions –  with forceful intent and an exactness in implementing an agenda.

Within a short time, heavy machinery razed to the ground residences, businesses, places of worship and schools under the watch of the police and the GSU. According to local residents and reports in the media, at least one person was killed by a police bullet during the protests that followed. There are reports of others being buried under the rubble as they sought to salvage their belongings, or dying of heartbreak.

“We only saw bulldozers, the General Service Unit and men in blue.”

Evicted households were offered neither compensation, nor alternative housing arrangements. Consequently, with nowhere to go and in a bid to protect from land grabbing “cartels” and thieves the spaces they had known as home for decades, many found themselves encamped on the rubble of their former homes. Their temerity and resistance in the face of the evictions and the looming private developers continues to inspire.

Who are these “cartels”, as locals refer to them? They are well-connected citizens with interests in property development that can, because of their likely links to the government, prompt land grabs in areas considered informal. Their association with the state, though difficult to prove, is evidenced in the continued role of the police during Mukuru evictions where they respond to demonstrators with teargas, water cannons, and live ammunition. During the most recent unrest, which was sparked by attempts to clear the area of residents and demarcate the space with beacons on 27 December 2021, two residents were shot dead and scores of others were injured. Today, months after the first eviction in early October, many of the displaced remain homeless and are unable to re-establish livelihoods in Mukuru in the face of a formidable alliance of enemies: property developers backed by the police and the government.

Informal evictions: Nairobi’s incendiary displacements

None of this is unique. Of the fifteen activists and community representatives from across the city who participated in our November focus group – from Mathare to Mukuru, to Baba Dogo and Kayole – two thirds had faced the reality or the threat of development-induced displacement. Each case involved populations in informal housing settlements having to make way for concrete structures – roads such as the Jomo Kenyatta International Airport-Westlands expressway, or residential high-rise buildings.

And yet these “formal” evictions – in which official involvement is often signalled by the presence of bulldozers flanked by various police forces – are just part of the story. We also learned of fires that residents believe are intentionally started to clear slums. It is virtually impossible to prove that these conflagrations are started with the deliberate intention to grab land. However, there are good grounds for the perception that these fires serve as “informal evictions”. Crucially, they have occurred in locations where the poor reside on desirable land; where land-title arrangements are contested, and/or where proximity to the city attracts the construction of more lucrative housing than the shacks that once stood upon these sites. As one activist commented, “In almost all places where there is a fire, a high-rise building will come up.”

Their temerity and resistance in the face of the evictions and the looming private developers continues to inspire.

Participants at our November meeting counted 14 episodes of displacement since the beginning of 2020, either through the construction of infrastructure or by the setting of fires. These took place in Shauri Moyo, Deep Sea, Kariobangi, Korogocho Market, Nyayo Village and Kisumu Ndogo, Baba Dogo, Njiru, Ruai, Gikomba, Viwandani, Mathare, Kibera and Kangemi. In total, these violent episodes involving either arson or demolition by bulldozers have deterritorialized, either permanently or temporarily, thousands of Nairobi residents over the last two years.

The social consequences of urban displacement 

Despite their diverse causes and contexts, urban displacements share a common set of consequences. Above all, they greatly diminish the living conditions for already desperate populations living on the brink of poverty. While they do not take place across borders, those who are affected live and suffer in ways that are comparable to the plight of refugees. Evictions typically involve the demolition of property, arrests and fines, and often feature brutal violence of the kind described earlier – the expulsion of entire communities from their homes, a disruption of livelihoods, and loss and damage of personal effects, such as belongings and identity documents.

And, certainly, this impact is gendered. Women, habitual caregivers, have had to take care of children in situations of greater precarity than usual in Mukuru. In the absence of housing structures and a community that can protect each other, the threat of sexual and gender-based violence looms larger than before, as but one example of the gendered impacts of forced evictions.

The hardship experienced by those displaced in urban contexts is persistent, with many being forced to move on more than one occasion. In several of the aforementioned sites, evictions have occurred more than once within relatively short spans of time. For example, Dagoretti Centre was demolished several times by the City Council between 1971 to 1978. During that same period, Soko ya Mawe (1975), Mafik (1979) and the villages of Light Industries (1980) also faced evictions. Indeed, the history of displacement in Nairobi is as old as the city.

Urban displacement: past, present, future

As early as 1902, four years into the emergence of Nairobi as a railway town, the “Indian Bazaar” was demolished for being “unhygienic” – a result of racialized projections that would lead these evictions to recur twice by 1907. Africans, whose very presence in the city was conditional upon their registration as workers, had to contend with the regular demolition of their dwellings, legalized by the 1922 Vagrancy Act.

During the emergency period, between 1952 and 1960, whole settlements in the Eastlands area, such as Mathare and Kariobangi, were flattened as they were perceived to harbour anti-colonial agitators and undesirable city dwellers. Fifty years since the colonial evictions, post-colonial urban governance continues to borrow from a similar toolbox: from Mji wa Huruma, to Muoroto to Kibagare settlements, thousands of Nairobi residents have been forced to make way to concrete: usually roads, buildings or housing for more prosperous citizens.

Today, over 60 per cent of Nairobi’s population lives in its informal settlements, which make up just 5 per cent of the city’s residential area. Many homes in these “slums” are built with corrugated iron sheets, and residents lack access to adequate sewage, electricity, or water systems, denied to those without the titles that would confer on them tenure rights to their dwellings. Over the years, justification for the violent displacement of the “informal” (we would say informalized) sector workers and residents has included concerns over tax evasion, trespassing, traffic congestion and food safety. Yet the highway that displaced Mukuru residents was equally informal: it did not feature in the 2014 Masterplan for Nairobi and nor was a strategic environmental assessment of its costs undertaken. It, however, continues to be defended by the government, including the National Environmental Management Agency (NEMA), as a viable means to “decongest” the city.” Evidently, concerns with order, modern aesthetics and “hygiene” have always prevailed over the principles of equity and inclusion in the governance of Nairobi, and it is probably for these reasons that the Evictions and Resettlement Procedures Bill – introduced to Parliament in 2012 – has not been passed.

In the absence of housing structures and a community that can protect each other, the threat of sexual and gender-based violence looms larger than before.

Despite a legislative framework from which to draw upon, such as Article 40 of the 2010 Constitution that upholds the protection of property, the majority of our elected representatives do not prioritize the formulation of policies that protect those at risk from the inhumane consequences of urban displacements. Evictions have been widespread over decades, and, as we have noted above, there is anecdotal evidence to suggest they may be taking increasingly sinister forms, with fires being deployed to expel and intimidate those living in areas considered “informal”.

If documents such as Kenya Vision 2030 are anything to go, the present scenario, in which the poorest elements of urban society are being repeatedly displaced in violent, unjust and often illegal evictions, is likely to worsen. This development plan, which is used to justify an ever greater proliferation of concrete infrastructure, is frequently referred to by technocratic proponents of large-scale hypermodern architecture. And as the infrastructure it portends is materialized in order to realize Vision 2030 or presidential legacies, more communities will likely be forced to move.  All of which underlines the need for an urgent response from civil society, which must scrutinize the role of the state, county governments, and private interests in inflicting incessant housing insecurity, and psychological and physical trauma on already marginalized communities.

This article is the first in a series on migration and displacement in and from Africa, co-produced by the Elephant and the Heinrich Boll Foundation’s African Migration Hub, which is housed at its new Horn of Africa Office in Nairobi.

Support The Elephant.

The Elephant is helping to build a truly public platform, while producing consistent, quality investigations, opinions and analysis. The Elephant cannot survive and grow without your participation. Now, more than ever, it is vital for The Elephant to reach as many people as possible.

Your support helps protect The Elephant's independence and it means we can continue keeping the democratic space free, open and robust. Every contribution, however big or small, is so valuable for our collective future.

By

Wangui Kimari is an editorial board member at Africa is a country. She is based in Nairobi. Constant Cap is a graduate urban planner and rugby referee in Nairobi.

Politics

The Genocidal War on Tigray and the Future of Ethiopia

For any negotiations to succeed, the international community should refrain from deciding on the future of Ethiopia and attempting to salvage an irredeemable genocidal regime.

Published

on

The Genocidal War on Tigray and the Future of Ethiopia

With its irreconcilable fundamental contradictions, Ethiopia lies behind the veil of nostalgic glorification of its unceremonious past. It is not an overstatement that countries like Ethiopia, obsessed with ultra-nationalist pride, live in past glories but are desperate about the future. Such political entities have little hope in their shared future and invariably look for a pretext to commit atrocities against their people and to launch aggression for the sole purpose of either creating a monolithic national image to the liking of the political elites or using the crisis as a unifying factor and to sustain the lifeline of the state.

After almost 23 years of attempts to transition from a unitarist empire to a federal democratic republic, Abiy Ahmed’s appointment as prime minister in 2018 has consolidated Ethiopia’s formation defects and accelerated the course of the country towards inevitable disintegration.

Abiy Ahmed’s government waged a premeditated genocidal war on Tigray using as pretext the holding of successful elections and the establishment of a legitimate government in Tigray. Ironically, the same government has indefinitely postponed the 2020 election and unconditionally extended its term with the support of a far-right political elite that has perfected the art of midwifing dictatorship and is hell-bent on returning Ethiopia to its imperial past.

A year of genocidal war on Tigray

More than a year into the genocidal war on Tigray by the Ethiopian and Eritrean joint defence forces, Amhara’s forces, and with the participation of foreign powers, the people of Tigray are completely under siege. And despite diplomatic engagements, international calls, and condemnations of Ethiopian and Eritrean officials, and the Amhara regional government, no meaningful, targeted actions have been taken to end the war on Tigray and ensure the withdrawal of Eritrean troops and Amhara militia. And nor has any meaningful action been taken to pursue a peaceful resolution to the political crisis and allow unhindered humanitarian access to avert famine in Tigray.

The international community’s response has not only been lacking coherent and concrete actions, but it has also demonstrated divergent interests. At the beginning of the war, its reaction was to recognize the narrative of the Abiy regime that the incursion was a law enforcement operation. Throughout the duration of the war, and despite calls for unhindered humanitarian access and the protection of human rights, the approach has been to blame all the parties to the conflict without any regard for their level of participation and level of accountability; at times it has been tantamount to blaming the dead for dying.

Reinventing peace 

Following the rout of the Abiy regime and its allied forces in most parts of Tigray, including Mekelle on 28 June 2021, and the subsequent deepening of defensive strategic depth by the Tigray Defense Forces, coupled with the formation of an alliance with the Oromia Liberation Army and allied forces, the tone and sense of urgency of the regional and international players have dramatically escalated. The coordinated reaction, including by institutions that have been silent throughout the war, leads to suspicions that there are attempts to save the empire and the regime against the spirit of the constitution. Strikingly, while extending complete and unconditional legitimacy to the other parties to the war on Tigray, they have been reluctant to correctly address the government of Tigray by its proper nomenclature.

In an attempt to salvage the state, the international community, the United Nations Security Council, and the African Union Peace and the Security Council, in reaffirming their strong commitment to the sovereignty, political independence, territorial integrity, and unity of Ethiopia, have shown more regard for the interests of the significant players than for the aspirations of the Ethiopian people embodied in the Constitution. In procedure and substance, the future of Ethiopia should be addressed primarily within the framework of the constitution. The far-right Ethiopianists’ camp that the prime minister associates himself with are convinced that the Ethiopia they want to see cannot be built on a democratic order. Unsurprisingly, they are convinced that they are best served by a dictatorial regime that provides an unchallenged environment better suited for imposing their values. It is not a coincidence that they blindly supported the Haile Selassie imperial order, the Derg dictatorial regime and now the Abiy genocidal regime. In a nutshell, from the modus operandi of the political elite in power that has consistently viewed the installing of a democratic order as an existential threat to its survival, the prospect of holding up Ethiopia as a democratic republic would require a generous optimism. And yet the convening of an all-inclusive dialogue could regulate the constitutional state succession, with the constitution providing the rules of engagement.

No meaningful, targeted actions have been taken to end the war on Tigray and ensure the withdrawal of Eritrean troops and Amhara militia.

Equally, while the US and the European Union have declined to recognize the election process, the communiqué of the first meeting of the African Union Peace and Security Council on 8 November 2021 reinvented the strategic mistake made by the UNSC in its Resolution 2216 on the civil war in Yemen. Notably, the reiteration against any attempt aimed at unconstitutional change of government in concurrence with the African Union’s unconditional recognition of the sham election and the ‘‘new government’’ reinforces the utter bias of the African Union in salvaging the Abiy regime under the guise of the Obasanjo-led negotiation process. In this regard, if the AU-led process is to make any meaningful contribution, the policy organs of the AU should start by recognizing the parties to the conflict equally, leaving the legitimacy question to the domestic stakeholders, and limiting their levels of engagement until a legitimate government is established through an all-inclusive transitional arrangement.

Incentivizing unacceptable behaviour 

It is public knowledge that one of the reasons for the war waged on Tigray, the arbitrary arrests and undue harassment of credible political party leaders and non-convening of elections in three regional states were an attempt to obtain legitimacy by force. In a statement on 30 September 2020, the government of Tigray made it clear that decisions and laws passed by the Prime Minister, the Council of Ministers, the House of People’s Representatives, and the House of Federations after the expiry of their mandate are null and void. Therefore, since the convening of an election presupposes having a constitutional mandate, the established government lacks constitutional and political legitimacy since the election was held by a regime without authority.

For any negotiations to succeed, the international community should refrain from deciding on the future of Ethiopia and attempting to salvage an irredeemable genocidal regime. Instead, it should give equal recognition to the parties to the conflict, leave the legitimacy question to the domestic stakeholders, and limit its levels of engagement until a legitimate government is established through an all-inclusive transitional arrangement.

Continue Reading

Politics

Europe at War: Normal Service Has Resumed

After an eight-decade hiatus, Europe is again at war, and as with all European conflicts past, the invasion of Ukraine is about business; when capitalists want something, they find an excuse to start a war in order to get it.

Published

on

Europe at War: Normal Service Has Resumed

Nobody should really be surprised by the conflict taking place inside the Republic of Ukraine. In their long modern history, Europeans have been at war, or in preparation for war, or recovering from one, longer than they have been at peace.

Western Europe has been the key driver of conflicts at home and globally for the last three centuries. European armies have war graves in just about every country on every continent.

The only surprise is that they have been able to keep their warlike behaviour in check for the last seventy-seven years (if we exclude the fighting that followed the 1990 break-up of Yugoslavia), since the end of their 1939-1945 war that they spread to much of the rest of the world.

And even that peace was only because they managed, for once, to come to an agreement about the thing that drives their conflicts: money.

Ambassador Martin Kimani, Kenya’s permanent representative to the United Nations did an important thing when he asserted the idea that Africans can also have an opinion on world events, drawing on the lived African historical experience.

In his February speech to the Security Council, while criticizing the then anticipated Russian military entry into Ukrainian territory, Ambassador Kimani urged Russian leaders to follow the example set by Africa’s post-colonial leaders and simply accept post-empire borders as they are. He also urged them to put their faith in international diplomacy, in order to resolve such disputes.

Deep down, these words will sound strange to European ears on all sides of the Ukraine dispute. The historical record shows that this is simply not how these people do business, and certainly not the white powers of Western Europe (which birthed other white powers like the United States and Canada). For them, war is the norm, and when they say “peace”, they mean their successful imposition of conditions to their liking on the side they have defeated.

Ambassador Kimani urged Russian leaders to follow the example set by Africa’s post-colonial leaders and simply accept post-empire borders as they are.

The conflict now located in Ukraine has been brewing for quite some time. It is an expression of a wider tension between the continuing ambitions of Western countries and economic masters against the interests of Russia in the various forms it has taken before, during and after becoming the world’s first, biggest and most powerful non-capitalist state.

There has never been a period of actual good relations between Russia and the Western European powers in over one hundred years. And places like Ukraine are where this has often played out. The great plains of Europe, lying between Russia proper and the powers of the West, made up of shifting, weaker states, have always been a buffer zone.

In the first phase, this was the fight between the German and Russian empires during the 1914-1918 war, which led to both the collapse of the Russian monarchy, and the dissolution of the German Empire.

The second phase was between 1920 and 1939, when various combinations of Western European powers sponsored rebellions, small wars and sabotage in an attempt to dislodge the communist-led Union of Soviet Socialist Republics (USSR) regime that had eventually taken over the Russian state following the collapse of the monarchy there.

This was only briefly suspended by the rise of fascism to state power in Spain, Italy and Germany, setting the conditions for the 1939-1945 war.

But this was in fact a war against Germany’s attempt to re-establish an empire to replace the one taken from it under the terms of the treaty ending the 1914-1918 war, much as it was dressed up as a war against the fascism of Hitler’s Germany. During that war, the capitalist Western powers were embarrassed to have had to make an anti-Hitler alliance with the very Soviet Union they had been trying to undermine militarily not a few years earlier.

There has never been a period of actual good relations between Russia and the Western European powers in over one hundred years.

The end of that war gave rise to the third phase, between 1946 and 1991, when the effort to remove the communists (whose reach had now expanded to control parts of central Europe) resumed and became an all-consuming fixation of Western statecraft. Now led by the United States, it re-oriented all Western political, diplomatic and military thinking to see the Soviet Union, and its satellites state, as the principal enemy.

It is in this phase, known as the Cold War, that institutions like the US-dominated military alliance known as the North Atlantic Treaty Organization (NATO: 1947), the well-known American Central Intelligence Agency (CIA: 1947) in the West, and the rival Soviet-led Warsaw Pact (Treaty of Friendship, Co-operation and Mutual Assistance: 1955) in Eastern Europe, were formed. This phase officially came to an end with the collapse and dissolution of the USSR in 1991. The Warsaw Pact dissolved in 1991 while, conversely, NATO has just kept on going.

Originally made up of 15 member states by 1955, and committed to mutual defence for fifty years, NATO has never properly explained why then it continues to exist. There are a number of contradictions. The Cold War itself did not last 50 years, and the NATO side won anyway, yet it has gone on to include fifteen new members, thus doubling its membership. What is more, the bulk of these new member states are former territories of the Warsaw Pact, with membership being offered to even more, such as Ukraine, which used to be part of the Soviet Union proper. In other words, NATO became twice as big as its original size after the reason for its creation no longer existed.

This brings us to the fifth phase running from 1991 to Russia’s February invasion of Ukraine, which is a whole story in itself.

Russia’s unease with this expansion—expressed in a number of failed diplomatic initiatives, with Ukraine increasingly at the epicentre—was never really taken seriously. The immediate trigger begins with a 2014 coup in Ukraine that brings a pro-West government to power. There followed a series of measures against the Russian ethnic minority of Ukraine, as well as proscriptions against the symbols and legacy (both good and bad) Russia had left in Ukraine during the communist era. In particular, there was the public rehabilitation of the legacies of fascist organizations that had collaborated with Hitler’s forces during the German invasions of the 1940s, and the public tolerance of new fascist organisations. It is one issue to wonder why anyone should find it desirable to join a political identity with such a record. It is another issue to also question why such politics should be even permissible in a society claiming to be civilized.

How this invasion ends will be the start, and then the nature, of the sixth phase.

Africans are not obliged to take sides. But there is a human obligation to share knowledge and experience, as Ambassador Kimani has done. And any call for the avoidance of armed conflict is a good thing.

More than once in the last century, Europeans have dragged us into their conflicts in a bout of global racism.

Therefore, scenes of Africans being discriminated against on the Ukraine-Poland border as they tried—like many other peoples in Ukraine—to flee the looming conflict, should have been expected.

European culture is racist, and it did not become racist when it arrived in the Americas, Asia and Africa; it was its racism that took it there in the first place. What is more, Europeans actually began their racism among themselves.

Eastern Europe is Slavic country.  “Slavic” is how the Eurasian people described themselves, as a concept of praise. However, these people had been conquered in the 9th Century (in other words, in yet another inter-European war), and had been reduced to what would now be called slavery.

So, Western European history ascribed a different meaning to the name. “Slavonic” was turned to mean “captive” in Latin.  “Slav” is where the word “slave” in Western European languages comes from.

European racism—now directed at mainly non-white people—may be less expressive and performative at home as compared to the settler spaces it created overseas, because it is less directly in the presence of black people, and it is also more secure and confident in itself at home. But it is always there; it is just a matter of opportunity and circumstance (such as a border).

The Nazi Germany era was in many ways a condensed form of the already 400-year white supremacist project that had seen white Europeans forcibly settle themselves in the Americas from the arctic to the Antarctic, Australia, New Zealand, and all of southern Africa. In all cases, these incursions (that Hitler called “lebensraum”, literally, “space for living in”, when he applied them to Eastern Europe) began with genocides, and were sustained on them.

European culture is racist, and it did not become racist when it arrived in the Americas, Asia and Africa.

Being hemmed in militarily, Hitler’s Germany found it necessary to massively mobilize its population. It did this by appealing to their racism by victimizing a significant minority in an acute intensification of perhaps the longest standing racial prejudice in European public life; vilifying people of Jewish descent, as well as picking on its neighbours.

Underneath the usual romanticisation of the conflicts among Europeans lies the story of coal and iron. Until perhaps the 1960s, the Alsace-Lorraine region, which lies where the lands of France and Germany meet, held the largest known deposits of iron ore in the Western world. Together with the abundant supplies of the coal in the neighbouring regions, this created the opportunity for the bulk production of perhaps the most significant material to industrialization—steel.

On top of the already mentioned 1914-1918 British-German war that led to Germany’s loss of its entire global empire as well as territory closer to home, and the 1939-1945 British-French-American-Russian war against Germany, Italy, and Japan, which left Europe militarily split in half for the following four decades, there had already been the Franco-Prussian war of 1870-1871 which had ended with German occupation of France. All these were essentially conflicts over the Alsace–Lorraine region.

It was the site of the beginnings of a reversal of fortunes for Germany in its big gamble to also invade the Soviet Union in 1941. This gave rise to the heroic politics of the Americans leading the massive landings on the shores of Western Europe in a race for Berlin, the German capital. The real panic was to try and capture Germany before the Russians advancing from the East did. Perhaps they feared that Russia would reclaim those territories it conceded to Germany as part of the process of pulling out of the 1914-1918 war in which the then new communist regime had felt it had no side.

Underneath the usual romanticisation of the conflicts among Europeans lies the story of coal and iron.

This is why from the day of the German defeat in 1945, up until its reunification in 1990, all the countries that had fought Hitler’s armies had their armies in the ridiculous situation of each controlling a cramped sector of Germany’s capital Berlin, while Berlin as a whole was itself deep inside Soviet-controlled territory (because the Soviet Union’s Red Army had overrun German territory before the Western armies got there).

Russia has always been governed by a cultural tension between its actual Asiatic roots, and an underlying tendency to embrace a more western European identity. This “Westernizer tendency” (as it is known) played a role in taking the monarchy to the degree of crisis the Russian Empire had. In trying to become more like the more industrialised powers to the west, Imperial Russia had become financially indebted to them.

The socialist revolution under the communist party put an end to this, and in so doing saved the Russian state from collapse.

All foreign investments as well as locally owned private concerns were nationalized. Furthermore, key elements of Slavic culture, such as language, were synthetized into education and science in a way that allowed for the rapid progress of the spread of education and technological knowledge. This enabled the country to make a rapid leap forward technologically, and become an industrial and military power by the middle of the last century.

The political leaders were also adept at keeping the country’s enemies at bay through military and diplomatic manoeuvring. The coming to power of the Russian communists in October 1917 only intensified this, because under the Russian monarchy, Russia had been in alliance with the big powers of the West (France and Britain), in fighting Germany in the 1914-1918 war. It was of great use partly because, being to the east of Germany, Russia formed a whole other front. Those powers were very annoyed when Russia’s new rulers pulled out of the conflict.

Russia has always been governed by a cultural tension between its actual Asiatic roots, and an underlying tendency to embrace a more western European identity.

From that moment, the fight was no longer over the respective profit-seeking factions of several empire states seeking to grab valuable territory and markets for themselves. It became a fight between all such factions collectively on the one side, versus a huge country taken over by a political party that was opposed to private profit-making to begin with, on the other.

But with this loss of a crucial ally in the ongoing war, three things were at stake for the powers to the west. Germany, which had only really united as one nation in the 1871 war (minus Austria; that would be organized later by Hitler), now had more opportunities and room to manoeuvre in the conduct of the war. There was the immediate possibility of Germany taking over all the installations and resources that the Russian forces had left scattered all over the eastern front from Finland, Siberia to the central European plains.

Second, the substantial aforementioned economic and war debts that the economic powers to the west had over broke imperial Russia were now under threat of not been honoured.

Finally, the prospect of the communist party finally taking power, especially in a major country, raised the prospect of communism (by this time a movement with nearly eighty years of struggle behind it) gaining popularity in all the major capitals of Europe. For (mainly Western European) capitalist governments, this would be a political disaster.

Germany lost the war anyway. And, as said, the big powers to the west immediately turned their attention to supporting a combination of Russian forces trying to remove the communists from power in a growing Russian civil war between 1920 and 1922. Eventually, after deploying a few military expeditions, and even engineering a couple of coup attempts, they gave up and went home. But they were to continue sponsoring Russian exile groups in sporadic incursions and attacks on the growing communist state for many years after, until 1939 when Britain and the United States, principally, needed to make an about-turn and form an alliance with the very same Soviet Union they had been undermining, against Hitler.

It paid off well; the record shows that Nazi Germany’s decisive defeat took place on the Eastern front, at great human and material cost to the Soviet Union. Russian losses to Nazi Germany exceeded 26 million people, including 10 million soldiers.

Therefore, beyond the earlier historic rivalries, by 1945 significant countries of Western Europe were collectively hostile to the Soviet Union, the culmination of a process that had begun shortly after the communist takeover of power in 1917, but which also predated it.

Indeed, as soon as the ’39-’45 hostilities ended in Europe with the capture of Berlin, the Western powers immediately reverted to a stance of armed hostility towards the Soviet Union. It is said that one legendary American General called Patton had to be removed from command because he was calling for an immediate attack on the Soviet forces in Germany, followed by the invasion of Moscow. This stance has effectively continued even after the demise of communist rule in Russia. The old game of lusting after the territories of the Balkans and beyond has resumed.

This then, is the Russian experience of Western powers, right from the start of the last century, whether as the Russian Empire, the communist state, or as the Russian Federation.

After being besieged by Western debt, what began as a free-for-all among the competing ambitious ruling classes of the various European empires developed into a quasi-unity of those ruling classes in a joint attempt to prevent the spread of communism among the ordinary people. Once that was achieved, they all went back to trying to have economic advantage over the weaker parts Europe. These were the 1990s wars over the re-division of the Baltic states, and their seduction into the Western debt-based economic system.

Whether democratic or not, any Russian head of state would do well to understand NATO’s interest in Eastern European countries now bordering Russia in this context. President Vladimir Putin, whatever one may think of him, certainly holds a sense of this history.

The old game of lusting after the territories of the Balkans and beyond has resumed.

Russia fears it may be seen as the next prize; the very name “Ukraine” literally means “border” or “frontier” in some Slavic languages. The only new development is that wealthy Russians probably also harbour the same ambitions, and wish to expand their own place in the Russian economy.

All this tells us Africans four critical things.

First, that these wars are about business: making money, or seizing territory to make money from it later. When capitalists want something, they find an excuse to start a war in order to get it.  These recurrent conflicts were only suspended for the last eighty years with the creation of a trade mechanism that enabled interested European countries to access resources for their domestic industries without having to also physically control the territory. This mechanism began life as the European Coal and Steel Commission, later renamed the European Economic Commission, and then renamed again the European Commission. Today, it is known as the European Union.

To Europeans, fighting is normal. And they are very good at it, on the whole. They manufacture their own weapons, and make money out of that, too. War, for the European, is a relatively sustainable activity.

Even modest-sized European cities will have a monument (if not whole cemeteries) to the dead of more than one war. There are about 68,000 war memorials in the UK alone, and 3,000 war cemeteries in France.

Tiny Belgium holds about 800 military cemeteries for the 1914-1918, and the 1939-1945 wars alone. It is why military-style language (e.g. “to pull a flanker”, and “to steal a march”, in common English) peppers a lot of casual Western speech. It is why most Western armed forces retain standing divisions trained to be quickly transported far abroad, and to fight in terrain very unlike their home territories. Europeans (and white America) are warmongers. That is the historical and contemporary record, quite contrary to the political propaganda they produce in their media and education systems.

The second lesson is that among white powers (of which the Russian state is one) there is never any real principle involved. Millions died fighting “Nazis”, only for the politicians that sent them to their deaths to recruit those very same Nazi leaders into their own programmes.

When capitalists want something, they find an excuse to start a war in order to get it.

For example, one Arthur L. Rudolph was a German Nazi-era scientist brought to the United States in 1945 for his rocket-making expertise. He has even been honoured by the United States National Aeronautic and Space Agency (NASA). He is considered to be the “father” of the Saturn V rocket upon which the Apollo moon-landing programme depended.

More directly, one Adolf Heusinger, a German general who served as chairman of the NATO Military Committee from 1961 to 1964, had in an earlier life been a colonel in Hitler’s General Staff, and had been directly involved in planning the invasion of the Soviet Union.

The Soviet Union also grabbed from defeated Germany as many Nazi scientists as it could lay its hands on. But their case was handled more like a reparative abduction than the offer of an entirely new comfortable life.

Today’s enemy may become tomorrow’s friend, and today’s friend was the enemy yesterday. It is just their culture of politics, war and diplomacy. Get involved at your own risk.

Third, that when these giants fight, they do so on an industrial scale. Their conflicts often spill across other borders and territories. Their weaponry brings mass death, and their logistical and human resource needs often suck in people who have very little to do with the actual cause of the conflict. In Africa, it is only Ethiopia that has come anywhere near this scale of war-making.

Today’s Democratic Republic of Congo was plundered for the rubber and copper needed to make tires and bullet casings for the 1914-1918 war. The Lumumba-led independence government fell victim to the Cold War rivalry over Congo’s uranium deposits as part of the America vs. Soviet Union nuclear arms race.

Hundreds of thousands of black Africans faced off and killed each other as loyal soldiers of the German and British armies fighting for German Tanganyika and British Uganda and Kenya, respectively.

Ukrainians, like all peoples everywhere, matter. That is why its real independence from either power is important to the rest of the world. For the Western powers, it would be nice to have Ukraine, but Russia as a whole, is the real prize.

Whatever one may wish to now call it, Ukraine is a place of wealth and potential profit. It is the second largest country in Europe by area, holding significant reserves of uranium, titanium, manganese, iron, mercury and coal.

It is a world leader in the production and export of a whole range of agricultural products (corn, potatoes, rye, wheat and eggs, all of which are central to the processed food industry).

In Africa, it is only Ethiopia that has come anywhere near this scale of war-making.

Ukraine is also a country with a significant body of advanced industrial knowledge.

And as with the Alsace-Lorraine, and the earlier wars to dislodge the Communist Party of the Soviet Union from power, this is about Western corporations (and Russian oligarchs) looking to increase their wealth.

This is not an African story, but it is certainly beginning to look like one. In the history of the conflicts of the modern world, certain zones stand out as having suffered from the accident of being located where strategic resources were to be found. Before the DRC, there was Western Europe and the Middle East. With lots of minerals and fertile land, all that is missing in Ukraine is a population too weak, too poor, and too divided to think and speak for itself. War, autocratic government, and CIA-sponsored “good governance” workshops have been known to supply those.

Therefore, Ambassador Kimani’s advice notwithstanding, Africans are better off staying away from all this, just as Ukraine would have been wiser to stay out of the Russia-NATO rivalries.

While the white powers were not fighting in Ukraine, they were still promoting fighting somewhere else. Now that they have also kicked off in Europe, it means their unusual break of eight decades of peace is finally over. “Normal service has resumed”.

Europe is at war with Europe, in Europe, once again.

Continue Reading

Politics

Reflections on Medical Ethics in the Era of COVID-19

The intense centralisation of health services has killed the doctor-patient relationship while hospitals have now become centres for gathering detailed patient information that is exploited by pharmaceutical companies.

Published

on

Reflections on Medical Ethics in the Era of COVID-19

According to William Ruddick, the wide field usually referred to as “medical ethics” comprises a range of disciplines, including medical ethics (primarily, medical doctor-centred), and healthcare ethics (including nurses and other healthcare providers), clinical ethics (focused on hospital case decisions with the aid of diverse committees and consultants), and bioethics (including general issues of reproduction, fair distribution of organs and other scarce life-saving resources, and protection of the biosphere). All discussions of medical ethics proceed from the assumption that all things being equal, all patients have moral status. As Matjaž Zwitter observes in a chapter on “Moral Status”, “There should be no doubt that all of us with a capability of deciding about ourselves have moral status.” Zwitter further points out that only beings with moral status can be meaningfully said to have rights.

However, in our time, the misconception is widespread that science, of which medicine is a part, is all about “objective” observation of facts without any consideration of values (standards by which we judge some things to be good or bad, right or wrong, beautiful or ugly, and so on). Nevertheless, we human beings cannot live without values, because it is they that make life truly human by enabling us to choose our goals and the appropriate means of attaining them. Thus, in the introduction to his Medical Ethics: A Very Short Introduction, psychiatrist Tony Hope writes, “As my clinical experience grew so I became increasingly aware that ethical values lie at the heart of medicine. Much emphasis during my training was put on the importance of using scientific evidence in clinical decision-making. Little thought was given to justifying, or even noticing, the ethical assumptions that lay behind the decisions. So I moved increasingly towards medical ethics, wanting medical practice, and patients, to benefit from ethical reasoning.” In what follows, I examine the viability of the doctor-patient relationship, undergirded by medical ethics, in the era of COVID-19.

Principles of medical ethics in the era of COVID-19

One of the best-known texts associated with medical ethics is the Hippocratic Oath authored in ancient Greece about 2400 years ago. It required a person being admitted to the position of a medical doctor to swear by a number of healing gods to uphold certain ethical standards. The oath established several principles of medical ethics that are still considered crucial to the conduct of a medical doctor today. At the heart of medical ethics are questions regarding what is morally acceptable or morally unacceptable for a doctor to do in the course of caring for the sick. Three of the key issues in medical ethics are commitment on the part of the doctor to do only good to the patient, to respect the patient’s right to accept or decline a medical procedure, and to conduct medical research in line with sound ethical principles.

Doing only good to the patient

According to William Ruddick, the Hippocratic injunction “Strive to help, but above all, do no harm” is the ruling moral maxim in the doctor-patient relationship. In current discussion, this maxim has been codified in oft-cited principles of beneficence (action to promote the good/welfare) and non-maleficence (refraining from doing evil). For the doctor to achieve these noble goals, he or she must utilize their medical knowledge in a free atmosphere in which their only concern is their patient’s well-being, without having to worry about demands from an elaborate medical care bureaucracy. Yet in the era of COVID-19, the doctor has been turned into a functionary of just such a bureaucracy, receiving instructions from local and global health authorities, and being stopped from using certain medications, even if he/she and his/her patient would have liked to use them.

Respecting informed consent

The principle of informed consent stipulates that the patient has a right to accept or decline a medical procedure after being duly furnished with information about what it entails and the possible positive and negative impacts arising from it. As I indicated in COVID-19 Vaccine Mandates in the Light of Public Health Ethics, the medical ethical principle of informed consent is based on the conviction that each and every human being is endowed with intrinsic infinite worth (dignity) and human agency (the capacity of the person to act out of his/her own uniquely human viewpoint). Underlying the two considerations is the assumption that the human person is a ‘know-er’, since it is impossible to adequately enjoy human dignity and human agency without knowledge. All this implies the idea of human rights—certain entitlements due to every person by virtue of his/her humanity.

Human dignity, human agency and human rights presume the autonomy of the individual. In his On Liberty, John Stuart Mill asserted the autonomy of the individual as follows:

“The only part of the conduct of any one, for which he is amenable to society, is that which concerns others. In the part which merely concerns himself, his independence is, of right, absolute. Over himself, over his own body and mind, the individual is sovereign.”

In a chapter on “Autonomy and Its Limitations”, Matjaž Zwitter highlights at least five characteristics of individual autonomy. Understood as the right to self-determination, autonomy includes the right to information and protection of privacy. In an ideal situation, a patient with full autonomy participates in all essential medical decisions, and consents to every invasive procedure. Nevertheless, even patients with full capacity have the right to transfer their autonomy to others such as family members, friends, or to their physicians. In cases where patients are unable to decide for themselves and therefore with limited autonomy, surrogate decision-making is justified. Nevertheless, a doctor is not morally obliged to respect a directive by a surrogate decision-maker if this directive is clearly against the patient’s interests. Some persons make advance directives, to be followed in case of their future incapacity to participate in decisions regarding their treatment. While such written or oral directives are helpful, their validity may be re-considered in situations that the person could not have foreseen at the time of making the advance directive.

In the era of COVID-19, the doctor has been turned into a functionary of just such a bureaucracy, receiving instructions from local and global health authorities.

Thus, in “COVID-19 Vaccine Mandates in the Light of Public Health Ethics”, I pointed out that in the light of the notions of human dignity, human agency and human rights manifesting in medical care as informed consent, any measures imposed on the patient in the name of containing COVID-19 is paternalism, that is, the treating of adults as though they were children. This is equally true in medical care where the doctor-patient relationship is in operation, as in public health policy where health authorities institute measures for the welfare of populations.

Research ethics in medicine

Progress in the medical field rides on research, but therein also lies the danger of the violation of the moral principles that ought to govern the doctor-patient relationship. Consequently, as Adebayo A. Ogungbure notes, the aim of medical research ethics is to ensure that research projects involving human subjects are carried out without causing harm to the subjects involved.

One of the most outrageous violations of medical research ethics was the “Tuskegee Study of Untreated Syphilis in the Negro Male”  of which Ogungbure writes:

“The Tuskegee Study of Untreated Syphilis in the African American Male was the longest experiment on human beings in the history of medicine and public health. Conducted under the auspices of the US Public Health Service (USPHS), the study was originally projected to last six months but ended up spanning forty years, from 1932 to 1972. The men used as subjects in the study were never told that they had the sexually transmitted disease. The term “bad blood” was coined to falsely depict their medical condition. The men were told that they were ill and promised free care. Offered therapy “on a golden platter”, they became willing subjects. The USPHS did not tell the men that they were participants in an experiment; . . .

Though the study was organised and managed from Washington, the participants dealt with a black nurse named Eunice Rivers, who helped with transportation to the clinic, free meals, even burials. The project did not stop until Peter Buxtun, a former PHS venereal disease investigator, shared the truth about the study’s unethical methods with an Associated Press reporter.”

As Ogungbure further explains, the health authorities went to great lengths to ensure that the men in the “Tuskegee Study” were denied treatment, even after penicillin had become the standard cure for syphilis in the mid-1940s. He points out that the ignominious study only came to an end when the Associated Press published a well-researched article about it by whistle-blowing reporter Jean Heller. As a result, writes Ogungbure, congressional hearings about the Study took place in 1973, and the following year the United States Congress passed legislation creating the National Commission for the Protection of Human Subjects of Biomedical and Behavioural Research. Apologising for the Tuskegee Syphilis Study on 16 May 1997, President Bill Clinton described it as “deeply, profoundly, and morally wrong”.

Yet in the early years during which African Americans in Alabama were being ravaged by the Tuskegee Syphilis Study, Adolf Hitler’s regime in Germany was busy conducting grossly unethical research on segments of the population that he considered to be inferior to his mythical Aryan race in the name of eugenics (a set of beliefs and practices that aim to improve the genetic quality of a human population by excluding people and groups judged to be inferior or promoting those judged to be superior). Ogunbure explains that one of the consequences of the atrocities committed by Nazi Germany was the drafting of the Nuremberg Code by an international panel of experts on medical research, human rights and ethics, which served as the initial model for those few public and private research and professional organisations that voluntarily chose to adopt guidelines or rules for research involving human subjects.

Progress in the medical field rides on research, but therein also lies the danger of the violation of the moral principles that ought to govern the doctor-patient relationship.

The following are the ten basic principles of the Nuremberg Code: Seek the voluntary consent of the human subject; conduct only an experiment that is necessary, and whose results will promote the good of society; an experiment on humans ought to only follow experiments on animals; an experiment ought to avoid all unnecessary physical and mental suffering and injury; no experiments likely to cause death or disabling injury should be undertaken; the humanitarian importance of the problem to be solved by the experiment ought to exceed the degree of risk involved; the experimental subject should be protected against even remote possibilities of injury, disability or death; an experiment ought to be conducted only by scientifically qualified persons; the human subjects should be at liberty to opt out of an experiment at any stage; the scientist in charge must be prepared to terminate an experiment at any stage if he/she has any reason to believe that its continuation is likely to result in injury, disability or death.

Several other documents on medical research ethics have been issued since the Nuremberg Code, including the World Medical Association’s Declaration of Helsinki of 1964 which has been revised several times since, and Canada’s Belmont Report of 1979.

Centralisation killing the doctor-patient relationship 

One of the dominant trends in our day is the centralisation of services. Those of us who are older recall a time when the branch bank manager had considerable freedom to make decisions. Then, due to digitisation, came the motto “Every Branch is Your Branch”, because every branch was now directly link to the head office. What we were not told was that the branch manager would henceforth be a mere functionary who had to wait for decisions on every minor detail from the head office. We have witnessed similar developments in the university system, with the Commission for University Education (CUE) now having massive control over the operations of universities in Kenya, so that although there are over forty public universities in the country, CUE requires all of them to operate along the same lines, leaving very little room for lecturers to exercise the time-honoured academic freedom.

Similarly, as hospitals have grown in physical size as well as in the number of personnel, so have their centralizing bureaucratic procedures (“red tape”). Doctors have to comply with elaborate protocols put in place by hospital management to avoid or reduce the number of court cases filed against the hospitals. Similarly, the elaborate chains of command pile pressure on doctors to comply with the policies of the hospitals even when those policies are contrary to patients’ interests. For example, one leading hospital chain in Kenya requires doctors to “request” three different tests on every patient suspected of having malaria, significantly raising the patients’ bills. If the results show that patients do have malaria, doctors in the hospital chain are again duty-bound to administer only a specific set of drugs. In short, doctors have very little say in that whole process.

Furthermore, hospitals have now become centres for gathering detailed information about patients for purposes other than the patients’ welfare. Many of my Kenyan readers have probably noticed that they can only purchase drugs or have tests done in private hospitals after giving their phone numbers to the personnel at the front desk. This enables the hospitals to access a patient’s personal records for the purpose of building a detailed history of all the drugs and tests that he/she has procured from that hospital over the years. This is precisely the kind of information which large pharmaceutical companies are eager to buy from the hospitals for a handsome price. In the era of the Fourth Industrial Revolution, the pharmaceuticals use artificial intelligence to analyse the massive information (“big data”) to get a  very clear picture of trends in the health of individuals and populations, thereby enabling them to design business plans that bring them massive profits.

As hospitals have grown in physical size as well as in the number of personnel, so have their centralizing bureaucratic procedures.

The death of the doctor-patient relationship on the back of intense centralisation has already taken a huge toll on the quality of health in hospitals in Kenya. According to the “Kenya Patient Safety Survey” conducted by the Ministry of Health in 2013, a patient’s safety could not be guaranteed in a majority of medical facilities in the country: only 13 hospitals out of 493 public and private health facilities in 29 counties surveyed achieved a score greater than one on an ascending scale of 0-3. The report stated, “Overall safety compliance was relatively poor, with less than one per cent of public facilities and only about two per cent of private facilities achieving a score greater than one in all five areas of risks assessed.” For instance, less than 10 per cent achieved a score greater than one in providing safe clinical care to patients. Of the 13 that scored more than one, 11 were private facilities, while only two were public. Furthermore, less than six per cent of public hospitals achieved a score greater than one in having a competent workforce. According to the report, this state of affairs had in some instances resulted in death.

Besides, in mid-2015, twenty-eight children in Busia County became partially paralysed due to medical malpractice. According to The Standard, the children had partial paralysis arising from injections given in the six months between December 2014 and June 2015, although those with severe paralysis reported initial complaints after treatment in 2013. The Standard quoted then Cabinet Secretary James Macharia as stating, “Our initial investigations point towards medical malpractice from inappropriate injection techniques as the primary cause of partial paralysis in all the 28 children.”

This is precisely the kind of inforamation which large pharmaceutical companies are eager to buy from the hospitals for a handsome price.

Yet in the era of COVID-19, the heavy centralisation of hospital operations that is stifling the traditional doctor-patient relationship has moved to an unprecedented high level. COVID-19 testing and treatment are heavily centralised and meticulously directed from the highest health authorities in each country. In many countries, doctors are strictly forbidden to use COVID-19 vaccines and therapies that have not yet been approved by the World Health Organisation (WHO) and adopted by their own top health authorities. Besides, countries are required to share their data on COVID-19 infections, hospitalisations, vaccinations and deaths with the WHO. Many health authorities at country level run online COVID-19 databases through which citizens can request for vaccination, download their vaccination certificates, and show proof of vaccination. Various governments also have arrangements among themselves to verify the authenticity of international travellers’ vaccination certificates/passports.

Furthermore, David Ngira and John Harrington inform us that generally, WHO recommendations are used as a form of quality control by domestic regulators who view them as a guarantee of safety and effectiveness. Ngira and Harrington also point out that many African states have relied wholly on the WHO Global Advisory Committee on Vaccine Safety given their weak national drug regulators and the limited capacity of the Africa Centre for Disease Control (CDC). The Africa CDC itself deems vaccines safe for use by member states on the basis of WHO recommendations. This means that the doctor no longer has the latitude to give his/her patient guidance strictly on the basis of his/her medical training and experience, but rather on the basis of protocols formulated by local and global health authorities.

Thus in the face of intense centralisation of medical care in the era of COVID-19, time-honoured principles of medical ethics such as the single-minded promotion of the good of the patient, confidentiality, respect for the patient’s right to informed consent, and the imperative for moral integrity in medical research, all of which held the doctor personally responsible for what he/she did in the course of his/her work, are inconceivable in a situation in which a doctor only acts on “orders from above”. The loser in this undesirable paradigm shift is the patient, and the winner the wealthy who have turned medical care into a business. I recently heard a senior Israeli medical professor state that when politics is mixed with science, all that remains is politics. To which I add that when medical care is mixed with business, all that remains is business.

Continue Reading

Trending