Facing the fallibility of the human body

Wednesday 8 September I had woken up reminding myself to chat to Anson for tips on having an MRI, as I was scheduled for one the next day. Anson was the ideal person to reach out to – not only had he experienced his fair share of MRIs over the past year following his diagnosis with head and neck cancer, but he is also a dear friend, colleague and mentor. Professor Anson Mackay was one of my three PhD supervisors, and based at UCL, he was my primary point of contact during the split site component of my PhD that I had spent at the Environmental Change Research Centre at UCL in their Geography department. Since completing my PhD, Anson has become my most valued mentor, an enthusiastic research collaborator, and really a very dear friend. 

As someone who is far more comfortable hiding away from fear than facing it, I managed to ‘forget’ to contact Anson throughout that day. The idea popped into my head every once in a while, only to be squashed by the next conference talk that I hopped onto Zoom for; I was in the middle of the SSAG/SAAG Conference that week, presenting two papers, and mentoring a large group of my postgraduate students who were presenting for the first time at an academic conference. I didn’t even think I had that much to fear – the MRI, MRA and Doppler were very much a precaution to rule out anything sinister. I have struggled with my thyroid this year. I have an autoimmune thyroid condition – Hashimotos thyroiditis, which is treated by replacing the thyroid hormones that are under-produced by your body. At the beginning of January, these thyroid levels were being quite dramatically over-replaced through over-medication and an acute thyroiditis, resulting in a range of unpleasant symptoms of hyperthyroidism. Following the subsequent reduction of my dose of thyroid replacement hormones when this was detected, it took about 3 months to reach normal range, and a further 3 months to start to feel myself again. The first three months coincided with the very stressful preparation and teaching of an all-online module to our first years, as the second wave of COVID-19 was in full force. There was no time for sick leave, and I just pushed through, one day at a time. Back to the present, at the 9-month mark I was still struggling with occasional dizziness, and hence it made sense to do an MRI (and MRA and doppler of the carotid artery) to check that there wasn’t anything more sinister behind it. I was largely convinced there was not. 

I saw Anson’s name pop up in my inbox that evening – and as I saw it, I remembered again to ask him for tips. His email, although partly discussing possible projects and PhD cosupervision, was also letting me know that the CT scan he’d recently had showed abnormalities. My heart sank. He was still optimistic in that email that it could be a red herring, and gave me some really excellent advice on how to stay calm in an MRI. 

I did manage to stay calm, and thank goodness the MRI, MRA and doppler came back clear. I did wonder briefly, while lying in the MRI, whether my productivity and success was perhaps just an Amelia Sheppard-type brain tumour (Grey’s Anatomy fans would understand). Fortunately not. Anson, however, was not as lucky. After fighting and overcoming neck cancer in 2020, and subsequently running multiple ultramarathons, he has just been diagnosed with a new primary cancer – this time of the lungs. I received a text from him to let me know last week, shortly before he started publishing on his cancer blog again. The news is devastating. To me personally, but also because I can only just begin to understand what this has felt like for him and his partner David. I am scared – one of the most important and influencial people in my life has cancer for a second time; this time far more difficult to treat, and so he’s looking at what life with cancer will look like, rather than the path to eradicate it. We have so many projects planned, projects that involve helicopter trips to South Africa’s highest peak, road trips through Lesotho to collect samples, and so much data to analyse. I have so much still to learn from Anson – about diatoms and isotopes, about academia, and about life. I need to keep reminding myself that I will still have time to learn from Anson, we just might need to both pace ourselves a bit better.

It also all feels so terribly unfair. Anson has run ultramarathons in the past 18 months! The feeling of illness being unfair is one I feel in relation to Anson’s diagnosis and my own health challenges – a year ago I was pushing myself too hard, working long hours, taking on too much, and struggling to effectively handle stress. All very standard in academia, and indeed often glamourised. I had been doing this my whole life. It is no surprise then that I would eventually have had to face the fallibility of my own health, and at the beginning of the year it was scary not understanding what was wrong and whether this was all just my thyroid. However, 9 months later I have taken a 5-month sabbatical (albeit still a busy one with many research deadlines, and commitments to various societies), gone on at least a 2km each day (most days 3.5km, some days 8km), developed a better sleep cycle, reduced my workload to something a little bit more manageable, started doing Yoga regularly, and eaten a more balanced diet. Yet I still struggle with my health fairly often; I still don’t quite feel myself as reliably as I used to, and can’t just push myself to handle the next thing that lands on my plate. It feels unfair – I am taking things slowly, I am paying attention to my health, and I still don’t feel quite right. I don’t know how to feel that way again, and while I do want to feel reliably healthy, perhaps I shouldn’t want or aspire to go back to doing every and anything. 

Life throws curveballs at us, and the lessons aren’t always clear. Sometimes bad news is just bad news, and if this year has taught me anything, it is to sit with the fear, the loss, the devastation and just feel it. As difficult as that is. And then learn – how to make work more sustainable, how to enjoy each day, and how to become the best parts of the people I most look up to.

The ones left behind

Last week, I cycled past a bus reading ‘5G – don’t get left behind’ on its back. This very bus drives through Cape Town’s city centre and its more affluent suburbs, but also transports many workers who come in from low-income areas. The message bothered me. It was there for to sell a product and thus not necessarily meant to convey a meaningful message. Still, it did echo assumptions that I find to be prominent in discussions on digital media and technological developments more broadly.

For one thing, there is the premise that there will be an improved humanity with an increasing access to information. Information flows tend to be almost religiously celebrated as having supreme value in and of themselves (also referred to as dataism), as being inherently progressive, and as levelling social playing fields.

Presenting technisation as a lofty ideal or a superior mode of being to achieve rather than something created from a particular vantage point effectively veils the authoritative regimes of the technological revolution we currently witness. This includes the cultures and values embedded in tech products. Very few women and people of colour are hired in tech industries, leading to the development of problematic algorithms.[1] Even more problematically, designs and codes are presented as neutral and gender- and colour-blind, much like the employment politics in bis tech.[2]

Adding to their opacity is the fact that tech products are often portrayed as independent actors. Power relations precipitating unequal access to resources that tie in with social, economic and educational developments are, consequently, neatly brushed under the discursive carpet. Framing access as a matter of capability and choice (reach it, grab it – or else get left behind) rather than something that forms part of a historical development supports the prioritisation of the needs of some while the experiences of others (those who cannot reach) are rendered even less visible and relevant for imagined futures.

In Cape Town, where the geographic, economic and social divisions of Apartheid are notoriously persistent, the ‘don’t get left behind’ paradigm seems particularly cynical. It foreshadows an even more unequal future and places the responsibility for ‘being left behind’ onto individuals unable, for example, to invest in 5G products. This form of exclusion severs itself from problematic histories of divisions and portrays the ones to come as both evitable (ones can make the “right” choices and catch up with tech) and as an inescapable future of insiders and outsiders – much like the narratives of numerous sci-fi plots.

Why sci-fi could be the secret weapon in China's soft-power arsenal |  Financial Times

It was throughout my studying Tinder that I grew increasingly intrigued by what lies behind the shiny, promising exteriors of technologies and artificial intelligence (AI). This is why I want to continue studying their impact on our well-being, social identities, politics, economies and demographic developments. Something I am very curious about is the role of algorithms in how we as their users come to understand ourselves, the world around us, and how we relate to others. I’m especially interested in the impacts of technologies on relationships of trust.

The more I read about AI more broadly, the more I find myself getting irritated with its overly positivistic representations. Especially when people like Amazon CO Jeff Bezos shamelessly flaunt their extraordinary wealth by taking a quick trip to space in a phallic-shaped rocket – and making some extra cash by selling spare seats to similarly wealthy people.

When products like the new Tesla humanoid robot named Optimus are developed and when Amazon’s AI assistant Alexa seems to have learned a little too much about your habits, it is useful to think back to Bezos’s phallus-shaped rocket – just as a memento of how the products we are sold as progressive are anything but neutral, nor are they necessarily designed for our needs. While there are well-intentioned inventions (especially in the medical field), AI and big tech should not be treated as inherently superior approaches to human sense-making but rather as complementing it if well-developed. This is because tech solutions are not “semi-sentient” as ultrarich AI-enthusiast Elon Musk promises his new human-replacement robot to be and they only have the “sense” of morality that has been encoded in them.

If left unchecked, the trajectory of dataism may very well be to the detriment of humanism. Thankfully, this is not a sci-fi movie or a zero-sum game. We are in a position in which we can still decide just how to handle these seemingly inevitable developments that are sprung on us from silicon-valley and co. We can contextualise and look at them as the political and socially momentous projects that they are. “Don’t get left behind” messages in this context should serve as a wake-up call. But instead of letting them induce panic and self-questioning as the advertisers appear to intend, we should treat it as a reminder to consider people at the margins and designing appropriate interventions instead of placing blame in the most inappropriate ways.


[1] For more on this, read D’ignazio, C. and Klein, L.F., 2020. Data feminism. MIT Press.

[2] See Noble, S.U., 2018. Algorithms of oppression. New York University Press.