I am if you are, and if you aren’t I still am.

I am…

Take a moment. Breathe in. 

Say, “I am…” and the first few things that come to mind. Notice how these thoughts feel. Any words that follow “I am…” have the power to mould and manoeuvre your sense of self.

I am human. I am curious. I am kind. It is perhaps one of the greatest instincts of the human condition to attach ourselves to a sense of identity. This may be rooted in connection, community or companionship.  Perhaps identity stems from creation, control, or ceremony. To construct a comfortable and assured interaction with the environment, we tell ourselves (and those around us) who we are. I am not my research, though I am working in the field of sleep science – diagnosing obstructive sleep apnoea in persons living with HIV. This involves tracking the brain patterns of a sleeping patient, as well as their breathing. I am constantly reminded to be humble in my knowledge acquisition.

I am a learner. I am a teacher. I am a neuroscientist. Effectively, this means I study the squishy, convoluted pink organ housed within the skull. This lump of biologically active stuff, which somewhat governs our lived experience, fascinates me so deeply that I am compelled to tell you why it is part of who I am.

As you read this sentence, your brain is making associations between what I write; the sounds in your environment; any aromas wafting past your nostrils; and even the temperature of your body. When you think back to this moment, your brain will recount – within milliseconds – all the sensations activated within you to remind you of this experience.

The average human brain can create about 60 000 thoughts every day!

We can practice calming or stimulating our minds by the type and timing of awareness we employ. I might be so bold as to say this awareness is a series of thoughts. So, what is a thought? A thought is an electrochemical trace that occupies multi-dimensional space in your brain. A thought is the internal experience of how we process external stimuli. This internal experience relates to one’s senses and (new term incoming) somatosensation, or the sensory relationships of our bodies with the space around it – a tickle, an itch, a chill. We even have this epic internal ‘sixth sense’ called interoception – sensing what we feel within our bodies! In some ways, I agree that what we think we can become.

Still, I am more than just my brain’s interpretations of my body’s sensations.

Humans have humanity. We adapt to circumstance and unite in hardship. I am an activist. I am an advocate. I am an ally. I situate myself at the intersection of neuroscience, public health, and social justice. I have more than just a love for science – I have a love for sharing science. This brings me to a chilling (but in no way “chilled”) fact:

In 2020, the Annual Mental State of the World Report showed that 36 % of South Africans are living in mental health distress. Let that number sink in. 36 % is about four out of ten people. I dream of a day where we see this number crumble like the last rusk in the packet. My research aims will likely centre around this dream for as far into our future as I can imagine. This percentage is not the fault of our brains, but a psychosocial consequence of centuries of suffering and oppression.

Restructuring the paradigm of cognitive wellness requires not only inclusion of minority groups, but in fact building new systems with excluded groups at the centre of our focus. While I have an ongoing love-affair with the brain, I feel even more inspired by Black joy, trans joy and accessible places for people with disabilities. As I pursue my neuroscientific dreams, I want to cultivate safer mental health spaces and research outcomes for LGBTQPIA+ people, Indigenous peoples and disabled persons.

There is no quick fix for mental health reform, but I am committed to proactively prioritizing both systemic and systematic wellness. I invite you to ask yourself, “Am I?”.

The ones left behind

Last week, I cycled past a bus reading ‘5G – don’t get left behind’ on its back. This very bus drives through Cape Town’s city centre and its more affluent suburbs, but also transports many workers who come in from low-income areas. The message bothered me. It was there for to sell a product and thus not necessarily meant to convey a meaningful message. Still, it did echo assumptions that I find to be prominent in discussions on digital media and technological developments more broadly.

For one thing, there is the premise that there will be an improved humanity with an increasing access to information. Information flows tend to be almost religiously celebrated as having supreme value in and of themselves (also referred to as dataism), as being inherently progressive, and as levelling social playing fields.

Presenting technisation as a lofty ideal or a superior mode of being to achieve rather than something created from a particular vantage point effectively veils the authoritative regimes of the technological revolution we currently witness. This includes the cultures and values embedded in tech products. Very few women and people of colour are hired in tech industries, leading to the development of problematic algorithms.[1] Even more problematically, designs and codes are presented as neutral and gender- and colour-blind, much like the employment politics in bis tech.[2]

Adding to their opacity is the fact that tech products are often portrayed as independent actors. Power relations precipitating unequal access to resources that tie in with social, economic and educational developments are, consequently, neatly brushed under the discursive carpet. Framing access as a matter of capability and choice (reach it, grab it – or else get left behind) rather than something that forms part of a historical development supports the prioritisation of the needs of some while the experiences of others (those who cannot reach) are rendered even less visible and relevant for imagined futures.

In Cape Town, where the geographic, economic and social divisions of Apartheid are notoriously persistent, the ‘don’t get left behind’ paradigm seems particularly cynical. It foreshadows an even more unequal future and places the responsibility for ‘being left behind’ onto individuals unable, for example, to invest in 5G products. This form of exclusion severs itself from problematic histories of divisions and portrays the ones to come as both evitable (ones can make the “right” choices and catch up with tech) and as an inescapable future of insiders and outsiders – much like the narratives of numerous sci-fi plots.

Why sci-fi could be the secret weapon in China's soft-power arsenal |  Financial Times

It was throughout my studying Tinder that I grew increasingly intrigued by what lies behind the shiny, promising exteriors of technologies and artificial intelligence (AI). This is why I want to continue studying their impact on our well-being, social identities, politics, economies and demographic developments. Something I am very curious about is the role of algorithms in how we as their users come to understand ourselves, the world around us, and how we relate to others. I’m especially interested in the impacts of technologies on relationships of trust.

The more I read about AI more broadly, the more I find myself getting irritated with its overly positivistic representations. Especially when people like Amazon CO Jeff Bezos shamelessly flaunt their extraordinary wealth by taking a quick trip to space in a phallic-shaped rocket – and making some extra cash by selling spare seats to similarly wealthy people.

When products like the new Tesla humanoid robot named Optimus are developed and when Amazon’s AI assistant Alexa seems to have learned a little too much about your habits, it is useful to think back to Bezos’s phallus-shaped rocket – just as a memento of how the products we are sold as progressive are anything but neutral, nor are they necessarily designed for our needs. While there are well-intentioned inventions (especially in the medical field), AI and big tech should not be treated as inherently superior approaches to human sense-making but rather as complementing it if well-developed. This is because tech solutions are not “semi-sentient” as ultrarich AI-enthusiast Elon Musk promises his new human-replacement robot to be and they only have the “sense” of morality that has been encoded in them.

If left unchecked, the trajectory of dataism may very well be to the detriment of humanism. Thankfully, this is not a sci-fi movie or a zero-sum game. We are in a position in which we can still decide just how to handle these seemingly inevitable developments that are sprung on us from silicon-valley and co. We can contextualise and look at them as the political and socially momentous projects that they are. “Don’t get left behind” messages in this context should serve as a wake-up call. But instead of letting them induce panic and self-questioning as the advertisers appear to intend, we should treat it as a reminder to consider people at the margins and designing appropriate interventions instead of placing blame in the most inappropriate ways.


[1] For more on this, read D’ignazio, C. and Klein, L.F., 2020. Data feminism. MIT Press.

[2] See Noble, S.U., 2018. Algorithms of oppression. New York University Press.