On tonight's doomscroll02 May 20254 MIN

My algorithm knows me better than my therapist

But does it really? Or is it just whispering vapid, predictable, and echo-chamber-y nothings?

Image

Monica Loya. “Face filter”, Óleo sobre lienzo, 50x70cm

Three days into my breakup, I notice that my For You page has metamorphosed. Gushy reels about couples slow dancing to Florence + The Machine’s ‘Never Let Me Go’ (which I would immediately share with the former love of my life) have been swiftly replaced by handwritten quote cards that say things like “You’re too full of life to be half loved by someone”. Damn right, I think to myself. My next session isn’t until Sunday, and my therapist is blissfully unaware of the avalanche that’s rushing towards her. But my algorithm is all caught up. Perhaps it’s the quick scroll-aways from twosome content, the moving of my ex’s chat to the ‘General’ tab, or the nonchalant double-tap on that one post about get-over-him getaways—I can’t really tell.

Born in a generation characterised by early and extensive exposure to the internet, smartphones, and social media, I am no stranger to how the math works. This is not Black Mirror-level paranoia of the digital world. In fact, more often than not, it’s comforting to have a behind-the-scenes librarian of sorts pick up on the subtle cues and provide what I need the most, as I need it. Hooked on a recipe series? Part 7 is on my feed as it is published. Obsessed with a new stand-up comic? She appears at tasteful intervals with a segment from her set. Fixated on the latest tiff between two A-listers? I am offered the hottest take coming from that guy in Tasmania or that new development my favourite anon gossip handle has worked out. When it works, it’s almost magical.

Research by Global WebIndex finds that 63.9 per cent of the world’s population uses social media every day for two hours and 21 minutes on average (February 2025). It is undoubtedly gratifying to have three hours of your day populated with curated pieces of content, cherry-picked from the chaos of the Internet. Yet, the algorithm doesn’t work remotely as well as those real-life tastemakers who have nourished my brain and helped me form nuanced opinions in the past. It doesn’t introduce me to the epicurean importance of pleasure when I have been fawning over the stoic glory of hardships, like my college professor once did. Or throw me a curveball, like my mother did when she made me exclusively read Indian fiction for almost a year as she felt my narratives were becoming too Eurocentric.

“That’s because the algorithm’s real job is not to recommend,” a friend who works at Meta, and wishes to remain anonymous, tells me. “It is to reinforce.” The filtering system that runs our feed feeds us only a combination of what we ask for and more of what we take a bite of. From our steadfast political opinions to our niche interests in fashion, technology, culture, sex, beauty and Pedro Pascal, it gives us more of what we already enjoy (or pretend to) without challenging our worldview. “The one job that the algorithm is made for is to keep you on the app, not make you feel uncomfortable and jump off,” my friend explains. It is therefore not surprising that while we find community in like-minded people and build para-social relationships with strangers, content creators and celebrities, we lose out on varying and complex perspectives. Our thoughts, ideologies and emotions are incessantly validated, creating a silo where everything else feels like a naked threat.

Don’t get me wrong. I love my screen time and am acutely aware of the convenience of the internet. Apart from the riveting deep dives and rabbit holes, I also enjoy the power to, say, look up (read: stalk) my romantic interests, catch up on my quota of Katy Perry in space memes, or reverse Google image search a vintage Bottega Veneta to add to my wishlist. I am also the first to mildly scream “okay boomer!’ in my head at anyone cribbing about how the World Wide Web has ruined us, because in many ways it has only made us better. But when we have access to so much, and yet allow algorithms to decide what we consume, we risk developing thoughts that are not nuanced at the very least, and that are not our own at the very most.

When we have access to so much, and yet allow algorithms to decide what we consume, we risk developing thoughts that are not nuanced at the very least, and that are not our own at the very most.

When was the last time you typed out a URL to read articles on a specific website without a call-to-action luring you in? Verified a nugget of history that a hastily filmed video narrated to you as you doomscrolled the night away? Double-checked if your hate for that actor is second-hand? As we consume neatly packaged information presented within comfortable ecosystems by giant tech corporations, we are more algorithmically complacent than ever—our preferences and opinions ready to be sold in ziplocked data sets, not meant to be multifaceted or paradoxical at all.

Our favourite platforms do give us masked options to opt out. Instagram has the ‘not interested’ button, while YouTube has settings to stop tailored suggestions. But that doesn’t solve the problem. It makes your feed look random while continuing to filter content according to location, age, sex and other hidden data points. “Algorithms, originally designed to systematically address problems—whether it’s finding the shortest path, sorting data, or making predictions—are now solving an issue heavily fuelled by the need to homogenise and capitalise on the human mind one way or another. It is doing exactly what needs to be done—creating spaces where everyone thinks in extremes, and consumes recklessly,” says Delhi-based psychologist Adrija Aggarwal.

I muster up the courage to talk to my therapist a week after my heartbreak. I mention the reels and their sudden tonal shift. Even though the spiteful quotes are absolute gems, I complain about how I would still like to see videos of couples that give me hope of rediscovering love. She hits me with a time-tested banger: “And how does that make you feel?” Often badgered as tired mental-health speak, I can’t think of a question more important in a world that is riddled with automated intelligence. If apps on our phones know us better than our kin, and if the machines curate our point of view, all we have left as a metric of our truth is how we feel about what we see. That makes me feel mysteriously human, if not somewhat optimistic.

The Nod Newsletter

We're making your inbox interesting. Enter your email to get our best reads and exclusive insights from our editors delivered directly to you.