Andre Franca

When Technology Serves (Not Controls)

Have you ever asked yourself about the difference between technology that works for you and technology that quietly works against you? It's not always obvious at first. A tool can seem helpful, even essential, until you realize you've handed over more control than you meant to. The line between servant and master is surprisingly thin in our digital lives.

Take my phone, for instance. For years, I thought I was managing it just fine. I'd check notifications when they came in, respond to messages promptly, keep up with everything. I felt productive, connected, available. Then one day I caught myself reaching for it during a conversation with someone I actually cared about, and I realized the truth: I wasn't managing my phone. It was managing me. Every buzz and banner was a tiny interruption, a small theft of attention that added up to something significant. I was living in reactive mode, responding to whatever demanded my focus rather than choosing where to direct it.

The shift came when I started asking a simple question about every piece of technology I use: does this respect my agency, or does it assume I have none? Most social media platforms, for example, rarely respect agency. They're designed to maximize engagement, which is a polite way of saying they're designed to keep you scrolling even when you'd rather be doing something else. The infinite feed isn't a feature for your benefit. It's a feature that benefits from you. Long ago I deleted most social accounts and apps from my phone not because I'm anti-technology, but because I'm pro-me. Mastodon, for instance, I still use it occasionally on my computer, where the friction of opening a browser and typing a URL gives me just enough pause to ask myself if this is really what I want to be doing right now.

Email is another interesting case. I spent years with notifications turned on, treating every incoming message like it deserved immediate attention. The problem wasn't email itself, but the expectation it created that I should always be available. I turned off notifications a few years ago, and it felt radical at the time. Now it feels obvious. I check email when I choose to check email, not when someone else decides to send me something. The difference is subtle but profound. I went from being interrupted constantly to being in control of my own attention. Nothing terrible happened. The urgent messages turned out to be less urgent than they claimed. The world kept turning.

I've become particular about the tools I invite into my life. I use a plain text editor for writing instead of something with fonts and formatting and internet connectivity, because I don't want my writing environment trying to be helpful in ways I didn't ask for. I use a password manager because remembering dozens of complex passwords is a job I'm happy to delegate to software. I use a GPS when I'm genuinely lost or need to look at traffic conditions, but I try to learn the route so I don't need it next time. The pattern is consistent: I want technology that does what I tell it to do, not technology that tells me what to do.

The sneakiest tech isn't obviously controlling. It's the stuff that presents itself as convenience while quietly shaping your behavior. Recommendation algorithms are like this. They seem helpful, suggesting things you might like based on what you've liked before. But they also trap you in a bubble of sameness, showing you more of what you already know and less of what might surprise you. I've started deliberately seeking out randomness, browsing physical bookstores where algorithms have no power, asking friends for recommendations instead of letting Netflix decide what I should watch next.

Smart home devices are fascinating in this regard. I have none, though. That's a level of ambient surveillance I'm not comfortable with, no matter how convenient it might be to ask a disembodied voice for the weather. The convenience isn't worth the trade-off in privacy and agency.

It's easy to drift into dependency without noticing. Technology is seductive. It promises to make life easier, and often it does, but there's always a question of what you're giving up in exchange. I don't think the answer is to reject technology entirely. That's not realistic or desirable. The answer is to be intentional about which tools you use and why. To remember that you're in charge, not the software. To choose tools that enhance your capabilities without diminishing your autonomy. It's a constant negotiation, and it requires paying attention to how technology makes you feel and behave over time, not just in the moment of initial convenience. When technology serves rather than controls, it fades into the background, becoming an extension of your intent rather than a distraction from it. That's the kind of relationship with tech I'm trying to cultivate, one deliberate choice at a time.

Previous

COP 30: Um Vexame Global

Next

Small Web, Big Voice