Our Rest and Choice When AI Does the Work
edition seventy-five of the newsletter 'data uncollected'

Our Rest and Choice When AI Does the Work

Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data we missed and are yet to collect. In this newsletter, we will talk about everything the raw data is capable of – from simple strategies of building equity into data+AI processes to how we can make a better community through purpose-driven analysis.


Someone asked me a brilliant question recently: Do you feel anger when you see these big tech companies doing what they are doing with their technologies, and yet we are somehow needed to use those technologies?

I extend my response to that question to say, yes, I am angry.

I am angry because

  1. CRMs and products are behaving ultra-capitalistic with quickly changing subscription fees and bands.
  2. Tech companies are introducing tiered “premium memberships” not by adding more quality to the “basic” and “premium” memberships, but by reducing services on the “basic” membership to park decent services in higher-paid bands.
  3. Big tech companies are acting way out of character in response to this current social and political moment by supporting mission, products, ideas, and spaces that are far from the values of justice or inclusive innovation.
  4. We, as day-to-day consumers, are so easily collected, measured, and segmented into these platforms, with whom we no longer share values.
  5. Not every individual or nonprofit has the privilege to immediately boycott products—and now many are wondering where they stand with these misaligned actions.

But anger doesn’t take us far.

We are living a moment in our arc of humanity where artificial intelligence is marketed as the ultimate solution for overwork. Headlines promise us more time, greater efficiency, and the gift of “focus.” Tools advertise themselves with soft colors and calming language: “Let us handle the busywork so you can get back to what matters.” The subtext is—AI will give you rest.

And now I am struggling (which is what we are exploring today)—are we truly able to rest when AI takes over?

Because if you pause and look closely, a deeper question emerges: who actually gets to rest when AI does the work, and who doesn’t?

This is not my typical theoretical curiosity. I am interested in bringing this question to you and me because it connects with equity, labor distribution, and justice inside nonprofits and civil society.

Whether this story is one of horror or magic is yet to be seen (or perhaps it belongs to both camps?), but let’s put some context of where we know we are.

AI today writes the grant application, summarizes the report, and drafts the donor letter. Yes, sophisticated or not, tools exist there. Those over-dramatic Facebook ads with people leaning back, sipping coffee, and somehow smiling with creepy-confidence…just because they used some AI tool. Great, except that for many nonprofit workers, the rest promised by automation doesn’t always materialize. It speeds up certain parts of the operation, yes, but rest is a different story. Because blank spaces on the calendar get easily filled with new expectations. 

Say a staff member who once spent ten hours preparing a report is now told, “Use AI. With AI, you can do this in two!” Except that the hours saved are not given back as downtime. They are silently reallocated to new tasks, faster deadlines, or higher targets.

Somehow, AI efficiency quietly turns into a treadmill.

And this treadmill doesn’t run at the same speed for everyone. Rest is unevenly distributed, just as it has always been in our sector.

The same org may use AI to streamline communications or strategy decks (say in higher roles), while its frontline support staff/program coordinators still stay late navigating glitches when automated systems confuse vulnerable clients. Or, larger nonprofits with budgets to invest in sophisticated tools might genuinely lighten staff workload, while smaller groups rely on free versions, spending extra labor to monitor and correct what AI produces.

And while staff may find relief from some administrative tasks, community members at the receiving end might still face the frictions of belonging — language barriers, inaccessible systems, or risks of misaligned segmentations.

In short, that promised rest of one group may translate into more work for another. In this way, rest becomes not a shared outcome, but a privilege.

Meanwhile, frontline staff and volunteers, often women or people of color, absorb the pressures of “doing more with less.” The cultural norm of overwork in the nonprofit sector deepens further when AI is introduced into this culture without clarity and intention, risking the intensification of inequities.

There is another, quieter layer here that is harder to name. AI doesn’t just promise rest; it often threatens our feeling of choice.

When technology starts making decisions on our behalf—what task to prioritize, which story to tell, which data to highlight—it begins to narrow the field of possible actions before we even notice the narrowing.

The choice has been pre-made, and we are invited only to confirm it. The design of “smart systems” is often a design of quiet containment: fewer clicks, fewer steps, fewer options.

Efficiency feels like freedom until you realize you are moving only within the lanes someone else drew.

This sense of choicelessness is not accidental.

It’s built into how technology scales. A dashboard recommends what to track. A CRM decides who to flag as “lapsed.” A chatbot phrases the thank-you note in a way that shows empathy but doesn’t feel it.

These systems are not neutral; they carry the logic of the people who created them—the funders who define success, the vendors who define productivity, the executives who define efficiency. Each click of automation can make the worker feel smaller, less necessary, less autonomous.

Yet it is vital to remember that choice never truly disappears. It only becomes harder to see. The choice to question the default. The choice to slow down when the system asks you to hurry. The choice to re-insert human judgment in places where it was quietly removed.

Unless we take deliberate steps to protect and promote staff care and curiosity, AI can become another tool of extraction, pulling labor from people’s days without reinvesting in their well-being.

This is why I want to remind you and me in this edition - our organizations supporting and composing philanthropy still have a choice—both moral and practical.

What if you treated AI not as a productivity tool, but as a redistribution tool? What if every hour saved through automation wasn’t absorbed into the endless backlog of tasks, but shared between people and the mission? What if completing 30 (instead of 15) tasks from a 40-task list is not the default reason/pride to use AI?

Imagine an organization that declares: “Every hour AI saves us, we return half to our people and half to our community.” That would mean AI cutting report-writing hours in half, and staff using the extra time to leave early on Fridays or to spend time in real conversation with partners. AI could summarize meeting notes not so leaders could cram another meeting in, but so every staff member could reclaim 30 minutes a week as genuine rest.

Choicelessness can fade the moment you name a new possibility.

But naming it will require our courage—because choice comes with responsibility. It means leaders must decide what boundaries to set around AI’s use, how to measure not only output but human well-being, and how to protect staff from the assumption that efficiency should always equal more work. It also means asking whether the time you save on your end creates new barriers for the communities you serve.

At the heart of this is a need for recognition that rest and agency are not indulgences but forms of equity.

Rest is not a reward for productivity; it is an indicator of trust. And choice - choice, my friend, is the foundation of dignity. Both are must-haves if we want systems that do not automate decision-making without accountability.

I say when technology says, “This is the only way forward,” your quiet resistance—through the act of asking, "Why this way? Who benefits?"—will what keep us human.

The next time someone says, “AI will give you more time,” pause and ask: "Where will that time be used? Can it offer me rest in the ways I need rest? Will that form of rest still allow me to keep my choices?"

****************************************************************

*** So, what do I want from you today (my readers)?

Question for you: how do you define your rest if/when AI can do parts of your work?


I think it's SO IMPORTANT to think about who benefits from generative AI and who is at-risk or even harmed. Time, environmental impact, lost wages, privilege - such a great reminder as companies invest billions to move as fast as possible.

Great insights Meenakshi (Meena) Das. Using AI to give people quality time back rather than using it to create an even bigger hamster wheel of continuing to do even more with less.

To view or add a comment, sign in

More articles by Meenakshi (Meena) Das

  • Finding "We" in Personalized World.

    Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data…

    4 Comments
  • Collecting Data in a World on Fire.

    Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data…

    7 Comments
  • Tech Is Doing It's Job. Are We?

    Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data…

    2 Comments
  • "Human In The Loop" Is Not Enough

    Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data…

    11 Comments
  • To See AI Do Good, Build a Better World Around It. Full Stop.

    Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data…

    11 Comments
  • We Are The Data.

    Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data…

    6 Comments
  • The Bridge Between Anger and Kindness? That’s Uncertainty

    If I have to describe my community, my circle of trusted people, my tribe in a single sentence – they are all humans…

    8 Comments
  • Data, Joyful Resistance, and Progressive Philanthropy

    Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data…

    8 Comments
  • Can AI cause Tech Trauma? – Part Two

    Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data…

    2 Comments
  • Can AI cause Tech Trauma? – Part One

    Welcome to data uncollected, a newsletter designed to enable nonprofits to listen, think, reflect, and talk about data…

    4 Comments

Others also viewed

Explore content categories