Information is Still Power: Who will Determine Your Agent’s Access the Agentic AI Era?

Information is Still Power: Who will Determine Your Agent’s Access the Agentic AI Era?

In the silent courts of ancient empires, where torchlight danced upon gilded walls, scribes guarded scrolls as if they were the very breath of life. Monarchs and pontiffs treated knowledge as their most guarded treasure, admitting only the chosen few into their inner sanctums. A monk’s illuminated manuscript passed hand to hand; each reader earned the privilege of knowing. Yet today, in an age where data floods every device and screen, information has become almost meaningless unless you hold the keys to its use. True influence now resides not in possessing facts but in commanding the gates through which they flow, our permission architectures. These invisible frameworks determine who exercises agency, who tastes the sweetness of recognition, and who languishes in the bitterness of exclusion.

This article unpacks how human societies have always leveraged information as power, how modern organizations translate that power into programmable permission layers, and how social media, this century’s most powerful architecture of trust, segments our dopamine responses to bind us ever tighter to its algorithms. Along the way, we’ll draw upon insights from The Voter’s Guide to AI: Why Nearly Everything You’ve Been Told is False!, the forthcoming work waiting to be published on ubiquitous connectivity, "access" as the architecture of permission & trust, and reasoning agents, weaving together history, psychology, and strategy to reveal the moral imperatives for redesigning our information commons.

1. From Sacred Scrolls to Digital Gateways

Before Gutenberg’s movable type democratized the written word, most knowledge was held in fortified monasteries, royal treasuries, and patriarchal courts. Information—medical treatises, legal codes, philosophical dialogues—traveled at the pace of a messenger horse, shrouded in secrecy. A copyist’s mistake could rewrite history; a lost codex could erase entire civilizations.

Yet the ancient paradigm contained within it the seed of modern power: scarcity. By restricting who could see and record, authorities controlled not only what people knew but also how they perceived the world. Priests who alone could read scripture shaped moral codes; scholars who decoded celestial charts determined the fates of kings. The physical act of turning a parchment page was itself an exercise of power.

Contrast this with our digital age: data is everywhere, yet raw information seldom moves the needle. Terabytes of tweets, research papers, financial transactions, and surveillance feeds traverse global networks every second. In such abundance, knowledge loses its scarcity value. Instead, power has migrated to the control planes, the digital gateways that define who can query, compute, modify, or share.

“Information is abundant. Access is what makes it actionable.”

Gutenberg shattered the monopoly on text; the Internet vaporized it altogether. But as physical gates fell, new digital barricades rose: APIs hidden behind paywalls, authentication walls too opaque even for seasoned developers, policy-as-code that only specialists can interpret. In this new ecosystem, the distinction between who has data and who can use it has never been more consequential.

2. Access: The New Currency of Control

When information is everywhere, the architecture of access becomes the fulcrum of influence. Carroll’s “Seven Degrees of Access Power” reframes our understanding of modern hierarchies:

  1. Connectivity: The baseline visibility of a system, simply being able to ping an endpoint.
  2. Discovery: Knowing where data and workflows reside.
  3. Authentication: Proving you are who you claim to be.
  4. Authorization: Holding the permissions that permit basic actions.
  5. Execution: The right to run operations, trigger transactions, and enact change.
  6. Orchestration: Chaining actions across multiple systems to achieve complex outcomes.
  7. Delegation: Empowering digital agents or proxies to act on your behalf, autonomously.

Possessing only the first four leaves you a spectator: you can observe but not transform. True agency emerges at the levels of execution, orchestration, and delegation. These are the layers where decisions are made, operations are automated, and influence radiates outward through both human and machine proxies.

The analogy to ancient hierarchies is striking. Kings commanded armies (execution); chancellors coordinated campaigns across realms (orchestration); envoys and governors acted as the monarch’s proxies (delegation). In digital feudalism, those with only “read” privileges are serfs looking upon the system’s riches but denied their utility.

3. Trust Architectures and Digital Feudalism

As physical barriers crumbled, digital walls mushroomed. Today’s digital feudalism is enforced not by stone towers but by lines of code: access-control lists, role-based policies, identity brokers, and adaptive risk engines. When well designed, these systems can outpace human bureaucracies in fairness and auditability. When poorly designed, they replicate and amplify bias, consigning entire communities to the margins of the information economy.

Static Roles vs. Dynamic Policies

  • Static Roles: Traditional corporate IT often assigns permissions based on an employee’s title or department. Once granted, those rights persist until manually revoked, leading to privilege creep and insider threats.
  • Dynamic Policies: Next-generation platforms define access through context—time, location, device posture, risk signals. Permissions adapt in real time, rising and falling with the threat landscape.

Under a dynamic model, a user in New York might access sensitive data at 9 a.m. but be locked out momentarily if anomalous behavior is detected. Yet dynamic systems require transparency. When a permission is denied, the user must understand why, lest trust erode and productivity grind to a halt.

Policy as Code In high-performing data-driven organizations, access policies are no longer scribbled on flowcharts; they are written in code, version-controlled alongside application logic, subjected to automated testing, and deployed through CI/CD pipelines. The benefits are manifold: consistency, repeatability, and the ability to simulate policy changes before they go live.

However, policy as code introduces new challenges:

  • Complexity: As policies grow, so does the cognitive load on developers.
  • Opacity: Without proper documentation and tooling, even engineers struggle to decipher nested conditions.
  • Governance: Who owns the policies? How are conflicts resolved? What compliance standards apply?

If left unchecked, policy as code can become a black box, inevitably reproducing the same feudal structures it sought to transcend.

4. The Dopamine Economy of Social Reward

Human beings are hardwired for social bonding. From infancy, our brains link approval to survival. A caregiver’s smile triggers dopamine release; exclusion activates pain centers identical to those engaged by physical injury. Evolutionary pressures made social inclusion imperative: those who bonded survived, those who strayed perished.

Enter social media, the greatest orchestrator of social reward loops ever created. Platforms have weaponized dopamine by engineering three core mechanics:

  1. Notification Triggers: Each ping promises social validation. From a friend request to a comment on a post, the uncertainty of timing and magnitude mirrors the variable-ratio reward schedules known to reinforce compulsive behavior.
  2. Like and Reaction Systems: Beyond mere binary likes, modern networks use diverse reactions—hearts, applause, laughter—to fine-tune emotional feedback and quantify social standing.
  3. Follower and Share Counts: Public metrics become reputational capital, gamifying our interactions and turning our personal networks into arenas of status competition.

Through A/B testing and machine learning, platforms continuously refine which cues trigger the greatest engagement. In effect, they segment our dopamine responses, calibrating the frequency and magnitude of rewards to maximize time spent scrolling, posting, and craving the next hit of affirmation.

5. Degrees of Separation: Control Through Network Topology

Carroll’s “One-Degree World” posits that every entity—human or system—is one permission or agent away from every other. Social media brings this abstraction to vivid life through the topology of our networks:

  • First-Degree Connections: Friends, followers—trusted nodes where our self-image is cushioned.
  • Second-Degree Connections: Friends-of-friends—zones of semi-trust where shared context exists but fidelity diminishes.
  • Third-Degree and Beyond: The algorithmic public sphere where context is lost and trust is entirely managed by opaque recommendation engines.

Every piece of content is tagged with an invisible access score that influences who sees it. That score is determined by factors including:

  • Your posting frequency and engagement history
  • The reciprocity and strength of your connections
  • The platform’s broader objectives, be they growth, engagement, or ad revenue

With every share, like, and comment, platforms refine their perception of your influence footprint. Popular content receives broader distribution; fringe content is suppressed or sandboxed. Through this control of information flows, platforms steer not only what we know but also what we feel about ourselves and the world.

6. Classical Allusion: The Covenant of Fellowship

In ancient Greece, the polis was more than a physical city; it was a covenant among citizens, a shared pledge to uphold laws and participate in the assembly. Only those who met property and birth-based criteria could vote, speak, or hold office. Thus, political agency was both a right and a privilege, earned through status and oath.

Today’s digital covenants are no less momentous. Every user contract and every terms-of-service document is an unspoken pact, a covenant of connection that grants us fleeting moments of agency in exchange for our attention, our data, and our identities. Yet few pause to read these modern oaths, much less question their fairness or negotiate their terms.

Imagine if, in the ancient agora, attendees were blindfolded to the laws they must live by, handed a tablet they could not read, and told to trust the priests who interpreted them. We have effectively replicated that scenario in the digital realm, and the stakes could not be higher: our reputations, mental health, civic participation, and even democratic processes hinge on these hidden frameworks.

7. Designing Ethical Architectures of Permission

If permissions are the new bastions of power, building them with moral rigor is imperative. Then an “agentic rights” framework guides the redesign of digital gatekeeping:

  1. The Right to Inspect Users must have transparent visibility into why they receive or are denied access. Permission labels, akin to nutrition facts on food, should disclose which signals and trade-offs govern data curation and content promotion.
  2. The Right to Challenge When logic misfires—when an innocent post is shadow-banned or a benign user flagged as risky—there must be recourse. Appeals processes, automated review channels, and human oversight committees must be built into every platform.
  3. The Right to Delegate Wisely Users should configure their own proxies, personal data agents, custom filters, and trusted recommendation engines, rather than having them imposed. A civilization of self-sovereign agents empowers individuals to shape their own information diets.

Operationalizing these rights demands new tooling:

  • Permission Nutrition Labels: Human-readable summaries of policy as code, displayed alongside every data request and content recommendation.
  • User-Driven Policy Editors: Low-code interfaces enabling nontechnical users to adjust their own permission rules and thresholds.
  • Continuous Alignment Audits: Regular reviews testing whether reward loops promote well-being or exploit cognitive vulnerabilities, scored against clear ethical metrics rather than raw engagement.

8. Strategic Imperatives for Modern Leadership

CEOs, policymakers, and platform architects face an existential dilemma: to govern not only people but the logic that governs people. Four actions rise above the rest:

  1. Audit Response Latency Measure the time between a user’s request, whether a data query or a content post, and the system’s response—grant or deny. High latency frustrates users and erodes trust; near-instant response amplifies the illusion of agency, for better or worse.
  2. Codify Organizational Purpose Beyond features and roadmaps, define your company’s values in machine-readable policies. If you champion community health over pure engagement, that imperative must be encoded into your recommendation algorithms and access-control systems.
  3. Empower the Edge Hand real agency to community stewards—moderators, power users, third-party auditors—so that decisions reflect diverse perspectives rather than top-down dictates. Cultivating distributed governance can inoculate platforms against monocultural echo chambers.
  4. Institute Alignment Councils Convene multidisciplinary ethics boards—philosophers, technologists, behavioral scientists—to oversee the evolution of access policies and reward mechanisms. These councils should meet regularly, publish findings, and wield real influence over product roadmaps.

Execution of these imperatives requires more than good intentions; it demands cross-functional collaboration between engineers, designers, legal teams, and external stakeholders. The greatest risk is not malevolence but myopia—building systems optimized for narrow business metrics while eroding the very social fabric they depend on.

9. Toward a Commons of Shared Agency

Beneath every tweet, every push notification, every API request, lies a fundamental question: Who decides what we see and why? The answer is etched into invisible gates—permission lists, ranking algorithms, policy as code. If left in the hands of purely commercial actors, these gates will persist as tools of control, shaping our beliefs, moods, civic participation, and even our sense of self.

Yet we can choose another path, one that reclaims power for individuals and communities:

  • Reframe Success Metrics Elevate mental health indicators, civic engagement, and knowledge diversity above raw time-on-site and click-through rates. If a platform’s success hinges on user well-being, its architecture of permission must reflect that priority.
  • Foster Digital Literacy Teach everyone, not just technologists, how permission architectures and dopamine loops operate. Empower users to navigate networks with intention rather than drift aimlessly into algorithmic traps.
  • Regulate Access Markets Treat critical social platforms as public utilities, subject to fair-access mandates and transparency requirements. When networks become de facto town squares, they must uphold basic covenants of free expression and non-discrimination.
  • Invest in Counter-Architectures Support open-source social fabrics where users own their data, deploy their own agents, and shape their own reward structures. Decentralized protocols can insulate communities from the whims of single vendors.

By weaving these strands together, we can transform our digital commons from arenas of manipulation into ecosystems of genuine collaboration, creativity, and trust.

Conclusion

In this age, the true fortress is not built of stone or steel but of code, the laws, policies, and algorithms that grant or withhold agency. Information flows unceasingly, but access remains precious. Through meticulously calibrated reward loops, social media and enterprise systems have honed our craving for acceptance into a science, fracturing attention, self-esteem, and society itself.

Yet the same principles that enable manipulation can be reclaimed for liberation. By recognizing permission as the architecture of power and by infusing our gatekeeping systems with ethical purpose, we can reclaim the narrative of influence. The question before every leader and every citizen is simple: will we remain serfs to invisible algorithms or become architects of a new, human-centered commons of connection?

Only by redesigning our permission architectures with moral rigor can we ensure that information once again becomes a force for collective enlightenment rather than a tool of division and control. In that reimagined world, the gates we build will no longer lock us in but invite us to step into shared agency—and ultimately, shared destiny.

This is a great article and throws up many points to ponder over. I don't know where we are heading - there is a deluge of information that we need to sift through and then be able to access when needed. Often I wonder if we know too much or this is just the tip of the iceberg. The covenant of friendship in ancient Greece is similar to the digital covenant of our times. We have platforms to share content and then we analyze the public metrics. Are we looking for validation or the basic human need of acceptance/inclusion?

Cynthia Bradshaw

Technology & AI Trainer | Project Management Leader | Speaker | 25+ Years in Legal & Finance | Founder, Martha’s Vineyard HBCU College Fair

4mo

Powerful framing, Michael. As we race ahead with AI, reclaiming agency and codifying ethical access is no longer optional, it’s foundational to trust and progress. Grateful for voices pushing this critical dialogue forward. 

To view or add a comment, sign in

Others also viewed

Explore content categories