Agora
A Constitution for the Digital Age
Draft v0.6 — March 2026
Not in the fine print. In the agora.
Preamble
“men have become the tools of their tools” - Henry David Thoreau
Technology is more powerful than ever, and yet it is often challenging, intrusive, or even dangerous. An unchecked market facilitates a predictable pattern: a useful tool gets monetized and the product suffers, resulting in violations of basic human dignity.
- Your data gets sold.
- The app breaks.
- What was once free becomes a subscription.
- Kids harm themselves.
The line is pushed up until – and often beyond – what the limited legal statutes require.
Complex user agreements, pop up windows, “essential cookies” and illegible terms and conditions all feign at consent.
To rein in the power of technological systems this “techno-social contract” delineates a rightful relationship between a human user and the system itself.
Constitution
We the people, have a fundamental right to consent to power, whether that power resides with a government or a technological system.
Technological systems have power over people in many fundamental ways:
- They identify you — deciding who you are in the system, or whether you exist at all
- They track you — recording what you do, where you go, and when, and the resulting data can be lost, locked, sold, or erased often without your consent
- They decide for you — algorithms determine what you see, what you qualify for, what you’re denied
- They distract you — shaping your attention, your urgency, your priorities
- They don’t let you leave — making departure difficult, punitive, or impossible, trapping your data and your presence
- They answer to no one — when systems fail, remedies may be hard to come by
A technological system does not have powers by divine right, and any power it claims should be legible and accountable.
A system should acknowledge its subject as a human person rather than a user.
An agreement written on behalf of those subjected to the power of the system, rather than from within the system itself, is the proper way to ensure a just interaction.
Those that build and control powerful systems, be they individuals, entities, organizations, companies, or governments, should be held accountable for the systems they build, and should answer to the people when these systems fail to honor these rights.
The following articles delineate our rights in interacting with a technological system.
Articles
I — Identity
Identity is arguably the most important distinction made in the world of technology. When a system identifies an individual, their identity becomes part of the system.
A system that identifies a human individual must acknowledge, affirm, and act in accordance with their rights as set out in this document.
Typically a system uses language that identifies individuals as “users”. The system must instead recognize the human as an able actor, capable of consent, and with unalienable rights.
II — Record
When a system tracks the actions, movements, or transactions of a person, it creates a record. That record is not the property of the system. It is a shared artifact of the relationship, and the person it describes has rights over it: the right to see it, the right to challenge it, and the right to understand how it is used.
III — Restraint
A system must acknowledge the boundaries of what its purposes require. Not everything about a person should be captured, categorized, or made legible.
A system has an obligation to not know certain things. The right to opacity exists in many contexts.
A system must operate within a bounded and clearly defined framework extending only to its core purposes and the alignment of those purposes with the rights of the people that it serves and interacts with. These core purposes should default to a narrow scope, and justify this scope through honest disclosure.
IV — Transparency
A system must present itself honestly to the people it serves. Its rules, its logic, and its actions must be legible to those it affects.
When a system makes a decision that impacts a person — approving, denying, ranking, recommending, restricting — that person has the right to understand what happened and why. When a system changes its rules, those changes must be communicated clearly and in advance.
A system that cannot explain itself to the people it serves has no legitimate authority over them.
V — Decision
When a system makes a determination that affects a person’s outcomes, opportunities, or standing, it exercises judgment. That judgment carries weight regardless of whether it was made by a human or an algorithm.
This article does not concern the general design choices inherent in any system. It concerns the specific determinations a system makes about an individual person that affect their rights, opportunities, or outcomes.
A system must not delegate consequential decisions to automated processes without clear justification and meaningful human review. A person has the right to know when a decision about them has been made by an automated system, and the right to have that decision reviewed by a human being.
The power to decide is not neutralized by the complexity of the system that exercises it. An algorithm is not an excuse. It is a choice made by the people who built and deployed it, and it must be answerable as such.
VI — Attention
A system that notifies, alerts, or otherwise demands the attention of a person exercises a form of power over that person’s time, focus, and mental energy.
This power must be exercised with restraint and respect. A person has the right to control when and how a system may demand their attention.
Notifications must serve the interest of the person, not the engagement metrics of the system. A system must never exploit human psychology to manufacture urgency, compel interaction, or prevent departure.
The attention of a human being is a faculty to be respected.
VII — Exit
A person has the right to leave any system, for any reason, at any time. This right is unconditional and does not require justification.
When a person chooses to leave, the system must provide a clear and accessible means of departure. Upon request, any records pertaining to the person must be either returned to them in a readable format, permanently removed, or — where retention is required by law or public obligation — clearly disclosed and minimally scoped.
Departure from a system must not be punitive, prohibitively difficult, or effectively impossible.
VIII — Accountability
Those who build, own, and operate systems that fall under this document bear an obligation that cannot be delegated to the technology itself.
A system must identify the entity responsible for its operation. A person affected by the actions of a system must have a means of reaching a responsible human being — not solely an automated process. When a system causes harm through error, negligence, or design, there must be a clear path to remedy.
No system may claim that its complexity, scale, or automated nature absolves it of responsibility to the people it serves. The obligation follows the power. Where the system exercises power, the operator answers for it.
Enforcement
This constitution will only be enforceable when people demand it of corporate, political, and social organizations.
It begins with a commitment by those who agree with its principles.
Agora Declaration
Displaying a blue shield “agora” icon represents a public declaration that an organization, system, or individual is committed to building software based on the principles in the Agora constitution.
License
This document will be maintained as a living text, revised through open and transparent process, and made freely available to all who wish to adopt, implement, or build upon it.
It is licensed under Creative Commons Attribution-ShareAlike 4.0 International (CC BY-SA 4.0).
You are free to share and adapt it for any purpose, provided you give attribution and distribute any derivatives under the same license.
(See LICENSE.md or https://creativecommons.org/licenses/by-sa/4.0/)