Al Newkirk
Project

Filtered-For Content.

An architecture for a new kind of social media platform — one that eliminates algorithms, moderators, and privileged user hierarchies by giving every user control over their own experience.

Whitepaper

Filtered-For Content (FFC) — RFC

Deep Dive
0:00

01 — Definition

What is Filtered-For Content?

Filtered-For Content is an architecture for a new kind of social media platform built on three commitments: users control what they see, content spreads only through deliberate human action, and every account operates under identical rules. There are no moderators deciding what is acceptable. There is no algorithm deciding what is important. There are no special users who get different treatment. The platform provides the infrastructure. The user makes the choices.

02 — The Core Model

Three commitments. No compromise.

Follow-Based Delivery

Your feed contains content from the accounts you follow — in reverse chronological order. No public timeline. No trending page. No "recommended for you." The only way content from an account you do not follow enters your feed is if someone you follow reposts it.

Filtering Instead of Moderation

Every post carries structured content tags from a controlled vocabulary. Users set inclusion and exclusion rules to shape their feed. The platform does not decide what is appropriate. You do. Exclusion rules are absolute — boundaries are hard, not suggestions.

User Equality

No verified accounts. No blue checkmarks. No influencer tiers. No moderator roles. No algorithmic boosts for popular users. Every account operates under identical rules, with identical capabilities, and identical treatment. Influence is earned through content, not granted by the platform.

03 — What FFC Replaces

The architecture we discard.

  • Centralized moderation. Inconsistent at scale, politically fragile, and built on an underclass of traumatized workers reviewing the worst content the internet produces.
  • Algorithmic amplification. Engagement-based ranking that rewards outrage, divisiveness, and false news — because those generate the most clicks.
  • User hierarchies. Verification badges, influencer tiers, and moderator power that distort the platform in favor of whoever the platform decides matters most.
  • Engagement-driven advertising. A business model that turns user attention into the product, creating a structural conflict between profitability and well-being.

04 — What FFC Produces

The results that follow.

  • User control. Every piece of content in your feed arrived because you or someone you trust chose to put it there.
  • Structural anti-abuse. Follow-based visibility, username economics, rate limiting, and behavioral friction make spam operations expensive and low-reach — without a single moderator.
  • Aligned economics. Revenue from username leasing, paid content persistence, and opt-in advertising — not from maximizing time on screen.
  • Ephemeral by default. Content expires unless you choose otherwise. No permanent archive to weaponize. The platform is a stream, not a vault.

05 — Key Mechanisms

How the architecture works.

Discovery works without algorithms — through public lists, discovery pools, random profile browsing, and organic syndication. You find things through people you trust, through browsing, through shared interests, and through deliberate exploration. The entire podcast ecosystem proves this model works at massive scale.

Abuse is handled through structural friction, not surveillance. Five compounding layers — follow-based visibility, username cost barriers, follow rate limiting, human verification, and behavioral friction — make spam operations simultaneously expensive, slow, low-reach, and operationally fragile. A 10,000-account spam network costs roughly $100,000 per year and reaches almost nobody.

Every post has a time-to-live. Free content expires in 24 hours after mass adoption. Paid tiers extend that to 30 days, one year, or permanent. The platform stores a fraction of what traditional platforms accumulate — and the revenue to pay for storage arrives with the demand.

FFC is bot-friendly and API-first. Automated accounts, scheduled posts, batch operations, and full API-driven management are first-class features. The distinction is not human vs. bot. It is consent vs. abuse.

06 — The Argument

Why this matters.

Social media is broken — not at the policy level, but at the architecture level. Centralized content moderation, algorithmic amplification, and privileged user hierarchies are not bugs in the system. They are the system. Reform within the current model has been tried. Every major platform has invested billions. The problems persist because the problems are the architecture.

FFC is not theoretical. Every component described in the whitepaper has precedent in production systems serving millions or billions of users. Follow-based feeds powered Twitter to 310 million users. Username leasing has operated for over thirty years in DNS. Ephemeral content built Snapchat to 750 million monthly active users. The question is not whether it can work. The question is whether we are willing to build it.