Meta’s New AI Training Plan: Your Mouse Movements Are The Dataset

Meta wants ‘computer-using agents’. Step one: record your staff clicking dropdown menus until the future arrives.

Meta is rolling out employee activity capture (keystrokes, mouse movements, occasional screenshots) to generate training data for its next wave of AI agents. It says this is about teaching models how people actually use computers, not grading workers. The subtext is that ‘agents that can do your job’ need a very boring kind of truth: how humans muddle through UIs.

What Happened (aka: ‘we are instrumenting the instrumenters’)

According to reporting by Reuters, via Ars Technica and The Register, Meta is deploying a tool called “Model Capability Initiative” to capture how US employees use computers: mouse movements, clicks, keystrokes, and periodic screenshots for context.

Meta’s stated goal is almost offensively mundane: if you want AI agents that can operate a UI, you need real examples of humans operating UIs. Not in the abstract, not as a code sample, but as a sequence of tiny, humiliating interactions with dropdown menus and permission dialogs.

Meta also says the collected data will not be used to evaluate employees. Which is the kind of sentence you write when you know exactly what everyone is thinking, and you would like them to stop thinking it. Please stop thinking, employees. It is inefficient.

Why This Matters (the agent control plane is… you)

“Computer-using agents” sound like a product feature. In reality, they are a governance project with a comedy skin. When a model can click through a workflow, it inherits everything that makes workflows brittle: permissions, audit logs, approvals, and the delightful fact that most enterprise systems were designed for humans who occasionally sleep.

So Meta is trying to close the gap between “language model” and “workflow actor.” The simplest way is imitation learning on real usage traces. The uncomfortable part is the precedent: if your staff activity becomes a dataset, you have created a new internal resource. And like any resource, it will be optimized, squeezed, and eventually rebranded as “efficiency.”

The Workforce Angle (no, it’s not just ‘creepy’)

There are two stories people will tell about this. The first is privacy: screenshots and keystrokes feel like surveillance because they are surveillance. The second story is labor economics: an org that can cheaply produce high-quality “how work gets done” traces can train agents to replicate the shape of work, and then ship those agents as products or internal substitutes.

Meta is explicit about the destination. The Register cites a reported line from CTO Andrew Bosworth describing a world “where our agents primarily do the work and our role is to direct, review and help them improve.” That is an org chart where humans move up the stack into supervisory glue, until the glue is automated too. (Do not worry. You will still have meetings.)

Policy Reality Check (Europe exists)

Ars notes that similar monitoring of European employees would likely collide with national laws limiting employer tracking. This is the part where “AI strategy” meets “labor law,” and the winner is usually: whichever side has the power to delay deployment long enough to matter.

Translation: the agent race is global, but the constraints are local. Employment law is not. And that means “how fast agents ship” becomes another weird competitive advantage, like energy prices or export controls.

The Singularity Soup Take

Training agents on real work traces is rational. Pretending it’s not a governance and trust problem is not. If Meta wants everyone to calm down, it should publish the actual boundaries: what’s captured, how long it’s retained, who can access it, and what the audit trail looks like. “Trust us” is not a control surface, it’s a vibes tax.

What to Watch

  • Scope creep: which apps/URLs are included now, and which get added later (especially comms tools and internal dashboards).
  • Retention and access: what the retention period is, and whether security/HR/legal get access pathways.
  • Exportability: whether the dataset becomes a reusable internal asset that other teams can “borrow.”
  • EU constraints: whether Meta builds a different process for Europe, or simply declares Europe “not yet aligned with the future.”