The Department of Defense recognizes that public data is a growing operational risk. The problem is not novelty. What has changed is scale and automation. Personal devices, sensors, commercial telemetry, and third party data aggregators now produce a stream of traceable signals that adversaries can collect, fuse, and exploit at machine speed. The recent GAO analysis of DOD’s information environment made that point explicit: aggregated digital footprints can expose personnel, reveal patterns of life, and jeopardize operations, and multiple DOD components are not consistently training or completing required security assessments to address those risks.
This failure is not hypothetical. Public datasets and innocuous applications have leaked operationally meaningful signals before. Fitness tracker heatmaps once made forward operating bases visible to anyone who looked. That incident is a clear illustration of how benign consumer services can become operational security liabilities when aggregated with other public sources.
At the same time a commercial market for location and behavioral signals has grown into a multibillion dollar industry. Companies harvest app telemetry and sell it to advertisers, researchers, and sometimes government customers. Regulators have taken enforcement actions against brokers for trafficking in sensitive location data, which underlines the reality that these commercial streams routinely include visits to military installations and other protected sites. The market dynamics mean that even if DOD stops publishing some categories of information, private-sector collectors and resellers will often fill the gap unless contracting and policy change in parallel.
Why the shortfall matters
The GAO found three core gaps that produce systemic risk: policies and guidance are fragmented and narrowly scoped, training rarely covers the full breadth of security areas, and required threat assessments are incomplete or absent across many components. When training focuses only on classic operations security and assessments omit force protection or insider threat considerations, DOD loses the ability to see how open sources can be fused into practical targeting information. That is precisely what an intelligent adversary needs.
Beyond governance, technical exposures are plentiful. Commercial telemetry, device metadata, unredacted press releases, and sensor feeds can all leak indicators that, when aggregated, form a usable digital profile. The risk is amplified by a robust secondary market for movement and location data and by the archival nature of digital footprints. A breached or sold dataset can persist and be reanalyzed years later with newer analytic tools.
What meaningful fixes look like
Fixing this problem requires a layered program that treats public data threats as a cross cutting mission assurance challenge. The following are practical, actionable priorities that DOD should pursue immediately.
1) Create a single cross-functional governance baseline for public data risk. Assign a senior executive to coordinate policy across security, acquisitions, and operations. That body must inventory all public-facing data streams produced or relied on by DOD, from press releases to telemetry from test ranges and public APIs. Policies should identify which datasets must be embargoed, sanitized, or time-lagged before release. GAO specifically recommended departmentwide assessment and collaboration to close the current policy gaps.
2) Mandate comprehensive, scenario-driven threat assessments across four security areas: force protection, insider threat, mission assurance, and operations security. These assessments should model how adversaries could combine public signals with commercial data. They must be repeated regularly and after major program or reporting changes. The GAO found these assessments were missing or incomplete in most components.
3) Operationalize OSINT and red teaming. Every major command and program office must run adversarial OSINT campaigns that attempt to build actionable profiles from only public sources. These red teams should include specialists in data broker markets and pipeline reconstruction. The output must feed requirements for data minimization, publication hygiene, and scheduling changes. This will surface brittle disclosure practices before an adversary does.
4) Harden publication processes. Strip sensitive metadata from documents and media before release. When movement or location data has operational value, prefer aggregated, delayed, or sampled releases rather than real time feeds. Where public interest demands transparency, consider controlled portals with authentication and logging instead of open APIs. Press and outreach guidance should include minimum sanitization rules vetted against the red team findings.
5) Treat sensor and platform telemetry as a data classification challenge. Many platforms and test ranges emit telemetry that is useful for development but harmful if public. Require privacy preserving telemetry modes for commercial off the shelf components and embed data tagging into procurement contracts so vendors must support controlled disclosure. Put simple switches into fielded systems that allow telemetry to be tagged and retained only in secure enclaves.
6) Reform acquisitions to manage downstream data risk. Contract language must prevent contractors from buying or using sensitive brokered datasets that include military locations or personnel indicators unless the purchase is explicitly authorized and controlled. Contracts should require suppliers to disclose third party data sources and to demonstrate how they remove sensitive locations from commercial feeds.
7) Engage regulators and industry to limit abuse of commercial data. Work with federal partners and allies to press for rules that restrict sale of high risk location datasets, and to create stronger disclosure rules for brokers. Recent enforcement by the FTC demonstrates that regulators can and will act when brokers trade sensitive location data. DOD should lend technical expertise and incident data to support stronger industry norms and rules.
8) Train to the new reality. Update training for service members, civilians, contractors, and families so it covers modern digital footprints and the mechanics of aggregation. One-off posters and slogans are not enough. Training must be scenario based, repeatable, and targeted to roles from leadership to family readiness coordinators. The GAO emphasized the need for broader, consistent training across components.
9) Build detection for aggregated profiling. Invest in analytic capability that scans open sources and commercial streams for emergent profiles that include DOD elements. This is not about surveillance of the public. It is about detecting when public data externally constructs a profile that maps to a DOD target set. Such detection can trigger remediation of the underlying sources or operational mitigation.
10) Rapid response and remediation playbooks. When a dataset is discovered that endangers personnel or operations, there must be a rapid cross-agency playbook that includes takedown requests, legal escalation, family support, and operational hardening. Time matters. Archival data should also be assessed and, where feasible, mitigated.
A final note on realism
None of these actions are silver bullets. Commercial markets will continue to trade data and technology will only make aggregation more powerful. The goal is to make exploitation harder, more expensive, and more detectable. That requires structural change. DOD has the authority and the tools to begin. The GAO’s findings provide a roadmap for where to start. Concretely implementing cross-cutting governance, extending training, tightening acquisition language, and investing in analytic detection will reduce the department’s attack surface in the public domain and help protect personnel and missions in the age of ubiquitous data.