visions) received their content through the air while devices that wanted to be mobile the desired behavior. You don’t look into the mind of the human subject nor into the struc- ture of the self-modifying algorithm, you just (telephones) received their content over fixed look at the objective reality of their behavior, cables. MIT’s Nicholas Negroponte predicted per se. This is not so much humanity’s purpo- that information going through the cables sive creation of an intelligence but rather an (phone calls) would go through the air and unforced error of assumption that a utile intel- information going through the air (televi- ligence will appear if given enough training sets. sion programs) would be delivered via cable.1 It was three years ago when the count of Negroponte called this “trading places.” He networked devices exceeded the count of was right, and the effect was profound. human beings.3 Qualcomm’s Swarm Lab at UC Everywhere the talk is about “big data” and Berkeley predicts 1,000 radios per human by how much better an instrumented society will be. 2025, while Pete Diamandis’ book Abundance The cumulative sum of the curves for computing, calls for 4531012 networked sensors by 2035. storage, and bandwidth is this: In 1986 you could These kinds of scale cannot be supervised; they Daniel E. Geer Jr. fill the world’s total storage using the world’s total can only be deployed and left to free-run. If any In-Q-Tel bandwidth in two days. Today, it would probably of this free-running is self-modifying, then the take something over nine months of the world’s concept of attack surface is just plain over as is total bandwidth to fill the world’s total storage,2 the concept of trustworthy computing, at least but because of replication, synchronization, and as those two are presently understood. Their sensor-driven autonomy, it is no longer really data inputs are what control them, not their possible to know how much data there is. Deci- code. Protecting confidentiality when data is sion making that depends or depended on know- coming from 103 radios per person is as irrel- ing how much data there is is over. evant as it is infeasible, but protecting its integ- Nevertheless, it is clear that the future will rity had better be doable, and all the more so if be data rich and that the tools acting on that algorithms are data-fueled black boxes that are data will be dual use. The classic triad of cyber- not obligated to give you an answer when you security has long been confidentiality, integrity, ask why they made such and such a decision. and availability, and we have heretofore pri- oritized confidentiality as the pinnacle goal of cybersecurity, especially in the military sector. That will not be the case going forward, and not N egroponte’s “trading places” was the story that defined the previous decade. Cybersecurity’s “trading places” will be the just because the rising generation has a relaxed story that defines next one. complacency about the tradeoffs inherent to information sharing. In the civilian sector, References integrity will supplant confidentiality as cyber- 1. G. Gilder, “Into the Telecosm,” Harvard Busi- security’s pinnacle goal. In the military sector, ness Review, Mar./Apr. 1991; https://hbr.org weapons against data integrity already far sur- /1991/03/into-the-telecosm. pass weapons against data confidentiality. 2. M Hilbert, “World’s Information Capacity PPTs,” This trading places—this eclipse of confi- www.martinhilbert.net/WorldInfoCapacity dentiality by integrity—is solidly underway. PPT.html. Already algorithms learn rather than being 3. D. Geer, “Implications of the IoT,” USENIX; taught. What they learn depends on what they login:, Dec. 2016; geer.tinho.net/fgm/fgm.geer are fed and how their learning is scored. This is .1612.pdf. behavioral reinforcement of a form that would be entirely familiar to B.F. Skinner—you don’t Daniel E. Geer Jr. is the chief information teach the subject the desired behavior, you security officer of In-Q-Tel. Contact him at reward the subject for accidentally exhibiting dan@geer.org.