I’ve Seen People Misuse Intelligence Before. It Never Ends Well.
The intelligence “customer” has a responsibility to speak and act with humility.

IS IRAN’S NUCLEAR WEAPONS PROGRAM completely “obliterated,” as President Trump claims? The intelligence is still coming in—including from a leak of an intercepted phone call between two Iranian government officials apparently relieved that the American strike wasn’t as effective as they feared. On Sunday, the director general of the International Atomic Energy Agency said that Iran’s nuclear program had suffered “severe” but not “total damage” and warned that uranium enrichment could resume in “months.”
The fate of the Iranian nuclear program is, of course, concerning. And President Trump may yet be proved right. But the handling of intelligence is a very serious, delicate thing, and the rush by so many leaders—political leaders, intelligence leaders, military leaders, and others—to take first reports as the whole truth is deeply disconcerting. That’s how people get killed.
In the lead-up to the 2003 Iraq War, two now-infamous phrases came to define the failure of early intelligence: “slam dunk” and “pockets of resistance.” The former, uttered by CIA Director George Tenet in a high-stakes White House briefing, assured the president that the case for Saddam Hussein’s weapons of mass destruction was airtight. The latter came from Secretary of Defense Donald Rumsfeld after U.S. troops entered Baghdad, when he denied that there was a guerrilla campaign against the American-led coalition and insisted peace was around the corner for months and months.
Neither statement held up over time.
They were not just premature—they were catastrophically wrong. In both cases, early intelligence was used not to confirm, deny, or question assumptions that would guide better decisions, but to endorse desired outcomes. And our military paid the price.
Dealing with intelligence is an old problem. Writing in the early nineteenth century, Carl von Clausewitz observed that “Many intelligence reports in war are contradictory; even more are false, and most are uncertain.”
Clausewitz understood that intelligence is not a crystal ball. It is a tool—fragile, incomplete, and distorted by fear, friction, and political interest. In short: intelligence is fleeting, so treating it as divine truth, especially in the early stages of any war, is a grave mistake.
To borrow a phrase, intelligence is useful not because it is easy, but because it is hard. Gathering information that is intentionally kept secret is difficult and dangerous. Interpreting that information in ways that are honest, helpful, clear, and relevant requires training and a healthy competition of ideas. Using intelligence, as military and political leaders are charged with doing, requires judgment and humility.
I’ve heard it said that there are only intelligence failures and only policy successes: When things go wrong, the intelligence officers get blamed because, working in the shadows, they can’t defend themselves. But when things go right, policymakers take victory laps, even if their ability to make the right decision was the result of excellent intelligence work.
The reality is more complicated. History is replete with examples of intelligence producers and consumers abusing the process, misusing the product, or just plain getting things wrong. Even in my own career in uniform, I saw it on multiple occasions.
INTELLIGENCE FAILURES ARE NOTHING NEW. Just to pick a handful of examples:
In 1961, as the newly inaugurated President John F. Kennedy made dramatic, last-minute changes to the Bay of Pigs operation (including, among other things, by moving the invasion site to Fidel Castro’s favorite fishing spot), the CIA failed to give Kennedy updated estimates of success. Having promised that the original plan to land CIA-trained Cuban exiles on the island would spark a domestic revolt against Castro, the VIA never warned him that the changes to the plan seriously imperiled the invasion force, and with it the likelihood of any strategic success. The exiles were crushed, and the Cuban people never rose. Intelligence had mistaken hope for analysis.
In 1967, the military and the CIA disputed the number of enemy combatants the United States was facing in Vietnam. The military said the number was about 250,000; the CIA said it was closer to 500,000. There is evidence that the military’s figure was distorted by pressure from above to make the situation look more manageable than it was. Instead of presenting policymakers with the different analyses and explaining where they came from and what they meant, the military browbeat the CIA into adopting a lowball figure of 248,000. This wasn’t honest intelligence; it was a political feedback loop in which the military told its analysts to report what it wanted policymakers to hear. American military leaders and presidents repeatedly told the public that victory was near, that there was a “light at the end of the tunnel.” Just a few months later, the Tet Offensive belied that impression and made it seem like the combined forces of the Viet Cong and the North Vietnamese Army could bring the battle to the Americans whenever and wherever they chose. (In reality, the Tet Offensive was the last exertion of the Viet Cong, which ceased to be a significant fighting force for the remainder of the war.)
Before the Yom Kippur War of 1973, despite raw intelligence that was “plentiful, ominous, and accurate,” both Israeli and U.S. intelligence missed the imminent threat posed by both Egypt and Syria. Their “concept” of Arab incapacity blinded analysts and policymakers in both countries to the Egyptian and Syrian war preparations that were hiding in plain sight.
In 1978, U.S. intelligence reportedly informed President Jimmy Carter that Iran was “not in a revolutionary or even a ‘pre-revolutionary’ situation.” Within months, the Shah was overthrown, and Ayatollah Ruhollah Khomeini was in power.
In 1995, both NATO and the UN failed to see the impending genocide in Bosnia and Herzegovina. Despite escalating warnings, the intelligence community failed to grasp the full intent of Bosnian Serb forces under Ratko Mladić, so insufficient military action was taken to protect the “safe area” of Srebrenica. Bosnian Serb forces easily overpowered the Dutch battalion (“Dutchbat”) of just 400 soldiers under UN auspices, and within days, more than 8,000 Bosniak men and boys were massacred. It remains one of the worst atrocities on European soil since World War II—and a tragic example of analysis misreading not the capacity for violence, but the will to carry it out.
Most recently, in 2022, most of the intelligence community predicted that Russian invaders would take Kyiv and that the Ukrainian government would fall within 72 Hours. Ukraine defied those predictions—and continues to fight today.
The causes of these errors are myriad: Some were overly pessimistic, others overly optimistic; some shaped too much by policy, others too blind to policy changes; some were due to a lack of information, others to poor understanding of plentiful information. But what they all had in common was the belief that the future was basically predictable. But war is never predictable.
We ask our analysts to predict the future all the time, but seasoned analysts will often say, “We can try to tell you what’s going on right now, but we deal in facts, and the future isn’t a fact yet.”
I COULD LIST EVEN MORE EXAMPLES of historical intelligence failures at the strategic level, but I still bear the scars from misusing intelligence at the tactical level.
During the Iraq Surge, when I commanded forces in Northern Iraq, I would often remind my intelligence officers—particularly the younger analysts—never to provide “single-source intel” in isolation. I was strict on this point. The battlefield was far too complex, the stakes too high, and the potential for manipulation or misunderstanding too real. I gave that rule after having been duped by too many intelligence “tips” that we had acted on. We had multiple Iraqi sources that gave us the supposed locations of hidden yellow cake storage areas. At one point, our intelligence told us we had sealed off our zone from al Qaeda terrorists and we controlled all the ingresses—until a soldier at a checkpoint banged on a fuel truck with a hammer, heard the hollow bang, and discovered twenty terrorists hiding inside.
I learned that our young division intelligence analysts, eager to contribute and confident in a single intercepted communication or informant tip, would believe they had uncovered the key to an operation. But over months in combat, I had learned—often the hard way—that actionable intelligence requires corroboration, cross-checking, and a healthy dose of skepticism.
A single intercepted radio call might point to an ambush. A suspicious vehicle spotted near a checkpoint could suggest a bombing. But we needed more than that. We needed pattern analysis, secondary sources, and sometimes, the patience to wait.
In war, acting on bad intelligence isn’t just an embarrassment. It gets people killed.
In war, governmental and military leaders may never get the full picture. But that does not justify acting on and proclaiming what you want to be true. Leaders must not be wedded to their desired outcomes, especially when intelligence is involved. They must remain open to re-evaluation, flexible in their planning, and brutally honest about what they don’t know.
When I told my intel teams in Iraq to avoid single-source assessments, it wasn’t just about accuracy—it was about leadership. It was about instilling the understanding that uncertainty is part of the job when dealing with enemies on the battlefield. Responsible leaders ask the hard questions, even when the pressure to act or publicly state certainties is enormous.
In an age of faster communications, real-time surveillance, and political impatience, the temptation to treat the “first report” as the final word is stronger than ever. But it’s exactly in these moments that we must return to first principles:
Good intelligence informs decisions; it does not justify them. And it should never be delivered—or received—with hubris. In war and in policy, humility is not weakness. It’s a virtue that respects the gravity of any tough decisions being made. It’s good to be humble, as lives often depend on it.