Okay, enough silly math. Here's how it really breaks down.
From the perspective of a large drainage area (10+ square miles), the first 0.10 to 0.25 inches of rainfall produce no runoff because it fills natural depressions, pools, puddles, etc. Exception would be if there's been enough precipitation in the previous ~week to already fill those tiny reservoirs. The next 2 inches or so produce between 0.5 inches and 2 inches of runoff, depending on landuse & soil types. Sands and pastures produce less runoff, while clays, bedrocks, and urban areas produce more. After about 4 inches of rain has fallen, it pretty much all becomes runoff.
The fancy hydrualic models that the guys over at LCRA, USACE, etc use generally end up with about 10 inches of runoff for the 100-year event (which is 11.5 inches of total rain). Trouble is, over what portion of the watershed? Seeing as how Lake Travis is 38,130 square miles in size - that's way, way larger than even epic-level storms like TS Allison or TS Bill. You'll never get 10+ inches over the entire watershed; instead, some areas get 25 inches while others get 0.25 inches. The 1955 flood only averaged out to 0.45 inches of rain over the whole watershed. Talking about 10" across the whole watershed is silly and stupid because it has never happened and likely never will.
The dam operators use the flood pool to "absorb" the flood wave rushing down the watershed and then slowly release it over a few weeks or months instead of the few hours in which mother nature dumped it. The flood pool for Travis (assuming
this is accurate) is 787,000 acre-ft, which is about 0.40 inches of runoff. The spillway starts running at elev ~714, so any flow after 714 will not be the initial rush of the storm but a small tail-end of the storm that trickles in days or weeks later.