The Fitness of Physical Models

How a 1950s-era, 1.5-acre mock-up of the hydrology of the Bay Area might still be able to complement real science in the age of computer modeling.

Ranger Thomas Downs leads a group of visitors to a point above San Pablo Bay in Sausalito, California. Gesturing toward the Pacific Ocean, he clearly enjoys himself as he exclaims, “There’s the Golden Gate Bridge!” The kids in the group grin at a California gray whale breaching in the distance.

The four main bays that make up the San Francisco Bay estuary can all be seen from here: San Pablo, Suisun, Central and South Bay. The entire San Francisco Bay, including its famous bridges, is visible, and the whale is spectacular.

This view is possible from only one vantage point on Earth: inside a WWII-era warehouse.

That warehouse holds the Bay Model, the largest working hydraulic model in the United States; its 1.5 acres replicate a 1,600-square-mile area that runs from the Pacific Ocean to the Sacramento Delta. It is not an exact replica: the delta has been shifted 45 degrees so that it fits into the building, and the whale and the Pacific Ocean are painted on a wall.

Made of reinforced concrete and dotted with copper tabs to control tidal simulations, the model — constructed in 1959 by the Army Corps of Engineers — is a throwback to an earlier era before computers took over modeling. Besides its nostalgic charm, the model has value both as a piece of San Francisco Bay history and as a tool for teaching about natural resources.

In 2000, the model’s mission officially changed from scientific research to education and culture, and more than 150,000 people visit the free attraction each year. But in the era of climate change and rising sea levels, its partisans say it still can contribute real science.

Research on things like estuary hydraulics and fluid dynamics that began when the model was built has had enormous implications for both San Francisco Bay and for the rest of the California. Water diversion plans, oil spill impacts, effects on salinity from dredging have all been studied extensively here, adding a physical component to the computer modeling that has overtaken hydrology research.

“The model made it possible for the general public to interface with the scientists,” says Nancy Rogers, who managed the visitor center for the Corps from 1985 to 2001. “And people are interested in the science.”

Rogers says there is a debate over the value of physical modeling, especially now that computers have become so powerful and don’t require the space that physical models do. Meredith Williams, director of environmental data, information and technology at the San Francisco Estuary Institute, says that computer modeling has become so accurate that physical modeling is often bypassed altogether. “Computer models are validated now by looking at existing historical data.”

Small, generic “flume” models are still used by individual researchers, especially at universities, but models of specific locations, such as the Army Corps’ 14-acre model of Chesapeake Bay, no longer exist.

But no one disputes the model’s ability to teach. “Being able to see the whole bay, and to realize that what we do in one part of the bay affects all the other parts — people realize now that (the bay) isn’t an unlimited resource,” Rogers says.

School and university groups still tour the facility, but the San Francisco Bay they have seen at the model for the last year and half is a dry one, the light blue paint that lines it substituting for the water that normally reproduces a 24-hour tidal cycle in just 14 minutes. In March 2009 the water was drained so construction could begin on a remodel paid for with more than $15 million in federal economic stimulus funds. That money allowed the installation of 2,492 solar panels on the roof, seismic retrofitting, and extensive repairs that should keep the model operating for at least another 50 years.

• • • • • • • • • • • • • • •

The Rhode Island-sized estuary the model represents is the drainage point for more than 50,000 square miles of watershed that run the length of the Central Valley and provides more than 40 percent of California’s freshwater. The estuary itself is relatively shallow: less than 13 feet deep in most places, thanks in part to a thick layer of mud and silt sent cascading down the Sacramento and San Joaquin rivers during California’s Gold Rush.

The damage from the Gold Rush is difficult to overstate. According to the Corps of Engineers, hydraulic mining sent seven times more silt into the bay than was displaced during the digging of the Panama Canal; Gold Rush silt is still making its way downstream.

The Gold Rush also spurred efforts to fill in the bay. Hundreds of ships lay abandoned in Yerba Buena Cove and elsewhere as newly outfitted gold diggers scrambled up and scrabbled in the Sierra Nevada mountains. (The PBS documentary Saving the Bay nicely illustrates this.) Merchants, short on land, began doing business from the abandoned ships, and as the hills behind the cove were leveled to make more useable space, their sand was used to fill in around the ships, transforming them into buildings. Such “bayfill” continued into most of the next century as people sought to push back the water’s edge farther into the bay, ultimately shrinking it by a third.

Water, and the human desire to master it, lay behind the model’s creation.

Map that was included in the Reber Plan. Click to enlarge.

Schemes to divert the Sierra Nevada’s abundant melt water to San Francisco and Los Angeles have always been circulated in the state of California — recall Yosemite’s Hetch-Hetchy Dam or Jack Nicholson’s Chinatown. Just after World War II the controversial “Reber Plan” combined this primordial urge with national security. The plan proposed building enormous dams close to where the Oakland Bay and San Rafael bridges now stand. The dams featured multilane highways for cars, rapid transit and freight rails; most of the eastern side of the bay would be filled in, and huge locks and a freshwater shipping canal would enable both freight and military ships to pass.

Gray Brechin, a historical geographer at the University of California, Berkeley, and the author of Imperial San Francisco: Urban Power, Earthly Ruin, points out that fear of a Pearl Harbor occurring in San Francisco made the plan seem attractive. “Political cartoons of the day show that people were terrified,” Brechin says. “The Reber Plan seemed to provide the answer; people could be moved in and out of the area relatively quickly.” And then, of course, there was the water.

Supporters of the Reber Plan claimed the freshwater reservoirs created by the dams would solve California’s water problems. The melt water that traveled from the Sierra Nevadas, through the Delta, and out the Golden Gate could be captured and sent throughout the state. Newspapers all over California editorialized in favor of the Reber Plan, and Congress held hearings featuring the plan’s author, schoolteacher and theater producer John Reber. The Army Corps of Engineers received orders to construct a working hydraulic model to see whether it would work.

The Corps, at first, wanted to build the model in Vicksburg, Mississippi, home to its efforts to tame another waterway, the Mississippi River. But Californians lobbied to have it built near the bay it would mimic, and a Marinship warehouse, which churned out Liberty Ships during World War II, was handed over to the Corps.

The new physical model began testing the Reber Plan in 1959, and it worked too well for the plan: Corps engineers determined that, not only would Reber be catastrophic for the estuary, but that the bay was much too shallow for the reservoirs to work. The reservoirs instead would turn into enormous evaporation ponds, incapable of providing any discernible amount of freshwater.

• • • • • • • • • • • • • • •

Testing conducted at the model was an enormous task in the early years; dozens of people were needed to carry out studies and instruments were cumbersome.

Marvin Horton, a mechanical engineer who worked at the model in 1961, describes the at times mind-numbing effort to collect data in those days: “The velocity meter was a little wheel in the water. You had to count the number of revolutions and then convert them to feet per second.” Changes were gradually made to the system, often by Horton and other mechanics working on their own with ways to improve the data collection.

Horton returned to the model in 1992, when it was under contract to UC Berkeley. A computer now ran the machine that determined the depth of the tidal simulations, but it still required someone to sit at the keyboard to type in adjustments continuously.

“Even taking a bathroom break was risky,” he recalls. “You had to get someone to cover for you for every minute.”

Not all requests came from scientists. Law enforcement officials called more than once for help in determining where bodies that had washed up might have originated. Once, a woman called asking for help locating her wedding ring, which she’d lost at a beach. Although the model can’t pinpoint exactly where an item might end up, Horton recalls with a grin the time the Discovery Channel used the model to reenact the 1961 escape by three inmates from Alcatraz. “No matter what the starting point, they ended up in the open ocean,” he says.

Dan Schaaf, a principal in the civil engineering firm, Schaaf and Wheeler, visited the model on a field trip as a grad student in 1994. He was so fascinated by the research being done there that he vowed to return to do testing. “Very few places then even did physical modeling anymore, because it required space and money.”

With hydraulic computer modeling, a scientist will “calibrate” the computer to the real world. Gauges are placed in the bay that measure tidal depth, salinity, and water velocity, and the information is programmed into the computer. In theory, when new variables are introduced, results should match the real world. “Computer guys always have kind of dismissed physical modeling, and the research money is all in computer [modeling] these days,” says Schaaf. “I’m a guy who now does computer modeling all day every day, but physical modeling gives you security. It means that what you are doing has more validity.”

Schaaf recalls a conference about estuary hydraulics in the early 1990s that featured a session titled “Is the Bay Model Still Viable?” He elaborates: “And they basically said it is still viable, but it needs to be updated. It needs to run a true tidal cycle and provide automated data collection.” Freshwater releases also needed to be variable, so that the spring runoff could be effectively modeled.

[youtube]fy16vKonJUM[/youtube]

The Corps of Engineers responded to the challenge with a huge investment, and Schaaf and his colleagues began to research instrumentation. Among other things, they converted the manual water measurement to an ultrasound system and transformed the model’s tidal cycle into a programmable tidal cycle instead of a 19-year-epoch averaged tide. Marvin Horton, the mechanical engineer who recalled counting the revolutions of the velocity meter, figured out a system to ensure all 286 of the model’s 7-ton, 12-foot-by-12-foot sections would stay level.

Schaaf had returned to the model — he worked there from 1997 to 2000, and became the “default director” when David Fry left in 1999 — but he was never able to participate directly in physical model testing.

“We were hoping to work on the airport expansion, and a guy at USC wanted to research tsunami oscillation in the bay.”

But the research never materialized. The end of physical modeling at the site arrived with a new Corps of Engineers supervisor in 2000. “He axed it,” Schaaf says, explaining that computer modeling had rendered the Bay Model obsolete. Schaaf disagrees. “Everyone liked the hybrid idea, the idea that computer modeling would be validated by the physical model.”

He doubts that the Bay Model will ever again be used for research, but he believes it could still contribute, even though getting it ready for actual testing would require money.

“The model could be used to validate theories regarding climate change, water supply, habitat, navigation, land use, flood control, and various disasters. The difficulty would be getting the numeric modeling community to embrace the model’s capabilities.”

A month after Ranger Downs arrived at the model, the water was drained and construction began, so he has been giving tours for the last year and a half of a dry model. A few weeks ago, engineers attempted to fill the model once again, but numerous leaks were spotted. The latest projection for refilling it is before the end of the year. Downs is anxious to have the water flowing again: “It’s the essential element that makes the model work, the essential element that makes the dynamics of San Francisco Bay understandable.”

Sign up for the free Miller-McCune.com e-newsletter.

“Like” Miller-McCune on Facebook.

Follow Miller-McCune on Twitter.

Add Miller-McCune.com news to your site.

Related Posts