Lehigh University logo
Lehigh University logo
Lehigh University logo

The Expanding State of the Infrastructure

Researchers apply a layer of sophistication to a "System of Systems"

Few words have taken on as much baggage in recent history as infrastructure.

For more than two decades, infrastructure has kept questionable company, appearing with adjectives like crumbling and aging, substandard and neglected.

It has been linked to tragedies like Hurricane Katrina (2005), the Los Angeles-Northridge Earthquake (1994) and the recent crisis at Japan’s earthquake- and tsunami-battered Fukushima nuclear power plant.

Politicians of all stripes have catalogued the decline of America’s public works and facilities and have pegged the cost of meaningful overhaul at $1 trillion or more.

Lehigh researchers who study the infrastructure are hardly oblivious to its shortcomings – many have sounded the warnings that helped bring the topic to the forefront of the public imagination.

But today, these researchers are focusing on the potential of infrastructure. They are working on advances in smart systems, software and sensors that will make it possible to allocate resources more efficiently and build more durable structures. They envision a day when infrastructure will be sustainable, engineered for a longer lifecycle and able to withstand extreme events without major damage.

When that day arrives, infrastructure will keep company with words like intelligent, personal autonomy and environmentally friendly. Technology will enable people to assume more responsibility for the resources they consume while affording greater protection against hackers, failures and down time.

Infrastructure is a broad term, encompassing the range of physical structures and services – levees and wastewater treatment plants, highways and airports, power grids and wireless networks, even schools and hospitals – that enable a society to function.

In the past two years, Lehigh’s engineering researchers have carved out strategic areas of focus in new areas such as the smart grid while enhancing their traditional expertise in the civil infrastructure and power generation. Their approach is governed by a philosophy that regards the infrastructure as a system of systems that are integrated and interdependent.

The following pages explore a few infrastructure research projects at Lehigh.

Saving lives, preserving community
Lehigh’s structural engineers took a step toward the goal of sustainable infrastructure recently with a successful experiment on the world’s largest earthquake shake table.

The test verified the superior performance of a reinforced concrete building system containing earthquake-resisting technology developed at Lehigh’s ATLSS (Advanced Technology for Large Structural Systems) Center.

Sustainable infrastructure, says ATLSS director Richard Sause, protects lives while enabling building and transportation facilities to remain operational after an earthquake. In the process, it preserves the social and economic value of the community.

“Think of San Francisco,” says Sause. “What will happen when the next big earthquake hits? Twenty years ago, the question was, ‘can we build structures that protect the lives of people?’

Lehigh’s structural engineers took a step toward the goal of sustainable infrastructure recently with a successful experiment on the world’s largest earthquake shake table.

“Today, the question is, ‘Where are people going to live if there’s extensive damage to the infrastructure?’ What will happen if one-third or more of the people and businesses have to leave?”

The ATLSS technology is a self-centering system with reinforced concrete wall panels designed to “rock” during an earthquake. after shaking concludes, post-tensioned steel strands act like a rubber band to pull the building back to its original position.

The shake table test was conducted at the Hyogo Earthquake Engineering Research Center, or E-Defense Center, in Japan. Lehigh researchers joined peers from the Network for Earthquake Engineering Simulation in the U.S. and the National Research Institute for Earth Science and Disaster Prevention in Japan. The project was led by E-Defense researchers and funded by the Japanese government.

Researchers built two full-scale models of a reinforced-concrete four-story building – one with the ATLSS system and one with conventional reinforced concrete. The shake table simulated the 1995 Kobe earthquake.

“The self-centering post-tensioned concrete wall system [sustained] very little damage under very strong earthquake ground motions,” Sause and Wesley Keller, a Ph.D. candidate, reported.

“We think that type of performance should be expected. By contrast, the conventional reinforced concrete in the adjacent building was badly damaged.”

Self-centering post-tensioned concrete walls are made by casting panels of reinforced concrete and then feeding steel cables through pre-existing hollow ducts in the panels. When the panels are in place, the cables are stretched and then anchored at the top and bottom of the wall, which clamps the panels together.

Lehigh researchers have found that by using “unbonded” post-tensioned steel, concrete walls can be designed to perform well under strong ground shaking. By not bonding the steel to the surrounding concrete, says Sause, deformations in the steel are distributed over a relatively long length rather than concentrated in a small critical region. Strain levels in the steel are thus significantly decreased during earthquake loading.

“An unbonded post-tensioned structure remains nearly elastic during earthquake shaking. As a result, it returns to its original shape after the earthquake without the need for costly repairs,” says Sause.

In the event of earthquakes, greater resilience
In another project, ATLSS researchers are evaluating the potential for magnetorheological (MR) dampers to minimize seismic damage to structures. They study the dampers using hybrid simulation, which integrates two types of data – data generated by numerical models of structural components that are well understood and can be modeled analytically and data collected simultaneously from lab experiments on components that are not well understood.

MR dampers, say Sause and James Ricles, professors of structural engineering, improve the resilience of a multi-story steel-framed structure to earthquakes by minimizing its drift and vibration during a seismic event. During an earthquake, says Ricles, floors in a high-rise can drift laterally, damaging beams and columns and pipes and wires. a building that looks sturdy can still be deemed unsafe to use and can cost more to repair than to be demolished and rebuilt.

Hybrid simulation, says Sause, increases the amount of information researchers can gather from an experiment.

“Hybrid simulation allows us to evaluate the performance of very large buildings under the dynamic loading of an earthquake in real time and also to compare designs,” says Sause. “such data would be too expensive to collect from physical experiments alone.”

Sause and Ricles evaluated the benefit of using MR dampers by performing hybrid simulations of a nine-story building subjected to conditions equivalent to those of the Northridge Earthquake. The actual building was condemned after the earthquake.

The building, which was rendered unusable by the Northridge Earthquake, would not have had to be condemned if it had been fitted with MR dampers.

“MR dampers are unique,” says Ricles. “Their properties can be controlled by varying an applied electrical current. The fluid inside the damper contains iron particles, which form linear chains that align with the induced magnetic field when a current is applied.

“This alignment increases the viscosity of the fluid and restricts its ability to move through the orifices of the damper. The result is a change in the yield strength and energy dissipation capability of the fluid.”

“The dampers significantly reduce the vibration and drift of the structure,” says Yunbyeong Chae, a Ph.D. student. The building that was rendered unusable by the Northridge Earthquake, he adds, would not have had to be condemned if it had been fitted with MR dampers.

The test was performed at the NEES Real-Time Multi-Directional Earthquake Simulation Facility in the ATLSS Center, with funding from NSF and the state of Pennsylvania.

For the grid, an intelligent interface
No part of America’s infrastructure, says Rick Blum, is more overdue for a fresh coat of intelligent systems than the electrical grid that generates, transmits and distributes power to more than 300 million people.

Blum, professor of electrical and computer engineering, is one of a cluster of Lehigh researchers studying the smart grid. The group also includes Shalinee Kishore, associate professor of electrical and computer engineering; Lawrence Snyder, associate professor of industrial and systems engineering; and Liang Cheng, associate professor of computer science and engineering. The group formed a year ago when engineering faculty, meeting with experts from industry, government and national labs, determined that Lehigh’s expertise in systems engineering was ideally suited to help overhaul the grid.

“The electrical grid,” says Blum, “needs to be able to respond to demand and to control distribution in real time. We have to figure out when consumers need power, how much they need and how much power is being generated at a given time by a given plant.

“The smart grid will increase energy efficiency and decrease carbon emissions. It will integrate renewable energy sources like wind and solar. Because power generated by these sources is variable, the prices charged for power must become variable as well.”

Information architecture overlaying the smart grid, says Kishore, will more efficiently match supply with demand.

“The smart grid will send information on real-time prices directly to consumers, allowing them to make decisions regarding the purchase of power,” she says.

“Homes will have energy management controllers [EMCs] and smart meters. Your EMC will be programmed to know your power usage patterns and preferences. It will look at prices in real time and make decisions for you.”

Communication between EMCs and the grid, says Kishore, will enable consumers to use power when it is priced most cheaply. Communication among EMCs will allow car batteries to be charged and dishwashers to be run on a schedule that spreads out demand for power.

This leveling effect, says Snyder, will help power companies avoid costly periods of peak demand and even costlier brownouts and blackouts.

“A utility always tries to meet peak demand, but this is very expensive,” says Snyder. “The peak usually lasts a short period of time. But a utility has to maintain expensive equipment – usually older, more polluting equipment – to be able to turn on power as needed no matter what.”

Kishore and Snyder are developing mathematical models that optimize communication among EMCs and between EMCs and the grid. The model, says Snyder, is similar to a CSMA (carrier sense multiple access) protocol that enables a node in a network of sensors to transmit information only when it detects that other nodes in the network are idle.

“Let’s say your EMC wants to run the air conditioner, dishwasher and clothes dryer,” says Snyder, who is developing an optimization algorithm for the model. “It asks the other EMCs in the network, ‘Do i have enough power?’ They might respond, ‘No, you have to delay one of your tasks.’

“This is like rationing, but it’s structured so that users never really feel it.”

Without communication among each other, says Kishore, too many EMCs might schedule tasks during periods of low power demand, distorting usage and pricing patterns. Kishore and Snyder have run tests showing that this distortion would actually cause a greater spike in peak demand. To avoid this, a smart grid would act in concert with EMCs to manage the number of power-driven tasks performed at various times during the day. It would implement its decisions to achieve fairness without infringing on customers’ privacy.

“Pricing is just one mechanism to achieve efficiency,” says Kishore. “Communication is also essential. Our scheme shows you can develop coordinator-based communications protocols that allow the leveling of peak usage.”

The cell phone user as chemical detective
In the not-too-distant future, says Miltos Hatalis, the ubiquitous cell phone will acquire a new, silent function.

Fitted with arrays of gas sensors, it will monitor the air for leaks of toxic chemicals.

In fact, says Hatalis, professor of electrical and computer engineering, groups of cell phones will serve as dynamic, wireless networks of chemical sensors – first for police and other first responders and later for average phone users.

“What we see,” says Hatalis, “is the day when every cell phone will be a mobile chemical lab detecting and analyzing harmful chemicals. The phones will send data to a central location, which will correlate data to achieve a realistic picture of a regional environment.”

None of this will come to fruition until sensors can be integrated cheaply into cell phones.

Hatalis’ group is working with NASA to make multi-channel sensors that will be integrated with carbon-nanotube (CNT) sensing materials. The CNTs will be functionalized to absorb certain chemicals selectively. This will cause changes in their electrical resistance that signify gases have been detected.

One goal is to increase the number of sensors that can be fitted into a phone and to ensure their accurate performance.

“As the number of sensors increases,” says Hatalis, “it becomes impractical to wire each sensor directly to the circuit that measures the change in electrical resistance.

“We’ve developed an array of 64 sensors that requires only 16 wires. We’re exploring a device with 256 channels and 24 wires. Our goal is to have arrays of thousands of sensors that can be read with just a few input-output wires.

“By reducing the number of wires and utilizing glass substrates, we’ll end up with a device that is smaller, cheaper and much easier to integrate into a cell phone.”

Taking the sting out of explosions
Terrorist attacks have motivated engineers to design structures that withstand explosions without collapsing while also minimizing danger from flying debris.

Clay Naito, associate professor of structural engineering, has worked six years with the Air Force Research Laboratory and Portland Cement Association to assess the blast resistance of precast concrete panels. His group conducts full-scale blast tests to evaluate the resistance to explosions of standard wall panel construction techniques. The goal is to design the panels more accurately and efficiently.

Wall panels are typically designed to resist loads from handling, construction, wind pressure, thermal expansion and shrinkage, and from floors and roofs. Flexural (bending) resistance is also a concern.

During an explosion, a wall sustains a pressure increase of up to 28,000 pounds per square foot, which falls to a negative pressure, creating suction on the panel, before returning to ambient conditions. All of this happens in less than a 20th of a second.

Even a small explosion can cause a pressure increase 20 times greater than the maximum static load a panel is designed to support. Inertial and flexural resistance, says Naito, help a structure withstand a blast.

In an NSF project, Naito is attempting to improve the flexural performance of precast concrete sandwich panels by using reinforcement strategies that combine bonded and unbonded wire strands with an internal layer of insulating foam.

“An analytical study is under way to determine the most effective arrangement of bonded to unbonded strands,” says Naito.

Under close-proximity detonations, the pressure of the initial impact generates a compression wave that penetrates the thickness of the panel and reflects off an interior face as a tension wave. If this wave exceeds the tensile capacity of the concrete, fragments break off, or spall. If the spall depth exceeds half the thickness of the panel, a breach typically occurs.

“The propagation of the compression wave is reduced by the presence of low-density foam insulation,” says Naito. “We plan to study this effect and assess the potential for supplemental reinforcing materials such as carbon fibers, nylon and other synthetic fibers to improve the tensile strength of the concrete layers.”

Naito also collaborates with Auburn University. Blast testing is carried out at the Tyndall Air Force Base in Florida.

Treating Bridges as a system
The ambient vibrations caused by wind, river flow and car traffic are not the most dramatic loads a bridge sustains, says Shamim Pakzad, assistant professor of structural engineering. Forced vibrations from large trucks and earthquakes cause the most significant stresses.

But an assessment of ambient vibrations can paint a revealing portrait of a bridge’s structural health and enable engineers to evaluate more precisely the effect of an extreme event.

Pakzad and his students use networks of wireless sensors to study three truss bridges near Lehigh. On the Northampton Street Bridge connecting Easton, Pa., to Phillipsburg, N.J., the students installed 28 sensor units. In one day, they collected 3 million data points, or more than 100,000 information bits per sensor.

The sensors record ground vibrations to the bridge’s foundation as well as response vibrations by beams, columns and bridge deck. The baseline data will help engineers detect damage caused over the long term by truck traffic or over the shorter term by an extreme event.

The three bridges, says Pakzad, function as a system whose performance affects the social and economic life of eastern Pennsylvania.

“If one bridge is taken out of service, its traffic has to be taken up by other bridges in the region. This increases the risk to the other bridges and requires us to look at the behavior of the overall system.

“Instead of looking at one bridge and evaluating its prognosis, you look at all the bridges as a system. A decision about one affects the others.”

Pakzad recently helped lead a team that installed 64 wireless sensor units on San Francisco’s Golden Gate Bridge. The units worked as effectively as conventional wired sensors at a fraction of the cost.

Each wired unit on the Golden Gate cost thousands of dollars, says Pakzad. Each prototype installed by his group cost $200. Mass production and new design could cut that to $10.