[continued from page 2]
“Because lab tests are expensive and time-consuming, you can do only a limited number of them. Simulation enables you to extend your findings, but it goes only so far. Even the most sophisticated calculations will not produce the right result until you can confirm your mathematical model with experimental results.
“You can then use mathematical modeling to project the results of lab tests to a similar, but distinct, set of tests. As more experimental data comes in, you refine your model to improve its predictions.”
Advances in computational capabilities in the last 15 years or so have enabled engineers to analyze and simulate more realistically the response of an entire structure and its design to different types of failure modes.
“Our goal,” says Roy, “is to define the infinite life threshold of the variation of the critical connections in these structures so that we know that no fatigue-induced failure will occur during their design lifetime. This will enable future standards to specify how many cycles a structure can sustain.”
To shade or to shine
HPC has long played a role in the imaging of tumors, says Tamas Terlaky, but it has not yet reached its full potential.
“Massively parallel computing technology gives us an optimal image of the tumor and the healthy organs alongside it,” says Terlaky, the chair of the department of industrial and systems engineering. “Once you have this image, you have to decide how to radiate the tumor. Twenty years ago, radiologists used a large gamma ray beam that revolved around the body, radiating from different angles. But this was a uniform beam that often burned healthy tissue and tumor alike.”
Advances in computational modeling enable today’s radiologists to limit this damage by modulating the intensity of the beam as it moves in an arc around the tumor. This Intensity-Modulated Radiation Therapy (IMRT) is achieved by focusing the beam through a grid of pinholes as small as 1 mm across that can be shaded. Radiologists and medical physicists solve linear equations to determine which holes should be shaded, and by how much, at a large number of positions along the arc.
“You need to decide where to shade and where to shine from each position as the beam moves around,” says Terlaky. “This involves millions of variables and is far beyond what a medical doctor by intuition can do. You need a mathematical model capable of solving millions of equations to calculate the optimal pattern of radiation.”
This model must also overcome the uncertainties inherent in radiation therapy, says Terlaky. As your tumor is being radiated, for example, you are breathing and your body is moving. And the image on which your mathematical model is based is no longer accurate, as you will have gained or lost weight since it was taken.
To develop models robust enough to deal with these uncertainties, Terlaky uses SeDuMi (SelfDualMinimization), a software package that solves optimization problems over symmetric cones and can link the linear and nonlinear aspects of IMRT.
“Thanks to SeDuMi, we can now solve optimization problems that are much bigger and more complicated than ever before, and we can solve them much more quickly,” says Terlaky, who directed McMaster University’s School of Computational Engineering and Science in Ontario before joining Lehigh’s faculty in 2008.
“Fifteen years ago, there was no way to solve these problems. SeDuMi has helped us solve a small fraction of them.”
Only a minority of Internet users, says Brian Davison, click past the first page of a list of search results, and barely 10 percent make it to the third page. Hundreds of millions of searches are conducted daily and a query can yield millions of results. Factor in human nature and you have the ingredients for an invisible and adversarial contest: While search engines seek to provide accurate results for clients, shady content providers use link-bombing, blog spam, comment spam and other gimmicks to falsely boost their rankings.
The phenomenon has given rise to a field of study called adversarial information retrieval (IR), which helps search engines identify Web sites that manipulate search engine rankings. Adversarial IR researchers have held five annual AIRWeb workshops; Davison, an associate professor of computer science and engineering, organized the first three and gave the keynote address at AIRWeb (2009 in Madrid, Spain).
Davison collaborates with researchers from Yahoo, Google and Microsoft to improve the quality of search engine rankings. In his work, which is supported by an NSF CAREER Award and by Microsoft, he writes algorithms that perform contextual link analysis to help search engines achieve more accurate rankings. This type of analysis gauges the reputation of a Web site by evaluating the sites from which it is linked.
“If 50,000 pages match a query,” says Davison, “which are most authoritative? Search engines track the number of times Web page authors link to a page. We try to be more intelligent by factoring in the topics of the sites that link to the pages that come up in a search. We determine this by following links on the Internet. The links that a Web site chooses are regarded by search engines as expressions of popularity.”
Davison analyzes hundreds of millions of Web pages. Hoping to push that number to one billion, he has assembled dozens of hard drives and tens and eventually hundreds of terabytes of storage. “To follow the Web in a believable way, we work with ever-larger collections of data. As a result, we need machines with lots of memory. We can perform our calculations much faster if our entire set of data fits in the memory.”
Fighting search engine spam, Davison said at AIRWeb 2009, is like playing a high-stakes game of chess against an opponent who constantly changes the rules.
“The Web is becoming the sum of human knowledge. It’s vital to know what information can be considered true or objective.”