Category Archives: Uncategorized

what Punk Rock OR is reading

Have a wonderful Fourth of July weekend!

  1. The queen of college tours: a post by Bill Cook about the TSP and how to solve it.
  2. When maps lie: a fascinating read about geography and map literacy
  3. How analytics transformed the NBA
  4. An overpass built for a bear
  5. Finding the beauty in optimization models: visualizing MPS files by Imre Polik at SAS. I also found a 1987 paper by Irv Lustig [pdf] that does just that using old school tools.

happy belated anniversary operations research and management science!

A recent literature review turned up a reference to a classic 1981 paper by Marshall Fisher that introduced Lagrangian relaxation. I was surprised to note a 2004 publication date and upon further analysis, I noticed that the paper was republished in 2004 in a special issue of Management Science devoted to the ten most influential papers in the journal’s first 50 years. I didn’t have a blog in December 2004 when the issue came out, so I am going to wish Management Science a belated anniversary 10.5 years later.

The list of papers is pretty amazing. It includes:

  1. Linear Programming Under Uncertainty by George Dantzig
  2. Dynamic Version of the Economic Lot Size Model by Harvey M. Wagner, Thomson M. Whitin
  3. A Suggested Computation for Maximal Multi-Commodity Network Flows by L. R. Ford Jr., D. R. Fulkerson
  4. Optimal Policies for a Multi-Echelon Inventory Problem by Andrew Clark and Herbert Scarf
  5. Jobshop-like Queueing Systems by James Jackson
  6. Games with Incomplete Information Played by “Bayesian” Players, I–III: Part I. The Basic Model by John Harsanyi
  7. A New Product Growth for Model Consumer Durables by Frank Bass
  8. Models and Managers: The Concept of a Decision Calculus by John D. C. Little
  9. The Lagrangian Relaxation Method for Solving Integer Programming Problems by Marshall L. Fisher
  10. Information Distortion in a Supply Chain: The Bullwhip Effect by Hau L. Lee, V. Padmanabhan, and Seungjin Whang

Later I discovered an expanded list of the 50 most influential papers. Additionally, there are anniversary review papers in every issue of the 50th volume of Management Science, including “Improving emergency responsiveness with management science” by Linda Green and Peter Kolesar (one of my favorites). More are here.

The journal Operations Research celebrated its 50th anniversary in 2002. Volume 50, issue 1 of Operations Research is dedicated to the celebration, and it contains 33 articles that contain musings of the origins of important breakthroughs in operations research:

  1. The Genesis of “Optimal Inventory Policy” by Kenneth J. Arrow
  2. Solving Real-World Linear Programs: A Decade and More of Progress by Robert E. Bixby
  3. Crime Modeling by Alfred Blumstein
  4. Army Operations Research—Historical Perspectives and Lessons Learned by Seth Bonder
  5. Abraham Charnes and W. W. Cooper (et al.): A Brief History of a Long Collaboration in Developing Industrial Uses of Linear Programming by W. W. Cooper
  6. Linear Programming by George B. Dantzig
  7. Richard Bellman on the Birth of Dynamic Programming by Stuart Dreyfus
  8. Some Origins of Operations Research in the Health Services by Charles D. Flagle
  9. The First Linear-Programming Shoppe by Saul Gass
  10. The Origins of Traffic Theory by Denos C. Gazis
  11. Early Integer Programming by Ralph E. Gomory
  12. War and Peace: The First 25 Years of or in Great Britain by K. Brian Haley
  13. Energy Modeling for Policy Studies by William W. Hogan
  14. Learning How to Plan Production, Inventories, and Work Force by Charles C. Holt
  15. Comments on the Origin and Application of Markov Decision Processes by Ronald A. Howard
  16. Navy Operations Research by Wayne P. Hughes Jr.
  17. Interdisciplinary Meandering in Science by Samuel Karlin
  18. How Networks of Queues Came About by Jim Jackson
  19. Creating a Mathematical Theory of Computer Networks by Leonard Kleinrock
  20. Being in the Right Place at the Right Time by Harold W. Kuhn
  21. Public Sector Operations Research: A Personal Journey by Richard Larson
  22. Philip M. Morse and the Beginnings by John D. C. Little
  23. Operations Research at Arthur D. Little, Inc.: The Early Years by John F. Magee
  24. Efficient Portfolios, Sparse Matrices, and Entities: A Retrospective by Harry M. Markowitz
  25. Perspectives on the Evolution of Simulation by Richard E. Nance and Robert G. Sargent
  26. Memoirs on Highway Traffic Flow Theory in the 1950s by G. F. Newell
  27. Decision Analysis: A Personal Account of How It Got Started and Evolved by Howard Raiffa
  28. Inventory Theory by Herbert E. Scarf
  29. Game Theory and Operations Research: Some Musings 50 Years Later by Martin Shubik
  30. Analysis, Design, and Control of Queueing Systems by Shaler Stidham Jr.
  31. And Then There Were None by Harvey M. Wagner
  32. Applied Probability in Great Britain by Peter Whittle

There are other anniversary collections. In 2008, Springer published a book called “50 Years of Integer Programming, 1958 – 2008″ edited by Juenger, M., Liebling, Th.M., Naddef, D., Nemhauser, G.L., Pulleyblank, W.R., Reinelt, G., Rinaldi, G., Wolsey, L.A. The book contains new material summarizing important integer programming algorithms and ideas that have been introduced over the years. The book is described as follows:

In 1958, Ralph E. Gomory transformed the field of integer programming when he published a short paper that described his cutting-plane algorithm for pure integer programs and announced that the method could be refined to give a finite algorithm for integer programming. In January of 2008, to commemorate the anniversary of Gomory’s seminal paper, a special session celebrating fifty years of integer programming was held in Aussois, France, as part of the 12th Combinatorial Optimization Workshop. This book is based on the material presented during this session.

Peter Horner wrote a 2002 article in OR/MS Today celebrating the 50th anniversary of INFORMS about the history of INFORMS and where we need to go as a field. He ends his article with this:

After 50 years of combined history, INFORMS still finds itself knee-deep in confusion. Plenty of problems have been solved… and plenty of problems remain.

What is your favorite OR/MS birthday memory?


what Punk Rock OR is reading

Here are a few things I am reading:

  1. How an economist helped patients find the right kidney donors: an NPR article about Nobel Laureate Al Roth’s work in matching markets
  2. The unreasonable effectiveness of Random Forests.
  3. Anna Nagurney’s post on why she has fallen in love with Sweden.
  4. What it’s like as a “girl” in the lab: from the New York Times.
  5. Math-inspired art from Wired includes Bob Bosch’s TSP art
  6. Make craftier engineers: Why students should learn to sew in STEM classes
  7. Yes, androids dream of electric sheep – a Guardian article about how Google set up a feedback look in its image classification algorithms


Left: Original painting by Georges Seurat. Right: processed images by Matthew McNaughton, Software Engineer


my national academies committee experience & risk-based flood insurance

I had the pleasure of serving on a National Academies committee the past two years. Our report entitled “Tying flood insurance to flood risk for low-lying structures in the floodplain” was just released [Link].

If you don’t know much about the National Academies, it is a private, independent, nonprofit institution that provides technical expertise to important societal problems (engineering, in my case). The National Academies committees like the one I participated in address a specific challenge and has a very specific charge. The committee is composed of a bunch of really smart people who work together to answer the questions posed in the charge. FEMA provided the charge for my committee.

The specific charge is below, but a bit of background is necessary to know why the problem is so important and was it had to be addressed now. Recently, I blogged about floods and their huge impact on society [Link]. After a series of hurricanes that caused extensive flood damage to properties, the National Flood Insurance Program (NFIP) was created (in 1968) to reduce the risk of flood and mitigate flood damage by encouraging better flood management and building practices. The idea was that homeowners in flood-prone areas (“Special Flood Hazard Areas” – areas with an annual chance of flooding of 1% or more) would have to pay for flood insurance to help pay for the cost of disasters. Today, most homeowners in Special Flood Hazard Areas pay the going rate based on an elaborate formula set by FEMA. There are currently about 5.5 million flood insurance policies.

Those houses that already existed in a Special Flood Hazard Area in 1968 could be grandfathered into the program and receive subsidized rates. Over time, the hope was that these existing houses in Special Flood Hazard Areas would be replaced, thus reducing exposure in flood-prone areas. But they were not. They continue to exist and are expensive for FEMA when disasters strike. This is a huge problem. FEMA’s insurance premium formula as well as risk-based actuarial rates are incredibly sensitive to the elevation of the home relative to base flood elevation. These homeowners may pay $200 for a flood insurance premium per year when a risk-based actuarial rate may thousands of dollars. These houses are negatively elevated, meaning that they are below base flood elevation and flood frequently. There are a lot of these structures out there and they are costly to FEMA.

The Biggert-Waters (BW) Flood Insurance Reform Act of 2012 required these subsidized policies to disappear overnight, turning this important problem into an immediate problem. Subsequent legislation changed some of this, but the bottom line was that subsidized rates would rise, substantially for some. FEMA wanted a review of how they set their rates to be credible, fair, and transparent. That is where the committee came in.

Here is our study charge set by FEMA. In our conversations with FEMA actuaries, FEMA asked for shorter-term (within 5 years) and longer-term recommendations for improving their methods. FEMA asked us to look at how premiums are set and how the process could be improved. We focused on the math; another committee addressed the societal impact of the changes.

Study Charge
An ad hoc committee will conduct a study of pricing negatively elevated structures in the National
Flood Insurance Program. Specifically, the committee will:

  1. Review current NFIP methods for calculating risk-based premiums for negatively elevated structures, including risk analysis, flood maps, and engineering data.
  2. Evaluate alternative approaches for calculating “full risk-based premiums” for negatively elevated structures, considering current actuarial principles and standards.
  3. Discuss engineering, hydrologic, and property assessment data and analytical needs associated with fully implementing full risk-based premiums for negatively elevated structures.
  4. Discuss approaches for keeping these engineering, hydrologic, or property assessment dataupdated to maintain full risk-based rates for negatively elevated structures.
  5. Discuss feasibility, implementation, and cost of underwriting risk-based premiums for negatively elevated structures, including a comparison of factors used to set risk-based premiums.

We constructed ten conclusions:

  1. Careful representation of frequent floods in the NFIP water surface elevation–probability functions (PELV curves) is important for assessing losses for negatively elevated structures.
  2. Averaging the average annual loss over a large set of PELV curves leads to rate classes that encompass high variability in flood hazard for negatively elevated structures, and thus the premiums charged are too high for some policyholders and too low for
    others.
  3. NFIP claims data for a given depth of flooding are highly variable, suggesting that inundation depth is not the only driver of damage to structures or that the quality of the economic damage and inundation depth reports that support the insurance claims is poor.
  4. When the sample of claims data is small, the NFIP credibility weighting scheme assumes that U.S. Army Corps of Engineers damage estimates are better than NFIP claims data, which has not been proven.
  5. Levees may reduce the flood risk for negatively elevated structures, even if they do not meet NFIP standards for protection against the 1 percent annual chance exceedance flood.
  6. When risk-based rates for negatively elevated structures are implemented, premiums are likely to be higher than they are today, creating perverse incentives for policyholders to purchase too little or no insurance. As a result, the concept of recovering loss through pooling premiums breaks down, and the NFIP may not collect enough premiums to cover losses and underinsured policyholders may have inadequate financial protection.
  7. Adjustments in deductible discounts could help reduce the high risk-based premiums expected for negatively elevated structures.
  8. Modern technologies, including analysis tools and improved data collection and management capabilities, enable the development and use of comprehensive risk assessment methods, which could improve NFIP estimates of flood loss.
  9. Risk-based rating for negatively elevated structures requires, at a minimum, structure elevation data, water surface elevations for frequent flood events, and new information on structure characteristics to support the assessment of structure damage and flood risk.
  10. The lack of uniformity and control over the methods used to determine structure replacement cost values and the insufficient quality control of NFIP claims data undermine the accuracy of NFIP flood loss estimates and premium adjustments.

You can read more about our report and its conclusions in the press release.

The committee was composed of 12 members and included civil engineers, risk analysts, actuaries, and one retired FEMA employee. Our fearless chair David Ford did a lot of the heavy lifting in terms of crafting our core conclusions. National Academies staff member Anne Linn was incredibly helpful in terms of getting us focused, staying on track, and writing the report. National Academies staff member Anita Hall did the logistics and was incredibly responsive to our travel needs. The committee met in person four times and wrote parts of the report. The report was sent out to reviewers, and we changed parts of the report in response to reviewer comments much like in a peer-reviewed journal. We couldn’t have done this without David and Anne (many thanks!)

Serving on the committee helped me understand the importance of flooding from many possible perspective. I bought a new house during the my time on the committee. My new house is on the top of a ridge with virtually no chance of flooding.

Serving on the committee also helped me to learn about state-of-the-art techniques in civil engineering and risk-based insurance. Our colleagues in other fields do some pretty cool things, and we can all work together to make the world a better place. I’m proud of our final report – I hope it leads to more credible, fair, and transparent NFIP flood insurance premiums.

21720-030937166X-450


aviation security: is more really more?

Aviation security has been in the news this week after ABC released a report suggesting that 95% of explosives go undetected when passengers go through checkpoint screening at airports.

There are several operations research challenges in passenger screening that address how the Transportation Security Administration (TSA) can set up, staff, and use its limited resources for screening passengers. My advisor Sheldon Jacobson has been working on aviation security issued since 1996 (!) and has amassed a large number of papers on passenger and baggage screening. His work provides the critical technical analysis that is at the foundation of security operations at commercial airports throughout the United States, including the fundamental technical analysis that laid the basis for risk-based security, which in turn lead to TSA PreCheck. I wrote several of these papers with Sheldon when I was a PhD student.

Sheldon Jacobson was interviewed by Defense One about risk-based passenger screening issues.

“Ultimately, we’re dealing with people’s intent more than items. Which concerns you more: a person who has no bad intent but who has an item on them like a knife or a gun, or someone who has bad intent but doesn’t have such an item?” [Jacobson] said. “Most people are comfortable with the former rather than the latter. A person with bad intent will find a way to cause damage. A person without bad intent who happens to have an item on them is not the issue.”

Risk-based systems can help solve that problem, but only when used correctly. The most famous and widely used is TSA’s PreCheck, which launched in December 2013. It allows U.S. citizens and permanent residents who submit to a somewhat strict background check (including an in-person meeting and fingerprint scan) to receive expedited screening at airports for five years. Jacobson says the best thing policy-makers could do to airport improve security is get a lot more people into PreCheck.

The TSA screening policies focus more on finding prohibited items rather than preventing terrorists from finding attacks. As evidence of this, think about the time and energy used to find bottles of liquids and gels that we have accidentally left in our bags. As further evidence of this, recall that the box cutters and small knives used in the September 11, 2001 hijackings were not even prohibited in the first place.

Ultimately, an overhaul of screening requires more than just operations research. We also need new technologies for screening and new training programs. Promising new technology may be just around the corner. Fast Company has an article about how to use biometrics to identify those with bad intents (rather than those who accidentally left a water bottle in their carry on).

I’ll end today’s post with a recent article from The Onion. The Onion is a bit pessimistic – we will always have security challenges. Hopefully we can make big improvements in security at airports and if we do, operations research will have played an important role.


flood risks and management science

This week’s flooding in Texas highlights how vulnerable we are to flood risks. Texas is extremely prone to flooding yet is among the worst states when it comes to flood-control spending. Texas is exposed to significant riverine flooding in addition to storm surge flooding caused by hurricanes and tropical storms. Texas has the second most flood insurance premiums in the US (second only to Florida).

In the past year, I have been serving on a National Academies committee on flood insurance for negatively elevated structures. I have learned a lot about flood insurance, incentives, and risk. I can’t say anything about the report yet except that it will be released soon, but I can tell you a little about the problem and how management science has helped us understand how to mitigate flood risks.

Floods occur frequently (although not frequently enough to motivate people to mitigate against the risks) and when floods occur, they do a lot of damage. The damage is usually measured in terms of structure and property damage, but flooding also leads to loss of life and injuries. Flooding is not just a coastal issue – floods occur along rivers, in areas with high water tables, and in urban areas where infrastructure channels water in such a way that it creates flood risks. Cook County, Illinois has chronic urban flooding issues that is expensive. Floods lead to enormous socioeconomic costs. Two-thirds of Presidential disaster declarations involve floods.

The basic approach to managing flood risks is to (1) encourage people not to build in floodplains and (2) building less flood-prone structures and communities to reduce the damage when floods occur. Getting this to happen is tough. Improving infrastructure requires an enormous investment cost either to society (e.g., flood walls), communities (e.g., FEMA’s Community Rating System), or individuals (e.g., elevating a house). A Citylab article criticizes Texas for not investing in infrastructure that could reduce the impact of floods.

On an individual level, flood insurance is required for those who live in “Special Flood Hazard Areas” (a floodplain; FEMA defines a “Special Flood Hazard Area” as an area with a >1% annual chance of flooding). Flood insurance can be really expensive, which can encourage individual homeowners to either forego insurance or mitigate against flooding. Elevating a house is expensive, but it may be a good financial choice if it reduces flood insurance by thousands of dollars per year. The reality is that most homeowners do not have a flood insurance policy even when it is required because insurance is expensive and is perceived as not needed. Many homeowners in floodplains go decades between floods, and despite the warnings and requirements, they do not see the need to pay so much for flood insurance when they have not experienced a flood.

I recommend reading Catherine Tinsley and Robin Dillon-Merrill’s paper on “near miss” events in Management Science and their follow-up paper with Matthew Cronin. Their papers demonstrate that when someone survives an event like a natural disaster that was not fatal or too traumatic (e.g., a flood that didn’t do too much/any damage), they are likely to make riskier decisions in the future (e.g., they cancel their flood insurance).

A Wharton report by Jeffrey Czajkowski, Howard Kunreuther and Erwann Michel-Kerjan entitled “A Methodological Approach for Pricing Flood Insurance and Evaluating Loss Reduction Measures: Application to Texas” specifically analyzed flood loss reduction methods in Texas. They recommend supplementing the flood insurance with private flood insurance (FEMA currently provides homeowners with flood insurance through the National Flood Insurance Program) to encourage more homeowners to purchase insurance. They also evaluate the impact of elevating structures in the most flood-prone areas, and they find that mitigation through elevation can be instrumental in reducing much of the damage.

What have you experienced with flooding?


domino optimization art

Domino opt-art version of me

Domino opt-art version of me

I discovered a picture of me in my student lab – one of the students optimized me for a class project using dominos(!)  My second blog post ever was about Bob Bosch’s optimization art – see some of his domino art here. It’s worth revisiting opt art.

Bob Bosch wrote about his domino optimization models in Math Horizons, a trade journal from the Mathematical Association of America, and OR/MS Today. Bob also does other types of optimization art (TSP art, mosaic art, etc.). Let’s take a closer look at domino art.

The art is created by solving an integer programming model that finds the arrangement of dominos that is closest to the picture. When complete sets of dominoes are used, there are limited numbers of each type of tile/domino, and intelligently using each type of domino (a limited resource) by assigning it to the “right” part of the photo is the basis for the optimization model.

First, the photo of interest is divided into m x n squares. The goal is to fully use s sets of dominoes, where each set of dominoes contains 55 tiles. Therefore, the photo must be divided into squares such that is satisfies

mn=110s .

The photo must then be divided into a set P of adjacent pairs of squares to account for different domino orientations (i.e., laying a tile horizontally or vertically). Decision variable x_{dp} is 1 of we place domino d on the photo to cover pair and 0 otherwise. There is a parameter c_{dp} that captures the “cost” of placing domino d on the photo to cover pair based on the brightness of the photograph and the number of dots on the tiles. This then gives us an integer programming model that is an assignment problem variation:

dominoip

The objective function minimizes the deviation between the photo and the placement of the dominoes. The first set of constraints ensures that all dominoes are used. The second set of constraints ensures that each pair in the photo is covered by exactly one domino. Bob’s article in  Math Horizons has all the details on constructing the set P and computing the costs. There is no shortage of cool opt art photos of Bob’s creations – check out his stuff on twitter and his web site.

What do you think of my domino photo? I think it’s terrific, but I think I prefer the non-optimized version of me.


Follow

Get every new post delivered to your Inbox.

Join 2,832 other followers