Tag Archives: homeland security

aviation security: there and back again

This week I attended the CREATE/TSA Symposium on Aviation Security at the University of Southern California campus. Center for Risk and Economic Analysis of Terrorism Events (CREATE) and the Transportation Security Administration (TSA).

It was a nice conference attended by academics, those at government agencies (TSA, DHS, Coast Guard, etc.), and those in the private sector. It was a good mix of attendees and speakers, and no one was shy about raising interesting and provocative ideas. Many issues were discussed in the conference from multiple viewpoints, including:

  • Are we more concerned with people with a nefarious intent and no threat items or people with no bad intent but with threat items?
  • How do we even begin to characterize the deterrent effect?
  • Good security means making tradeoffs between efficiency, effectiveness, and cost.
  • Government agencies wants more collaboration with academics. Almost all non-academic speakers mentioned this.
  • What about drone security?

It was clear that aviation is still a favorite target among terrorists and that aviation security issues are still challenging. Operations research tools such as risk analysis and optimization are needed to put good ideas into action. It was nice to hear that the practitioners feel this way too. We will always have security challenges, and OR will always help us address some of these challenges.

My advisor Sheldon Jacobson talked about his work in this area, including his work with me that introduced the concept of risk-based screening (see a previous article here). Two other PhD students followed me and continued work in this area. Our work addresses on how to optimally target scarce resources at the passengers based on their risk. The models are resource allocation models that allocate screening resources to passengers statically and dynamically (in real-time). The central theme is to use limited screening resources wisely. There are inherent tradeoffs in these types of decisions: with a fixed set of resources, targeting too many resources at low-risk passengers means there are fewer resources for higher-risk passengers.

Some of the critical findings from our research include:

  • We want to match passenger risk with the right amount of security resources.
  • Risk based screening is great because it uses limited screening resources in an intelligent way. Random screening or screening everyone with all of the resources is not an intelligent use of resources (although some randomness can be effective when used intelligently – it just shouldn’t be the only way to use limited resources).
  • When risk is underestimated, high value security resources get used on high risk passengers (a good thing). Finding a threat passenger is like finding a needle in a haystack. Underestimating risk helps you make a smaller haystack.
  • When risk is overestimated, high value security resources get used on low risk passengers, which may leave fewer high value security resources available for high risk passengers. Overestimating risk prevents you from making a smaller haystack (everyone looks risky!)
  • TSA PreCheck implicitly underscreens by weeding out many of the non-risky passengers to make a smaller “haystack.” PreCheck has the potential to make the air system safer in low risk, cost-constrained environments. Side note: TSA PreCheck didn’t exist when I was a PhD student working in this area, but earlier ideas and programs were out there (e.g., trusted traveler programs).

It was nice hearing from TSA practitioners who read my papers with Sheldon and used our ideas to guide changes to policy.

Sheldon will give the long version of this talk in Arlington, Virginia on August 5 at an WINFORMS  meeting. Details are here.

You can also listen to my podcast interview with Sheldon about aviation security from 2011 here.

Special thanks to Dr. Ali Abbas (CREATE director), Kenneth Fletcher (TSA), and Jerry Booker (TSA) for organizing the conference and to Stephen Gee, Lori Beltran, and Michael Navarrete for their hard work organizing the conference. Ali promised to write an OR/MS Today article about the symposium, so stay tuned for more details. 

TSA/CREATE Symposium attendees

TSA/CREATE Symposium attendees

0221 (Large)-XL

Sheldon Jacobson talks about our aviation security research

 


aviation security: is more really more?

Aviation security has been in the news this week after ABC released a report suggesting that 95% of explosives go undetected when passengers go through checkpoint screening at airports.

There are several operations research challenges in passenger screening that address how the Transportation Security Administration (TSA) can set up, staff, and use its limited resources for screening passengers. My advisor Sheldon Jacobson has been working on aviation security issued since 1996 (!) and has amassed a large number of papers on passenger and baggage screening. His work provides the critical technical analysis that is at the foundation of security operations at commercial airports throughout the United States, including the fundamental technical analysis that laid the basis for risk-based security, which in turn lead to TSA PreCheck. I wrote several of these papers with Sheldon when I was a PhD student.

Sheldon Jacobson was interviewed by Defense One about risk-based passenger screening issues.

“Ultimately, we’re dealing with people’s intent more than items. Which concerns you more: a person who has no bad intent but who has an item on them like a knife or a gun, or someone who has bad intent but doesn’t have such an item?” [Jacobson] said. “Most people are comfortable with the former rather than the latter. A person with bad intent will find a way to cause damage. A person without bad intent who happens to have an item on them is not the issue.”

Risk-based systems can help solve that problem, but only when used correctly. The most famous and widely used is TSA’s PreCheck, which launched in December 2013. It allows U.S. citizens and permanent residents who submit to a somewhat strict background check (including an in-person meeting and fingerprint scan) to receive expedited screening at airports for five years. Jacobson says the best thing policy-makers could do to airport improve security is get a lot more people into PreCheck.

The TSA screening policies focus more on finding prohibited items rather than preventing terrorists from finding attacks. As evidence of this, think about the time and energy used to find bottles of liquids and gels that we have accidentally left in our bags. As further evidence of this, recall that the box cutters and small knives used in the September 11, 2001 hijackings were not even prohibited in the first place.

Ultimately, an overhaul of screening requires more than just operations research. We also need new technologies for screening and new training programs. Promising new technology may be just around the corner. Fast Company has an article about how to use biometrics to identify those with bad intents (rather than those who accidentally left a water bottle in their carry on).

I’ll end today’s post with a recent article from The Onion. The Onion is a bit pessimistic – we will always have security challenges. Hopefully we can make big improvements in security at airports and if we do, operations research will have played an important role.


Punk Rock OR Podcast #4: Sheldon Jacobson on aviation security

The fourth edition of the Punk Rock OR Podcast is out.  With the 10th anniversary of September 11th coming up, I decided to a podcast episode on aviation security was in order. Dr. Sheldon Jacobson from the University of Illinois at Urbana-Champaign agreed to chat with me about his research on aviation security to highlight the role of operations research in homeland security.

Don’t forget to subscribe to the podcast feed via the podcast web site.

 

Sheldon Jacobson


terrorism analytics

For this month’s blog challenge, I was inspired by one of the month’s big stories: how Osama bin Laden was caught.

The Navy SEALs deservedly get a lot of credit for the role they play in the ongoing wars on terror, but nerds also play a critical role in fighting terror. The intelligence that played a role in finding bin Laden depended on people on the ground in foreign countries as well as on analytics. Many intelligence agencies are populated by nerds who use analytical techniques on the large volume of data they collect.

We’ll never know exactly how important analytics is in fighting terrorism, but I’ve written a few thoughts here.

Various US government agencies collect and analyze an enormous amount of data on a daily basis.  The NSA collects data equivalent in size to the Library of Congress every six hours.  All of this data obviously cannot be scrutinized at a detailed level (hopefully they don’t get to all of it–I may be put on a watch list if someone looks at the google search terms I used to write this post).  A data rich environment can lead to excellent decision-making if care is taken to determine how to use one’s limited resources for using analytical techniques.  In the terrorism example, how does one determine

  1. which cell phone communications to record?
  2. which phone conversations deserve a transcript and which emails need to be translated?
  3. which data to summarize as metadata?

Another problem with terrorism is the lack of a proper dependent variable.  For example, suppose you collect some cell phones that were used by known terrorists. If you want to look at the terrorists’ social networks by examining the calls sent and received from the terrorists’ phones, it is impossible to know if their calls were made to other terrorists or not (unless some of the numbers are to known terrorists).

This problem is not unlike, say, credit card companies trying to detect fraud.  Both terrorism and fraud detection involve finding a needle in the haystack.  However, terrorism social networks are large and involve many types of transactions (rather than, say, just credit card transactions). Osama bin Laden used flash drives and written communication delivered by courier, whereas others who are lower in the food chain use cell phones, land lines, email, etc. Credit card companies can also make decisions like dropping risky customers that don’t have analogous decisions when fighting terror.

It’s “easier” for a credit card company to determine who is fraudulent based on having more knowledge about their customers and having more certainty about their dependent variable (whether fraud is an issue). My credit card company called me while on vacation this winter, since my unusual purchases set of some kind of red flag.  I was able to verify that no fraud was taking place after I answered a few questions.  I was glad that they were looking out for me. No harm no foul.

Analytics used for fighting terror includes mining cell phone traffic for patterns, identifying social network analysis of terrorist organizations, and creating a system for analyzing risk air passengers or cargo containers (this report summarizes some of the analytical techniques that have been used). There are certainly some fascinating examples that are classified, but well have to speculate about those.

I’ve enjoyed the other blog posts about Analytics, especially those that discuss how analytics fits with the past and future of operations research. Please check out the other OR blogs to read more about analytics.

Related posts:


A post from the INFORMS computing society conference: are application-oriented conferences too specialized?

I am attending the INFORMS Computing Society (ICS) Conference this week.  This is my first ICS Conference.  In addition to the focus on computing, the conference selected homeland security for its theme, with a secondary theme of energy security.  I initially found it unusual for an already-specialized  conference to have an application area focus.

On second thought, many conferences have application area focuses, such as transportation, health, risk analysis.  The application-focused conferences tend to attract people with a broad range of interests who use a variety of tools.  I suppose that the dual focus on computing would not be too limiting, but homeland security seems more focused than, say, health applications.

As a homeland security researcher, I have to say that I was eventually won over.  The talks were all pretty interesting to me, and there was a strong theme to the talks in nearly every session.  Best of all, conference was populated with other homeland security enthusiasts, so the questions asked during the talks were really insightful.  The audience questions quite often gave me even more to think about than the talks.

Since I am interested in both computation and homeland security, nearly all of the talks appealed to me.  Despite having a mere five tracks, I often had to make some tough choices between talks scheduled at the same time and missed more than a few talks that I wanted to see.

The five concurrent sessions were a nice contrast with the 75 concurrent sessions at at the INFORMS Annual Meeting.  Walking between buildings to session hop is not ideal.  When I started attending the INFORMS Annual Meeting, there were about 50 concurrent sessions, and all sessions fit within one convention center.  I have been hoping for the number tracks to be cut back, but they seem to grow every year.  But this tangent should really be its own post, so I’ll stop reflecting for now and come back to this theme later.

I would like to know if those who are less interested in homeland security enjoyed the ICS theme of homeland security as much as I did.  Given that many homeland security applications are ultimately aimed at managing risk, they have broad applicability beyond homeland security and even extreme events.  So I am not sure if the application was really all that limiting.

I hope not too many were deterred by the application focus.  Did you attend ICS?  If so, what did you think of the homeland security focus?


OR and the intelligence community

The (security) intelligence community (IC) is mainly made up of people with a non-technical background (such as political science majors).  Although the IC makes tough decisions under uncertainty given limited information (one of the things that OR does so well), the IC manly uses intuition and expert judgment to make decisions.  Ed Kaplan, who gave the  Philip McCord Morse lecture at INFORMS 2010, spoke about the intelligence community and the role that OR needs to play in the IC.

There are clear needs to use advanced analytical methods in the IC.  The NSA, for example,  collects 1.7B emails per day.  How do they determine which emails to read and analyze?  Clearly, they only have the resources to read a few.  How should they allocate their resources between collecting new data and analyzing the data?  The good news is that the NSA hires OR people (they even have a summer program for graduate students in OR), and the CIA is starting to hire OR people too.

Kaplan reminds us that intelligence is not just an operations problem.  People with language and cultural skills are needed to interpret the nuances and make tough judgments that no algorithm can do.  Human agents have been highly successful at using non-analytical techniques at detection suicide bombing attempts in Israel, which resulted in a plummeting number of successful suicide attacks.  OR cannot replace the human element, but it can certainly aid it.  And often it’s routine law enforcement–not intelligence–that makes the difference.

There have been some attempts at using OR to solve intelligence problems.  Ed Kaplan introduced two obscure articles that were a lot of fun.  A 1967 article in a classified CIA journal (unclassified in 1994) applies Bayes rule to the Cuban missile crisis.  The paper uses the input from 200 agents in the field that reported that they believed that the Soviets were up to something in Cuba, even though none had visual evidence.  The odds started out as 10-to-1 against the Soviets building missiles in early 1962, and slowly increased to 3-to-1 and then 50/50 (1-to-1) based on successive applications of Bayes rule.

J. Michael Steele published a paper in Management Science entitled “Models for Managing Secrets” based on the Tom Clancy Theorem, which states that the time to detect a secret is inversely proportional to the square of the people that know it (stated without proof in the Hunt for Red October).  The paper applies Poisson processes to “prove” Tom Clancy’s theorem.  It also analyzes countermeasures according to how they would result in a secret being kept for longer.

Kaplan has written a few papers on intelligence, including his recent paper Terror Queues, which applies queuing models to determine how agents (servers) can interdict (serve) terrorists (customers) before the terrorists complete an attack (leave the queue).

Kaplan finished the lecture by encouraging researchers to continue to examine the many problems in IC from an OR perspective.  How do you think OR could be used by the IC?

 


Emily Stoll on using math/OR in the real world

Emily Stoll from Johns Hopkins Applied Physics Laboratory gave  a talk on balancing life and work as part of the Women in Math program.  She offered several lessons that she has learned along the way.

Stoll has a degree in civil engineering, a degree in applied math, and an MBA.  She talked about all of the different things you can do with an applied math degree.  Most of her work involves homeland security applications.  As a mathematical analyst, she analyzed submarine data using the Chi-squared test, Kolmogorov-Smirnov test, Fault tree analysis, and other statistical tests.    She used design of experiments as well as modeling and simulation to improve port security.  Her research on receiver operating (ROC) curves for IED detection was featured in the TSA’s blog.

One of the more interesting projects Stoll was involved with is the US NavyMarine Mammals” program. She helped to optimize the location of dolphins and sea lions to interdict dangerous materials (such as mines) and for swimmer defense.  Amazingly, the US Navy has been using the marine mammals program since the Vietnam War era.

All of Stoll’s work requires the use of statistics.  It’s nice to know that the tools I teach students in STAT 541 (an introductory statistics course for engineers) are widely used in industry, even by mathematicians and engineers who don’t consider themselves to be statisticians.

Stoll’s excellent life lessons include:

  • Take your time to think about a job offer before accepting
  • Know what you want before you go after it.
  • Build and use your network.
  • Very few decisions in life are Life Decisions.
  • Sometimes you have to take a risk.
  • You can do it all, just not at the same time.
  • Do what works for you.
  • Realize that it can be done.
  • Realize that you will need help.
  • Realize that almost every other woman in your position are struggling with the same decisions.
  • Figure out what is important to you and make that your priority.

Follow

Get every new post delivered to your Inbox.

Join 2,913 other followers