Can Computers Detect Suspicious Behavior?

from the minority-report dept

The arrest earlier this week of most-wanted criminal Warren Steed Jeffs, was a successful use of behavioral profiling. The cops who pulled over the car with Jeffs in it didn’t recognize him, but they suspected they had someone big due to his visibly pulsating carotid artery. Upon seeing that, they summoned backup, and eventually realized who they had pulled over. But the problem with this type of security is that it doesn’t scale very well. One-on-one encounters are costly and time intensive, and it’s difficult to train people in effective behavioral profiling. A group of scientists in Australia are now trying to develop algorithms that recognize suspicious behavior. For example, the computer might be able to identify if someone deliberately left a briefcase unattended, or if that person had an expression of nervousness as they shoved it beneath a chair. The technology hopes to improve on useless facial recognition techniques, which only work with known suspects, and tend to overwhelm security forces with false positives. Sadly, in all likelihood, this new approach, apart from being several years off, would likely run into many of the same problems.


Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Can Computers Detect Suspicious Behavior?”

Subscribe: RSS Leave a comment
17 Comments
Snowie says:

Pity the shaky!

One of my close friends flew to visit me in the UK last week for a few days. Upon his return, the stringent security measures at UK airports now in force quicked in with a vengeance. My mild-mannered, gentle IT friend checked in with lots of time to spare, but boarded the plane last. Why? He suffers from depression and the continued use of medication makes his hands tremble. I guess being surrounded by armed police and the excitable nature of the times we are living in didn’t help matters either.

Gabriel Tane (profile) says:

Hit it on the head

I think Joe got it right on with the last sentence. Since computers are nowhere near the ability to rationalize the information into a creative judgement of character, this is going to do nothing more than flag down “probables” that then have to be checked by humans.

But at least it’s a step forward. Hopefully, we’ll get those cool Star Trek computers that will hold a conversation and be able to interpret what you want it to do.

Like I always say, it’ll be a great day when my computer does what I want it to do, not what I tell it to do.

eb says:

Brings up the question

of whether the software will be able to interpret why the person is nervous/anxious. Maybe he’s just found out his wife is cheating on him, maybe he’s contemplating selling his employer’s business intelligence to a competitor, or maybe he’s a terrorist with a bomb. It’s still going to take a human being to make the call, unless we forbid air travel to any but the placid members of the prozac nation.

BotGuy says:

artificial intelligence

yea… this is pointless technology at this point in the game… “common sense” ai is nowhere near what it needs to be in order for something like this to be applicable. come up with all the algorithms you want, unless you have a computer with the ability to make rationalized deductions in a non-static non-specialized environment it’s all pointless… unless of course your algorithms accomplish such a thing and then why bother showcasing as they intend… the implications towards ai development would greatly overshadow the use in the aforementioned field.

Search Engine WEB (user link) says:

NOW WHAT

so now, people will train themselves NOT to show expressions when committing an illegal act. Or to wear tinted eyeglasses and bangs or caps

in fact, it may get to the point of trained terrorists ot career criminals using chemicals to paralyze their faces before commencing an act

People are nevous if they have some guillt or fear – mind control may now be taught to them

bored now says:

Of course we already have a working example

of this in banks’ IFDS (Intelligent Fraud Detection Systems). These are neural net based systems that they use to detect suspicious patterns of transactions on credit cards. They’ve been in place for some years now and, although the dataset they operate on is vastly more limited than the proposals mentioned, I believe that we can draw some conclusions.

What conclusions? Take a typical exchange with one of my (UK) card issuers (after I’ve called them back on a known good number):
Bank: “We’ve suspended your Visa card because we’ve detected a suspicious pattern of transactions.”
Me: “Do tell.”
Bank: “It’s been used repeatedly in…AMERICA!”
Me: “And do you know what type of card it is dear?”
Bank: “Oh yes. It’s an American Airlines affinity card.”
Me: “Does that fact not give you a clue…?”

In other words, I’d be very, very worried about an unacceptably high rate of false positives, not to mention the very real risks arising from false negatives, once people downgrade other security measures due to blind faith in the all-knowing expert system.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...