Monday, 21 November 2016

Week 3 [21-27.11.2016] - Automated Inference on Criminality using Face Images

Hello,
This week I would like to present the article Automated Inference on Criminality using Face Images.
This study includes automated inference on criminality based solely on still face images. Via supervised machine learning, authors build four classifiers (logistic regression, KNN, SVM, CNN) using facial images of 1856 real persons controlled for race, gender, age and facial expressions, nearly half of whom were convicted criminals, for discriminating between criminals and non-criminals. All four classifiers produced evidence for the validity of automated face-induced inference on criminality, despite the historical controversy surrounding the topic. The variation among criminal faces is significantly greater than that of the non-criminal faces which means criminal and non-criminal face images populate two quite distinctive manifolds.

I would like to encourage you to share your thoughts about the problem of privacy, facial recognition and any facial analysis which produce such far-reaching conclusions? Instead of sharing photos via facebook and the other services are we going to protect our images and try to reserve strictly the rights to our images?

31 comments:

  1. In Europe, companies have to seek your permission first before using facial recognition technology for commercial purposes. This is why Facebook has decided not to offer its photo-sharing app Moments in the region - it does not offer an opt-in facility. Just two states in the US - Illinois and Texas - have adopted Europe's approach. In the UK, the Data Protection Act stipulates that we have to be informed when we are under camera surveillance and by whom. We also have the right to request any recorded images we feature in. Unfortunately people don't always play by the rules. It's very difficult to know how our images are being used and whether our right to privacy is being respected. The digital surveillance genie is out of the bottle, and many privacy campaigners have little faith that regulators will ever be able to stuff it back in. Personally I agree with Civil rights groups which argue that we have a fundamental right to privacy and that wanting to be anonymous does not mean we have "something to hide". We shouldn't be able to be identified if we don't want to be.

    ReplyDelete
    Replies
    1. Thank you for your reply. Where have you found the information about the permission to facial recognition? Is it regulated somewhere/somehow?
      I agree with you. I think that we have a fundamental right to privacy and it should be respected. It reminds me of the scary russian app which searches social media looking for your face https://sputniknews.com/russia/201610211046600392-russia-technology-facefind/

      Delete
    2. http://www.bbc.com/news/technology-33199275

      Delete
  2. First of all the whole idea is completely crazy for me. Some people have quite non standard permanent facial expressions, some just made facial expression because of thinking about something unpleasant. My first thought was question - Are really facial features really determinant whether someone have potential to be a criminalist or not? I would like to emphasise that the whole research and idea started because of two Chinese researchers. For me it is the first step of computers taking over the humanity. If computers are better to inference on criminality free of any biases of subjective judgments of human observers then the next step will be replacing humans in courts and police and letting computer decide whether someone should go to jail or not. I wish good luck for countries where death penalty is allowed.

    This subject is somehow related for me with speed radars that make photos of car drivers. I think that this is much helpful to use facial recognition in this situation. I have heard multiple times when the police could not punish anyone because few people used the car but no one would admit to using it during the offence.

    I think that the problem of the use of images in an illegal way is very wide. Starting at the use of the photographs on websites through using them in advertising (often the products to which we would never lend our image) up to the creation of fictitious social profiles to impersonate another person.

    ReplyDelete
    Replies
    1. Thanks for your reply. I think that any research concerning a relationship between appearance and behavior should be performed carefully due to the fact there can be produced harmful conclusions. This research goes one step too far. I think we all should sensibly decide what is right or wrong. I hope that computers will not take control over some areas of human life.

      Delete
  3. I agree that this is quite a controversial topic with difficult to find "one fits all" solution. On one hand , quick identification of criminals might help prevent many high scale crimes, murders, terrorist attacks etc. Even a few such occurrences might be enough to recompensate for multiple false positives.
    On the other hand, it is worth considering if we can really correctly identify a criminal based solemnly on his face image. Moreover, what such identification gives us?
    Our major point of interest shoud be to prevent crime, not to identify a potential criminal. Even if somebody is a criminal, he might not commit a crime today. We cannot charge him with no evidence. Alternatively he might be a past convict trying to start a new life- where does it bring us?
    To summarise, I would strongly argue that such technology is not enough alone without context to identify criminals.

    ReplyDelete
    Replies
    1. Hello, indeed it is a controversial topic. There may provided some benefits from such a technology, however, it may also lead to exaggeration and abuse. To many of us it reminds of the 'Minority Report' which seems to be no longer a sci-fi.

      Delete
  4. Sorry, I am not going to bother to talk about privacy. In the Internet, privacy is virtually non-existent.
    However, the topic you covered seems interesting. The fact that we can tell apart faces of criminals and... regular people has far reaching consequences. Since our looks is deeply encoded in our DNA... it just means that there could be a test whether or not the person have criminal inclinations. That could also possibly lead to genetic therapy.
    However, this could be dangerous... Knowing history a bit and what Nazis did to homosexuals and mental ill people, I am quite afraid that such a therapy could lead to monstrosities...

    ReplyDelete
    Replies
    1. Hi, I have to agree with you that regarding Internet there is no such thing as privacy. Our pictures, video-calls, likes, tweets, shares, anything we think we don't share (metadata) can provide useful data for anyone who is interested in them.

      Besides, I agree with you that the presented research is quite dangerous and it also reminds me of Nazis. People should not be judged by their appearance. We need a commonsense reaction.

      Delete
  5. Awkward stuff. I wonder how Convolutional Neural Network picked up it's features for hidden layers, some images of CNN feature maps would improve the article. It seems to me that training set should be more similar to society structure, I mean that not a half of our society are criminals, then why is that the case in training data. And obviously the article is missing a definition of a criminal: are they really some bad guys like mobsters, drug dealers murderers or rapists, or maybe there are some muggers or thieves among them? I suppose in China sentence "Give me a man and I will find the crime" is still used.

    ReplyDelete
    Replies
    1. Hi! Thank you for sharing your opinion. I agree with you. The group of people taken into consideration should correspond with the society structure if we want to use this model anywhere. I think that this article is not performed in a proper way or the conclusions should contain information about the constraints in the applicability of the model.

      Delete
  6. This is very interesting topic and this article might be the pivot point in the field of criminology. Since beginning of 18th century, scientists in ethnology of crime tried to answer the question: why some people commit crime and other do not. The most picturesque theory was the one created by Cesare Lombroso, who measured the length of bones and tried to identify correlation between morphological features of people and the fact of committing the crime. Subsequent analysis of the theory determined that this concept is a nonsense and abolished its groundings. The research paper published by the scientists from Cornell University reminds me of the work of Lombroso. My biggest concern is the underlying concept that by analysing the certain features of mimics, software is able to identify the potential criminals. Common sense suggests that people’s mimic can change in every few seconds and is inherent to peoples physiognomy. Accidental recognition of some shapes in the mimic, might lead to spurious correlation nose and mouth triangles which subsequently identify individual as criminal.

    In terms of privacy of the data, I found the topic in this context not actual, given there is easy access to cameras and everyone can record people pretty much everywhere. There is no control over such data anymore.

    ReplyDelete
    Replies
    1. Hello, thanks for your comment.
      Do you think that the answer for the question 'why?' is the appearance of a person? I think we take a step back regarding criminology.
      We all have seen some mugshots somewhere and we all have built sets of criminals 'looks' and when it comes to jugde a person in a court we solely depend on the the facts not on how does the accused look like.

      Delete
  7. Hello, it is important topic in current world. We had a situation where person from external company had a problem with CCTV. He wanted to turn off industrial cameras during his visit. People has different approach to the own privacy. To be honest - in my opinion tracking in nova days is everywhere. We are using credit cards, checkin in the hotel, airport etc.

    Solutions, Which analyse people Behaviours are Implemented in many places. Integrated database for tracking every person in the world will be a huge, but tracking crimes can limit their chances to do something bad. I agree with Mikołaj - definition of a criminal is missing.

    Another point is what if someone puts the mask or makes a plastic surgery? Tracking this people is much complex and facial recognition might be one of many methods.

    ReplyDelete
    Replies
    1. Hi, thank you for sharing your experience. It seems that this research would be more reasonable if they were collecting images of people commiting sertain crimes (from industrial cameras for an example) and try to forecast who is about to commit a crime. Permanent features of a face should not be taken into consideration but the expression, in my opinion.

      Delete
  8. Is that article about the George Orwell novel Nineteen Eighty-Four, often published as 1984 ???

    Probably yes as an author comes from Peoples Republic of China.
    Going to basics.

    The problem is that machine learning is adept at picking up on human biases in data sets and acting on those biases, as proved by multiple recent incidents.

    One major concern going forward is that of false positives—that is, identifying innocent people as guilty—especially if this program is used in any sort of real-world criminal justice settings.

    ReplyDelete
  9. This paper is showing there's a detectable difference between the criminal and non-criminal faces in their dataset. Obviously, this would be the case if criminals actually had different faces. However, it could also be the case that people with more extreme faces are more likely to be convicted. That is, this paper could be showing that their dataset is biased.
    For instance, the data set they're using here is fairly small, and while, they did use 10-fold cross-validation, that's still a bit on the less than ideal side generally speaking neural nets, especially CNN architectures, which are usually pretty deep. Furthermore, the dataset itself seems fairly questionable to me. I'm not sure how much I trust the Chinese criminal justice system to adequately adjudicate culpability in the first place, but even setting aside such admittedly conspiratorial notions, it seems rather odd indeed that nearly half of their positive samples are not in fact convicted criminals but merely suspects. I do not find their attempts at devil's advocate persuasive as it's not readily obvious exactly how they used or obtained any of their testing with the three different data sets.

    ReplyDelete
  10. It seems to me that automated face-induced inference on criminality can be helpful in objective judgment during the trials. However, it can be used as an assistive technology only. In my opinion it never can be a final oracle.
    About images that are published on social sites: it depends on us what kind of photos are posted and shared ipso facto. Should we protect our photos? Yes, we should because of our privacy. But in this situation it isn’t possible - once online published photo is a permanently shared object in the network.

    ReplyDelete
  11. Interesting article and controversial topic. Idea of recognizing criminals and non-criminals from the pictures reminds me a little 'Minority Report' movie. It can lead to finding 'criminals' even before they commit a crime on the base of their face photo. I also agree with Mikolaj, that the set used to train neural network should reflect the society and the people specified as 'criminals' should be described more accurately. In my opinion there is a huge difference between the murderer, and the person who did not pay their taxes.

    ReplyDelete
  12. I don't like the idea of making a research in a field that potentially can go beyond our privacy. Not that I'm scared of it, but I just think it's ugly. Why an algorithm should decide if someone should be suspected or watched, just because his/her facial features fits some template. Further - we already have an ability to guess emotions or to say something about others just by looking at their faces. Why do we need an algorithm for that? Why do we need a virtual algorithm which could say 'Hey, this guy can be a murderer, because his face fits a template', if we already have this kind of sense? Maybe it's interesting that criminal faces are similar in some kind of way, but don't we know it without an algorithm? Maybe research in similar psychological 'patterns' of real criminals should be done, so the examinations/studies could be run behind court's door.

    In my opinion, using face recognition algorithms to scan people faces just to try to match their facial features to some criminal template to mark him or her as a potential threat is very far from being ethic.

    ReplyDelete
    Replies
    1. Hello, thank you for your comment. I have the same feelings about this topic. It is taking a step back in the field of science.

      Delete
  13. pirvacy ? WHat is that ? :) Let us be serious, 99% of population that is connected to the internet has no privacy. The 1% are people that understand that and they are using special software to keep their lives as private as possible. We are selling our lives to Facebook, Google, Microsofot, Apple and so on and so on. Everybody is happy that services like Facebook, Gmail, Outlook, Google Analytics and other are for free and everybody can use it. But let us be serious, nowadays nothing is for free. The only question is what is the currency. In the examples of services we are the currency and informations about our lives. Recently I've read what information you can get from facebook if you pay them. There are 92 things that can blow your mind. For example based on the data that you`re sharing Facebook is able to say where do you live and how big is your live, what is your sex preferences and many other very individual things. I can just guess that the Master of the Universe ( I mean Google) know even more.
    Face recognition is just another natural step in this area. They will say that it is for our safety or to give you better offer in the shop and we as a society will believe them and we will sell our lives

    ReplyDelete
    Replies
    1. Hi, Thank you for sharing your opinion. There is no such thing like privacy. One person is a part of a group and is just another observation in the dataset. Each of us represent many different groups. This article presents a very similar situation when someone is interested in our specific features and he tries to reach us and bring consequences like going to jail instead of providing us with another campaign (I mean adverts, mails etc.)

      Delete
  14. Topic about criminals face recognition is quite old nonetheless I think that is very important not only in terms of public safety but also safety of ourselves. There is one big problem with this in every country there is different law connected with privacy so if it would to make sense there should some agreement between countries (maybe some agreement with facebook about sharing pictures?). When it comes to privacy in todays world (especially when there are a lot of social apps) we have to know that there is not anything called privacy and there won't be. As somebody said here it looks like some script from movie called Minority Report.

    ReplyDelete
  15. Thank you for a very interesting article, I agree with the previous speakers. Indeed, that is a very controversial approach. Unfortunately, there are honest people with a suspicious appearance and also very dishonest people on the face of an angel. Search features of criminals by face somewhat belies the presumption of innocence.
    Privacy on the Internet - probably the safest way is to think that it does not exist. Unfortunately, some people are unaware of the risks and very little care about their privacy - which often brings them problems. Just look what people write on various portals as protect their passwords. Looking at it from the perspective, we think they behave as if they were looking for a problem.

    ReplyDelete
  16. To be honest I am shocked. I was not aware of the study and its far-fetched conclusions. I guess for some scientists the saying "Don't judge the book by its cover" is no longer valid. I don't believe that criminals share some particular facial features. There are dishonest people who look very decent and there are honest, hard-working people who have crooked mouthes. When it comes to Internet and privacy, I don't believe that there can be any privacy on the Internet and I really admire the gullibility of people who think otherwise.

    ReplyDelete
  17. I would like to encourage you to share your thoughts about the problem of privacy, facial recognition and any facial analysis which produce such far-reaching conclusions?
    More than 117 million adults included in ‘virtual, perpetual lineup’, which authorities can use to track citizens, raising concerns over privacy and profiling.
    Take a look at this article: https://www.theguardian.com/world/2016/oct/18/police-facial-recognition-database-surveillance-profiling
    In my honest opinion, if we can somehow reduce amount of terrorists it's a clear win. I don't mind being monitored in public area.

    Instead of sharing photos via facebook and the other services are we going to protect our images and try to reserve strictly the rights to our images?
    I won't elaborate about this, but I'm not sharing any photos. This is the best way to protect our privacy.

    ReplyDelete
  18. Very interesting example of use artificial intelligence. It is quite surprising that we can classify someone as criminal on a base of his or her facial expression.
    I think that privacy of our images would not be a problem if we knew that it is used with a respect of law. On the one hand recognizing criminals by artificial intelligence may seem to be a way to increase our safety but on the other hand, if it is used by inappropriate people might be also treated as a reason to suspect innocent people who are classified as criminals. The problem appears when someone uses it with bad intentions. Also, powerful algorithms are more accessible for everyone due to simpler tools and with a little knowledge, almost every software engineer can try to write similar classifiers. This is a good reason to care about the safety of our data.

    ReplyDelete
  19. This comment has been removed by the author.

    ReplyDelete
  20. Hey, first of all I would like to apologise for my late reply. This is indeed a very interesting topic. I think that this use of the technological advancements might start looking like the movie Minority Report in which people were convicted for crimes prior to committing them. I suspect that there are some traits and characteristics which make psychopaths more likely to make a characteristic facial expression. I really don't know, its just a thought. However I am not really convinced this method will be treated seriously.

    ReplyDelete
  21. I think that facial image should be protected, but use the mechanisms of face recognition such as football matches at the airports is a good direction. It is a subject quite controversial, we want to be anonymous and at the same time we want to feel safe in public places.

    ReplyDelete