AI is trained on train passengers. There is one thing that worries me the most
Did you know that while traveling by train, your face can be scanned by an AI camera and you won’t even know about it? Artificial intelligence testing is in full swing.
AI cameras are used to monitor crowds, to detect thieves, suspicious activity and to track passengers’ emotions.
Thousands of people using UK trains appear to have been scanned by Amazon software as part of extensive artificial intelligence tests. An image recognition system was used to recognize the age, gender and potential emotions of travelers, suggesting that this data could be used in advertising systems in the future.
Over the past two years, eight railway stations have tested AI surveillance technology using CCTV cameras to alert staff of potentially dangerous incidents. It was about the safety of travelers and reducing crime.
Extensive research has used an object recognition system that can identify elements in video materials. It is able to detect people entering the tracks, predict platform overcrowding, and identify antisocial behavior such as shouting, running, smoking.. The system is able to prevent bicycle theft. In separate studies, sensors were used to detect slippery floors, overflowing garbage bins and flooded drains.
Information about research is made available at the request of interested organizations.
The implementation of artificial intelligence surveillance in public space, without wider consultation and talks, is quite a disturbing step.
– says Jake Hurfurt, from the civil liberties group Big Brother Watch, who analyzed the research.
According to the documents, each station had from 5 to 7 sensors or cameras. The most disturbing element of the research is passenger demographics and their emotionssuch as joy, sadness and anger. In this way, it attempted to measure passenger satisfaction and then use the data “to obtain maximum advertising and retail revenues.”
Detecting emotions, according to artificial intelligence researchers, is quite controversial. Some even believe it should be banned, due to the difficulty of telling how a person is really feeling from sound or image. So far, none of the systems uses facial recognition technology, the purpose of which is to match people’s identities to those in databases.
See: Facebook changes its data use rules. You will raise your objection by June 26
