THINKING THE APPLICATION OF 'MAKING' CHILDREN IN OLD COLLECTION OF PERSONAL DATA

FaceApp - an image-editing application with artificial intelligence (AI) - is suspected of automatically collecting unauthorized user images.

FaceApp has been a worldwide fever application for the past few days thanks to the ability to use AI to transform facial images, making subjects grow old, change gender ... It is used by many famous people. Soaring downloads both on the App Store and Play Store. However, experts in Vietnam and around the world believe that this software is quietly collecting data, users should not use.

FaceApp applications are prevalent in Vietnam and some countries around the world use AI to transform user images.
Mr. Vo Do Thang, Director of Athena Network Security Center, said that FaceApp is a cloud-based image processing application, which has a high risk of stealing face data. “When you accept to hand over images to this software, your information is likely to have been collected. The personal information can be sold to advertisers, AI research companies or facial recognition organizations, ”Mr. Thang explained.

According to Mr. Thang, biometrics and face recognition are developing strongly, the future of people will do everything through this technology. “If the face is exposed, crooks can access other information, including sensitive financial data. The risk of having money stolen, credit card numbers or being used to pay for a certain invoice is absolutely likely to happen in the near future, ”Mr. Thang warned.

According to technology expert Pham Hong Phuoc, users should consider using FaceApp because when they agree with the terms and access rights, a photo of a person is likely to become "common" worldwide. . Even, Phuoc is worried that the image of users will be analyzed, built into a 3D version and used to unlock mobile devices, or used to create virtual accounts on social networks for fraudulent purposes. or harassment.

In the past few days, many people working in the security field in the world also warned about FaceApp's privacy breach and data theft. In a Twitter post, American developer Joshua Nozzi discovered that the app automatically saves images in smartphones, without permission and without specific notice.

David Shipley, security expert at Beauceron Security (USA), said that the advertised product is free, but the "cost" here is the user. "Just by collecting facial images, users also face a series of risks," Shipley emphasized. "For example, the image is used to identify, identify you, or sell to places where they are needed."

Even Ariel Hochstadt, co-founder of online security company vpnMentorlo (USA), fears FaceApp might become a spy tool. "When we grant access to the camera, it can secretly record someone, upload it to a remote server and collate it with the huge database of faces collected earlier," Hochstadt said. prefer. "With the huge amount of information and images, it is possible that FaceApp will be used to become a tracking tool for the Russian government."

Facing growing concerns, the US Democratic Party has recently called on the authorities to investigate FaceApp. According to CNN, many party MPs cautioned that an application from Russia could potentially threaten the US presidential campaign in 2020. "It would be very unfortunate if the personal information of US citizens reached the country. outside and fall into some hostile force against Washington, "Senator Chuck Schumer warned.

FaceApp is software from a Russian-based company, launched in January 2017
Meanwhile, FaceApp representative denied. “Most images are deleted from our servers within 48 hours of the time of upload. The company claims not to sell or share data to any third party, ”a FaceApp spokesman said in a statement.

FaceApp is a mobile photo editing application developed by Russia's Wireless Lab, launched in January 2017 on iOS and February 2017 on Android. The software uses AI technology to transform faces in an image into different states, from sadness to laugh, look younger, look older or even change gender.

Post a Comment

0 Comments