Deepfake Videos Are Also After Our Money

Dazzling promises of technologies are numerous. New technology will add speed to live and help people to save time. Time is money. But, new technologies can also earn money by creating added value by increasing production and decreasing costs. Also, humans are prone to make mistakes, forget things, or be insufficient. Advanced skills of technology can eliminate human risk. Thus, humankind is always hungry for new technologies for new gains.

Artificial Intelligence (AI) is, in fact, a challenge to human intelligence that has developed civilizations, created science and technology, and transformed the Earth into its current form. Humankind was not satisfied with creating a digital mind where they transfer everything they know with artificial intelligence technology. Algorithms that have emerged with machine learning have reached beyond human intelligence. These robotic systems with self-learning abilities have started to replace humans in various fields of life.

The finance sector has a strong grip of artificial intelligence that enables 24/7 interaction with customers. Also, banks are carried with the charm of $447 billion savings until 2023, driven by artificial intelligence technology in human resources and points of service. Artificial intelligence has almost touched every field in our lives including retail banking, individual account opening, warning customers to use the idle balance in their accounts, and choosing the most suitable term and investment tools for the customer profile in investment banking.

Internet users, estimated to reach 4 billion people at the end of this year, have become used to an immense e-comfort offered by e-services provided by artificial intelligence to customers in various fields. An increase in e-customers has prioritized e-security with e-competition in the finance sector. Approximately one-half trillion dollar savings depends on customers making their transaction “from their living room, in the safest way possible.”

How safe is “video identity?”

Technological developments have made us safe, right? First, biometric photograph matching is presented as the advanced step of personal passwords that contain letters, numbers, and signs in digital identity control. Then, videos took photographs to place in digital banking and payment systems. Digital platforms like Binance, where more than 100 cryptocurrencies are traded, transferred to videos instead of photographs as criminals reached user accounts by using photographs of others. Accordingly, in International Finance Sector, “Fraud, Money Laundering and Financing Terror” (AML) efforts have led banks to have “Know Your Customers” obligations. As the European Union amended Fourth Money Laundering Directives on 20 May 2017, European banks have paved the way of identification with “video interview”. Banks have started to offer various banking transactions via video calls as customers scan their legal identity on the web or a mobile application without visiting the physical branch.

In fact, people who are after “e-theft and fraud” with artificial intelligence are working as much as technology companies that made banking services fast and easy. Actually, deepfake videos which are “hard to differentiate from the original” that emerged in the same years became a large threat in “Customer Video Identification” systems (E-KYC).

Fake videos are at the verge of perfection with deepfake

Security analysists are concerned about the safety of video identification due to advancements in deepfake videos. Due to a similar concern, China’s e-commerce giant Alibaba.com was forced to make an announcement about deepfake application Zao which has shook China. This announcement claimed that Zao cannot deceive Alibaba’s Alipay payment system with Face Recognition Payment system. It is unknown how much this announcement will continue to hold true. But, it is clear that deepfake videos, as a reflection of artificial intelligence, are becoming more “persuasive” every day.

In deepfake technology, GAN which has deep learning properties like a product of artificial intelligence has two artificial productive networks with different algorithms and uses one of these network data to create fake videos. Competitor networks analyze this fake video and provide feedback to make the fake video more convincing and realistic. Thus, deepfake video creators can spend more time to eliminate the errors with feedback assessment and reach “the most persuasive fraud” level. When deepfake videos reach a quality that is indistinguishable from the original, these videos move forward to perfection to deceive “Video Identification” systems. In this case, digital transactions related to different financial instruments including bank accounts, credit cards, insurance policies, stocks, and other securities, will be the target of theft and fraud with deepfake video identities.

Will Siri become a deepfake accomplice?

The finance sector’s deepfake test actually started with “Voice Recognition” before “Video Identification.” “Audio assistants” have become an indispensable part of our daily lives as a product of artificial intelligence have replaced the positions of real customer representatives in various banking transactions. From smartphones to navigation devices, audio interfaces known with different names like Siri, Alexa, and Cortana have entered our lives as mobile artificial intelligence technology. They are now taking an active role in banking transactions. The newest Alexa “skill” of TD Bank, one of the largest banks in the US, is enabling customers to place stock trades with just their voice. Voice assistant “Erica” in Bank of America’s mobile application analyzes customer models and offers recommendations to control daily spending. “Audio banking” is already provoking cyberthieves.

It is highly possible that widespread “deep” technologies will be used in banking and payment provider frauds by imitating customers. Imitating the voice of the CEO of a German-based company in England led to a $243.000 loss. This is the first recorded crime in this area. Now, a fraud only needs to capture your voice with a small recording device or a call from an unknown number. Cybercriminals will make Alexa, Erica, Siri, or other flavors of digital audio assistants an accomplice in account capturing (ATO) with audio services or call center frauds. Experts are warning financial institutions not to abandon intelligent rules for digital database user verification by sinking into “audio banking” attraction. Multi-layered safety structures will be necessary for deepfake fraud in the future.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: