اهلا وسهلا بيكم في موقع دروس فوريو
بيقولك مجموعة هاكرز عملوا تطبيق بيعتمد على الذكاء الاصطناعي في إنه يقلد أصوات الأشخاص باللكنة بسرعة وطريقة الكلام بكله كله وعملوا تجربة على مدير تنفيذي في شركة إنجليزية واتصلوا بيه على أساس إنه بيكلم المدير بتاعه في فرع ألمانيا وقالوله يحول $243,000 على حساب بنكي معين لصالح مورد مجري والراجل حول الفلوس واللي بعد كدة اتحولت كذا حساب لحد ما اختفت.
اوعى حد يتصل بيك ويقولك أنا صاحبك فلان الفلاني وحولي فودافون كاش عشان معاييش فلوس أروح.. هتلاقيه في الآخر واحد بيشتغلك
طبعا دية الترجمة العربي
نقلا عن
Muhammad Elghdban
نقلا عن
Muhammad Elghdban
😃
So-called artificial intelligence apps like Zao had been stirring up controversy with their potential abusive use to beat facial recognition systems.
The Chinese deepfake video app proved itself to be widely popular as users had fun transplanting their digital faces onto footage from movies and popular TV shows such as “Game of Thrones.”
But deepfake video isn’t the only area raising concerns. The ability to make convincing deepfake audio, mimicking the voice of real people, is also ringing alarm bells due to its potential for abuse by criminals and scammers.
A report in The Wall Street Journal brings their fear into the spotlight with a claim that an energy firm was defrauded out of $243,000.
According to the report, the chief executive of the unnamed UK-based firm believed he was talking to his boss at the company’s German parent company, when he was ordered to immediately move €220,000 (approximately US $243,000) into what he thought was the bank account of a Hungarian supplier.
Rüdiger Kirsch, a fraud expert at the company’s insurance company told the WSJ that the executive was told the payment was urgent and should be made within the hour, and was made more believable because the UK-based CEO recognized his boss’ “slight German accent” and the “melody” of his voice on the phone.
The funds were duly transferred to an account under the criminals’ control in Hungary, and then onto an account based in Mexico, before being ultimately moved elsewhere.
When the scammers tried the trick again to request a further payment, the UK company became suspicious noticing that the calls were originating from Austria rather than Germany.
Quite what makes Kirsch believe that deepfake technology was being used, rather than just someone who is really good at doing an impression of a particular German chief executive, is not made clear.
Although some media reports have suggested that this is the first noted instance of deepfake audio being used in a scam that may not be accurate.
A couple of months ago a representative of security firm Symantec told BBC News that they knew of three cases where “seemingly deepfaked audio” of different chief executives had been used to trick staff to transfer money into bank accounts under the control of scammers.
Unfortunately when I quizzed Symantec at the time for further information they were unable to confirm who the victims had been, how they came to believe the CEO’s conversation had been mimicked through deepfake technology, or even what country the affected companies were based in.
What’s clear, regardless of the method used to try to dupe staff into believing they are speaking to someone they’re not, is to have systems and procedures in place to confirm that large transfers of money or sensitive data are properly authorised.
A simple phone call clearly can no longer be considered enough.
مصدر الخبر
هنا
هنا
تعليقات: 0