{"id":23383,"date":"2024-03-14T09:23:27","date_gmt":"2024-03-14T09:23:27","guid":{"rendered":"https:\/\/web3unplugged.io\/blog\/?p=23383"},"modified":"2024-03-14T09:23:29","modified_gmt":"2024-03-14T09:23:29","slug":"fbi-warns-against-misuse-of-artificial-intelligence","status":"publish","type":"post","link":"https:\/\/web3unplugged.io\/blog\/fbi-warns-against-misuse-of-artificial-intelligence\/","title":{"rendered":"FBI Warns Against Misuse Of Artificial Intelligence"},"content":{"rendered":"\n<p>After a quick scroll through any social media network, users are highly likely to encounter content created by artificial intelligence.&nbsp;<\/p>\n\n\n\n<p>For example, people standing next to intricate and impossible-to-create wood carvings seem to be making their rounds, garnering thousands of likes and shares from the masses.&nbsp;&nbsp;<\/p>\n\n\n\n<p>Many seem to believe these photos represent someone\u2019s efforts to create a work of art.&nbsp;In reality, it\u2019s as simple as someone typing \u201cman standing next to a realistic wooden carving of a grizzly bear\u201d and then using photo editing software to clean the image up a little bit.&nbsp;<\/p>\n\n\n\n<p>While using artificial intelligence to collect likes on Instagram, Facebook or one\u2019s social media of choice is rather benign, the FBI warns that this technology can cause serious damage in the wrong hands.&nbsp;&nbsp;<\/p>\n\n\n\n<p>\u201cIn the image and video domain the threat ranges from damaging election integrity by creating fake videos of politicians to harming individuals\u2019 lives through bullying and intimidation,\u201d the FBI said. \u201cThe latter might be accomplished through nonconsensual pornography that superimposes a victim\u2019s face into explicit content or uses techniques to alter a clothed image of the victim to make them appear naked.&nbsp;&nbsp;<\/p>\n\n\n\n<p>\u201cBut other potential vectors exist, such as audio deepfakes which might be used to fool a financial officer to make a bank transfer based on a \u2018call from the CEO\u2019 or a deepfake call that fools a family member into thinking a loved one is in danger and needs a ransom or other financial payment made to secure their safety.\u201d&nbsp;<\/p>\n\n\n\n<p>Even though using artificial intelligence to create images, video, audio or text files is new, the manipulation of various forms of media to achieve nefarious goals is not.&nbsp;&nbsp;<\/p>\n\n\n\n<p>\u201cThe history of manipulated media is a continual game of cat and mouse \u2013 whenever a new technique is developed to manipulate media, researchers figure out a way to detect it,\u201d the FBI said. \u201cOnce it becomes possible to detect the manipulation, a new technique will be developed, and the cycle starts again. The current era is only different from prior ones in that the development cycle is greatly accelerated.\u201d<\/p>\n\n\n\n<p>Also, since it\u2019s not a new idea, that means there are already laws on the books that allow law enforcement agencies to arrest and prosecute people who harm others by using what the FBI calls synthetic media.&nbsp;&nbsp;<\/p>\n\n\n\n<p>\u201cThe laws relating to fraud apply regardless of the means by which fraud occurs, so synthetic-media-enabled financial fraud is still subject to prosecution,\u201d the FBI said. \u201cLikewise, in cases such as when fake robocalls are used to dissuade voters from voting (a clear effort to stop people from exercising their constitutional rights), laws exist to allow for prosecution.&nbsp;<\/p>\n\n\n\n<p>\u201cThe FBI is constantly collaborating with our international, federal and local law enforcement partners to counter this fight and protect victims from this threat.\u201d&nbsp;<\/p>\n\n\n\n<p>The organization noted that the advances in technology, which makes the manipulation of media much easier, also means it makes it easier for others to create ways to detect the manipulation.&nbsp;&nbsp;<\/p>\n\n\n\n<p>\u201cSo, while the number of people performing as \u2018black hats\u2019 has increased, so, too has the number of \u2018white hats,\u2019\u201d the FBI said.&nbsp;&nbsp;<\/p>\n\n\n\n<p>According to the bureau, other than using a detector built to spot different types of synthetic media, there are a number of signs that can indicate if a piece of media has been manipulated by AI.&nbsp;&nbsp;<\/p>\n\n\n\n<p>For example, distortions, warping, or inconsistencies, such as different earrings on the left and right ears or the wrong number of fingers in images or videos, may be indicators that the media has been manipulated or created using generative AI programs.&nbsp;<\/p>\n\n\n\n<p>Other telltale signs include noticeably unnatural head or torso movements or out of sync face and lip movements in videos, and static, which is used to hide artifacts that are created during the process of making deepfake audio files.&nbsp;&nbsp;<\/p>\n\n\n\n<p>The FBI offered one way to avoid the aforementioned scam where someone will mimic the voice of a loved one and call and say they are in danger or they need money.<\/p>\n\n\n\n<p>The organization said to establish a \u201cchallenge phrase\u201d within your family, so you can verify the identity of the person on the other end of the line.&nbsp;&nbsp;<\/p>\n\n\n\n<p>\u201cIf you are going to be on social media and the Internet, it is inevitable that you will encounter synthetic media,\u201d the FBI said. \u201cBe skeptical.\u201d<\/p>\n","protected":false},"excerpt":{"rendered":"<p>After a quick scroll through any social media network, users are highly likely to encounter content created by artificial intelligence.&nbsp; For example, people standing next to intricate and impossible-to-create wood carvings seem to be making their rounds, garnering thousands of likes and shares from the masses.&nbsp;&nbsp; Many seem to believe these photos represent someone\u2019s efforts [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":23167,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_seopress_robots_primary_cat":"none","_seopress_titles_title":"","_seopress_titles_desc":"","_seopress_robots_index":"","footnotes":""},"categories":[2],"tags":[],"class_list":["post-23383","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-news"],"rttpg_featured_image_url":{"full":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2024\/02\/AI.jpg",500,500,false],"landscape":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2024\/02\/AI.jpg",500,500,false],"portraits":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2024\/02\/AI.jpg",500,500,false],"thumbnail":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2024\/02\/AI-150x150.jpg",150,150,true],"medium":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2024\/02\/AI-300x300.jpg",300,300,true],"large":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2024\/02\/AI.jpg",500,500,false],"1536x1536":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2024\/02\/AI.jpg",500,500,false],"2048x2048":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2024\/02\/AI.jpg",500,500,false],"post-thumbnail":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2024\/02\/AI.jpg",420,420,false],"graptor-sq-xs":["https:\/\/web3unplugged.io\/blog\/wp-content\/uploads\/2024\/02\/AI.jpg",100,100,false]},"rttpg_author":{"display_name":"Admin CG","author_link":"https:\/\/web3unplugged.io\/blog\/author\/admin-cg\/"},"rttpg_comment":0,"rttpg_category":"<a href=\"https:\/\/web3unplugged.io\/blog\/category\/news\/\" rel=\"category tag\">news<\/a>","rttpg_excerpt":"After a quick scroll through any social media network, users are highly likely to encounter content created by artificial intelligence.&nbsp; For example, people standing next to intricate and impossible-to-create wood carvings seem to be making their rounds, garnering thousands of likes and shares from the masses.&nbsp;&nbsp; Many seem to believe these photos represent someone\u2019s efforts&hellip;","_links":{"self":[{"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/posts\/23383","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/comments?post=23383"}],"version-history":[{"count":1,"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/posts\/23383\/revisions"}],"predecessor-version":[{"id":23385,"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/posts\/23383\/revisions\/23385"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/media\/23167"}],"wp:attachment":[{"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/media?parent=23383"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/categories?post=23383"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/web3unplugged.io\/blog\/wp-json\/wp\/v2\/tags?post=23383"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}