FBI Warns Against Misuse Of Artificial Intelligence

Home/FBI Warns Against Misuse Of Ar...
FBI Warns Against Misuse Of Artificial Intelligence
FBI Warns Against Misuse Of Artificial Intelligence Admin CG March 14, 2024

After a quick scroll through any social media network, users are highly likely to encounter content created by artificial intelligence. 

For example, people standing next to intricate and impossible-to-create wood carvings seem to be making their rounds, garnering thousands of likes and shares from the masses.  

Many seem to believe these photos represent someone’s efforts to create a work of art. In reality, it’s as simple as someone typing “man standing next to a realistic wooden carving of a grizzly bear” and then using photo editing software to clean the image up a little bit. 

While using artificial intelligence to collect likes on Instagram, Facebook or one’s social media of choice is rather benign, the FBI warns that this technology can cause serious damage in the wrong hands.  

“In the image and video domain the threat ranges from damaging election integrity by creating fake videos of politicians to harming individuals’ lives through bullying and intimidation,” the FBI said. “The latter might be accomplished through nonconsensual pornography that superimposes a victim’s face into explicit content or uses techniques to alter a clothed image of the victim to make them appear naked.  

“But other potential vectors exist, such as audio deepfakes which might be used to fool a financial officer to make a bank transfer based on a ‘call from the CEO’ or a deepfake call that fools a family member into thinking a loved one is in danger and needs a ransom or other financial payment made to secure their safety.” 

Even though using artificial intelligence to create images, video, audio or text files is new, the manipulation of various forms of media to achieve nefarious goals is not.  

“The history of manipulated media is a continual game of cat and mouse – whenever a new technique is developed to manipulate media, researchers figure out a way to detect it,” the FBI said. “Once it becomes possible to detect the manipulation, a new technique will be developed, and the cycle starts again. The current era is only different from prior ones in that the development cycle is greatly accelerated.”

Also, since it’s not a new idea, that means there are already laws on the books that allow law enforcement agencies to arrest and prosecute people who harm others by using what the FBI calls synthetic media.  

“The laws relating to fraud apply regardless of the means by which fraud occurs, so synthetic-media-enabled financial fraud is still subject to prosecution,” the FBI said. “Likewise, in cases such as when fake robocalls are used to dissuade voters from voting (a clear effort to stop people from exercising their constitutional rights), laws exist to allow for prosecution. 

“The FBI is constantly collaborating with our international, federal and local law enforcement partners to counter this fight and protect victims from this threat.” 

The organization noted that the advances in technology, which makes the manipulation of media much easier, also means it makes it easier for others to create ways to detect the manipulation.  

“So, while the number of people performing as ‘black hats’ has increased, so, too has the number of ‘white hats,’” the FBI said.  

According to the bureau, other than using a detector built to spot different types of synthetic media, there are a number of signs that can indicate if a piece of media has been manipulated by AI.  

For example, distortions, warping, or inconsistencies, such as different earrings on the left and right ears or the wrong number of fingers in images or videos, may be indicators that the media has been manipulated or created using generative AI programs. 

Other telltale signs include noticeably unnatural head or torso movements or out of sync face and lip movements in videos, and static, which is used to hide artifacts that are created during the process of making deepfake audio files.  

The FBI offered one way to avoid the aforementioned scam where someone will mimic the voice of a loved one and call and say they are in danger or they need money.

The organization said to establish a “challenge phrase” within your family, so you can verify the identity of the person on the other end of the line.  

“If you are going to be on social media and the Internet, it is inevitable that you will encounter synthetic media,” the FBI said. “Be skeptical.”


PUBLISHING PARTNERS

Tags