How the new Instegogram threat creates liability for organizations
Insights
CSO Online
How the new instegogram threat creates liability for organizations
December 26, 2023
By Daniel Garrie, Jennifer Deutsch and Peter Halprin
Organizations might be at risk of liability for images containing malicious code they post on social media even if they were unaware of it.
Writing in 2017, one of the authors of this article noted that, “Social media networks represent the largest, most dynamic risk to organizational security and allocating liability.” Unfortunately, with the growth of social media networks since then, this threat has only increased. First identified in 2016, this risk combines digital image steganography and social media in the corporate environment. While neither steganography nor social media are new, it is novel to combine both as a tool for malware distribution.
What is Instegogram?
This scheme, known as “Instegogram,” is the use of social networks, Instagram in particular, as a threat actor’s command-and-control site. Instegogram is unique in that “once the remote system is compromised, encoded images can be posted from the command machine using Instagram’s API. The remote system will download the image, decode it, execute the encoded commands, encode the results in another image, and post back to Instagram.” Instegogram was created for academic purposes, but its potential use as part of a malware attack poses the question of who would be liable for such an attack.
Instegogram attacks could remove liability protections
Under Section 230 of the Communications Decency Act (CDA), companies that offer web-hosting services are typically shielded from liability for most content that customers or malicious users place on the websites they host. However, such protection may cease if the website controls the information content. A company that uses a social media network to create the picture or develop information would arguably control that information and thus may not be immune. That is, if a service provider is “responsible, in whole or in part, for the creation or development of the offending content,” its actions could fall outside the CDA’s protections.
To read the full article, go to CSO Online