Independent cybersecurity audits

Independent Cybersecurity Audits Are Powerful Tools for Boards

Independent Cybersecurity Audits Are Powerful Tools for Boards

Insights

Bloomberg Law

 

Independent Cybersecurity Audits Are Powerful Tools for Boards

 

March 11, 2024

 

By Daniel B. Garrie, Esq.

Board members today increasingly face personal liability for their organization’s cyber posture. This has raised the stakes of attestations and created a need to gain insight into cyber programs.

One of the most effective ways to do so is through independent cybersecurity audits. This essential component of responsible organizational governance can demonstrate proactive leadership and reveal possible blind spots. Cybersecurity audits are also necessary for compliance with regulations that hold the board and C-suite accountable for verifying the efficacy of their company’s cybersecurity program.

Recent Regulations

Growing cyber regulatory oversight is demanding dynamic evidence of compliance. The Securities and Exchange Commission’s 2023 rules on cybersecurity risk governance and public company incident disclosure require boards of directors to oversee corporate cybersecurity management and demonstrate active oversight, while facing personal liability for failures. Public reporting companies must also:

  • Disclose all material cybersecurity incidents within four business days
  • Describe process(es) used to identify, assess, and manage material risks from cybersecurity threats, and their effect on business strategy, results of operations, or financial condition
  • Describe the board’s oversight of cybersecurity risks and leadership’s role in assessing and managing material risks from cybersecurity threats

Another recent example is the New York State Department of Financial Services’ amended cybersecurity regulation, which requires covered entities to conduct independent audits of their cybersecurity programs and integrates cybersecurity into business strategy. Changes include:

  • Additional controls and requirements for more regular risk and vulnerability assessments, along with more robust incident response, business continuity, and disaster recovery planning
  • Updated notification requirements, which include reporting ransomware payments
  • Updated direction for companies to invest in at least annual training and cybersecurity awareness

To read the full article, go to Bloomberg Law

Special Masters in Social Media Litigation

Using Special Masters in Social Media Litigation to Streamline Discovery: Navigating the complexities of these cases

Using Special Masters in Social Media Litigation to Streamline Discovery: Navigating the complexities of these cases

Insights

ALM

 

Using Special Masters in Social Media Litigation to Streamline Discovery: Navigating the complexities of these cases

 

February 21, 2024

 

By Daniel B. Garrie, Esq.

There has been an explosion of litigation in recent years related to the potentially life-threatening effects of social media usage among adolescents. Hundreds of cases have been filed in state and federal courts, many of which have been consolidated into multidistrict litigations or otherwise consolidated within state court departments. One such example in California state court is, in which the court recently overruled Snap’s demurrer on claims that Snap’s conduct in designing and implementing its social media platform, Snapchat, resulted in the foreseeable deaths of plaintiffs’ children, who overdosed on fentanyl.

Litigations involving social media can be complex and highly technical. Such cases often involve collecting and analyzing large amounts of data from social media websites and apps. This can present challenges for the lawyers and judges involved, as they may not be familiar with social media platforms and managing the unique types of data generated and stored on these platforms. Engaging a discovery special master can help streamline discovery in social media litigations to ensure that the right data is collected as efficiently as possible.

Social media repositories present unique issues for the discovery process. For instance, the repositories that hold a user’s social media data are controlled by a third party (e.g., Meta, X, Snap, etc.). Obtaining a user’s data typically requires the user to download their information using the application at issue. However, this download will only capture a snapshot of the user’s data at the time of the download. This means that any subsequent changes to a user’s social media data that occur after the download would not be captured by this snapshot.

To read the full article, go to ALM

Cybersecurity and Privacy Regulations

Small Law Firms Must Take Action and Address Cybersecurity and Privacy Regulations

Small Law Firms Must Take Action and Address Cybersecurity and Privacy Regulations

Insights

ALM

 

Small Law Firms Must Take Action and Address Cybersecurity and Privacy Regulation

 

February 15, 2024

 

By Daniel Garrie, Esq., Peter A. Halprin, Esq., and Elsa Ramo, Esq.

Cybersecurity and privacy regulations have become increasingly important in recent years due to the exponential growth of technology and the internet. The legal industry, including small law firms, is not immune to these challenges. In fact, small law firms must prioritize cybersecurity and privacy regulations to protect their clients’ sensitive information and maintain their professional reputations. This article explores the reasons why small law firms need to care about cybersecurity and privacy regulations and provides recommended first steps.

Six Reasons Why Small Law Firms Should Be Concerned About Cybersecurity and Privacy Regulations

 

    1. Ethical Obligations

As legal professionals, lawyers have an ethical obligation to protect their clients’ confidential information. Rules of professional conduct across various jurisdictions emphasize the importance of maintaining client confidentiality and safeguarding client data. Failing to uphold these ethical obligations can lead to disciplinary action.

From social media posts to a third-party vendor who is managing the website to a company processing a credit card payment on behalf of the firm, the lawyer has an ethical responsibility to ensure that all parties that interface with the law firm are operating under strict confidentiality and complying to prevent the disclosure of confidential information.

    1. Legal Obligations

Small law firms may also be subject to privacy regulations, such as the California Consumer Privacy Act (CCPA). Non-compliance with these regulations can result in financially devastating consequences for small law firms.

As noted above, it is not simply a limited duty for the attorney to maintain confidentiality, but rather the attorney and law firm have a legal obligation to ensure that client information is stored in a way that protects privacy. More often than not, small law firms are paperless and store virtually all of their data electronically, so the law firm must ensure that how, where, and who is storing that data is in compliance with applicable law.

To read the full article, go to ALM

The advantages of mediating Computer Frand and Abuse Act disputes

The advantages of mediating Computer Frand and Abuse Act disputes

Insights

Daily Journal


The advantages of mediating Computer Frand and Abuse Act disputes


February 6, 2024

 

By Daniel Garrie, Hon. Gail A. Andler

 

The Computer Fraud and Abuse Act of 1986 (“CFAA”), codified as United States Code Title 18 Section 1030, is probably best known as the primary federal law governing cybercrime in the United States today. However, the CFAA also provides for civil remedies, which some companies have seen as an way to recover monetary damages for suffered losses from data breaches or cyberattacks that the government is either unwilling or unable to prosecute due to the explosion of cybercrime in recent years. For CFAA cases pursued as civil matters, mediation can be an effective tool for resolving disputes, saving the parties time and money in a way that gets to the heart of the technical issues.    

What is the CFAA?  

At a high level, the CFAA prohibits unauthorized access of protected computer systems and the distribution, theft, or damage of information from a computer or network. The CFAA also includes provisions prohibiting other computer related offenses such as computer espionage, trafficking in passwords, and transmitting malicious code.  

Notably, the CFAA also allows individuals to bring civil actions for violations of the CFAA. Pursuant to Section 1030(g) of the CFAA, “Any person who suffers damage or loss by reason of a violation of this section may maintain a civil action against the violator to obtain compensatory damages and injunctive relief or other equitable relief.”  The CFAA limits the right of a private individual to bring a civil action only where the violation: (1) modifies or impairs medical examination, diagnosis, treatment or care of a person or persons; (2) causes physical injury to any person; (3) causes a threat to public health or safety; or (4) causes “loss” to one or more persons during any one-year period aggregating at least $5,000 in value. 

This gives entities that have been the victim of cybersecurity breaches a useful tool in situations where law enforcement does not pursue the matter. There are also advantages for the plaintiff as the “preponderance of evidence” standard of civil cases is much lower than the “beyond a reasonable doubt” standard of criminal cases.

To read the full article, go to Daily Journal

Technical E-Discovery Neutrals in Litigation

Harnessing the Power of Technical E-Discovery Neutrals in Litigation

Harnessing the Power of Technical E-Discovery Neutrals in Litigation

Insights

Technical E-Discovery Neutrals in Litigation

New York Law Journal

 

Harnessing the Power of Technical E-Discovery Neutrals in Litigation

 

February 2, 2024

 

By Daniel Garrie, Leo M. GOrdon

 

E-discovery processes can be complex given the sheer volume and diversity of digital data, combined with the technical intricacies of data management and retrieval. This is where technical e-discovery neutrals come into play, offering their specialized expertise to manage and streamline e-discovery processes.

The digital age has significantly transformed the legal landscape, particularly in the realm of discovery. Electronic discovery (e-discovery) has emerged as a critical part of modern litigation regarding the identification, collection and production of electronically stored information (ESI). However, e-discovery processes can be complex, given the sheer volume and diversity of digital data, combined with the technical intricacies of data management and retrieval. This is where technical e-discovery neutrals come into play, offering their specialized expertise to manage and streamline e-discovery processes, potentially narrowing disputes and saving significant time and cost.

 

Role of Technical E-Discovery Neutrals

Technical e-discovery neutrals are legal professionals with expertise in both the law and technology. They work to facilitate efficient and cost-effective e-discovery processes by advising parties on technical issues, ensuring compliance with relevant legal standards, and arbitrating disputes over ESI. Their involvement can range from consultative roles to more formal appointments by courts, especially in complex litigation where the e-discovery process might be contentious or technically challenging.

 

To read the full article, go to Daily Journal

Roles of E-Discovery and Digital Forensics

Understanding the Distinct Roles of E-Discovery and Digital Forensics

Understanding the Distinct Roles of E-Discovery and Digital Forensics

Insights

Roles of E-Discovery and Digital Forensics

Daily Journal

Understanding the Distinct Roles of E-Discovery and Digital Forensics

December 28th, 2023

By Daniel Garrie, Hon. Gail A. Andler

E-discovery and digital forensics are two distinct and nuanced concepts that are often conflated in the world of legal technology. While both fields converge in their utilization of digital data and may overlap once litigation is instituted, their applications, methodologies, and implications in legal proceedings significantly differ.

E-Discovery is by its nature employed once litigation (or arbitration, under some rules) has commenced; digital forensics implicates the prelitigation obligation of preservation, as discussed below, and perhaps other aspects of the discipline which may come into play for pre-litigation mediation or other forms of alternative dispute resolution.

Take, for example, the hypothetical situation of a key employee (“Former Employee”) leaving Business A to start a competing business, Business B. As soon as competing business enterprise Business B or Former Employee are put on notice that Business A may dispute some aspect of Former Employee’s actions in leaving Business A or engaging at Business B, Digital Forensics must come into play to identify, preserve and maintain certain electronically stored information of all concerned. Early mediation efforts may take place pre-mediation with the sides, separately or together, utilizing a digital forensics expert to review hard drives or phones to determine whether information has been accessed, downloaded or deleted. In our hypothetical, it is not until either litigation or arbitration permitting discovery commences that eDiscovery may come into play, potentially overlapping with Digital Forensics activities. Following below is a more expansive discussion of each. Understanding the roles and characteristics of these two critical facets of legal practice can aid legal professionals in managing the technical aspects of legal proceedings more efficiently and avoid costly pitfalls. This article provides an overview of the defining features of e-discovery and digital forensics and how they are used in distinct ways in the legal field.

To read the full article, go to JAMS

Instegogram threat creates liability

How the new Instegogram threat creates liability for organizations

How the new Instegogram threat creates liability for organizations

Insights

Instegogram threat creates liability

CSO Online

How the new instegogram threat creates liability for organizations 

December 26, 2023

By Daniel Garrie, Jennifer Deutsch and Peter Halprin

Organizations might be at risk of liability for images containing malicious code they post on social media even if they were unaware of it.

Writing in 2017, one of the authors of this article noted that, “Social media networks represent the largest, most dynamic risk to organizational security and allocating liability.” Unfortunately, with the growth of social media networks since then, this threat has only increased. First identified in 2016, this risk combines digital image steganography and social media in the corporate environment. While neither steganography nor social media are new, it is novel to combine both as a tool for malware distribution.

What is Instegogram?

This scheme, known as “Instegogram,” is the use of social networks, Instagram in particular, as a threat actor’s command-and-control site. Instegogram is unique in that “once the remote system is compromised, encoded images can be posted from the command machine using Instagram’s API. The remote system will download the image, decode it, execute the encoded commands, encode the results in another image, and post back to Instagram.” Instegogram was created for academic purposes, but its potential use as part of a malware attack poses the question of who would be liable for such an attack.

Instegogram attacks could remove liability protections

Under Section 230 of the Communications Decency Act (CDA), companies that offer web-hosting services are typically shielded from liability for most content that customers or malicious users place on the websites they host. However, such protection may cease if the website controls the information content. A company that uses a social media network to create the picture or develop information would arguably control that information and thus may not be immune. That is, if a service provider is “responsible, in whole or in part, for the creation or development of the offending content,” its actions could fall outside the CDA’s protections.

To read the full article, go to CSO Online

 
AI-Assisted Legal World

Preparing Law Students For A New, AI-Assisted Legal World

Preparing Law Students For A New, AI-Assisted Legal World

Insights

AI-Assisted Legal World

Law360

Preparing Law Students For A New, AI-Assisted Legal World

November 15th, 2023

By Daniel Garrie, Ryan Abbott and Karen Silverman

The legal profession is no stranger to change. From the invention of the printing press to the dawn of the internet, each technological revolution has reshaped how legal professionals work.

Today, another seismic shift is underway as artificial intelligence emerges as a powerful tool in the legal landscape.

A recent study from the University of Minnesota Law School highlighted this transformation, showcasing how AI — specifically, GPT-4, a large language model — aided low-performing law students in improving their exam scores on multiple-choice questions. However, with AI assistance, high-performing students saw a decline in their essay scores.[1]

The findings suggest a potential equalizing effect of AI within the legal profession, which hints at the broader implications of integrating AI in legal education.

This article will delve into the convergence of legal education and AI, exploring the methodologies to train future lawyers with AI and the myriad challenges that may ensue.

The Role of AI in Legal Education

Law schools across the globe are beginning to integrate AI into their curricula, aiming to prepare students for a future where AI plays a central role in legal practice. Here are some ways that AI can and is transforming legal education.

Personalized Learning

AI-powered adaptive learning platforms can give students a more tailored learning experience catering to their strengths and weaknesses. These platforms can suggest or generate personalized content and exercises by analyzing students’ past performance and learning preferences, improving educational outcomes.

Virtual Simulation and Scenario-Based Learning

AI can facilitate immersive and interactive learning experiences through virtual simulations and chatbots. These platforms can simulate real-world legal scenarios and provide instant feedback, allowing students to develop improved problem-solving and critical thinking skills in a controlled setting.

To read the full article, go to Law360

Don’t Rush to AI and ML Without a Governance Framework

Don’t Rush to AI and ML Without a Governance Framework

Don’t Rush to AI and ML Without a Governance Framework

Insights

Don’t Rush to AI and ML Without a Governance Framework

Security Current

Don’t Rush to AI and ML Without a Governance Framework October 26, 2023


By David Cass

The rapid adoption of artificial intelligence and machine learning yields tremendous benefits. But as with any transformational technology that can affect human lives and societal structures, there are attendant governance challenges.

Effective governance of AI and ML requires a blueprint to ensure these technologies are used safely, ethically, and responsibly. Understanding the risks associated with these technologies, such as biases, potential misuse, and privacy concerns, is essential. A governance framework will help ensure our organizations have transparency and accountability in their implementation of AI and ML, and they promote the responsible use of these technologies to avoid misuse or unintended consequences.

Having a framework also helps to build trust among the general public and the organization’s stakeholders regarding the deployment of AI and ML. You need to have a standard against which you will be measured.

Key components you need for an effective AI/ML governance framework include:

Clear objectives. There should be well-defined goals and principles to ensure that any AI or ML introduced is fair, reduces bias, and adheres to the ethical principles you define.

* Clearly defined roles and responsibilities. You want to make sure that you delineate the roles and responsibilities of those involved in developing, deploying, monitoring, and testing AI models.

* Data management. Guidelines on data collection have to be clearly spelled out. What data are being collected? How are data being stored? How are data being processed? How are they being used?

*Implementing transparency. How do you document the processes? How do you document the algorithms and the data sources that are used? This will help you explain the model and potentially explain decisions it may make if you’re called before a board of directors, congressional committee, or some other regulatory or governing body.  You need to be able to reconstruct what happened, not just from a regulatory point of view, but to ensure there’s nothing wrong with the model.

To read the full article, go to Security Current

 
Ethical Principles Must Undergird AI

Ethical Principles Must Undergird AI

Ethical Principles Must Undergird AI

Insights

Ethical Principles Must Undergird AI

Security Current

Ethical Principles Must Undergird AI

November 1, 2023

 

By David Cass

Artificial intelligence needs to be deployed in a way that benefits humanity. That requires looking beyond the short-term model to long-term use and AI’s widescale impact on the broader society.

As the use of artificial intelligence and machine learning grows, so, too, will the deployment of automated decision-making systems that could greatly impact well-being, privacy, and livelihood. Organizations must, therefore, develop ethical principles to guide the design, development, and deployment of AI and ML systems to ensure that the power of these technologies is used responsibly.

This is a two-stage process. Stage one is developing the principles. Stage two defines the various core AI ethics principles that will guide the organization.

When developing the principles, the first step is to get multidisciplinary input from a mixed community of ethicists, technologists, legal experts, and sociologists. Representatives of affected communities — for example, health care or finance — also have to be involved to guarantee there’s a comprehensive understanding of the potential implications for its use.

The second step would be a broader public consultation if it’s an AI or ML model that impacts society at large. Public consultations, such as a town hall, can offer insights from ordinary citizens who might be affected while helping to foster trust in the use of AI and ML.

Regularly reviewing ethical principles is critical because AI is evolving so quickly, and they need to remain relevant.

It’s also important to put a feedback mechanism in place to ensure that the AI developers, users, and affected individuals can provide observations and critiques on the AI systems and their implications once they’re deployed. It’s important to know whether the system is working as expected.

To read the full article, go to Security Current