Cultural and technological developments in recent years have given rise to a new array of legal risks for businesses in the US. Now, in response, law firms are expanding their products and services to cover anything from cyber crime and generative AI to cultural disagreements in the workplace.
“Historically, law firms have been organised around what the lawyers do, such as litigation, corporate or regulatory work,” says Gerry Stegmaier, a partner in the emerging technologies group at Reed Smith.
But this is changing, as lawyers adapt to offer clients legal advice beyond the more traditional practice areas.
As existing legal risks, such as regulatory compliance, grow more complex, and new risks emerge, such as those brought on by the advent of generative AI, many firms are also creating new practices to focus on these subjects, says Stegmaier.
At the same time, firms must make use of new technologies if they want to remain competitive. He adds: “Lawyers who are familiar with and expert in AI and the other skills required for excellence will be much better positioned to adapt to change now and in the future.”
Since OpenAI launched ChatGPT in November 2022, chatbots and other large language models have been widely introduced in the workplace. But, for all their advantages, the adoption of these tools also creates legal risks over data privacy, copyright infringement, regulatory compliance, and discrimination — for instance, when used for hiring.
With generative AI, “you had a tool that you could ask to do anything, and it would provide different answers every time”, points out Danny Tobey, chair of DLA Piper’s AI and data analytics practice for the Americas. The question is: how do you test such a tool for compliance, accuracy and vulnerabilities?
Tobey’s team decided to use AI to test AI. They started “legal red teaming” the language models: cross-examining a generative AI model the same way one would a trial witness. Lawyers and data scientists would work together on a particular model, he says, interrogating it with “lines of attack” for specialised industries, such as healthcare, financial services, or consumer goods. Then, a separate AI system would be set up to interrogate the AI model.
“A lawyer could ask a dozen or 100 questions, but then we want the generative AI to ask 1,000 or more questions,” Tobey explains. “That way, you get the benefit of human ingenuity and creativity in really pushing on the model, but you also get the scale and repetition of generative AI.”
Another growing area of focus for law firms is the legal risk associated with cultural issues and diversity, equality and inclusion (DEI) in the workplace. This is particularly noticeable since the pandemic and the increased emphasis on being open and authentic at work, says Sam Schwartz-Fenwick, a partner at Seyfarth Shaw and leader of the firm’s LGBT affinity group: “People were coming to work with their full identity on, and that meant that there was a lot of disagreement.” At some companies, this led to HR complaints and, sometimes, litigation.
The legal framework governing free speech in US workplaces is complex, he says. It includes Title Seven, the federal law that covers or protects against employment discrimination at work based on protected categories. Then, there is the National Labor Relations Act and a number of state laws regarding off-duty conduct and speech, he adds. “So all of those things are in play.”
In 2023, Seyfarth found it had been handling enough such client cases to warrant a dedicated unit to handle them. Schwartz-Fenwick is co-lead of the Cultural Flashpoints task force, which allows lawyers from different specialised areas to pool their expertise to help clients. Historically, clients would come to Seyfarth after a problem had already emerged. But increasingly, companies want to nip issues in the bud.
He expects workplace disputes and regulations to become more complex due to a “fractured culture” in the US in which “roughly half of the population sees the world in very different ways than the other half”.
Meanwhile, regulatory risk is only growing, says Sebastian Lach, a partner at Hogan Lovells and co-CEO of the firm’s in-house legal technology brand Eltemate. “It’s not only more regulations, it’s also more enforcement and more aggressive enforcement.”
Eltemate created a specialised AI tool to sort through regulatory documents, making the process both systematic and faster for their clients. “We’ve trained our own AI algorithm that basically gives all the documents a relevancy score based on what the client has told us they’re interested in,” Lach says. “If it’s higher than 60 per cent, it’s relevant. If it’s lower than 40 per cent, it’s not relevant.”
The tool can reduce a “document dump” of 10,000 documents to a database of 70, including automated summaries, and translations in select languages. Rather than clients pushing back against the use of generative AI tools by law firms, they see it as proof the lawyers are working efficiently, Lach says. “We’re seeing a huge shift in our business model. Because, if you think about it, this is also a shift from hourly rates to technology cost, which is massive, because you have to rethink the whole model.”
Case studies: read about the law firms innovating as businesses and the individual ‘intrapreneurs’ driving change within their firms.