this post was submitted on 05 Mar 2024
54 points (93.5% liked)

Asklemmy

43755 readers
1240 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

When they say that "they have an army of lawyers" or that Disney has more lawyers than animators and things like that, do they tho? Is an army of lawyers really effective? Do companies actually have an "army" of lawyers to redact and sign documents?

you are viewing a single comment's thread
view the rest of the comments
[–] oxjox@lemmy.ml 12 points 8 months ago (3 children)

I would imagine it's only matter of time before AI can do the majority of the work for law firms. I'll have to ask my IP lawyer friend about this.

[–] vis4valentine@lemmy.ml 19 points 8 months ago (3 children)

Lawyers who has tried to use AI so far had lost their cases miserably.

[–] Fubarberry@sopuli.xyz 25 points 8 months ago (1 children)

That's because we only hear about AI being used by lawyers when they use it wrong and it hallucinates a case that doesn't exist, and then they don't actually verify the case themselves.

I'm sure lawyers are already using it successfully, we just don't hear about successful cases.

And right now they're using general purpose LLM models, I'm sure we'll get models actually focused on legal knowledge in the future that will do much better than the current ones.

[–] sonori@beehaw.org 1 points 8 months ago

Most of the recent change in AI has been owed to Openai’s approach of combining a more primitive transformer with going from all the books they could pirate with GPT3 to the entire text interment with GPT4. Smaller subject specific models have made relatively little progress in the last ten to fifteen years, so I don’t think a chatbot like GPT4 that regurgitates more specific information with high accuracy is likely to be on the table anytime soon.

A better search engine seems far more suited to such a task than a generitive system anyway.

[–] rdyoung@lemmy.world 11 points 8 months ago (1 children)

First off, it's not AI, it's llm, basically a better way to collate and search data. It's a tool that they should be using for research but they better not be using chatgpt or any of the other publicly available ones. I would hope that by now someone has launched or is working on one that was trained with data from law books, existing case law, etc and then you could also feed it any discovery documents that come in and it can help highlight what is important.

[–] Atemu@lemmy.ml -2 points 8 months ago* (last edited 8 months ago) (2 children)

a better way to collate and search data

[citation needed]

Though I'm sure your LLM could hallucinate some for you!

[–] rdyoung@lemmy.world 4 points 8 months ago* (last edited 8 months ago)

Do I need to define collate? Maybe it wasn't the best choice of verbiage but the point still stands. The quality of the output is always relative to the input. That's why a growing number of companies are training their own llms with data from their own databases instead of trying to rely on external datasets.

For the record, I'm not talking about ones that you can ask a question and get an answer. I was talking about law firms using a local or privately hosted llm to scan through discovery documents and finding keywords or related keywords that may be relevant to the case they are working. Especially now that a lot of discovery is digital.

I can't give more detail than the following because it may not be public yet but I am aware of one company working on their own llm to let clients more easily find info that has been published on their platform and would take longer to skim through than to just use a search engine.

[–] FlashMobOfOne@lemmy.world 3 points 8 months ago* (last edited 8 months ago)

I love that term "hallucinate".

That's a big of a euphemism as the word "faith", and like the term "faith", it's used to mask glaring operational deficiencies. It reminds me of the time when I test drove a used car and there was a clear steering issue, which the car salesman called a "shimmy".

[–] Death_Equity@lemmy.world -2 points 8 months ago* (last edited 8 months ago) (2 children)

Because we don't actually have AI. We have people following paint by numbers, not artists.

True AI, and not the sparkling programming we have, will be more effective than any lawyer.

[–] captainlezbian@lemmy.world 2 points 8 months ago (1 children)

Oh, you mean that thing that hasn’t been proven possible yet?

[–] Death_Equity@lemmy.world 2 points 8 months ago (1 children)

“Not within a thousand years will man ever fly” -Wilbur Wright 1901

Two years later he and his brother achieved the first successful test of powered flight. Their flight would last 12 seconds and cover 120ft with a top speed of 6.8mph.

The SR-71 Blackbird, flown 61 years after the first powered flight, had a top speed of 2190mph and had a range of 2,500 miles.

True AI will happen unless temporary stars are all the rage.

[–] captainlezbian@lemmy.world 2 points 8 months ago

Yeah the wright brothers were great, but it pains me to say as a Daytonian engineer, but they were also completely full of themselves. There was good reason to believe heavier than air flight was not only possible but soon at the time. Lighter than air flight was not only already happening but had been used in conflicts, there was a hot air balloonist involved in the Paris commune.

But my doubts are of the possibility, immediacy, and practicality of an artificial device having human or greater cognition power in ways able to mimic organic brains. These questions aren’t me just being some doubter (though that is valid given the sheer resources being thrown at them and the way that we’re being asked to leave problems to them rather than seeking more immediate alternatives), but based on discussions with artificial intelligence specialists who don’t have a financial stake in the technology

[–] rdyoung@lemmy.world -1 points 8 months ago

Who downvoted you? I've been arguing the same thing since AI has become the buzzword of the decade. No one seems to understand what Artificial Intelligence actually is and how these current systems are anything but. They aren't even really a step in that direction because the underlying software and hardware isn't anywhere near ready to emulate a human or even lower animal brain.

[–] Brkdncr@lemmy.world 7 points 8 months ago

Yes lots of contract and doc review billable hours are going to go away. It’s going to be devastating.

Doc review is pretty awful though so maybe it’s for the better.

[–] FlashMobOfOne@lemmy.world 2 points 8 months ago (1 children)

All AI does is determine the probability of the next word that's about to be said.

There definitely will come a time when an AI can craft legal thought, but it is a long, long time off.

Source: I'm a legal tech who's actually helping my firm test legal gen AI platforms, all of which produce information that can't be relied upon without human validation.

[–] oxjox@lemmy.ml 2 points 8 months ago (1 children)

I currently use Copilot to help me solve problems with Microsoft Power Platform. "AI", the generic misleading word we use to describe advanced search, can scour the web and solve a problem for me in seconds. It's only a matter of time before a more advanced algorithm "learns" all the case law to ever exist. At some point, I imagine, you can type in all the details about your case and the machine with find all the applicable court cases in a matter of minutes. It's still up to the lawyers to apply and utilize this but the research, the stuff that takes an army of lawyers, will be done in a fraction of a fraction of the time.

I said it's a matter of time, not that it's happening now or will happen tomorrow. But I do believe it will happen in our lifetime, very likely within the next ten years.

[–] FlashMobOfOne@lemmy.world 1 points 8 months ago (1 children)

Yep. I use ChatGPT myself to help with research on coding and other issues.

I don't expect AI is going to replace humans anytime soon, but the use of it is going to be an essential skill, and people and companies who don't learn how will definitely go extinct.

[–] oxjox@lemmy.ml 1 points 8 months ago

It is most certainly already replacing some humans. One example is some video content now being generated by AI rather than a digital artist; same for stock photos.

All mechanization and automation is designed to aid or replace humans. In some cases it's about safety or about effort. But ultimately, it's about a company increasing their profitability. Profits will motivate adoption.