-
Notifications
You must be signed in to change notification settings - Fork 0
Description
Context
The current phrasing on ai to human collaboration mentions hallucinations as a problem but thats not the real problem.
Comment
we need positive framing here (and also update the docs!)
comapre:
- agents are powerful but hallucinate
- agents make mistakes
- bad bad bad agentno agents are amazing. the problem is not that they hallucinate or make mistakes. it the sheer amount of changes they create. if an agent does 1 change a day, you can manually diff and verify that everthing is okay.
BUT agents create 1000's of changes in 1 day. THAT's where lix helps. See and control what agents do (OMG SUCH A GOOD TAGLINE), irrespective of them doing mistakes
Transcript
08:57
I think the core problem that Liggs solves is that agents generate so many, so many changes. And it's hard to keep track of what AI agents do.
09:06
Sure, if they hallucinate, that's just an outcome of them doing so many changes. I mean, look at that. Assume that an AI agent does one change a day.
09:13
Is it problematic that they're hallucinating? No, it's not, because you have the time to look at that one change. But if they do a thousand changes in a day, that becomes problematic, because sure, now, do you have time to go through a thousand changes?
09:27
Like, how do you even know? That's the main problem, how do you even know what changes the AI agent did?
09:31
If you have one change a day, sure, you can manually compare the document and what if not. But the main problem is the amount of changes.
09:38
And hallucinations are just an outcome. So, if we start this section with a full but imperfect. AI agents are powerful, they are the future.
09:49
Part of that future is that AI agents create a shit ton of changes. And staying on top of what AI agents do is extremely hard.
10:00
This is where Lix helps. No hallucinations, not talking about the problems. People will solve the problems with AI agents. They will hallucinate less, and so on and so forth.
10:10
The amount of changes they create is just overwhelming. Show a demo here. So, the step where you are is, you're setting, you're giving this awesome hook.
Proposal
Change docs to say the amount of changes agents generate is the problem, not if they hallucinate or not.