Twice now I’ve experienced the fallout of bugs in my coworkers code and when I looked into it the bug was introduced by Copilot.
Think about that for a second.
I’m trying to accept that everyone I talk to at work about these systems (I won’t dignify them by using the term “intelligence”) ignores my warnings and treats me like a fool for refusing to use them, but now I have to clean up the mess others make by trusting these things.
This isn’t sustainable.
@mav very true
@requiem fixing problems with LLM output is going to be every job soon
@benbrown it is becoming my primary objective to prevent this.
Hopefully, by providing compelling alternatives but alternatively, by any means necessary.
@requiem the gold rush mentality has already taken over