mastodon.me.uk is one of the many independent Mastodon servers you can use to participate in the fediverse.
Open, user-supported, corporation-free social media for the UK.

Administered by:

Server stats:

542
active users

Though making an unreliable intern is amazing and was impossible 5 years ago…

thank fuck sama invented the concept of doing a shit job

I mean, it’s not shit at everything; it can be quite useful in the right context (GitHub Copilot is a prime example). Still, it doesn’t surprise me that these first-party LLM benchmarks are full of smoke and mirrors.

That GitHub Copilot and friends are useful? I would argue that their utility is rather subjective, but there are indications that it improves developer productivity.

I’m unsure if you’ve used tools like GH Copilot before, but it primarily operates through “completions” (“spicy autocorrect” in its truest form) rather than a chatbot-like interface. It’s mostly good for filling out boilerplate and code that has a single obvious solution; not game-changing intelligence by any means, but useful in relieving the programmer of various menial tasks.

May I ask, what evidence are you hoping to see in particular?

I too want a taxi driver that doesn’t know how to drive a car but can adjust the little TV content in the back.
Psh I mean all he has to do is step on the gas pedal and the car does all the work anyways right? I’m glad he doesn’t have to think to much about so he has more time to get the thermostat just right.

I mean…yea? That’s kind of the point. It’s not driving, it’s the copilot. You’re the one driving, and it will get the thermostat right because you’re busy operating the vehicle and want to keep your attention on the road. That seems useful to me.

If you already have an idea of the code you want to write and start typing it, Copilot can help auto complete so you can focus on actually solving whatever problem you’re working on instead of searching for the correct syntax online. I understand shitting on AI is fun and there’s plenty of valid criticisms to be made, but this is actually kind of useful.

how could we possibly be critical of the technology that at best replicates basic editor functionality (templating, syntax completion), outputs wildly incorrect code, and burns rainforests?

The Register · Microsoft's carbon emissions up nearly 30% thanks to AIBy Dan Robinson

I’m not saying you can’t be critical of it, but templating and syntax completion is in fact useful. Suggesting incorrect code is obviously bad, but all of this stuff is still relatively new and I’m sure it’ll get better with time. Can’t we at least try to be a little optimistic about what this stuff is capable of when we give our criticisms, instead of having knee jerk reactions that make this out to be the harbinger of the apocalypse?

Side point to address the linked article: yes, computing systems use energy. If our energy grid is overly reliant on the fossil fuels with harmful emissions, that doesn’t mean we need to stop the advancement of our computers. It means we need to stop using so much fossil fuels in our grid.

@LargeMarge @self you seem very keen to find a problem for your beloved solution. Did you used to be a Blockchainer?

Nope, I just feel like there’s a lot of reactionary content out there about AI. It’s still in it’s infancy and a lot of the tech bros behind these companies are full of shit and over hype it, which is exactly why I was also skeptical about ChatGPT passing the bar exam when it initially happened. But even with that said, it’s still a tool that can be applied in useful ways, such as giving suggestions for code or correcting grammar as you type.

There’s just no nuance in these discussions and you’re a perfect example of that

Ball is in parking lot

@LargeMarge boiling the oceans because it might be good one day. Ok then