And, whilst I’m here, a post from someone who tried using copilot to help with software dev for a year.
I think my favourite bit was
Don’t use LLMs for autocomplete, use them for dialogues about the code.
Tried that. It’s worse than a rubber duck, which at least knows to stay silent when it doesn’t know what it’s talking about.
https://infosec.exchange/@david_chisnall/113690087142854474
(and also https://en.m.wikipedia.org/wiki/Rubber_duck_debugging for those who haven’t come across it)
If it were merely a search engine, it risks not being ai enough. We already have search engines, and no one is gonna invest in that old garbage. So instead, it finds something that you might want that’s been predigested for ease of ai consumption (Retrieval), dumps it into the context window alongside your original question (Augmentation) and then bullshits about it (Generation).
Think of it as exactly the same stuff that the LLM folk have already tried to sell you, trying to work around limitations of training and data availability by providing “cut and paste as a service” to generate ever more complex prompts for you, in the hopes that this time you’ll pay more for it than it costs to run.