Your mom's lol
Hexarei
I printed and assembled one myself for about $90 for all parts, so I can somewhat understand some of the the prices... But whew that's expensive
I'm gonna stop responding to this asanine thread now before you continue to demean us both with your nonsense.
Simpler language is fine when it's accurate.
Your simplification is inaccurate and could mislead people into thinking GPTs are just advanced regex matching engines.
They are not. They are closer to autocorrect on steroids.
Analysis. It uses it, but not by "matching it". The training data is not included in the final model. No GPT can access its training data at runtime.
Training analyzes the contents of the training data and creates a statistical model representing the likelihoods of various tokens based on a complex series of mathematical transformations that encode various attributes of the tokens making up the training data.
3Blue1Brown has a great series on the actual math behind it, I would highly recommend educating yourself on what GPTs actually do. It's way more interesting than simple matching.
You said it matches text to its training data, which it does not do.
Your single-phrase statement only works for very short, non-repetitive phrases. As soon as your phrase repeats a token more than a few times, the statistics for the tokens change and could result in nonsensical output that repeats through subsections of the training data.
And even then for that single non-repetitive phrases, the reason you would get that single phrase back is not because it would be "matching on" the phrase. It is because the token weights would effectively encode that the statistical likelihood of the "next token" in the generated output is 100% for a given token when the evaluated token precedes it in the training phrase. Or in other words: Your training data being a single phrase maniplates the statistics so that the most likely output is that single phrase.
However, that is a far cry from simple "matching" against the training data. Which is what you said it does.
They do not store anything verbatim; They instead store the directions in which various words and related concepts relate to one another in some gigantic multidimensional space.
I highly suggest you go learn what they actually do before you continue talking out of your ass about them
That's not how GPTs work
I've always heard it as "Where you mean to say one thing but fuck your mother"
If just telling her what you want isn't enough for you to feel like you're communicating effectively, try asking her if you could add to it by telling her how you want it, and then maybe expand to how you're desiring to feel about it.
E.g. not just "I would like oral" but instead, "I would like oral, and I'd love to hear that you're enjoying it, however you want to express it." <- This is a request that is direct and specific but doesn't feel robotic or unceremonious IMHO.
I have ASD and my wife doesn't, so we've established that it often makes the most sense when we just explicitly just ask one another, "what can I do for you tonight?" Which leads to very specific answers about what we're wanting to get out of it and how we can best achieve that together. "I've been thinking about you in this way" or "I'd like to know what it looks/feels/tastes/sounds like when you ..." Followed by describing whatever action would best fulfill the desire, followed by any specifics and how we're feeling about it now. "Now that we've talked about it I'm definitely excited to see that" and such.
Dunno if that's helpful but there might be ways to make it feel more special while still being explicit and direct! Just talk about the how and why and how you feel about it.
Honestly this thing solved a very specific problem for me: My car has no good way to mount a phone for car usage so I've always kept it in the center console. Car Thing just put a remote on my dash for that with buttons for presets and easy song skipping.
I only got it for $10 though, and that was two years ago. It has convinced me to get a new head unit with Android Auto support on it for sure
Just here in the comments as a top level comment, as an explanation of the image