nyan

joined 1 year ago
[–] [email protected] 13 points 7 months ago

Personally, I'm of the opinion that we should just set all the clocks to UTC and stop trying to coordinate wall clock time and solar time, since walll clock time is arbitrary anyway.

[–] [email protected] 5 points 7 months ago

Half of the human population is of below-average intelligence. They will be that dumb. Guaranteed. And safeguards generally only get added until after someone notices that a wrong answer is, in fact, wrong, and complains.

In part, I believe someone's going to die because large corporations will only get serious about controlling what their LLMs spew when faced with criminal charges or a lawsuit that might make a significant gouge in their gross income. Untill then, they're going to at best try to patch around the exact prompts that come up in each subsequent media scandal. Which is so easy to get around that some people are likely to do so by accident.

(As for humans making up answers, yes, some of them will, but in my experience it's not all that common—some form of "how would I know?" is a more likely response. Maybe the sample of people I have contact with on a regular basis is statistically skewed. Or maybe it's a Canadian thing.)

[–] [email protected] 6 points 7 months ago* (last edited 7 months ago) (4 children)

How about taking advice on a medical matter from an LLM? Or asking the appropriate thing to do in a survival situation? Or even seemingly mundane questions like "is it safe to use this [brand name of new model of generator that isn't in the LLM's training data] indoors?" Wrong answers to those questions can kill. If a person thinks the LLM is intelligent, they're more likely to take the bad advice at face value.

If you ask a human about something important that's outside their area of competence, they'll probably refer you to someone they think is knowledgeable. An LLM will happily make something up instead, because it doesn't understand the stakes.

The chance of any given query to an LLM killing someone is, admittedly, extremely low, but given a sufficiently large number of queries, it will happen sooner or later.

[–] [email protected] 8 points 7 months ago* (last edited 7 months ago) (7 children)

Calling a cat a dog won't make her start jumping into ponds to fetch sticks for you. And calling a glorified autocomplete "intelligence" (artificial or otherwise) doesn't make it smart.

Problem is, words have meanings. Well, they do to actual humans, anyway. And associating the word "intelligence" with these stochastic parrots will encourage nontechnical people to believe LLMs actually are intelligent. That's dangerous—potentially life-threatening. Downplaying the technology is an attempt to prevent this mindset from taking hold. It's about as effective as bailing the ocean with a teaspoon, yes, but some of us see even that as better than doing nothing.

[–] [email protected] -3 points 7 months ago (2 children)

Yes, there is C++ code still being written, and it's a reasonable choice for some lower-level and complex code , but it's a much smaller percentage of the whole than it was even ten years ago. Web stack stuff tends to be written in memory-managed languages, and it probably accounts for more lines of new code than anything else these days (note that I didn't specify good code). You can have a whole career without ever getting down into the weeds.

Similarly, assembler still had some practical applications in games and video codecs when I got out of school. These days, I wouldn't expect to see hand-written assembler outside of an OS kernel or other specialized low-level use. It's still not gone, but it's been gradually going for many years now. Languages without memory management likely never will completely disappear, and they have massive inertia because of the sheer number of C utility libraries lying around, but they're gradually becoming more marginalized.

What it comes down to is: understanding how memory works is useful and broadening for someone who wants to program, but it's no longer necessary even for a professional. (I think we're mostly in agreement on everything except relative importance, in other words.)

[–] [email protected] -2 points 7 months ago

You don't need to understand the details of how memory is allocated to understand that taking up too much space is bad, and that there's often a tradeoff between programmer time, machine execution time, and memory allocated, though.

[–] [email protected] 9 points 7 months ago* (last edited 7 months ago) (9 children)

Some people found the primitive ELIZA chatbot from 1966 convincing, but I don't think anyone would claim it was true AI. Turing Test notwithstanding, I don't think "convincing people who want to be convinced" should be the minimum test for artificial intelligence. It's just a categorization glitch.

[–] [email protected] -2 points 7 months ago (7 children)

Manual memory management has about as much applicability these days as assembler did back when I was doing my degree. It should be covered as part of learning How Things Work Under the Hood, it's still needed for some kinds of specialist work, but many—perhaps even the majority of—people writing code will never need to deal with it in the real world, because the languages in which most code is written these days all have some form of memory management.

[–] [email protected] 1 points 7 months ago* (last edited 7 months ago) (1 children)

Why does any studio ever reboot anything? Guaranteed audience. Less risk than putting out something new. At the same time, the original aired so long ago that they can't really continue on from it without potentially confusing new viewers (or at least, I think that's their logic).

[–] [email protected] 1 points 7 months ago

This particular dog appears to be attached to the "Ohio State University [something] Medical Center", according to what I can make out at the top of that badge.

[–] [email protected] 2 points 7 months ago

Displaying it would be kind of tough unless you used oversized faux pixels, though. I mean, ultimately everything that goes on a normal screen is translated to raster.

[–] [email protected] 2 points 7 months ago

And if anyone ever asks about it, just say that the shadow went "off the block" toward the back of the bench.

view more: ‹ prev next ›