this post was submitted on 09 Oct 2023
19 points (85.2% liked)

Canada

7206 readers
323 users here now

What's going on Canada?



Communities


๐Ÿ Meta


๐Ÿ—บ๏ธ Provinces / Territories


๐Ÿ™๏ธ Cities / Local Communities


๐Ÿ’ SportsHockey

Football (NFL)

  • List of All Teams: unknown

Football (CFL)

  • List of All Teams: unknown

Baseball

Basketball

Soccer


๐Ÿ’ป Universities


๐Ÿ’ต Finance / Shopping


๐Ÿ—ฃ๏ธ Politics


๐Ÿ Social and Culture


Rules

Reminder that the rules for lemmy.ca also apply here. See the sidebar on the homepage:

https://lemmy.ca


founded 3 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

The training of an attention network is not meaningfully different from training a perceptron, which was invented in 1957. Interconnected networks of computers date back to 1969. finger captured the spirt of Twitter in 1971. The businesses may have changed, but the concepts you speak of were very well understood in 1980.

A contract precisely specifying "generative AI", "Twitter", and "the Internet" would be silly even today. That would be like the DVD specifying "You may only watch this while sitting on a Chesterfield flower-print couch in your living room on a RCA CRT television". My understanding from you is that the intent is simply to prohibit non-humans reading the work. That is something that could have been easily written in 1980.

Hell, robots doing human-like things was front in centre in popular culture in the 1980s. It would have been impossible to not realize that robots reading books was an almost certain future. To act surprised now that it is happening is an act not selling itself.

[โ€“] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

There is a massive difference between AI tech in the 70s and today. The scale we're able to achieve is orders of magnitude beyond what was dreamed of. These modern issues were conceived as taking much longer to arrive and giving the legal system more time to catch up. Our legal system can force a common baseline of behavior on our new technology and that will be necessary to have a healthy balance of power.

[โ€“] [email protected] 0 points 1 year ago* (last edited 1 year ago)

There is a massive difference between AI tech in the 70s and today.

Not really. We've learned a few tricks along the way, but the fundamentals of neural networks have not changed at all. The most significant progress AI has made is in seeing compute become orders of magnitude faster. Which we knew, with reasonable confidence, was going to happen. Moore's Law and all that.

The scale weโ€™re able to achieve is orders of magnitude beyond what was dreamed of.

While I disagree, the scale is irrelevant. The slow systems in the 1970s were maybe only ingesting one book rather than millions of books, but legally there is no difference between one book and millions of books. If we are to believe that there is no legal right for a machine to read a book, then reading just one book is in violation of that.

our new technology

What new technology? The Attention is All You Need paper, which gave rise to LLMs, showed a way to train neural networks faster, but it is not the speed at which machines can read books that is in question. Nobody has suggested that the legal contention is in traffic law, with computers breaking speed limits.

We've been doing this for decades upon decades upon decades. Incremental improvements in doing it faster changes the legal state not. To pretend that suddenly the world was flipped upside down is ridiculously disingenuous.