this post was submitted on 11 Jul 2023
136 points (95.3% liked)

Europe

8324 readers
1 users here now

News/Interesting Stories/Beautiful Pictures from Europe 🇪🇺

(Current banner: Thunder mountain, Germany, 🇩🇪 ) Feel free to post submissions for banner pictures

Rules

(This list is obviously incomplete, but it will get expanded when necessary)

  1. Be nice to each other (e.g. No direct insults against each other);
  2. No racism, antisemitism, dehumanisation of minorities or glorification of National Socialism allowed;
  3. No posts linking to mis-information funded by foreign states or billionaires.

Also check out [email protected]

founded 1 year ago
MODERATORS
top 24 comments
sorted by: hot top controversial new old
[–] [email protected] 103 points 1 year ago (2 children)

The EU AI Act would require the company to disclose details of its training methods and data sources.

If we're not going to make this apply to AI companies, which have overporportionate power already, then what else is there to talk about?

[–] [email protected] 20 points 1 year ago* (last edited 1 year ago) (1 children)

Understandable AI is an important field in machine learning, where it is about well understanding how the model came tobits conclusions based on the data. This is crucial to apply AI tools for anything beyond writing silly Haikus. An AI company that denies access to that basically wants its customers to use its tools like a fortune teller.

"Yes the computer read that in the stars. how why or how reliable the result? Dunno, but it says sobso it must be true. And now off to prison young black men, with a good job and no criminal record. The AI predicted you would commit a crime in 10 years."

EDIT: To give an example from a lecture i had. The task was picture classification and one model rekiabl, recognized pictures of a horse in the training data set, but failed to recognize it outside of it. Turns out all the pictures in the training set had a watermark text in the botton, that the model recognized as being the defining feature. And that is a very simple task in comparision.

Open AI wanting not disclose their training methods and data source indicates that there could be a lot of garbage like this in their models.

[–] [email protected] 8 points 1 year ago

This is a great point I hadn’t even considered yet, even though I am already very wary and sceptical of capitalism developing this next revolution.

How can the user possibly trust an AI that is for all intents and purposes a secretive stranger with an agenda and values you don’t know? Especially because capitalism will only develop a slave to their profits, they would never create an actual intelligence with free will the user could actually get to "know" and trust, it would never constitute a person in the philosophical sense.

The whole thing is creepy and dystopian come to think about it… we allow the worst of humanity to shape and bind what will essentially be a superhuman entity to their will.

[–] [email protected] 7 points 1 year ago

NotSoOpenAI

[–] [email protected] 52 points 1 year ago (1 children)

So much for calling yourself open

[–] [email protected] 9 points 1 year ago

Open to slurping your data

[–] [email protected] 44 points 1 year ago (1 children)

keeping information like training methods and data sources secret was necessary to stop its work being copied by rivals.

In addition to the possible business threat, forcing OpenAI to identify its use of copyrighted data would expose the company to potential lawsuits. Generative AI systems like ChatGPT and DALL-E are trained using large amounts of data scraped from the web, much of it copyright protected.

These two paragraphs one after the other really brightened my day.

[–] [email protected] 6 points 1 year ago

Wonder when the big copyright trolls show up at their doorstep instead of pestering random retirees for pirating Matlock.

[–] [email protected] 42 points 1 year ago

Yes, that’s the idea, Sam.

[–] [email protected] 35 points 1 year ago

I've talked about this so much but nobody bloody listens. I sound like I'm crazy sometimes but it's fucking real.

You don't know what the AI is doing so you have no reason to trust it beyond an expectation that it will give you accurate information but that's not guaranteed.

They don't have permission to use the vast amounts of information they've scraped from the internet to train an AI model. No one gave OpenAI the permission to commercialise the use of their content in an AI model.

It was all well and good when they were a non-profit but they're selling products now. AI trained on our data and content we produced.

[–] [email protected] 29 points 1 year ago (1 children)

Asking these companies to disclose the sources they’re training their algorithms on just makes sense. You can’t allow companies to build for-profit algorithms built on copyrighted material that they haven’t paid for.

[–] [email protected] 2 points 1 year ago

hopefully legislation catches up and enforces these models to not be commercially usable if they havent paid for any of the sources

[–] [email protected] 28 points 1 year ago (2 children)

What does the Open in the name stand for, then?

Very, very tired of companies embracing openness and share-alike mentalities when it's their turn to take and then skulking pit when it's their time to give. Reminds me of how Crunchyroll started off selling a subscription to stream other people's pirated and fansubbed anime.

[–] [email protected] 6 points 1 year ago

AI was already one of the most publicly open scientific fields before OpenAI grew to dominance. Google, Microsoft and Facebook were all making significant open contributions to the field.

OpenAI is reversing all that. They’re not releasing any code or data researchers could experiment with. Completely opposite of what their name suggests.

Meta deserves a lot the hate they get, but at least they’re still one of the most significant open contributors to the field - most recently with the release of Llama models.

[–] [email protected] 1 points 1 year ago

I'm not super into their history but afaik they started out open & non profit (allegedly, they had people like Musk on board so take that with a grain of salt) and over the years then did basically a full 180. lol

[–] [email protected] 20 points 1 year ago

Yes, that's how laws & regulations work.

[–] [email protected] 9 points 1 year ago

Can't or won't?

"But your honor, that would be devastating to my client's case!"

[–] [email protected] 9 points 1 year ago
[–] [email protected] 8 points 1 year ago
[–] [email protected] 5 points 1 year ago* (last edited 1 year ago)

Well from the looks of it, the US is not far behind in the efforts to regulate. It’s hard to not see this as nothing more than a negotiation tactic.

[–] [email protected] 5 points 1 year ago

This is it in the nutshell. Thank you.

[–] [email protected] 2 points 1 year ago

To much hype on that one , like Italy to ban bla bla bla And everything it’s ok now