AI Stuff

294 readers
1 users here now

A place for all things artificial intelligence

Stay up-to-date with the latest news, reviews, and insightful discussions about artificial intelligence. Whether you're interested in machine learning, neural networks, natural language processing, or AI applications, this is the place to be!

Subscribe: [email protected]

Quick Links

Subscribe Links


Rules

1. Stay on topicAll posts should be directly related to artificial intelligence. This includes discussions, news, research, tutorials, applications, and anything else specifically about AI.

2. No reposts/rehosted contentSubmit original sources, unless the content is not available in English. Reposts about the same AI-related content are not allowed.

3. No self-promotional spamOnly active members of the community can post their AI-related apps, projects, or resources, and they must actively participate in discussions. Please avoid posting self-promotional content that does not contribute to the AI community.

4. No editorializing titlesWhen sharing AI-related articles or content, refrain from changing the original titles. You may add the author's name if relevant.

5. No offensive/low-effort contentAvoid posting offensive, irrelevant, or low-effort content that does not contribute positively to the AI community.

6. No unauthorized polls/bots/giveawaysDo not create unauthorized polls, use bots to generate content, or organize giveaways related to AI without proper authorization.

7. No affiliate linksPosting AI-related affiliate links is not allowed.

founded 1 year ago
MODERATORS
26
27
28
 
 
  • The U.S. is updating rules to prevent American chipmakers from selling AI chips to China that skirt existing restrictions.
  • These updated rules will expand the scope of restricted chips and demand reporting for certain shipments.
  • The move aims to prevent U.S. technology from strengthening China's military but risks complicating U.S.-China diplomatic efforts.
  • Consumer chips are exempt, but the new rules aim to close loopholes and evolve with technology advancements.
29
30
 
 

cross-posted from [email protected]

  • Google's Bard is developing a "Memory" feature that allows the chatbot to remember and adapt to user-specific details and preferences.
  • Currently, each conversation with Bard starts afresh, requiring users to restate particulars like dietary restrictions.
  • The Memory feature will improve personalized interactions, such as recommending recipes based on a user's noted vegetarianism or travel suggestions considering the number of kids a user has.
  • Users will be able to manage their stored preferences via a Memory page and can delete incorrect or unwanted memories.
  • A toggle will allow users to easily disable the Memory function for privacy concerns or to have non-personalized conversations.
31
32
33
34
35
36
37
 
 

cross-posted from [email protected]

38
 
 

cross-posted from: https://infosec.pub/post/2221265

NYT gift article expires in 30 days.

https://ghostarchive.org/archive/FAewq

39
40
41
 
 

By Andrew R. Chow

Over the past few months, shiny metallic orbs have materialized cities around the world, from New York to Berlin to Tokyo. Its creators hail the orbs as revolutionary devices ushering in a new era of global humanity and financial stability. Its detractors slam them as invasive, dystopian and exploitative.

Welcome to the rollout of Worldcoin, an AI-meets-crypto project from OpenAI founder Sam Altman that has stirred endless controversy. The startup uses orbs to scan people’s eyes in exchange for a digital ID and possibly some cryptocurrency, depending on what country they live in.

Altman and his co-founder Alex Blania hope that Worldcoin will provide a new solution to online identity in a digital landscape rife with scams, bots and even AI imposters. But privacy experts are concerned about the Worldcoin’s collection of biometric data and how, exactly, the project will keep and protect that data going forward. On Aug. 2, Kenya became the first country to suspend Worldcoin’s activities and its government launched a multi-agency investigation into the project’s practices.

Here’s what to know about the new technology. What is the aim of Worldcoin?

Navigating the online world in 2023 can often feel like an endless obstacle course of lurking dangers. Many websites require a login and password. Scammers have all sorts of strategies to get you to click on links and send them money. Bots run rampant on social media platforms.

And with the rise of AI, our collective ability to discern who is a human online and who isn’t is about to become much worse. Earlier this year, OpenAI’s GPT-4 was even able to convince a human to solve a CAPTCHA—a technology designed to differentiate humans from bots—on its behalf.

As a founder and the CEO of OpenAI, Sam Altman bears responsibility for problems like this. Worldcoin, then, serves as his potential solution: a way to definitively distinguish between humans and AIs. If all humans online could prove that they were, in fact, humans, then scams and imposters would dramatically decrease, and the digital landscapes would become more accurate representations of us as a society.

So in order to prove that humans are humans, Worldcoin scans irises, which are unique to their owners. This technique is not unlike the biometric scans conducted by CLEAR or Apple’s Face ID.

Once Worldcoin has received a unique iris scan, the project issues a digital identity called a World ID. The ID is not a user’s biometric data itself, but an identifier created using a cryptography method called zero-knowledge proofs.

If World IDs catch on, then holders could theoretically use them to sign on to all websites, just like Google offers single sign-on services. The difference, Altman argues, is that a Worldcoin login will be more secure and unlinked to other information, including a user’s email, name or photograph. If the zero-knowledge proof aspect of the tech works properly, Worldcoin would allow ID holders to log into a website without that action being traceable by other people or any government.

In June, Okta became the first major company to allow users to sign in with Worldcoin. Why are people signing up?

Worldcoin officially launched in July, with the project embarking on a multi-city sign-up tour. Altman posted a video of long lines outside Orb centers, and said that the project was scanning in a new user every eight seconds.

Worldcoin claims that more than two million people have registered for World IDs. Some people have signed up out of curiosity or affinity for the technology. But many more, it seems, signed up for a simpler reason: money. In order to galvanize interest in the project, Worldcoin created its own crypto token, called WLD, and has offered 25 units (currently worth about $60) to anyone who scanned their eyes into an orb. That offer does not extend to the U.S., however, due to the country’s strict regulatory environment related to crypto products.

Worldcoin’s cryptocurrency element serves as a marketing ploy, a global financial infrastructure, and a way to entice continued venture capital funding. (Tools for Humanity, the company behind Worldcoin, just raised $115 million in a funding round and was valued at $3 billion last year. Around 14% of all WLD tokens have been earmarked for investors.) Worldcoin, which Altman says he conceived of in 2019, has for years been entwined with the crypto ecosystem, for better or worse. It was publicly announced in the summer of 2021, when crypto was near its height, and the project’s original backers include Sam Bankman-Fried, the former CEO of FTX, accused of perpetrating an $8 billion fraud, and the collapsed crypto hedge fund Three Arrows Capital.

Sam Altman says he hopes that Worldcoin’s financialized element could lead to universal basic income (UBI) for its users. Altman has long been interested in UBI. When he was the president of startup accelerator Y Combinator, he led a pilot trial that aimed to give $1,500 a month to Oakland families. But the project was significantly delayed and reduced in scale. Elizabeth Rhodes, project director of YC Research—the incubator’s nonprofit arm, which has since spun off—told Wired in 2018 that “it’s harder to give away money than you might think.”

Worldcoin also isn’t the first project to entwine digital identity with universal basic income. Last year, a project called Proof of Humanity attempted something similar—and for a while, was handing out $50 to $100 monthly in crypto to everyone who signed up. But the token’s value cratered along with the larger market.

Santiago Siri, a board member of Proof of Humanity, says that Worldcoin’s cryptographic mechanisms designed to protect identity represent a step forward from his project, which required new users to post a public video to verify their identity. Siri hopes that Altman will leverage Worldcoin and his growing revenue from OpenAI to redistribute wealth. “AI can certainly hold true the promise of delivering a global universal basic income, because it can generate that kind of sustainable wealth throughout time. And because it will displace jobs, they have the ethical obligation to actually support UBI initiatives,” he says.

How Altman plans to pay for Worldcoin’s UBI, however, remains murky. “The hope is that as people want to buy this token, because they believe this is the future, there will be inflows into this economy. New token-buyers is how it gets paid for, effectively,” Altman told Coindesk recently.

While Altman hopes that the price of WLD will go up, its value so far has been somewhat volatile. WLD debuted at $7.50 but has since dropped threefold, and has hovered between $2 and $3 over the last week. Why is Worldcoin facing criticism?

Many critics have called Worldcoin’s business—of scanning eyeballs in exchange for crypto—dystopian and some have compared it to bribery. Many people are understandably hesitant to surrender their biometric data to a private for-profit startup with uncertain aims; even Altman himself acknowledged a “clear ick factor.”

Worldcoin says the biometric information on the orbs is deleted after being processed and converted into cryptographic code. But a history of Silicon Valley companies mishandling data has left a sour taste in peoples’ mouths, and some fear that the iris scans could be used for surveillance or be sold to third parties. “Don’t use biometrics for anything,” Edward Snowden wrote on Twitter in response to Altman’s post about Worldcoin in 2021. “The human body is not a ticket-punch.”

Meanwhile, Worldcoin’s rollout has been plagued by troubling reports from around the world. An extensive 2022 article from the MIT Technology Review found evidence that the project used deceptive practices to sign people up in countries like Indonesia, Kenya and Chile. Spanish-language speakers, for instance, were given terms of service notices in English, and people in Sudan were enticed with AirPod giveaways without being told what exactly they were scanning their eyeballs for.

A representative for the Worldcoin Foundation responded to the Technology Review article in a statement to TIME, writing: “The article is not representative of the project's operations, included inaccurate information and offered a narrow, personal perspective of the incredibly early stages of the project. It is also not an accurate representation of the project’s global operations today.”

This year, hackers stole the login credentials of Worldcoin operators who are tasked with signing up new users, allowing the hackers to view internal information. An article by the crypto publication BlockBeats alleged that people in Cambodia and Kenya were selling their iris data for as little as $30 a pop to speculators on the black market, who hoped that the WLD they collected from the scans would increase in price.

“I think that there are important questions to be asked about whether or not this opens the path towards artificial intelligence colonialism,” Proof of Humanity’s Siri, who is Argentinian, says. “We have seen the orb being deployed in third-world developing countries whose rules about identity and privacy might not be as strong as they are in the European Union or in the United States.”

Regulators around the world have been watching Worldcoin’s rise closely. The French data protection watchdog announced a probe into the project over its “questionable” data collection and preservation. A U.K. regulator issued a similar warning. And the Kenyan government demanded Worldcoin cease its data collection activities there, writing the project posed “legitimate regulatory concerns that require urgent action.”

A Worldcoin Foundation representative responded to the news in Kenya in a statement to TIME, writing: “The demand for Worldcoin’s proof of personhood verification services in Kenya has been overwhelming, resulting in tens of thousands of individuals waiting in lines over a three-day period to secure a World ID…Worldcoin remains committed to providing an inclusive, privacy-preserving, decentralized on-ramp to the global digital economy and looks forward to resuming its services in Kenya while working closely with local regulators and other stakeholders.”

When Worldcoin co-founder Alex Blania was asked by Bloomberg News about login theft and black market sales, he dismissed their impact. “Of course there will be fraud,” he said. “It will not be a perfect system, especially early on.”

42
43
44
 
 

Eager early adopters recently descended upon a Mexico City cafe where their eyes were scanned by a futuristic sphere, part of an ambitious project that ultimately seeks to create a unique digital identification for everyone on the planet.

Mexico is one of nearly three dozen countries where participants are allowing the sphere, outfitted with cameras and dubbed an orb, to scan their iris. The project's goal is to distinguish people from bots online, while doling out a cryptocurrency bonus as a incentive to participate.

The so-called Worldcoin project is a biometric verification tool led by Sam Altman, the chief executive of Open AI, and the crypto company he co-founded, Tools for Humanity.

Not directly AI related but the Sam Altman tie-in is interesting. This doesn't sound creepy at all... thoughts?

45
 
 

Aug 8 (Reuters) - Alphabet's (GOOGL.O) Google and Universal Music (UMG.AS) are in talks to license artists' voices and melodies for artificial intelligence-generated songs, Financial Times reported on Tuesday, citing four people familiar with the matter.

The music industry is grappling with "deepfake" songs, made using generative AI, that mimic artists' voices, often without their consent.

The goal behind the talks is to develop a tool for fans to create tracks legitimately and pay the owners of the copyrights for them, the report said, adding the artists would have a choice to opt in the process.

Discussions between Google and Universal Music are at an early stage and no product launch is imminent, while Warner Music (WMG.O) is also in talks with Google about a product, the report added.

The companies did not immediately respond to Reuters' requests for comment.

46
 
 

Since 2018, a dedicated team within Microsoft has attacked machine learning systems to make them safer. But with the public release of new generative AI tools, the field is already evolving.

47
 
 

The Dungeons & Dragons role-playing game franchise says it won’t allow artists to use artificial intelligence technology to draw its cast of sorcerers, druids and other characters and scenery.

48
 
 

Zoom Video Communications, Inc. recently updated its Terms of Service to encompass what some critics are calling a significant invasion of user privacy.

49
 
 
  • CoreWeave, a company that provides cloud services, has secured a $2.3 billion loan using Nvidia chips as security.
  • This large loan reflects a growing trend of securing loans with physical assets, especially when banks aren't lending as much.
  • CoreWeave has grown quickly thanks to a boom in AI. It has special access to advanced Nvidia chips, which gives it an advantage over big cloud providers like Microsoft, Amazon, and Google.
  • CoreWeave will use the loan to buy more chips, build data centers, and hire more staff. It's aiming to have 14 data centers in the U.S. by the end of the year.
  • Earlier this year, CoreWeave also raised $421 million in equity, pushing its value to over $2 billion.
50
view more: ‹ prev next ›