this post was submitted on 26 Oct 2024
11 points (100.0% liked)

AI

4197 readers
1 users here now

Artificial intelligence (AI) is intelligence demonstrated by machines, unlike the natural intelligence displayed by humans and animals, which involves consciousness and emotionality. The distinction between the former and the latter categories is often revealed by the acronym chosen.

founded 3 years ago
 

I've been reading about recent research on how the human brain processes and stores memories, and it's fascinating! It seems that our brains compress and store memories in a simplified, low-resolution format rather than as detailed, high-resolution recordings. When we recall these memories, we reconstruct them based on these compressed representations. This process has several advantages, such as efficiency, flexibility, and prioritization of important information.

Given this understanding of human cognition, I can't help but wonder why AI isn't being trained in a similar way. Instead of processing and storing vast amounts of data in high detail, why not develop AI systems that can compress and decompress input like the human brain? This could potentially lead to more efficient learning and memory management in AI, similar to how our brains handle information.

Are there any ongoing efforts in the AI community to explore this approach? What are the challenges and benefits of training AI to mimic this aspect of human memory? I'd love to hear your thoughts!

top 4 comments
sorted by: hot top controversial new old
[–] [email protected] 9 points 1 month ago* (last edited 1 month ago)

AI does work like that.

With (variational) auto-encoders, it's very explicit.

With shallow convolutional neural networks, it's fun to visualize the trained kernel weights, as they often return an abstract, to me dreamlike, representations of the thing being trained for. Although derived through a different method, search for "eigenfaces" as an example of what I mean.

In the recent hype model architecture, attention and transformers, the encoded state can be thought of as a compressed version of it's input. But human interpretation of those values is challenging.

[–] [email protected] 7 points 1 month ago (1 children)

Thats kinda is how neural networks actually function. They don't store massive amounts of data but, similar to us, tweak and adjust complex pathways of neurons that kinda just convert an input into a response.

When you ask an LLM a question you are actually getting a list of words based on probabilities, not anything the LLM had to "think about" before responding. During its training, different patterns fed to the AI tweak and balance how and when specific neurons should fire. One way to think about it is that "memories" or data is stored in how the paths are formed, not actually in the core of the neuron itself.

There are several hundred configurations of artificial neural networks that can mimic different functions of our brains, including memory.

[–] CoderSupreme 0 points 1 month ago* (last edited 1 month ago) (1 children)

Oh, so it’s mostly a side effect, but they are still primarily being trained to predict the next word.

[–] [email protected] 1 points 1 month ago

Not necessarily, sometimes dimensionality reduction (the more common terminology used, for what is basically compression) is the explicit goal.

Can be used for outlier detection, similarity search, etc.

During training, you find a projection of the input, for example an image, to a smaller space, and then back to the original image. This is referred to as encoding and decoding. The error fuction would be a measure of how similar the in- and output images are.