this post was submitted on 15 Nov 2024
59 points (100.0% liked)

Futurology

1776 readers
142 users here now

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 2 points 3 hours ago (1 children)

But you can accelerate nural nets better with a GPU, right? They've got a lot more parallel matrix multiplication compute than any npu you can slap on a CPU.

[โ€“] [email protected] 1 points 14 minutes ago

It all depends on the GPU. If it's something integrated in the CPU it will probably not so better, if it's a 2000$ dedicated GPU with 48GB of VRAM is will be very powerful for Neural Net computing. NPUs are most often implemented as small, low-power, embedded solutions. Their goal is not to compete with data centers or workstations, it's to enable some basic "AI" features on portable devices. E.g: "smart" camera with object recognition to give you alerts.