this post was submitted on 26 Aug 2024
170 points (94.7% liked)

Technology

58303 readers
3 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

As we all know, AC won the "War of the Currents". The reasoning behind this is that AC voltage is easy to convert up/down with just a ring of iron and two coils. And high voltage allows us to transport current over longer distances, with less loss.

Now, the War of the Currents happened in 1900 (approximately), and our technology has improved a lot since then. We have useful diodes and transistors now, we have microcontrollers and Buck/Boost converters. We can transform DC voltage well today.

Additionally, photovoltaics produces DC naturally. Whereas the traditional generator has an easier time producing AC, photovoltaic plants would have to transform the power into AC, which, if I understand correctly, has a massive loss.

And then there's the issue of stabilizing the frequency. When you have one big producer (one big hydro-electric dam or coal power plant), then stabilizing the frequency is trivial, because you only have to talk to yourself. When you have 100000 small producers (assume everyone in a bigger area has photovoltaics on their roof), then suddenly stabilizing the frequency becomes more challenging, because everybody has to work in exactly the same rhythm.

I wonder, would it make sense to change our power grid from AC to DC today? I know it would obviously be a lot of work, since every consuming device would have to change what power it accepts from the grid. But in the long run, could it be worth it? Also, what about insular networks. Would it make sense there? Thanks for taking the time for reading this, and also, I'm willing to go into the maths, if that's relevant to the discussion.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 3 points 3 months ago (1 children)

do you have a source for that?

I know about the buck/boost DC-to-DC converters, but they don't really use AC internally.

[–] [email protected] 13 points 3 months ago (1 children)

Is a square wave not AC? Current is flowing in and out of an inductor 100k times a second.

Could that 100khz square wave excite a transformer and produce usable current on the secondary? Absolutely it could, and that's how a bigger SMPS works.

If you're looking for a "pure DC to pure DC" converter, that's called a linear regulator and it's wildly inefficient. They work by varying the conductance of a transistor but are useful for low currents. The extra voltage is converted to heat.

[–] [email protected] -1 points 3 months ago (1 children)

It's more pulsed than alternating IMO. It never goes negative, and there isn't a consistent frequency.

[–] [email protected] 6 points 3 months ago

It depends where you measure. If you measure across the inductor, it absolutely goes negative.

The frequency is generally fixed, the duty cycle will vary.

A variable speed drive can be fed with DC. Is the output AC or DC? I know you need a three phase AC motor to wire up to it.

Is audio DC? It doesn't have a fixed frequency. Amplifiers pulse DC and then remove or 'block' the DC offset so speakers see AC.

It seems like people in this thread have a very strict definition of AC being a 60Hz sine wave, and everything else must be DC.