this post was submitted on 11 Nov 2023
139 points (99.3% liked)
Asklemmy
43902 readers
948 users here now
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- [email protected]: a community for finding communities
~Icon~ ~by~ ~@Double_[email protected]~
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The biggest issue would be microchips which require some really precise machinery to manufacture.
1930s - complete reverse engineering
By then they had both an understanding of semiconductors and computational theory. Using semi-conductive materials to compute wasn't yet a thing, but there wouldn't be much surprise at the concept. Some kind of reproduction is likely, probably not a 5nm manufacturing process like modern chip factories, but they could make it.
1890s - eventual understanding, but not able to manufacture
Measuring devices were sensitive enough by then to measure tiny electrical fluctuations. They would be able to tell the device functions due to processing of electrical signals, even capture those signals. Biggest missing piece is mathematical theory - they wouldn't immediately understand how those electrical signals produce images and results. Reproduction - no. Maybe the would get an idea what's needed - refining silicon and introducing other stuff into it, but no way they could do it with equipment of the day.
1830s - electricity goes into a tiny box and does calculations, wow!
This is the age of the first great electrical discoveries. They would be in awe what is possible, and understand on a high level how it's supposed to work. Absolutely no way to make it themselves.
1730s - magic, burn the witch!
Sir Bedevere: And what do you burn, apart from witches?
Peasant: More witches!
The novel ways that we've come up with to make processors and circuit boards over the past 40 years has been pretty amazing. I believe you're giving people of the 1930s too much credit here. Just for instance, the entire industry has known making chips smaller with more transistors will yield better performance for the past 40+ years. It's taken coming up with manufacturing "tricks" this long to get down to what we have today. Same thing for ram and hard drives.
And the code that programs it all to run would be completely unreadable. Much less the understanding of all the code for stuff that wouldn't have been named, created, or thought of, yet. Or how to program and read anything off the a solid state hard drive or the ram.
The first "digital computer" was made in 1945. You would bump that up a bit sooner by giving them a laptop in the 1930s, but most things since then have been just trying to refine the manufacturing process. They wouldn't be able to recreate the laptop at all. Not even in the 1980s would they be able to create it.
I looked it up, the first first SEM was 1937, then commercial ones in 1965
10,000 B.C. - UgA BuGa!
Related