this post was submitted on 16 Mar 2024
136 points (100.0% liked)

Space

8341 readers
576 users here now

Share & discuss informative content on: Astrophysics, Cosmology, Space Exploration, Planetary Science and Astrobiology.


Rules

  1. Be respectful and inclusive.
  2. No harassment, hate speech, or trolling.
  3. Engage in constructive discussions.
  4. Share relevant content.
  5. Follow guidelines and moderators' instructions.
  6. Use appropriate language and tone.
  7. Report violations.
  8. Foster a continuous learning environment.

Picture of the Day

The Busy Center of the Lagoon Nebula


Related Communities

๐Ÿ”ญ Science

๐Ÿš€ Engineering

๐ŸŒŒ Art and Photography


Other Cool Links

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] [email protected] 29 points 6 months ago (1 children)

What's really cool is they wanted to inspect the FDS to see if any parts of it is corrupted, and it was sending a whole damned readout back to us the entire time. No one could figure that out until now though.

[โ€“] [email protected] 18 points 6 months ago* (last edited 6 months ago) (3 children)

Right! I wonder how did the probe send an entire memory dump back without them realizing. Was it programmed to do that when a system failed or something?

[โ€“] [email protected] 24 points 6 months ago

That person who enabled the debug flag on their last command is shitting their pants at the moment

[โ€“] [email protected] 15 points 6 months ago* (last edited 6 months ago) (2 children)

Good question. Makes me wonder if it's part of a system debug programmed into it that was forgotten or something. The guy that put it in could be long gone and didn't document it?

[โ€“] [email protected] 14 points 6 months ago (2 children)

It's very well documented, just 4-8 documentation systems ago and never migrated because no one thought it was important.

[โ€“] [email protected] 12 points 6 months ago (2 children)

It's insane to me how many government agencies simply forget about things because nobody thought a certain file or document was important enough to update, and the only ways to access the information is to find the person who wrote it, find where it's being stored and dig through millions of unrelated files, or spend a ton of money to reverse engineer the thing you once made.

Just look at the US nuclear arsenal. Some of the warheads they began updating back in the day no longer had any documentation due to how many times the files changed hands. Things got lost. People moved to other projects or left the line of work altogether. There was no way to get the full process to make a other one, so they threw money at it until they figured out how to make it.

How many files have accidentally fallen into a box that got shredded? How many times has something been lost to the entirety of Mankind because it fell behind a shelf (and who wants to spend the afternoon moving the entire shelf for a single file)?

[โ€“] [email protected] 10 points 6 months ago

to find the person who wrote it

Somebody who was twenty years old when Voyager 1 launched is now 67. Even the junior members of the team are retired now. The senior members are way beyond the average life expectancy.

[โ€“] [email protected] 6 points 6 months ago

It was a completely different world back then. I'm not justifying it, but it was the way it was.

I grew up in the 80s.

I used to draw a lot, make comics, recreate the covers of my favorite music albums, etc. I also liked to record whatever thing I thought was funny in cassette tapes.

Back then, I didn't have the mindset like "I should archive this. Who knows when I will need it!"

I my Commodore ViC-20 games and programs, stuff I wrote, in cassette tapes too. I had a notebook detailing my projects, etc. Again, no "let's back this up or store it with me for years to come."

When it was time to dispose of things, you just..... did. Or reused the cassettes, or the notebooks or whatever.

Granted, my use case is way different from that of a government with nuclear warheads. But yup. Different time, different world, different mindsets.

[โ€“] [email protected] 4 points 6 months ago (1 children)

Is that true, or are you making a joke? Because the documentation is probably a big binder of paper.

[โ€“] [email protected] 1 points 6 months ago

It was a joke, it's how I tend to find most documentation at work, no matter where I am working.

[โ€“] [email protected] 8 points 6 months ago

Could also be the thing had a buffer overflow kind of fault. Instead of just sending its intended buffer the check for the end has broken and its continuously sending the entire contents of its memory.

[โ€“] [email protected] 8 points 6 months ago

There's a little bit more context (although not a lot) at the NASA blog which seems to be the source for this article. Basically it looks like they instructed it to go to different memory addresses and run whatever code was there in order to try to bypass any corrupted sections. One result was this memory dump. The reason they didn't immediately identify it was that it wasn't properly formatted in the normal way.