this post was submitted on 12 Oct 2024
73 points (95.1% liked)

Programming

17445 readers
201 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities [email protected]



founded 1 year ago
MODERATORS
 

Hi,

My question certainly stems from the imposter syndrome that I am living right now for no good reason, but when looking to resolve some issues for embedded C problems, I come across a lot of post from people that have a deep understanding of the language and how a mcu works at machine code level.

When I read these posts, I do understand what the author is saying, but it really makes me feel like I should know more about what's happening under the hood.

So my question is this : how do you rate yourself in your most used language? Do you understand the subtilities and the nuance of your language?

I know this doesn't necessarily makes me a bad firmware dev, but damn does it makes me feel like it when I read these posts.

I get that this is a subjective question without any good responses, but I'd be interested in hearing about different experiences in the hope of reducing my imposter syndrome.

Thanks

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 1 month ago

Thanks for the input

You're welcome!

I haven't used assembly in a long while, so I know where to look to understand all the instructions, but I can't tell right off the bat what a chunk of assembly code does.

Oh, me neither. And that's not what I think is necessary; what's important is that you can generally imagine the sorts of operations which are going on under the hood for any given line of code. That there's no magic "generate a hash for a string" CPU operation, and that, ultimately, something is going to be iterating over a series of memory locations and performing several math operations on each to produce a numeric output. I think this awareness is enormously valuable in developers, and helps them think about the code they're writing in a certain way, and usually in a way that improves their code.

Algorithms, I am terrible at these because I rarely use them.

You use them all the time! Anything longer than a single operation is an algorithm.

Nobody is going to ask you to write a search function; however, being aware of Big-O notation, and being able to reason about time and space complexity, is important. On the backbend, it's critical. It's important if you're a front end developer - I blame the whole NodeJS library fiasco on not enough awareness of dependency complexity by a majority of JS developers.

I tend to work in finite state machine which is close to algorithms, but it's not quite it.

I'd absolutely call FSM work "algorithms", and it sounds as if the projects you're working on is where these fundamentals are most important. Interfaces between hardware components? It's the most fraught topic in CIS! So. Many. Pitfalls. Shit, you probably have to worry about clock speeds and communication sheer; there's absolutely a huge corpus of material about algorithms for handling stuff you're working with, like vector clocks. That's a fabulous, interesting field. It's also super tedious, and requires huge attention to detail which I lack, so in a way I envy you, but an also glad I'm not you.