this post was submitted on 17 Oct 2023
6 points (80.0% liked)
Programming
17513 readers
243 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities [email protected]
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
In most cases, an IPC mechanism should be better. You open up named pipes (which can be anything, so why not JSONL) for two way communication, and use that. Or shared memory of some other kind. Only one of the processes would actually write the file to disk once in a while, to avoid on-disk corruption and allow the saving program to do checks and corrections.
But if really needed, here are some thoughts: Theoretically, you could write JSONL (JSON Lines), because that's simple and human-readable in cases of debugging. If process A wants to write a new line, a new event, for example, it has to check for "'\r\n" at the end, and write-append it in one go. Process B gets the filesystem event that the file changed and reads it. It would be better of course, to create a simple 0-bytes ".lock" file while the writing is going on and removed it afterwards. To avoid corruption, you could also create checksums and whatever. Depending on your use case, that can work. I mean, I open one file sometimes in two editors and you can loose of course modifications that you did while it reloads the file from the other editor.