Redkey

joined 1 year ago
[–] Redkey 2 points 2 days ago* (last edited 2 days ago)

I apologize, because between OP's post and looking at the OnlyOffice website, I got the impression that it was only a web app, requiring a web server to run. After reading another comment here I looked harder on the website and found the download links for the standalone versions.

[–] Redkey 4 points 3 days ago* (last edited 2 days ago) (2 children)

Where are these conversations happening? I could see a lot of enterprise-focused groups potentially getting behind OnlyOffice, but individual home users? Not so much.

EDIT: My mistake! I didn't realize that there are standalone versions of OnlyOffice in addition to the web app version.

[–] Redkey 1 points 2 weeks ago

That's kind of the bare bones of how it works, underneath all the abstraction layers and pretty GUIs.

Then it evolves.

First, you start splitting your code into multiple source files, either because your programs get too big to keep scrolling up and down one huge file to cross-check things, or because you want to incorporate someone else's code into your program, and it's more than just one or two functions you can easily copy and paste. You can still keep compiling and linking all of this in one step, but the command gets so long that you make a shell script/batch file as a shortcut.

After that, you might want to mix-and-match various source files to target different platforms, or to make other bulk changes, and you start going down the rabbit hole of having your shell script take arguments, rather than having a dozen different scripts. And then one day you take another look at "make" and realize that whereas before it seemed like impenetrable overengineering, it now makes complete and obvious sense to you.

Then you discover using "make" (or a similar utility) to split compilation and linking into separate steps, which used to seem nonsensical, but now you're dealing with codebases that take more than a couple of seconds to compile, or precompiled libraries or DLLs, and you get comfortable with the idea of just hanging on to compiled object files and (re)using them when the source for that part of the program hasn't changed.

And finally (maybe) you look at some of the crazy stuff in fancy IDEs and understand why it's there; that it's just representations of all this other stuff that you now know about and feel competent with. I say "maybe" because I've been programming for over 35 years, occasionally professionally but mostly as a hobbyist, and there are still things in IDEs that I either don't understand, or don't see the point of having them. But knowing the underlying principles makes me feel comfortable enough to ignore them.

[–] Redkey 9 points 2 weeks ago (1 children)

I hadn't heard of Kate before, so I can't offer much hands-on advice. I dug around and found a "handbook" here: https://docs.kde.org/stable5/en/kate/kate/index.html

Unfortunately it does look like you need to define a project to compile/run anything, which appears to require manually creating a .kateproject file in the directory as outlined here: https://docs.kde.org/stable5/en/kate/kate/kate-application-plugin-projects.html#project-create

I had exactly the same problem when I moved from languages that were interpreted or combined the IDE and runtime environment into one, and starting to use languages which had their own external compiler. Unfortunately, open source project user documentation is often terrible for beginners (what I found above for Kate seems to be no exception), and IDEs often seem to be written by people who don't really expect anyone to actually use the included build options (to be fair, most folks seem to like using their own separate build utilities, so probably this is often the case)

If you can tell us which compiler or interpreter you're using (e.g. gcc, clang, Python), someone can probably tell you how to compile and/or run a single-file program from the terminal with a fairly simple command.

[–] Redkey 3 points 3 weeks ago

Original webcomic brought to you by K009 @ tumblr.

[–] Redkey 4 points 1 month ago

In communities for the Murderbot Diaries series of books, I sometimes see this game mentioned as a good fit for the feel of that universe. What I've seen in clips of playthroughs bears that out; I bought the game a while ago but haven't gotten around to actually installing it yet.

Anyway, I just wanted to shout out the Murderbot series as something that folks may be interested in if they enjoyed this game's world and are looking for something to read.

[–] Redkey 1 points 1 month ago* (last edited 1 month ago) (1 children)

If you or anyone else is interested in playing more, I recommend:

  • Silent Hill 2: Restless Dreams (aka Director's Cut). Not a continuation of the story of the first game, but a separate story in the same universe. Generally agreed to take everything good from the first game and improve upon it. The "Restless Dreams" version has a substantial extra scenario which adds some backstory and lore, but should probably be played only after completing the main game.
  • Silent Hill 3. This one does continue the story of the first game, somewhat. To be honest I remember enjoying it but not much in terms of particulars.
  • Silent Hill 4: The Room. Started out as a separate game unrelated to the Silent Hill mythos, but was rewritten to become an SH game during development. This sounds like it might be a terrible cash-in, but it really is a perfect fit for the SH universe. IMO almost as strong as SH2.
  • Silent Hill: 0rigins. A PSP game set as a prequel to the first game. A little light on story, and with some odd combat mechanics, but I still found it very enjoyable. I played the later PS2 port.
  • Silent Hill: Shattered Memories. A "reimagining" of the story from the first game. It plays and feels very different to the previous games, but I still enjoyed it quite a lot.
  • Silent Hill: Orphan. A series of point-and-click adventures for Java-enabled mobile phones from the 2000s. Totally different mechanics from the mainline games, but they do the atmosphere and story well if you don't mind the slower pace of point-and-click. They run on some J2ME emulators.
  • Silent Hill: Alchemilla. A free fan-game centered on the Alchemilla Hospital, but also including several other locations. First-person view with many puzzles and no combat. Very polished and really nails the atmosphere.

I played a little of Silent Hill: Homecoming but got tired of it about 1/3 of the way through (I guess). I also bought Silent Hill: Downpour but gave up on that even more quickly. I don't recommend either of them. Things introduced in the earlier games for specific psychological reasons related to the plot - especially sexy monster nurses and Pyramid Head - tend to be regurgitated in the later games for no real reason other than "Silent Hill", which removes their impact completely.

[–] Redkey 4 points 1 month ago (1 children)

I think I was kinda in the same boat as you.

In theory, I loved the fact that if you wanted to check, the game would tell you when you theoretically had enough information to identify one of the crew or passengers, so you knew where to focus your thinking. But I got stuck on some characters who seemed to me to be implied or hinted, but for whom I didn't think I had positive proof.

I eventually got tired of continuously reviewing the same scenes over and over, looking for some detail that I had overlooked, and read a walkthrough to find out what I was missing. It seems that I hadn't missed anything, and "an educated guess" was the standard expected by the game, not "definitive proof". But I was burnt out with the game by that point and stopped playing.

[–] Redkey 7 points 2 months ago (1 children)

Despite this, I still bet that they post "nvm fixed it" an hour or two later.

[–] Redkey 4 points 3 months ago* (last edited 3 months ago)

The way I see it, there are two separate issues for discussion here.

The first is permanently altering a classic console. That's an issue of historical preservation, and I'm not going to get into that.

The second issue is whether or not, being prepared to go as far as having removed the original optical drive, one might not just as well drop the console entirely and go the emulation route. To me, suggesting this shows a lack of understanding about how emulation works.

A real console consists of IC semiconductors and discrete components that propagate electrical fields and shuffle the occasional electron around. A software emulator is a bag of rules and tricks that tries to replicate the overall output of a console. Even FPGA-based emulators aren't 100% perfect, because their gates and connections aren't configured identically to the original hardware.

Game consoles are very complex systems that operate via the interplay of dozens of intricate subsystems. That's why emulators start off supporting only a handful of games, and rarely reach 100% compatibility. Emulator developers are forever picking the next emulation inconsistency from the bug report list, tracking down what their emulator is doing differently to the original hardware, and then adding a new rule for dealing with that particular case. If they're lucky, a couple of other games will also start working better. If they're unlucky, a couple of other games will start working worse.

(For the interested, the author of BSNES wrote a detailed article about these issues for Ars Technica)

Take the Atari 2600. It's a very old console that was very popular. The community has full schematics not just for the mainboard, but even the CPU and custom video chip. More patient people than me have sat for hours with oscilloscopes and test ROMs to probe the console inside and out. There are emulators that can play every game that was released back in the day without fault. Heck, the emulator I use is so advanced that you can set it to emulate specific revisions of the console with specific CRT TV parameters, and it will glitch in the same way that the game would glitch on that combination of hardware in real life. But it's still not a "perfect" emulation! Homebrew developers are still finding quirks in the real 2600 hardware that the emulators don't replicate, at least until the next update.

I have a PS2 which plays my games from an internal hard drive, and which has its output fed through an HDMI converter. Why don't I just emulate it? Well, if you want to play FFX, or MGS2, or Ratchet & Clank, that'll work great. Those are popular games, and emulator developers have put a lot of effort into making sure that the rules of their emulation work for those games. But I have dozens of more obscure games that have game-breaking glitches or don't launch at all under emulation. And I also still have hundreds of discs that I don't want to paw through, and are slowly degrading until one day they'll no longer work, as well as an optical drive that gets a little closer to wearing out for good every time I use it, and a big, modern TV that hates analog inputs (not to mention no room for a bulky CRT). Getting the data into the console, and getting the final video and audio out, are both fairly well-understood and usually can be reimplemented reliably. But the heart of the console, where the data is turned into executing code, mixed with player input, and transformed into the output? That's where the actual magic happens.

In my opinion, saying that if you're going to replace an optical drive then you may as well just emulate the whole thing is a bit like saying that if you're going to talk to Angela over the phone instead of in person, then you may as well just replace her with a well-trained AI chatbot.

[–] Redkey 4 points 3 months ago (1 children)

Why bother? Because feeding data into the console and getting audio-visual signals out of it are both very well understood and can actually be replicated with essentially total accuracy. But the complex operations and subtle interactions of CPU, VDUs, RAM, and other support chips can't. That's the important part of the console, not the optical drive or the analog video output.

Software emulators and FPGA-based systems give it a good try, and can often run the majority of software for a console at an acceptable fidelity for most users, but they're a long, long way from being 1:1 perfect, and the more recent the console, the more games either don't run properly or don't run at all.

[–] Redkey -1 points 3 months ago

Oh my goodness, replacing the optical drive with a modern solution isn't close to halfway to complete software emulation. It's not even 20% of the way there.

view more: next ›