Kissaki

joined 2 years ago
MODERATOR OF
[–] Kissaki 1 points 14 minutes ago

IIRC I've had pip fail like that too. Unable to build a lib it included.

[–] Kissaki 1 points 44 minutes ago

Which is a fair point. But kinda besides the split leading to a split in community and content to some degree.

[–] Kissaki 1 points 45 minutes ago

Sounds like API specifications and an implementation of those could establish a path forward?

Certainly a lot of effort, but I assume these replications (with stolen code) at least supported discoverability and certainty on the APIs in use.

How complex is libogc?

[–] Kissaki 1 points 49 minutes ago (3 children)

Exposure may be the incentive but I've seen divergence between split communities. Crossposting is another activity, another barrier. Sure, it may just be two clicks, but if a reasonable number of users feel at home in one of the two, they likely won't look at the other. Especially when much of it is duplication.

[–] Kissaki 5 points 5 hours ago* (last edited 5 hours ago) (1 children)

I could tell you how I would like it to be, but it's probably not what others think. I find most of programmerhumor not funny.

To me, iiiitttt - and the short description fits it to some degree - should be about ridiculous stuff in IT. Be it ridiculous programs, behaviors, or users.

Programmerhumor seems like it could be something different. Or not. Joke languages, silly experiments and demonstrations like extra bad UI, or jokes (opposed to real occurrences).

🤷‍♀️

/edit: Also, sidenote, I dislike this community's name - purely because it's not memorable nor easily writable.

[–] Kissaki 1 points 5 hours ago (5 children)

that and a split community meaning less content in each

[–] Kissaki 1 points 6 hours ago

Interactions look very impactful, nice work!

[–] Kissaki 1 points 6 hours ago

Almost! Time for a disk failure!

Do you have backups? :P

[–] Kissaki 6 points 6 hours ago (1 children)

I don't know how much experience you have [in developing games or software], but to me

I expect it to be ready in about four years.

I’m working on a 4X grand strategy game, which is basically at least four games smashed into one.

sounds to me like it'll more realistically take 8 to 12 years or more realistically end up with less than was planned and the original plan cancelled in one way or another.

I'm also skeptical about the shared synergy effects of individually released titles. Trading Game and Colony Management Game sound like very different things. While you can reuse and share some aspects to them, if the gameplay logic is separate, they will diverge and make it harder to keep a shared base, and gameplay specific stuff is separate anyway. If you go this route, be mindful of this and design for a shared base only to a degree where it makes sense.

As for your listing of plus and minus side, I find the plus side much more convincing, both in what they say and impact.

It's better to reduce scope and risks, and gain experience. It's better to develop a following. It's better to learn about the whole process and expectations - towards yourself, how it will work out, success, etc.

"I might get sidetracked" will be a thing either way. I guess you mean by working on and improving the smaller titles instead of the summation title of all of them. This assessment is not convincing to me - specifically with the huge scope of the alternative in mind.

I don't know about your current following, but increasing exposure and titles seems like it would have a higher chance of increasing your following and satisfying them to me.

If your mini-games turn out to be bad rather than seeing it as a downside, take it as a chance to improve and gain feedback, or cut costs and risks. If that's the case and without those you'd have released a huge investment and title that is bad and nobody wants to play or buy. Sounds much worse to me.

Even with smaller titles released, once concluded, they could be bundled or iterated on into the bigger title you had originally planned. So I don't think it's one or the other, but a much less risky and more productive way of building towards that vision that has many uncertainties.

I don't think it's a very innovative or surprising or novel development or release strategy. It just makes sense.

[–] Kissaki 3 points 18 hours ago

"Press enter to start" - I press numpad enter and it adds another player :P

[–] Kissaki 3 points 18 hours ago* (last edited 18 hours ago)

Y2ROLL on Steam

Planned Release Date: Q2 2025

 

GitHub

Theia IDE is compatible with VS Code APIs and can install and use VS Code extensions. Has additional APIs for customizations not available in VS Code.

Have you tried Theia IDE? Any assessments or experiences to share?

 

Abstract:

When a website is accessed, a connection is made using HTTPS to ensure that it ends with the website owner and that subsequent data traffic is secured. However, no further assurances can be given to a user. It is therefore a matter of trust that the site is secure and treats the information exchanged faithfully. This puts users at risk of interacting with insecure or even fraudulent systems. With the availability of confidential computing, which makes execution contexts secure from external access and remotely attestable, this situation can be fundamentally improved.

In this paper, we propose browser-based site attestation that allows users to validate advanced security properties when accessing a website secured by confidential computing. This includes data handling policies such as the data provided being processed only during the visit and not stored or forwarded. Or informs the user that the accessed site has been audited by a security company and that the audited state is still intact. This is achieved by integrating remote attestation capabilities directly into a commodity browser and enforcing user-managed attestation rules.

Some excerpts:

Such a secured context is encrypted at all times, but is decrypted within the CPU only when the context is about to be executed. Thus, code and data are now also protected from unwanted access during execution. In order to validate that confidential computing applies to a secured context, remote attestation must be performed. During this process, a request is sent to a secured context, which in turn requests an attestation report from a Hardware Root of Trust (HRoT) local to the platform.

We argue that end users could also benefit greatly from the extended guarantees of confidential computing when accessing a secured website. However, there are two main obstacles: First, there is no standardized way for users to detect a secured context and perform remote attestation. Second, if remote attestation is enabled, users must be able to interpret an attestation result to decide whether the remote site is trustworthy.

In this paper, we present site attestation, which takes advantage of confidential computing to improve trust and security when surfing the Web.

7 CONCLUSION

Today, when accessing websites, users have to trust that the remote system is secure, respects data protection laws, and is benevolent. With the availability of confidential computing, remote execution contexts can be secured from external access and become attestable. Site attestation proposes to secure websites through confidential computing and perform remote attestation with trustworthiness policies while surfing the Web, reducing the need to blindly rely on the website’s reputation.

GitHub repo with Nginx, httperf, and Firefox code

14
submitted 2 days ago* (last edited 2 days ago) by Kissaki to c/git
 

For those familiar with Git terminology:

The simplest way to assemble a triangular workflow is to set the branch’s merge key to a different branch name, like so:

[branch “branch”]
   remote = origin
   merge = refs/heads/default

This will result in the branch pullRef as origin/default, but pushRef as origin/branch, as shown in Figure 9.

Working with triangular forks requires a bit more customization than triangular branches because we are dealing with multiple remotes. […]

 

Explicit Assembly References are stand-alone assemblies directly referenced in your project. They are not pulled in through NuGet packages, project references, or the Global Assembly Cache (GAC). These assemblies often represent legacy .NET Framework components, especially those compiled for 32-bit, which are not easily upgraded to modern .NET and may exist outside of package management.

Until now, the Toolbox in the Windows Forms designer only displayed controls sourced from NuGet packages or project references.

13
submitted 2 weeks ago by Kissaki to c/dotnet
 

This first push resulted in NuGet Restore times being cut in half, which was a reasonable stopping point for our work. However, along the way, we realized that a more extensive rewrite could improve performance by a factor of 5x or more.

Written from the perspective of several team members, this entry provides a deep dive into the internals of NuGet, as well as strategies to identify and address performance issues.

5
submitted 2 weeks ago* (last edited 2 weeks ago) by Kissaki to c/dotnet
 

This first push resulted in NuGet Restore times being cut in half, which was a reasonable stopping point for our work. However, along the way, we realized that a more extensive rewrite could improve performance by a factor of 5x or more.

Written from the perspective of several team members, this entry provides a deep dive into the internals of NuGet, as well as strategies to identify and address performance issues.

view more: next ›