TheHobbyist

joined 1 year ago
[–] [email protected] 15 points 2 days ago

i think they mean that signal on desktop does not encrypt their content at rest, which is acknowledged and not an issue they are intending on addressing.

But it seems to have recently changed? I'm learning thus as I wanted to find a source.

Source: https://candid.technology/signal-encryption-key-flaw-desktop-app-fixed/

[–] [email protected] 84 points 3 days ago (6 children)

I'm with you all the way, really, except that, truly, KDE plasma and dark mode are the superior choices, obviously :)

[–] [email protected] 25 points 1 week ago

And the Netherlands are 6th! But the hardest part will be reaching that Million threshold... We still have a lot of time, but the pace has certainly slowed down the last few weeks compared to the skyrocketing in the early days. I think we will need to have more awareness spread around the campaign, perhaps try to reach mainstream media in some ways...

[–] [email protected] 8 points 1 week ago

infomaniak is the largest swiss cloud provider, they have multiple services which are domain related (purchase and management), cloud computing and more. They have a good reputation. They also have a swiss cloud certificated meaning they are able to host data in Switzerland and manage it from Switzerland. If you trust Switzerland for privacy, I think by extension you can trust them.

[–] [email protected] 20 points 1 week ago (3 children)
  • July 23 - 3.12%
  • August 23 - 3.18%
  • September 23 - 3.02%
  • October 23 - 2.92%
  • November 23 - 3.22%
  • December 23 - 3.82%
  • January 24 - 3.77%
  • February 24 - 4.03%
  • March 24 - 4.05%
  • April - 3.88%
  • May - 3.77%
  • June - 4.05%
  • July - 4.45%
  • August - 4.55%
[–] [email protected] 53 points 1 week ago (2 children)

We had captchas to solve that a while ago. Turns out, some people are willing to be paid a miserable salary to solve the captchas for bots. How would this be different? The fact of being a human becomes a monetizable service which can just be rented out for automated systems. No "personhood" check can prevent this.

[–] [email protected] 1 points 1 week ago

Looks kind of average to me. Also, I wonder if it would not have made sense to build it on Lunar lake if it was to be announced/released this late?

[–] [email protected] 10 points 1 week ago

I question why anyone would have a monopole or exclusive right to a topic of conversation :)

But I actually disagree due to another fundamental point: USA politics are in fact involving and impacting the rest of the world. The USA leads NATO and has previously threatened to drop it, as well as the WHO, they get involved (sometimes violently) in foreign affairs (afghanistan). They are a huge economic market to which many foreign companies try to sell or get funding. They are the biggest stock exchange (still today I guess).

The influence of the USA exceeds its borders since quite a while now. It is only normal for people to feel involved, despite not being from the USA. Same with Russia, same with China.

[–] [email protected] 13 points 1 week ago* (last edited 1 week ago) (9 children)

The age of DRM means that they can now "unlaunch" the game and force you into a reimbursement while giving up the game. Why? What if someone liked it and wanted to keep playing? is this an online only game? This is just sad.

edit: this is a good time to remind people, if you live in the EU, please support the "Stop Killing Games" initiative, it has just past a third of the required signatures, and has 10 months to go still:

https://eci.ec.europa.eu/045/public/#/screen/home

[–] [email protected] 2 points 1 week ago (1 children)

Is there any source I can read to find out more about this?

[–] [email protected] 9 points 1 week ago (3 children)

[...] after confirming the database contains images of Dutch citizens.

How could they confirm this?

[–] [email protected] 5 points 1 week ago (4 children)

In the deep learning community, I know of someone using parquet for the dataset and annotations. It allows you to select which data you want to retrieve from the dataset and stream only those, and nothing else. It is a rather effective method for that if you have many different annotations for different use cases and want to be able to select only the ones you need for your application.

 

Hi folks,

I'm seeing there are multiple services which externalise the task of "identity provider" (e.g. login with Facebook, google or what not).

In my case, I am curious about Tailscale, a VPN service which allows one to chose an identity provider/SSO between Google, Microsoft, Github, Apple and OIDC.

How can I find out what data is actually communicates to the identity provider? Their task should simply be to decide whether I am who I claim to be, nothing more. But I'm guessing there may be some subtleties.

In the case of Tailscale, would the identity provider know where I'm trying to connect? Or more?

Answers and insights much appreciated! The topic does not seem to have much information online.

 

Hi folks, I'm considering setting up an offsite backup server and am seeking recommendations for a smallish form factor PC. Mainly, are there some suitable popular second hand PCs which meet the following requirements:

  • fits 4x 3.5" HDD
  • Smaller than a regular tower (e.g. mATX or ITX)
  • Equipped with a 6th of 7th gen Intel CPU at least (for power efficiency and transcoding, in case I want it to actually to some transcoding) with video output.
  • Ideally with upgradeable RAM

Do you know of something which meets those specs and is rather common on the second hand market?

Thanks!

Edit: I'm looking for a prebuilt system, such as a dell optiplex or similar.

 

Yesterday, there was a live scheduled by Louis Grossman, titled "Addressing futo license drama! Let's see if I get fired...". I was unable to watch it live, but now the stream seems to be gone from YouTube.

Did it air and was later removed? Or did it never happen in the first place?

Here's the link to where it was meant to happen: https://www.youtube.com/watch?v=HTBYMobWQzk

Cheers

Edit: a new video was recently posted at the following link: https://www.youtube.com/watch?v=lCjy2CHP7zU

I do not know if this was the supposedly edited and reuploaded video or if this is unrelated.

 

DeepComputing is preparing a RISC-V based motherboard to be used in existing Framework Laptop 13s!

Some snippets from the Framework blog post (the link to which is provided below):

The DeepComputing RISC-V Mainboard uses a JH7110 processor from StarFive which has four U74 RISC-V cores from SiFive.

This Mainboard is extremely compelling, but we want to be clear that in this generation, it is focused primarily on enabling developers, tinkerers, and hobbyists to start testing and creating on RISC-V.

DeepComputing is also working closely with the teams at Canonical and Red Hat to ensure Linux support is solid through Ubuntu and Fedora.

DeepComputing is demoing an early prototype of this Mainboard in a Framework Laptop 13 at the RISC-V Summit Europe next week.

Announcement: https://frame.work/blog/introducing-a-new-risc-v-mainboard-from-deepcomputing

The upcoming product page (no price/availability yet): https://frame.work/products/deep-computing-risc-v-mainboard

Edit: Adding link the the announcement by DeepComputing: https://deepcomputing.io/a-risc-v-world-first-independently-developed-risc-v-mainboard-for-a-framework-laptop-from-deepcomputing/

29
submitted 5 months ago* (last edited 5 months ago) by [email protected] to c/[email protected]
 

From Simon Willison: "Mistral tweet a link to a 281GB magnet BitTorrent of Mixtral 8x22B—their latest openly licensed model release, significantly larger than their previous best open model Mixtral 8x7B. I’ve not seen anyone get this running yet but it’s likely to perform extremely well, given how good the original Mixtral was."

 

Hi all,

I think around 1 or 2 years ago, I stumbled upon a personal blog of an asian woman (I think) working at OpenAI. She had numerous extensive fascinating blog posts on a black themed blog, going into the technical details of embeddings of language models and such.

I can no longer find that blog and have no other information to go by. Would anyone possibly know which blog I'm referring to? It would be very much appreciated.

 

Hi folks,

I seem to be having some internet connectivity issues lately and I would like to monitor my access to the internet. I have a homelab and was wondering whether someone had perhaps something like a docker container which pings a custom website every so often and plots a timescale of when the connection was successful and when it was not.

Or perhaps you have another suggestion? I know of dashboards like grafana but I don't know whether they can be configured to actually generate that data or whether they rely on a third party to feed them. Thanks!

 

Just wanted to share my appreciation of the game.

I grabbed a copy of this game a year ago, taking advantage of a sale and ahead of the massive update. Then forgot about it, never touched it.

Fast forward a year later, and now I got a steam deck and decided to dive into the game. I love it. I'm just a few hours in but I can already say this is among my favorite games. The broad openness of the world, the level of detail, the characters, the interactive dialogs, the items, the strategies, the game mechanics. It's a very involved game. It really is up there. Thank you CDPR for this game and this remake.

 

I was exploring the fps and refresh rate slider and I realized that when setting the framerate limiter to 25, the refresh rate was incorrectly set to 50Hz on the OLED version, when the 75 Hz setting would be a more appropriate setting, for the same reason 30 fps is at 90 Hz and not 60 Hz. Anyone else seeing the same behavior? Is there an explanation I'm missing here?

 

Hi folks, I'm looking for a specific YouTube video which I watched around 5 months ago.

The gist of the video is that it was comparing the transcoding performance of an Intel iGPU when used natively, compared to when passed through to a VM. From what I recall there was a significant performance hit and it was around 50% or so (in terms of fps transcoding). I believe the test was performed on jellyfin. I don't remember whether it was using xcpng, proxmox or another OS. I don't remember which channel published this video nor when it was published, just that I watched it sometime between April and June this year.

Anyone recall or know what video I'm talking about? Possible keywords include: quicksync, passthrough, sriov, iommu, transcoding, iGPU, encoding.

Thank you in advance!

 

Hi y'all,

I am exploring TrueNAS and configuring some ZFS datasets. As ZFS provides with some parameters to fine-tune its setup to the type of data, I was thinking it would be good to take advantage of it. So I'm here with the simple task of choosing the appropriate "record size".

Initially I thought, well this is simple, the dataset is meant to store videos, movies, tv shows for a jellyfin docker container, so in general large files and a record size of 1M sounds like a good idea (as suggested in Jim Salter's cheatsheet).

Out of curiosity, I ran Wendell's magic command from level1 tech to get a sense for the file size distribution:

find . -type f -print0 | xargs -0 ls -l | awk '{ n=int(log($5)/log(2)); if (n<10) { n=10; } size[n]++ } END { for (i in size) printf("%d %d\n", 2^i, size[i]) }' | sort -n | awk 'function human(x) { x[1]/=1024; if (x[1]>=1024) { x[2]++; human(x) } } { a[1]=$1; a[2]=0; human(a); printf("%3d%s: %6d\n", a[1],substr("kMGTEPYZ",a[2]+1,1),$2) }'

Turns out, that's when I discovered it was not as simple. The directory is obviously filled with videos, but also tiny small files, for subtitiles, NFOs, and small illustration images, valuable for Jellyfin's media organization.

That's where I'm at. The way I see it, there are several options:

    1. Let's not overcomplicate it, just run with the default 64K ZFS dataset recordsize and roll with it. It won't be such a big deal.
    1. Let's try to be clever about it, make 2 datasets, one with a recordsize of 4K for the small files and one with a recordsize of 1M for the videos, then select one as the "main" dataset and use symbolic links for each file to the other dataset such that all content is "visible" from within one file structure. I haven't dug too much in how I would automate it, but might not play nicely with the *arr suite? Perhaps overly complicated...
    1. Make all video files MKV files, embed the subtitles, rename the videos to make NFOs as unnecessary as possible for movies and tv shows (though this will still be useful for private videos, or YT downloads etc)
    1. Other?

So what do you think? And also, how have your personally set it up? Would love to get some feedback, especially if you are also using ZFS and have a videos library with a dedicated dataset. Thanks!

Edit: Alright, so I found the following post by Jim Salter which goes through more detail regarding record size. It clarifies my misconception about recordsize not being the same as the block size, but also it can easily be changed at any time. It's just the size of the chunks of data to be read. So I'll be sticking to 1M recordsize and leave it at that despite having multiple smaller files, because the important will be to effectively stream the larger files. Thank you all!

 

Dave2d who's been supportive of Framework preordered the Laptop 16.

He's a bit concerned about the pricing and questions the upgradability of the Laptop 16 specifically.

Personally I understand his point, but I think the upgradability alone is probably not a good reason to buy the Laptop 16. It's always been a package, which includes:

  • repairability
  • modularity
  • support of the movement/mission
  • the versatility of reusing parts for other use cases (e.g. the motherboard as thin-client)
  • a laptop that actually does not have Linux as an afterthought
  • the openness with the expansion card and (hopefully expansion bay) ecosystem
  • and maybe even more?

It's true that the laptop is expensive when you compare specs for specs but that was not the reason to buy it either. Do I wish it was cheaper? You bet. But like with all new startups, if it works out, if it scales, prices could come down. Long live Framework!

view more: next ›