this post was submitted on 24 Jan 2024
890 points (98.8% liked)

Technology

58303 readers
11 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Who would've thought? This isn’t going to fly with the EU.

Article 5.3 of the Digital Markets Act (DMA): "The gatekeeper shall not prevent business users from offering the same products or services to end users through third-party online intermediation services or through their own direct online sales channel at prices or conditions that are different from those offered through the online intermediation services of the gatekeeper."

Friendly reminder that you can sideload apps without jailbreaking or paying for a dev account using TrollStore, which utilises core trust bugs to bypass/spoof some app validation keys, on a iPhone XR or newer on iOS 14.0 up to 16.6.1. (ANY version for iPhone X and older)

Install guide: Trollstore

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 9 months ago (1 children)

The audit is not for you. Closed source software is audited all the time, but the results of those audits are generally confidential. This is about finding security bugs, not deliberate backdoors.

The key with this is who do you trust. Sure, open source can be audited by everyone, but is it? You can’t audit all the code you use yourself, even if you have the skills, it’s simply too much. So you still need to trust another person or company, it really doesn’t change the equation that much.

[–] [email protected] 1 points 9 months ago* (last edited 9 months ago) (1 children)

In practice, most common open source software is used and contributed to by hundreds of people. So it naturally does get audited by that process. Closed source software can't be confirmed to not be malicious, so it can't be confirmed to be secure, so back to my original point, it can't be private.

I didn't go into that much detail in my original comment, but it was what I meant when I first wrote it. As far as "does everyone audit the software they use", the answer is obviously no. But, the software I use is mostly FOSS and contributed to by dozens of users, sometimes including myself. So when alarms are rung over the smallest things, you have a better idea of the attack vectors and privacy implications.

[–] [email protected] 1 points 9 months ago (1 children)

In practice, most common open source software is used and contributed to by hundreds of people. So it naturally does get audited by that process.

Just working on software is not the same as actively looking for exploits. Software security auditing requires a specialised set of skills. Open source also makes it easier for black-hat hackers to find exploits.

Hundreds of people working on something is a double-edged sword. It also makes it easy for someone to sneak in an exploit. A single-character mistake in code could cause an exploitable bug, and if you are intent on deliberately introducing such an issue it can be very hard to spot and even if caught can be explained away as an honest to god mistake.

By contrast, lots of software companies screen their employees, especially if they are working on critical code.

[–] [email protected] 1 points 9 months ago* (last edited 9 months ago)

I don't know if you really believe what you're saying, but I'll continue answering anyways. I worked at Manulife, the largest private insurance company in Canada, and ignoring the fact our security team was mostly focused on pen testing (which as you know, in contrast to audits tells you nothing about whether a system is secure), but the audits were infrequent and limited in scope. Most corporations don't even do audits (and hire the cheapest engineers to do the job), and as a consumer, there's no way to easily tell which audits covered the security aspects you care about.

If you want to talk about the security of open source more, besides what is already mentioned above, not only are Google, Canonical and RedHat growing their open source security teams (combined employing close to 1,000 people whose job is to audit and patch popular open source apps), but also open source projects can likewise pay for audits themselves (See Mullvad or Monero as examples).

I will concede that it is possible for proprietary software to be secure. But in practice, it's simply not, and too hard to tell. It's certainly not secure when compared to similar open source offerings.