That's interesting given that in California pre-school is 4-yo and kindergarten is the year after that.
JDubbleu
Almost every personal computer that isn't a MacBook is poorly secured due to the lack of filesystem encryption as a default. No one encrypts their data at rest, and as such you just have to pull their drive and read it with another computer. Hell, I don't encrypt my entire file system despite being aware of this because of the inconvenience of added boot time, but everything that matters is encrypted and backed up across multiple devices.
The best thing anyone can do is keep the amount of critical, digital data they have to a minimum, keep that data encrypted and backed up, and use a password manager properly. That alone makes it exceedingly unlikely you will ever be a victim of cybercrime solely because you're more of a pain in the ass to compromise than 99.9% of the world.
I personally have almost 10TB of data between all my systems, but of that maybe 10 MB is actually valuable to anyone but me.
I don't think so because it requires you to provide proof you work there actively, and those who leave are assigned alumni and grandfathered in. It's mainly just lots of PIP and toxicity that is discussed, and memeing about how dog shit things are.
Even within SF there's plenty of great areas, but "peace and tranquility in the sunset district" doesn't make headlines. SF has a ton of problems and I really hope we can fix them in the long term, but they tend to only be in certain parts of the city. Saying all of SF is like this is akin to saying the entire bay area is like SF. They're both massive overgeneralizations.
I can't think of any neighborhood in SF where I'd choose one of these places over literally anywhere else. Too much good cheap food here.
This is already a thing. I'm part of a 25k person Discord server for Amazon/AWS employees both current and former. We often discussed a ton about the company's inner workings, navigating the toxic AF environment, and helping people find other jobs. Nothing ever trade secret level, but that Discord would give any competitor a massive leg up in direct competition with Amazon.
It's trained on western media so this shouldn't be surprising as those are the two biggest threats to the western world. An AI trained on China's intranet would likely nuke the US, Russia, and select SEA countries.
I mean I live in the most expensive region of the US and live pretty comfortably, but go off paying to see ads and have content taken from you I guess.
Please never bring up CNF again. I'm a year out of college, two years out of finite automata, and I still shudder when it's brought up.
Do yourself a favor and rent a floor sander for like $80 bucks a day and save your orbital sander for the edges next time. That or duct tape the orbital sander to a stick.
That was a pretty interesting read. However, I think it's attributing correlation and causation a little too strongly. The overall vibe of the article was that developers who use Copilot are writing worse code across the board. I don't necessarily think this is the case for a few reasons.
The first is that Copilot is just a tool and just like any tool it can easily be misused. It definitely makes programming accessible to people who it would not have been accessible to before. We have to keep in mind that it is allowing a lot of people who are very new to programming to make massive programs that they otherwise would not have been able to make. It's also going to be relied on more heavily by those who are newer because it's a more useful tool to them, but it will also allow them to learn more quickly.
The second is that they use a graph with an unlabeled y-axis to show an increase in reverts, and then never mention any indication of whether it is raw lines of code or percentage of lines of code. This is a problem because copilot allows people to write a fuck ton more code. Like it legitimately makes me write at least 40% more. Any increase in revisions are simply a function of writing more code. I actually feel like it leads to me reverting a lesser percentage of lines of code because it forces me to reread the code that the AI outputs multiple times to ensure its validity.
This ultimately comes down to the developer who's using the AI. It shouldn't be writing massive complex functions. It's just an advanced, context-aware autocomplete that happens to save a ton of typing. Sure, you can let it run off and write massive parts of your code base, but that's akin to hitting the next word suggestion on your phone keyboard a few dozen times and expecting something coherent.
I don't see it much differently than when high level languages first became a thing. The introduction of Python allowed a lot of people who would never have written code in their life to immediately jump in and be productive. They both provide accessibility to more people than the tools before them, and I don't think that's a bad thing even if there are some negative side effects. Besides, in anything that really matters there should be thorough code reviews and strict standards. If janky AI generated code is getting into production that is a process issue, not a tooling issue.
I got stopped with a Panettone once. Thankfully this was in EWR so the Italian-American gate agent understood why I'd be smuggling one to the west coast.