jvisick

joined 2 years ago
[–] jvisick 42 points 2 years ago (1 children)

“Sorry, our unbelievably massive military budget is only for active duty military. Best we can do is schedule you an appointment to talk to someone next year about the benefits you won’t be receiving.”

[–] jvisick 34 points 2 years ago* (last edited 2 years ago) (7 children)

“If a student uses the college search tool on CB.org, the student can add a GPA and SAT score range to the search filters. Those values are passed [to Facebook]”

So they don’t associate your official score to your browser, but presumably students who are using that search tool would be searching their real score - or a range close to it.

The headline is fairly leading, but the statement from the College Board is also fairly misleading. They’re not directly selling your official score to advertisers, but they’re indirectly selling data about you that gives a pretty good idea of your score.

[–] jvisick 7 points 2 years ago

I don’t think it’s good enough to have a blanket conception to not trust them completely.

On the other hand, I actually think we should, as a rule, not trust the output of an LLM.

They’re great for generative purposes, but I don’t think there’s a single valid case where the accuracy of their response should be outright trusted. Any information you get from an AI model should be validated outright.

There are many cases where a simple once-over from a human is good enough, but any time it tells you something you didn’t already know you should not trust it and, if you want to rely on that information, you should validate that it’s accurate.

[–] jvisick 1 points 2 years ago (1 children)

Those look like small cards. But why not just inspect the page and grab the CSS if you’re so focused in re-creating them? Why try to find and shoehorn a component from a UI library to look exactly like that when your browser will tell you exactly how it looks the way it does?

[–] jvisick 4 points 2 years ago

Yeah, and those blood suckers are the richest assholes out there

[–] jvisick 1 points 2 years ago (1 children)

That’s a little pedantic, don’t you think?

[–] jvisick 4 points 2 years ago

But didn’t you know? The poor cops are scared! Why would they check if a suspect is armed when they can just kill them and say they thought they were in danger?

[–] jvisick 8 points 2 years ago (1 children)

It entirely depends on the culture around it. Is everyone expected to model underwear for their store? If someone doesn’t want to - is the culture supportive, neutral, dismissive, or antagonistic? Are they expected to do it but just allowed to choose not to? Or is there no expectation to do it, but volunteers are welcomed?

I can’t imagine anyone is being forced to, but it wouldn’t surprise me if the company culture is dismissive or demeaning of people who would rather not.

[–] jvisick 21 points 2 years ago (1 children)

Notice the “up to” in their offer. It’s likely commission based and inflated numbers to lure the developer into doing it - to trick them into thinking exactly what you’ve said here.

I’d imagine what they actually pay out after you cave is significantly lower, only then you’ve already sold out your users so you might as well leave their tracking in there.

[–] jvisick 9 points 2 years ago
[–] jvisick 12 points 2 years ago (1 children)

"To get a base salary of $170k you know you need to work hard as an Engineer, this sucks."

As someone who has worked as a UPS driver and now as a software developer, I can say that the UPS drivers definitely work harder than your average engineer.

That quote is also deftly ignoring the fact that you’re generally paid for the value you generate, not how hard to you work.

view more: ‹ prev next ›