jim

joined 2 years ago
[–] jim 2 points 7 months ago (1 children)

EOL for 3.8 is coming up in a few short weeks!

[–] jim 2 points 7 months ago

So cool!! Mercury is definitely the most mysterious inner planet due to its difficulty to get a space probe there even though it's the closest planet.

The spacecraft will arrive next year, and I can't wait for all the Science it will uncover!

[–] jim 6 points 7 months ago

Haha, I've been waiting for the 4K/8K reference in this volume. Poor Anna.

[–] jim 6 points 7 months ago

TIL this exists

[–] jim 18 points 7 months ago

The complainant suggested other manga to replace the series such as Chainsaw Man, To Your Eternity, and The Seven Deadly Sins among others.

lol

[–] jim 3 points 7 months ago (1 children)

I also like the POSIX “seconds since 1970” standard, but I feel that should only be used in RAM when performing operations (time differences in timers etc.). It irks me when it’s used for serialising to text/JSON/XML/CSV.

I've seen bugs where programmers tried to represent date in epoch time in seconds or milliseconds in json. So something like "pay date" would be presented by a timestamp, and would get off-by-one errors because whatever time library the programmer was using would do time zone conversions on a timestamp then truncate the date portion.

If the programmer used ISO 8601 style formatting, I don't think they would have included the timepart and the bug could have been avoided.

Use dates when you need dates and timestamps when you need timestamps!

[–] jim 11 points 7 months ago

Do you use it? When?

Parquet is really used for big data batch data processing. It's columnar-based file format and is optimized for large, aggregation queries. It's non-human readable so you need a library like apache arrow to read/write to it.

I would use parquet in the following circumstances (or combination of circumstances):

  • The data is very large
  • I'm integrating this into an analytical query engine (Presto, etc.)
  • I'm transporting data that needs to land in an analytical data warehouse (Snowflake, BigQuery, etc.)
  • Consumed by data scientists, machine learning engineers, or other data engineers

Since the data is columnar-based, doing queries like select sum(sales) from revenue is much cheaper and faster if the underlying data is in parquet than csv.

The big advantage of csv is that it's more portable. csv as a data file format has been around forever, so it is used in a lot of places where parquet can't be used.

[–] jim 4 points 7 months ago

Wow everyone seems to love P3 but I actually liked P4 better. I mean I really enjoyed both, but P4 was a more immersive experience for me. I should reboot my vita and play it again.

I really felt like P4 had deeper connections and relationships between the characters. It felt more real, and that made the tension in the game more exciting. I love every second of it and am still trying to find a game like it.

Don't get me wrong, P3 was great also. The gameplay was superb and the characters were all great. But P4 still has a special place in my heart.

[–] jim 4 points 7 months ago

The autocomplete is nice but I don't find it a game-changer. The comment about writing tests is on point though, but that's the only place I found out useful.

[–] jim 1 points 7 months ago

Welcome to the world of light novels (and their adaptations).

[–] jim 10 points 8 months ago

They're asking for TV manufacturers to block a VPN app in the TV. Not to block VPN in general.

[–] jim 3 points 8 months ago

Dawww, I hope they make up... for the sake of the world lol

view more: ‹ prev next ›