this post was submitted on 02 Aug 2024
229 points (96.4% liked)

Programmer Humor

19453 readers
69 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS
 
top 19 comments
sorted by: hot top controversial new old
[–] RonSijm 38 points 3 months ago

O(n)? More Like Oh(No)

[–] [email protected] 25 points 3 months ago (1 children)
[–] [email protected] 7 points 3 months ago (1 children)
[–] [email protected] 5 points 3 months ago* (last edited 3 months ago)

O(BB(n))

"That matches no single computable function" is just an excuse.

[–] MajorHavoc 23 points 3 months ago (1 children)

There's O(1), O(n), O(nlgn), O( this code is crap ).

[–] [email protected] 14 points 3 months ago* (last edited 3 months ago) (1 children)

[Cries in matrix multiplication]

[–] sukhmel 5 points 3 months ago (1 children)

Matrix multiplication is O(n) if you do it in parallel /j

[–] [email protected] 7 points 3 months ago* (last edited 3 months ago) (2 children)

Umm, AKCTSHUALLY it gets more like O(n^2^) in parallel, assuming you're using a physically achievable memory. There's just a lot of criss-crossing the entries have to do.

Strassen's algorithm gets O(n^2.8^) in serial by being terrible, and the weird experimental ones get O(n^2.3^), but the asymptotic benefits of Coppersmith-Winograd and friends only kick in if you're a Kardashev III civilisation computing a single big product on a Dyson sphere.

[–] Mikina 3 points 2 months ago (1 children)

I can't decide whether this sentence is a joke or not. It has the same tone that triggers my PTSD from my CS degree classes and I also do recognize some of the terms, but it also sounds like it's just throwing random science terms around as if you asked a LLM to talk about math.

I love it.

Also, it's apparently also real and correct.

[–] [email protected] 1 points 2 months ago

Thank you, I'm glad to share the pain of numerical linear algebra with anyone who will listen.

[–] sukhmel 2 points 3 months ago (1 children)

Yeah, in fact, I somehow calculated in assumption of n being the amount of elements in matrix, not (assuming square matrix)

But I am impressed to know that there are serial algorithms that approach O(n²), thank you for sharing that info

[–] [email protected] 3 points 3 months ago* (last edited 3 months ago)

Yeah, they work by turning the problem into some crazy kind of group theory and attacking it that way. Every once in a while someone shaves the decimal down slightly, just by implementing the deep math in a more efficient way. A new approach will be needed if it is in fact possible to get down to O(n^2^), though. Strassen's is a divide and conquer algorithm, and each step of the iteration looks like this:

    S[1] = B[1, 2] - B[2, 2]
    S[2] = A[1, 1] + A[1, 2]
    S[3] = A[2, 1] + A[2, 2]
    S[4] = B[2, 1] - B[1, 1]
    S[5] = A[1, 1] + A[2, 2]
    S[6] = B[1, 1] + B[2, 2]
    S[7] = A[1, 2] - A[2, 2]
    S[8] = B[2, 1] + B[2, 2]
    S[9] = A[1, 1] - A[2, 1]
    S[10] = B[1, 1] + B[1, 2]
    P[1] = STRASSEN(A[1, 1], S[1])
    P[2] = STRASSEN(S[2], B[2, 2])
    P[3] = STRASSEN(S[3], B[1, 1])
    P[4] = STRASSEN(A[2, 2], S[4])
    P[5] = STRASSEN(S[5], S[6])
    P[6] = STRASSEN(S[7], S[8])
    P[7] = STRASSEN(S[9], S[10])
    C[1..n / 2][1..n / 2] = P[5] + P[4] - P[2] + P[6]
    C[1..n / 2][n / 2 + 1..n] = P[1] + P[2]
    C[n / 2 + 1..n][1..n / 2] = P[3] + P[4]
    C[n / 2 + 1..n][n / 2 + 1..n] = P[5] + P[1] - P[3] - P[7]
    return C

In my copy of Introduction to Algorithms, it says something like "this is the most bullshit algorithm in the book and it's not close" underneath. You can make it a bit neater by representing the multiplication operation as a 3-dimensional tensor, but at the end of the day it's still just a stupid arithmetic trick. One that's built into your GPU.

[–] [email protected] 18 points 3 months ago

Just add a delay that pads it out the execute time to 10 seconds. O(1) ez.

[–] [email protected] 14 points 3 months ago

That's still good! I'm proud of you for working though the parts of the problem that you were capable of

[–] [email protected] 5 points 3 months ago

Great, now I’m hungry for apples.

[–] [email protected] 2 points 3 months ago* (last edited 3 months ago) (1 children)

Why would you want a specific time complexity? Wouldn't it be better if it's faster? /s

[–] [email protected] 5 points 3 months ago (2 children)

Likely they want a lower time complexity.

for example a question can be trivially solved in O(n^2). but there is no know < O(n) solution, so they ask for O(n)

[–] [email protected] 2 points 2 months ago* (last edited 2 months ago)

Most of the time O(n^2) is optimized to O(n log n). You'll get some sort of award if you can figure out a sorting function that runs in O(n).

[–] sukhmel 1 points 3 months ago

That's a huge leap from O(n²) to O(n), in this example it would likely good to at least specify that it should be strictly less than best known solution (not sure if there are such cases on leet code, I thought they only restrict you to what is known to be solvable)