don't blame me i voted for turbo pascal
Programmer Humor
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics
I loved turbo pascal. Anyone using that FOSS delphi equivalent thing?
To be fair, Python is just glue for code written in lower level languages when it comes to AI
A lot of it is c in a python raincoat
The underlining linear algebra routines are written in… FORTRAN.
I've never played with FORTRAN, but I've done some linear algebra with matlab. Matlab was interesting for the native handling if matrices. What makes FORTRAN so good at linear algebra?
the main thing that makes fortran preferable to C is the way it handles arrays and vectors. due to different pointer semantics, they can be laid out more efficiently in memory, meaning less operations need to be done for a given calculation.
Interesting. Is this a fundamental limitation of C or is it just more preferable and easier to use FORTRAN when implementing it?
Meaning could the same performance be achieved in C but most optimized libraries are already written so why bother? Or basically C can't achieve the memory optimization at all?
you can get the same performance by using the restrict
keyword in C.
basically, C allows pointer aliasing while fortran does not, which means C programs need to be able to handle cases when a value is accessed from multiple locations. fortran does not, so a lot of accesses can be optimized into immediates, or unrolled without guards.
restrict
is a pinky-promise to the compiler that no overlapping takes place, e.g. that a value will only be accessed from one place. it's basically rust ownership semantics without enforcement.
That reminds me, I had a ride share driver named Blas, and I had to giggle and tell them about it.
Which can be ASM in a C raincoat
Which can be ASMR depending on pronunciation and tone of voice.
depending on pronunciation and tone of voice.
Thank you I suppose?
Does one even have to actually write Python code, except for frontends? I'd assume you just load the model, weights and maybe training data into pytorch/tensorflow.
Doesn't seem to be the case, some popular servers:
And then of course talking to these servers can be in any language that has a library for it or even just handles network requests, although Python is a nice choice. Possibly the process of training models is more heavy on the Python dependencies than inference is, haven't actually done anything with that though.
Python-wrapped C, for the most part.
There's also a whole lot that's just C/C++ exposing a Python interface, without any wrapping.
Thats okay I love them both. But not equally. (Don't tell Python).
I am lobbing rocks at you because of that admission.
Every old timer knows AI is supposed to be written in Prolog.
Love a language that doesn't care if you're using inputs to get outputs or using outputs to get inputs
One of the guys who taught me Prolog wrote the book: https://www.inf.fu-berlin.de/lehre/SS09/KI/folien/merritt.pdf
That book opening image is indeed telling
It sure made sense forty years ago. And I'd bet that the examples in that book are more AI than today's LLMs.
I have this one! Probably at my folks' place, definitely I'll put it behind my chair so people can see it during video calls.
Python hatched out of the egg on the cover.
Is Python not considered to be any good?
As far as I know many Python libraries which need performance are mainly written in C++
...It's okay. I've programmed in far far worse languages. ...It's got its advantages. It's got it's problems. 🤷🏻♀️
Edit: If you need a serious answer: Much like BASIC, it's a language often used in teaching programming. In that sense, I guess it's much better than BASIC. You can, like, actually use it on real world applications. If you're using BASIC for real world applications in this day and age something has gone really wrong.
If you're using BASIC for real world applications in this day and age something has gone really wrong.
Visual Basic is essentially the same as C# if they’re both working with the .NET framework, if I recall correctly.
But yes.
Python is phenomenal for prototyping IMO.
Once you need performance, its best to use another language (even partially).
But quickly banging out a concept, to me, is the big win for python.
But quickly banging out a concept, to me, is the big win for python.
For me the best language for quickly banging out a concept has always been the one I'm most familiar with at the moment.
Once you need performance
If you need more performance. Many things just don't.
Python is great, but it's so forgiving that it's easy to write garbage code if you're not very proficient and don't use the right tools with it.
The only objectively bad (major) thing against it is speed. Not that it matters much for most applications though, especially considering that most number crunching tasks will use libraries that have critical path written in a systems language:
numpy, pandas, polars, scikit-learn, pytorch, tf, spacy; all of them use another language to do the cpu intensive tasks, so it really doesn't matter much that you're using python at the surface.
Python is the tradeoff between ease of development and performance. If you do things the "normal" way (aka no cython) your programs will oftentimes severely underperform when compared with something written in a relatively lower-level language. Even Java outperforms it.
But, you can shit out a program in no time. Or so I've been told. Python is pretty far from the things I'm interested in programming so I haven't touched it much.
It's certainly not very fast
It's okay, but it's a bit slow and dynamic typing in general isn't that great IMO.
Dynamic typing is shit. But type annotation plus CI checkers can give you the same benefits in most cases.
It doesn't have dynamic typing FFS, variable are typed. You mean declarations.
You can't have statically typed objects, because they are of indeterminate length.
it is a dynamically typed language, but it's not a weakly typed language.
good is subjective to the opinions of the group.
objectively, Python is a smoldering pile of trash waiting to completely ignite. it does have one thing going for it though.
it's not JavaScript.
Would it have been any less shitty if it had instead been written in assembly?