this post was submitted on 21 Mar 2025
223 points (97.4% liked)
Programmer Humor
34811 readers
284 users here now
Post funny things about programming here! (Or just rant about your favourite programming language.)
Rules:
- Posts must be relevant to programming, programmers, or computer science.
- No NSFW content.
- Jokes must be in good taste. No hate speech, bigotry, etc.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
In practice, that cost of rendering the vectors vs images is negligible because rendering a raster image isn't free either. Especially, if you're going to package large images in your app.
Raster images do not need to be rendered - see Rendering:
Note that "render" is a fairly generic term, and it is sometimes used like "render to the screen," to just mean to display something. Rasterisation may be a better term to use here, since it only applies to vector graphics, and is the part of the process I am referring to.
In any case, except for possibly reading fewer bytes from disk, the vector case includes all the same compute and memory cost as the raster image - it just has added overhead to compute the bitmap. On modern hardware, this doesn't take terribly long, but it does mean we're using more compute just to launch/load things.
I meant render in a sense of displaying the encoded data as pixels on the screen. I understand how different kinds of rendering work. My point is that reading and displaying image data isn't free. It can be cheaper than drawing vectors, but there is still a cost and it grows with the size of the image. Images also end up eating up more memory as a rule.