this post was submitted on 05 Jun 2024
96 points (98.0% liked)

Programming

17534 readers
246 users here now

Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!

Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.

Hope you enjoy the instance!

Rules

Rules

  • Follow the programming.dev instance rules
  • Keep content related to programming in some way
  • If you're posting long videos try to add in some form of tldr for those who don't want to watch videos

Wormhole

Follow the wormhole through a path of communities [email protected]



founded 2 years ago
MODERATORS
all 46 comments
sorted by: hot top controversial new old
[–] [email protected] 58 points 5 months ago (5 children)

My secret to high uptime:

while True:
    try:
        main()
    except:
        pass
[–] [email protected] 9 points 5 months ago

that was hilarious xD

[–] alexdeathway 4 points 5 months ago

Flask developer?

[–] [email protected] 4 points 5 months ago

Someone is absolutely going to think this is a real recommendation and do it.

[–] [email protected] 3 points 5 months ago (3 children)

Lurking beginner here, why is this bad?

[–] [email protected] 24 points 5 months ago

Basically sweeps errors under rug.

No error handling and go again.

[–] [email protected] 13 points 5 months ago (1 children)

There you go:

# Start an infinite loop because True will always be True
while True: 
    # try to run the main function, usually where everything happens
    try:
        main()
    # in the case an exception is raised in the main function simply discard (pass) and restart the loop
    except:
        pass
[–] [email protected] 2 points 5 months ago

Thank you for that answer! That makes sense.

[–] [email protected] 4 points 5 months ago

This gives some better context. https://stackoverflow.com/questions/21553327/why-is-except-pass-a-bad-programming-practice

But essentially ignoring every single error a program could generate is not great. It'd be better to know what those errors are and fix/prevent them from occurring in the first place.

[–] [email protected] 44 points 5 months ago (2 children)

Don't need to activate your venv to use it.

Just use venv/bin/python my-file.py from a script or a terminal from your project root.

[–] [email protected] 13 points 5 months ago (1 children)

You don't need to use venv at all, break the mold and do it all global

[–] [email protected] 4 points 5 months ago
[–] [email protected] 0 points 5 months ago

Fuck yeah thanks man

[–] [email protected] 31 points 5 months ago (2 children)

Not necessarily a trick that's always useful but I always forget this.
You can get async REPL by calling python -m asyncio.

Also, old trick - in need of simple http server serving static files?
python -m http.server

[–] [email protected] 9 points 5 months ago (2 children)

I really don't know what asyncio does. can you elaborate?

[–] [email protected] 16 points 5 months ago* (last edited 5 months ago) (3 children)

So, the word here is parallelism. It's not something specific to python, asyncio in python is just the implementation of asynchronous execution allowing for parallelism.

Imagine a pizza restaurant that has one cook. This is your typical non-async, non-threading python script - single-threaded.
The cook checks for new orders, pickups the first one and starts making the pizza one instruction at the time - fetching the dough, waiting for the ham slicer to finish slicing, ... eventually putting the unbaked pizza into oven and sitting there waiting for the pizza to bake.
The cook is rather inefficient here, instead of waiting for the ham slicer and oven to finish it's job he could be picking up new orders, starting new pizzas and fetching/making other different ingredients.

This is where asynchronicity comes in as a solution, the cook is your single-thread and the machines would be mechanisms that have to be started but don't have to be waited on - these are usually various sockets, file buffers (notice these are what your OS can handle for you on the side, asyncIO ).
So, the cook configures the ham slicer (puts a block of ham in) and starts it - but does not wait for each ham slice to fall out and put it on the pizza. Instead he picks up a new order and goes through the motions until the ham slicer is done (or until he requires the slicer to cut different ingredient, in this case he would have to wait for the ham task to finish first, put ...cheese there and switch to finishing the first order with ham).

With proper asynchronicity your cook can now handle a lot more pizza orders, simply because his time is not spent so much on waiting.
Making a single pizza is not faster but in-total the cook can handle making more of them in the same time, this is the important bit.


Coming back to why a async REPL is useful comes simply to how python implements async - with special ("colored") functions:

async def prepare_and_bake(pizza):
  await oven.is_empty()  # await - a context switch can occur and python will check if other asynchronous tasks can be continued/finalized
  # so instead of blocking here, waiting for the oven to be empty the cook looks for other tasks to be done
  await oven.bake(pizza)  
  ...

The function prepare_and_bake() is asynchronous function (async def) which makes it special, I would have to dive into Event Loops here to fully explain why async REPL is useful but in short, you can't call async functions directly to execute them - you have to schedule the func.
Async REPL is here to help with that, allowing you to do await prepare_and_bake() directly, in the REPL.


And to give you an example where async does not help, you can't speed up cutting up onions with a knife, or grating cheese.
Now, if every ordered pizza required a lot of cheese you might want to employ a secondary cook to preemptively do these tasks (and "buffer" the processed ingredients in a bowl so that your primary cook does not have to always wait for the other cook to start and finish).

This is called concurrency, multiple tasks that require direct work and can't be relegated to a machine (OS, or to be precise can't be just started and awaited upon) are done at the same time.
In a real example if something requires a lot of computation (calculating something - like getting nth fibonnaci number, applying a function to list with a lot of entries, ...) you would want to employ secondary threads or processes so that your main thread does not get blocked.

To summarize, async/parallelism helps in cases where you can delegate (IO) processing to the OS (usually reading/writing into/out of a buffer) but does not make anything go faster in itself, just more efficient as you don't have to wait so much which is often a problem in single-threaded applications.

Hopefully this was somewhat understandable explanation haha. Here is some recommended reading https://realpython.com/async-io-python/

Final EDIT: Reading it myself few times, a pizza bakery example is not optimal, a better example would have been something where one has to talk with other people but these other people don't have immediate responses - to better drive home that this is mainly used on Input/Output tasks.

[–] [email protected] 6 points 5 months ago

Thank you very much!! That was very informative <3

[–] [email protected] 6 points 5 months ago

Final EDIT: Reading it myself few times, a pizza bakery example is not optimal

Yeah, I kept getting hung up on the notion of toppings being sliced or grated to-order, rather than bought from Sysco pre-prepped in big plastic bags like any normal pizzaria would do. Your analogy be fancy!

[–] [email protected] 1 points 5 months ago (1 children)

Have you got concurrency and parallelism swapped around?

[–] [email protected] 1 points 5 months ago* (last edited 5 months ago)

In what part exactly?
The example is not perfect I can see that myself. If I read into it too much there could be an overlap with concurrency, e.g. the (IO) tasks awaited & delegated to the OS could be considered a form of concurrency but other then that I do think it's close to describing how async usually works.

[–] [email protected] 5 points 5 months ago* (last edited 5 months ago)

asyncio provides "cooperative concurrency" for python.

Lets say you need to download 10 webpages in python, someone might do

result1 = requests.get(...)
result2 = requests.get(...)
....
result10 = requests.get(...)

Down side is that each requests.get() blocks until HTTP request is done and if each webpage loads in 5 seconds your programs needs 50 seconds to download all pages.

You can do something like spawn 10 threads, but threading has it's own downside.

What coopertive concurrency does is allowing these coroutine(tasks) that can tell Python to do something else while a function is waiting for something.... I think it's the best to read some Python examples. https://docs.python.org/3/library/asyncio-task.html#coroutines

examples that solves requests.get() problem with asyncio but it's probably better to use libraries that builds around asyncio.

[–] amenji 2 points 5 months ago

Using http.server is my go-to sanity check method if my configured my network firewall correctly or not.

[–] [email protected] 22 points 5 months ago (2 children)

Walrus operator - := - envious of the C devs being able to simultaneously assign and return a value? Envy no more, we've got it.

[–] [email protected] 15 points 5 months ago (1 children)

For those curious about the drama & lack of wide adoption surrounding the walrus operator

https://dev.to/renegadecoder94/the-controversy-behind-the-walrus-operator-in-python-4k4e

It's a shame because it's a really nice feature.

[–] [email protected] 7 points 5 months ago

You need to see it to believe it I think. I was generally on the side of “too complex” but then came across instances perfect for it and used it right away and found it pleasant.

I’m still generally on the side of “too complex” though, and think there are probably better things for PSF to work on (cough packaging cough).

[–] [email protected] 7 points 5 months ago

I am the Walrus

[–] [email protected] 13 points 5 months ago (1 children)

If using pyenv to support multiple python versions, when creating venvs, make sure to pass --copies to it.

% python3 -m venv venv --copies

Ordinarily, venv uses symbolic links back to the current version of the python binary. A lot of tools and IDEs don't traverse symbolic links. That flag actually copies the real binaries over to the venv.

This avoids a metric ton of hard-to-diagnose misery later on.

[–] [email protected] 2 points 5 months ago

Yeah, I wish I knew this about a year ago. Thanks.

[–] [email protected] 12 points 5 months ago (1 children)

If you're on Linux (or Mac), add an alias to your .bashrc:

alias activate="source env/bin/activate"

Now you can activate your venv by just running activate in the project root!

[–] [email protected] 2 points 5 months ago

Very nice, now acti in my rc.

[–] [email protected] 8 points 5 months ago (1 children)

Use fewer loops and more comprehensions.

Also, for the love of $DEITY, embrace EAFP!

[–] [email protected] 15 points 5 months ago

EAFP - "Easier to ask for forgiveness than for permission".

For those who are (like me) unfamiliar with this... acronym?

[–] [email protected] 7 points 5 months ago

I forget where I originally found this and Google on my phone was unhelpful.

My favorite annoying trick is x -=- 1. It looks like it shouldn't work because that's not precisely a valid operator, but it functions as an increment equivalent to x += 1

It works because -= still functions are "subtract and assign", but the second minus applies to the 1 making it -1.

[–] [email protected] 6 points 5 months ago* (last edited 5 months ago) (1 children)

You can feign immutablility on class attributes by playing with __setattr__. I don't remember the exact way to do it, but its roughly like this:

class YourClass:
    def __setattr__(self, name, value):
        if not hasattr(self, name):
            super().__setattr__(name, value)
       else:
            # handle as you wish if the
            # attr/value already exists.
            # pass, raise, whatever

I say "feign immutability" because there are still cases in which an attr value can change, such as:

  • the underlying attribute contains a list and appending to the list
  • the underlying attribute is a class and modifying the class's attributes
  • pretty much anything that does "in place" changes, because so much of python is referential and has side effects.
[–] [email protected] 6 points 5 months ago

I have to mention dataclasses here, especially with frozen=True.

Seriously, use dataclasses whenever possible, they're great.

[–] [email protected] 6 points 5 months ago (2 children)
[–] anzo 2 points 5 months ago

Uiii, I'm flying!

[–] [email protected] 1 points 5 months ago

python -m antigravity

[–] [email protected] 5 points 5 months ago (4 children)

Can I request a hack? How do I handle several different versions of Python installed, which one is used for pip stuff, and how sudo/running as services changes all of this.

[–] [email protected] 12 points 5 months ago

You can use pyenv. it will handle everything. https://github.com/pyenv/pyenv

[–] [email protected] 9 points 5 months ago* (last edited 5 months ago)

There are like 10,000 different solutions, but I would just recommend using what's built in to python

If you have multiple versions installed you should be able to call python3.12 to use 3.12, etc

Best practice is to use a different virtual environment for every project, which is basically a copy of an existing installed python version with its own packages folder. Calling pip with the system python installs it for the entire OS. Calling it with sudo puts the packages in a separate package directory reserved for the operating system and can create conflicts and break stuff (as far as I remember, this could have changed in recent versions)

Make a virtual environment with python3.13 -m venv venv the 2nd one is the directory name. Instead of calling the system python, call the executable at venv/bin/python3

If you do source venv/bin/activate it will temporarily replace all your bash commands to point to the executables in your venv instead of the system python install (for pip, etc). deactivate to revert. IDEs should detect the virtual environment in your project folder and automatically activate it

[–] [email protected] 2 points 5 months ago

I started using hatch lately and really like how I can manage everything from the pyproject.toml file

https://github.com/pypa/hatch

[–] Corbin 4 points 5 months ago

PyPy exists. (This is news to around 95% of the community.)

[–] [email protected] 2 points 5 months ago

If using asyncio is too impenetrable, try using Trio instead. It's a sensibly-designed asynchronous library, to the point that you'll find it's easier to write non-trivial Python programs in Trio from the start, rather than bolting-on async support later.

Asyncio is just plain weird, IMO, exposing more low-level concerns than is customary for Python. Whereas Trio lets you get things done intuitively.