logging_strict

joined 1 year ago
[–] logging_strict 1 points 7 hours ago* (last edited 7 hours ago)
from multiprocessing import Lock
l = Lock()

flake8 .... way too ambiguous

[–] logging_strict 1 points 7 hours ago

so what if it's 100x slower /nosarc

[–] logging_strict 1 points 5 days ago

You are in for a real treat!

Here is how to step in and get the locals

This technique depends on there being only one return statement

https://logging-strict.readthedocs.io/en/stable/code/tech_niques/context_locals.html

[–] logging_strict 1 points 5 days ago

Multiple return statements is unusual. In very rare situations i understand. But the rule is never do that.

When there is only one return statement, can step into the function to see the local variables

[–] logging_strict 0 points 1 week ago (3 children)

The sample code for lazy imports looks wrong

STRIPE = None

def _stripe():
    global STRIPE
    if STRIPE is None:
        import stripe

        return stripe
    return STRIPE

STRIPE is never changed. And two return statements in the same function?!

Anyways can imagine how to do lazy imports without relying on the given code sample.

[–] logging_strict 1 points 1 week ago

so does zfs, so does wayland, so does trying out every distro, so does trying out every text editor and associated plugins, so does trying out ventoy, so does GrapheneOS, ...

Everything makes life easier, but comes down to,

Linux isn't free, it costs you your time

Which can be reframed, what do you really want to spend your time on?

If i really really really had to answer that overly honestly, want:

  • my GUI apps non-blocking on heavy background processes

  • distributing out tasks to many computers and runners

None of which screams or necessitates systemd or zfs or wayland or trying out every distro, every text editor every plugin, ventoy, or GrapheneOS.

Not in a house with a fox with a crow and a bow a knot on a cot or relaxed in the snow, i will not eat never ending random suggestions Sam, i will never eat them Sam i am.

[–] logging_strict 1 points 2 weeks ago

Packaging seems to be a separate skill. Separate from coding. Lots of people are good at coding. Then hit the packaging roadblock.

Can step in and white knight, but too many projects are like that.

[–] logging_strict 1 points 2 weeks ago

How to separate requirements handling and build backend

then drain-swamp (+drain-swamp-action) which supports build plugins with built in plugin for specifying semantic version.

Prefer to tell the builder the desired version rather than to be told. Options: 'tag', 'current', or specify the version. Version bump is silly.

This way can set the semantic version before tagging a release.

Warning

This is hardcore. Not for noobs, the faint of heart, or wise guys. If you don't need build plugins then maybe drain-swamp is not for you. If you are doing something in setup.py, that you can't live without, then might have a looksie at drain-swamp.

[–] logging_strict 1 points 2 weeks ago

No, that’s… against community rules :) I don’t like the common use of venvs or .toml very much and I don’t like their use by other people and “timid” is also diplomatic. So you’re getting timid, and we get to get along and we can agree to disagree on the use of .venvs and we can wish each other a pleasant day.

Think you broke the Internet. That's brilliant /nosarc.

Want you to write my code of misconduct!

[–] logging_strict -5 points 2 weeks ago (1 children)

It’s literally 10x faster

reminds me of the ol' joke

young bull: lets run down the hill and get us a heffer

old bull: lets walk down and do 'em all

wtf is your rush?

It literally doesn't matter how long it takes. Especially when Astral has moved on.

[–] logging_strict 0 points 2 weeks ago (1 children)

It’s literally 10x faster. I’m not sure what kind of person wouldn’t care about that. On that, lets agree to disagree.

Thru magic Astral has funding. I don't. So why shoulder the risk that their magical situation will continue forever.

When Astral goes tits up, which we seem to agree on, and everyone is whining crying and screaming, at least there is one or more fallback(s) written in Python which is/are maintainable by people without magical super powers.

[–] logging_strict 0 points 2 weeks ago

I have no need for this kind of tool, because I don’t have version conflicts. Does this manage my dependencies in other ways?

Happily no. wreck attempts to do only one thing. If you don't have version conflicts in your requirements files then whatever you are doing, keep doing that.

No idea what .in is.

requirements-*.in. are placed in folders. So requirements/whatever.in --> requirements/whatever.lock and requirements/whatever.unlock

Are they still .txt or is there a new file standard for .lock and .unlock?

.txt is meaningless or exceedingly broad. A text file huh? Well that explains everything.

The standard is what works.

use of venvs

Containerization, especially for GUIs and apps, is better than depending on venvs. Until it's not. Then still need venvs

4
submitted 2 weeks ago* (last edited 2 weeks ago) by logging_strict to c/python
 

Market research

This post is only about dependency management, not package management, not build backends.

You know about these:

  • uv

  • poetry

  • pipenv

You are probably not familiar with:

  • pip-compile-multi

    (toposort, pip-tools)

You are defintely unfamiliar with:

  • wreck

    (pip-tools, pip-requirements-parser)

pip-compile-multi creates lock files. Has no concept of unlock files.

wreck produces both lock and unlock files. venv aware.

Both sync dependencies across requirement files

Both act only upon requirements files, not venv(s)

Up to speed with wreck

You are familiar with .in and .txt requirements files.

.txt is split out into .lock and .unlock. The later is for packages which are not apps.

Create .in files that are interlinked with -r and -c. No editable builds. No urls.

(If this is a deal breaker feel free to submit a PR)

pins files

pins-*.in are for common constraints. The huge advantage here is to document why?

Without the documentation even the devs has no idea whether or not the constraint is still required.

pins-*.in file are split up to tackle one issue. The beauty is the issue must be documented with enough details to bring yourself up to speed.

Explain the origin of the issue in terms a 6 year old can understand.

Configuration

python -m pip install wreck

This is logging-strict pyproject.toml


[tool.wreck]
create_pins_unlock = false

[[tool.wreck.venvs]]
venv_base_path = '.venv'
reqs = [
    'requirements/dev',
    'requirements/kit',
    'requirements/pip',
    'requirements/pip-tools',
    'requirements/prod',
    'requirements/manage',
    'requirements/mypy',
    'requirements/tox',
]

[[tool.wreck.venvs]]
venv_base_path = '.doc/.venv'
reqs = [
    'docs/requirements',
]

dynamic = [
    "optional-dependencies",
    "dependencies",
    "version",
]

[tool.setuptools.dynamic]
dependencies = { file = ["requirements/prod.unlock"] }
optional-dependencies.pip = { file = ["requirements/pip.lock"] }
optional-dependencies.pip_tools = { file = ["requirements/pip-tools.lock"] }
optional-dependencies.dev = { file = ["requirements/dev.lock"] }
optional-dependencies.manage = { file = ["requirements/manage.lock"] }
optional-dependencies.docs = { file = ["docs/requirements.lock"] }
version = {attr = "logging_strict._version.__version__"}

Look how short and simple that is.

The only thing you have to unlearn is being so timid.

More venvs. More constraints and requirements complexity.

Do it

mkdir -p .venv || :;
pyenv version > .venv/python-version
python -m venv .venv

mkdir -p .doc || :;
echo "3.10.14" > .doc/python-version
cd .doc && python -m venv .venv; cd - &>/dev/null

. .venv/bin/activate
# python -m pip install wreck
reqs fix --venv-relpath='.venv'

There will be no avoidable resolution conflicts.

Preferable to do this within tox-reqs.ini

Details

TOML file format expects paths to be single quoted. The paths are relative without the last file suffix.

If pyproject.toml not in the cwd, --path='path to pyproject.toml'

create_pins_unlock = false tells wreck to not produce .unlock files for pins-*.in files.

DANGER

This is not for a faint of heart. If you can avoid it. This is for the folks who often say, Oh really, hold my beer!

For pins that span venv, add the file suffix .shared

e.g. pins-typing.shared.in

wreck deals with one venv at a time. Files that span venv have to be dealt with manually and carefully.

Issues

  1. no support for editable builds

  2. no url support

  3. no hashs

  4. your eyes will tire and brains will splatter on the wall, from all the eye rolling after sifting thru endless posts on uv and poetry and none about pip-compile-multi or wreck

  5. Some folks love having all dependency managed within pyproject.toml These folks are deranged and its impossible to convince them otherwise. pyproject.toml is a config file, not a database. It should be read only.

  6. a docs link on pypi.org is 404. Luckily there are two docs links. Should really just fix that, but it's left like that to see if anyone notices. No one did.

4
submitted 3 weeks ago* (last edited 3 weeks ago) by logging_strict to c/python
 

Finally got around to creating a gh profile page

The design is to give activity insights on:

  • what Issues/PRs working on

  • future issues/PRs

  • for fun, show off package mascots

All out of ideas. Any suggestions? How did you improve your github profile?

14
submitted 1 month ago* (last edited 1 month ago) by logging_strict to c/python
 

From helping other projects have run across a fundamental issue which web searches have not given appropriate answers.

What should go in a tarball and what should not?

Is it only the build files, python code, and package data and nothing else?

Should it include tests/ folder?

Should it include development and configuration files?

Have seven published packages which include almost all the files and folders. Including:

.gitignore,

.gitattributes,

.github folder tree,

docs/,

tests/,

Makefile,

all config files,

all tox files,

pre-commit config file

My thinking is that the tarball should have everything needed to maintain the package, but this belief has been challenged. That the tarball is not appropriate for that.

Thoughts?

 

PEP 735 what is it's goal? Does it solve our dependency hell issue?

A deep dive and out comes this limitation

The mutual compatibility of Dependency Groups is not guaranteed.

-- https://peps.python.org/pep-0735/#lockfile-generation

Huh?! Why not?

mutual compatibility or go pound sand!

pip install -r requirements/dev.lock
pip install -r requirements/kit.lock -r requirements/manage.lock

The above code, purposefully, does not afford pip a fighting chance. If there are incompatibilities, it'll come out when trying randomized combinations.

Without a means to test for and guarantee mutual compatibility, end users will always find themselves in dependency hell.

Any combination of requirement files (or dependency groups), intended for the same venv, MUST always work!

What if this is scaled further, instead of one package, a chain of packages?!

11
submitted 3 months ago* (last edited 3 months ago) by logging_strict to c/python
 

In a requirements-*.in file, at the top of the file, are lines with -c and -r flags followed by a requirements-*.in file. Uses relative paths (ignoring URLs).

Say have docs/requirements-pip-tools.in

-r ../requirements/requirements-prod.in
-c ../requirements/requirements-pins-base.in
-c ../requirements/requirements-pins-cffi.in

...

The intent is compiling this would produce docs/requirements-pip-tool.txt

But there is confusion as to which flag to use. It's non-obvious.

constraint

Subset of requirements features. Intended to restrict package versions. Does not necessarily (might not) install the package!

Does not support:

  • editable mode (-e)

  • extras (e.g. coverage[toml])

Personal preference

  • always organize requirements files in folder(s)

  • don't prefix requirements files with requirements-, just doing it here

  • DRY principle applies; split out constraints which are shared.

view more: next ›