logging_strict

joined 1 year ago
[–] logging_strict 3 points 14 hours ago* (last edited 14 hours ago)

Hello! i have an inferiority complex. Would like to leave the impression to everyone that i'm a very important person.

For my packages, how to make imports install duration correlated to my deep inferiority complex? To give the impression of compiling a massive code base written in a low level language. Rather than this ducked typed language static type checked with pre py312 knowhow (which is the truth!).

American Python packages should run like American motorcycles, bleeding oil all over the road.

Lets Make America clunky af again

This may or may not be sarcasm

It's really really dangerous to expose a parody onto a package author whose written both a build backend and a requirements parser. If Trump found out about this, the build backend might incorporate tariff.

This is one plugin away from becoming a feature

Heil cobra!!

[–] logging_strict 5 points 15 hours ago

Because foreign packages have been STEALING our CPU cycles for TOO LONG! It's time to put AMERICA FIRST and make importing FAIR and BALANCED again!

[–] logging_strict 1 points 3 days ago* (last edited 3 days ago)

i use interrogate

Which has a fun bug. Uses ast to compile every module. If a module contains a compile error, get a traceback showing only the one line that contains the issue, but not the module path nor the line #

The only way to find the issue is to just test your code base and ignore interrogate until code base is more mature.

interrogate author and maintainers: UX ... what's that?

The most obvious bug in the history of bugs ... setuptools maintainers, oh that's a feature request

[–] logging_strict 1 points 4 days ago

No normal sqa user will have any clue how to do this.

might wanna take this up with the sqlachemy maintainers OR look at sqlachemy tests for how they are testing driver integration.

Some other dialect+driver will do the exact same thing as this jbdc driver.

[–] logging_strict 1 points 6 days ago (2 children)
informix+ifx_jdbc://hostname:port/db_name;INFORMIXSERVER=server_name;delimident=y;user=user;password=pass

becomes

informix+ifx_jdbc://user:pass@hostname:port/db_name?INFORMIXSERVER=server_name&delimident=y

If this is a sqlalchemy engine connect url then it would look like the fixed one.

[–] logging_strict 1 points 6 days ago

https://github.com/OpenInformix/IfxPy/blob/master/README.md

This does not appear to support SQLAlchemy. Has a dbapi2 but no official driver+dialect supported by SQLAlchemy.

In which case, why are you bothering? Time would be better spent working on adding SQLAlchemy support than using the dbapi2 directly. In which case the code would be completely non-portable when decide to switch to other driver+dialect

[–] logging_strict 1 points 1 week ago

setuptools is for enforcing a cartel, naively can simplify that to for building.

I hope uv completely replaces setuptools and build. Then the maintainers can move on to another racket.

[–] logging_strict 0 points 1 week ago* (last edited 1 week ago)

Regular dependencies lack host url and hashes. Those are most important.

For the full details, encourage you to read pep751

^^ look a link! Oh so clickable and tempting. Go ahead. You know that pretty blue font-color is just asking for it. And after clicking the font-color changes colors. Wonder what font-color it'll become? Hmmmm

[–] logging_strict 1 points 1 week ago

git sources are not allowed by pypi.org

pypi.org cartel does not like competition; github repos are no exception.

Try to publish packages with git sourced packages and find out the hard way or save time and take my word for it.

-- author of wreck

[–] logging_strict 1 points 1 week ago* (last edited 1 week ago)

i love requirements files, venv, and pyenv.

Bringing requirements into pyproject.toml does not have enough value add to bother with. My requirements files are hierarchical. Extensively using -r and -c options AND venv aware.

pep751 does bring value, by stating both the host url and the hash of every package.

setuptools will fight this to continue their strange hold on Python

[–] logging_strict 1 points 1 week ago

i'm sad to report

wreck 0.3.4.post0 no longer emits build front end options into .lock files wreck#30.

Background of efforts to beg and plead for setuptools maintainers to bend ever so slightly.

Continuing from denied way to pass build front end options thru requirement files so know non-pypi.org hosts setuptools#4928

This hurts those hosting packages locally or remotely on non-pypi.org package index servers. For those who are, the packages themselves give no clue where the dependencies and transitive packages are hosted.

Each and every user would need to have a ~/.pip/pip.conf or pass --extra-index-url pip install cli option. And somehow know all the possible package index servers.

This allows the pypi.org cartel to continue along it's merry way unimpeded.

Wish pep751 good luck and may there be a .unlock equivalent. Do not yet understand how the pep751 implementers will bypass setuptools and build.

[–] logging_strict 1 points 1 week ago (2 children)

From the hatch docs, not seeing where it discusses publishing to alternative package warehouses.

4
submitted 2 months ago* (last edited 2 months ago) by logging_strict to c/python
 

Market research

This post is only about dependency management, not package management, not build backends.

You know about these:

  • uv

  • poetry

  • pipenv

You are probably not familiar with:

  • pip-compile-multi

    (toposort, pip-tools)

You are defintely unfamiliar with:

  • wreck

    (pip-tools, pip-requirements-parser)

pip-compile-multi creates lock files. Has no concept of unlock files.

wreck produces both lock and unlock files. venv aware.

Both sync dependencies across requirement files

Both act only upon requirements files, not venv(s)

Up to speed with wreck

You are familiar with .in and .txt requirements files.

.txt is split out into .lock and .unlock. The later is for packages which are not apps.

Create .in files that are interlinked with -r and -c. No editable builds. No urls.

(If this is a deal breaker feel free to submit a PR)

pins files

pins-*.in are for common constraints. The huge advantage here is to document why?

Without the documentation even the devs has no idea whether or not the constraint is still required.

pins-*.in file are split up to tackle one issue. The beauty is the issue must be documented with enough details to bring yourself up to speed.

Explain the origin of the issue in terms a 6 year old can understand.

Configuration

python -m pip install wreck

This is logging-strict pyproject.toml


[tool.wreck]
create_pins_unlock = false

[[tool.wreck.venvs]]
venv_base_path = '.venv'
reqs = [
    'requirements/dev',
    'requirements/kit',
    'requirements/pip',
    'requirements/pip-tools',
    'requirements/prod',
    'requirements/manage',
    'requirements/mypy',
    'requirements/tox',
]

[[tool.wreck.venvs]]
venv_base_path = '.doc/.venv'
reqs = [
    'docs/requirements',
]

dynamic = [
    "optional-dependencies",
    "dependencies",
    "version",
]

[tool.setuptools.dynamic]
dependencies = { file = ["requirements/prod.unlock"] }
optional-dependencies.pip = { file = ["requirements/pip.lock"] }
optional-dependencies.pip_tools = { file = ["requirements/pip-tools.lock"] }
optional-dependencies.dev = { file = ["requirements/dev.lock"] }
optional-dependencies.manage = { file = ["requirements/manage.lock"] }
optional-dependencies.docs = { file = ["docs/requirements.lock"] }
version = {attr = "logging_strict._version.__version__"}

Look how short and simple that is.

The only thing you have to unlearn is being so timid.

More venvs. More constraints and requirements complexity.

Do it

mkdir -p .venv || :;
pyenv version > .venv/python-version
python -m venv .venv

mkdir -p .doc || :;
echo "3.10.14" > .doc/python-version
cd .doc && python -m venv .venv; cd - &>/dev/null

. .venv/bin/activate
# python -m pip install wreck
reqs fix --venv-relpath='.venv'

There will be no avoidable resolution conflicts.

Preferable to do this within tox-reqs.ini

Details

TOML file format expects paths to be single quoted. The paths are relative without the last file suffix.

If pyproject.toml not in the cwd, --path='path to pyproject.toml'

create_pins_unlock = false tells wreck to not produce .unlock files for pins-*.in files.

DANGER

This is not for a faint of heart. If you can avoid it. This is for the folks who often say, Oh really, hold my beer!

For pins that span venv, add the file suffix .shared

e.g. pins-typing.shared.in

wreck deals with one venv at a time. Files that span venv have to be dealt with manually and carefully.

Issues

  1. no support for editable builds

  2. no url support

  3. no hashs

  4. your eyes will tire and brains will splatter on the wall, from all the eye rolling after sifting thru endless posts on uv and poetry and none about pip-compile-multi or wreck

  5. Some folks love having all dependency managed within pyproject.toml These folks are deranged and its impossible to convince them otherwise. pyproject.toml is a config file, not a database. It should be read only.

  6. a docs link on pypi.org is 404. Luckily there are two docs links. Should really just fix that, but it's left like that to see if anyone notices. No one did.

4
submitted 2 months ago* (last edited 2 months ago) by logging_strict to c/python
 

Finally got around to creating a gh profile page

The design is to give activity insights on:

  • what Issues/PRs working on

  • future issues/PRs

  • for fun, show off package mascots

All out of ideas. Any suggestions? How did you improve your github profile?

14
submitted 2 months ago* (last edited 2 months ago) by logging_strict to c/python
 

From helping other projects have run across a fundamental issue which web searches have not given appropriate answers.

What should go in a tarball and what should not?

Is it only the build files, python code, and package data and nothing else?

Should it include tests/ folder?

Should it include development and configuration files?

Have seven published packages which include almost all the files and folders. Including:

.gitignore,

.gitattributes,

.github folder tree,

docs/,

tests/,

Makefile,

all config files,

all tox files,

pre-commit config file

My thinking is that the tarball should have everything needed to maintain the package, but this belief has been challenged. That the tarball is not appropriate for that.

Thoughts?

 

PEP 735 what is it's goal? Does it solve our dependency hell issue?

A deep dive and out comes this limitation

The mutual compatibility of Dependency Groups is not guaranteed.

-- https://peps.python.org/pep-0735/#lockfile-generation

Huh?! Why not?

mutual compatibility or go pound sand!

pip install -r requirements/dev.lock
pip install -r requirements/kit.lock -r requirements/manage.lock

The above code, purposefully, does not afford pip a fighting chance. If there are incompatibilities, it'll come out when trying randomized combinations.

Without a means to test for and guarantee mutual compatibility, end users will always find themselves in dependency hell.

Any combination of requirement files (or dependency groups), intended for the same venv, MUST always work!

What if this is scaled further, instead of one package, a chain of packages?!

11
submitted 4 months ago* (last edited 4 months ago) by logging_strict to c/python
 

In a requirements-*.in file, at the top of the file, are lines with -c and -r flags followed by a requirements-*.in file. Uses relative paths (ignoring URLs).

Say have docs/requirements-pip-tools.in

-r ../requirements/requirements-prod.in
-c ../requirements/requirements-pins-base.in
-c ../requirements/requirements-pins-cffi.in

...

The intent is compiling this would produce docs/requirements-pip-tool.txt

But there is confusion as to which flag to use. It's non-obvious.

constraint

Subset of requirements features. Intended to restrict package versions. Does not necessarily (might not) install the package!

Does not support:

  • editable mode (-e)

  • extras (e.g. coverage[toml])

Personal preference

  • always organize requirements files in folder(s)

  • don't prefix requirements files with requirements-, just doing it here

  • DRY principle applies; split out constraints which are shared.

view more: next ›