logging_strict

joined 11 months ago
[–] logging_strict 1 points 3 days ago

As the quantity and relationships complexity increases so to does the need for management tools to deal with the chaos.

Most Python coders cope by keeping things overly simple. Avoiding complexity at all costs.

Do you fully embrace requirement file complexity or do you avoid it?

  1. assume one venv

  2. has no way to deal with unavoidable incompatibilities

Which maybe due to: a package becoming unmaintained or overly zealous limiting allowed versions

  1. has no way to adapt to security vulnerabilities (e.g. CVE-2024-9287)

  2. has no intelligent way to normalize both direct and transitive dependency versions across lock files

12
submitted 3 days ago* (last edited 3 days ago) by logging_strict to c/python
 

From helping other projects have run across a fundamental issue which web searches have not given appropriate answers.

What should go in a tarball and what should not?

Is it only the build files, python code, and package data and nothing else?

Should it include tests/ folder?

Should it include development and configuration files?

Have seven published packages which include almost all the files and folders. Including:

.gitignore,

.gitattributes,

.github folder tree,

docs/,

tests/,

Makefile,

all config files,

all tox files,

pre-commit config file

My thinking is that the tarball should have everything needed to maintain the package, but this belief has been challenged. That the tarball is not appropriate for that.

Thoughts?

[–] logging_strict 1 points 1 week ago

Ignoring concurrency.

For a write to be transactional, validate-pyproject would need to be called:

Once prior to the read and again prior to the write.

Is that occurring always?

Haven't checked if validate-pyproject has an API, so can be called on a str rather than only a file.

[–] logging_strict 0 points 1 week ago (2 children)

strict schema and a spec are not the same. package pyproject-validate can check if a pyproject.toml follows the spec, but not be using a strict schema.

A schema is similar to using Rust. Every element is strictly typed. Is that an int or a str is not enforced by a spec

If there was a strict schema, package pyproject-validate would be unnecessary

[–] logging_strict 1 points 1 week ago

But it's treated 100% like a crappy CRUD database with no modern features or SQL

it's a file containing records with a strict schema. And nothing else

[–] logging_strict 1 points 1 week ago (4 children)

The strictyaml schema holds a pinch of nuance.

The value argument is automagically coersed to a str. Which is nice; since the field value can be either integer or str. And i want a str, not an int.

A Rust solution would be superior, but the Python API is reasonable; not bad at all.

[–] logging_strict 1 points 1 week ago

parsing lock files

There's a few edge cases on parsing dependency urls. So it's not completely black and white.

just have to read over to pip-compile-multi to see that. (i have high praise for that project don't get me wrong)

[–] logging_strict 1 points 1 week ago (1 children)

then i'm misunderstanding what data dependencies groups are supposed to be storing. Just the equivalent of requirements.in files and mapping that to a target? And no -c (constraints) support?!

Feels like tying one of hands behind our back.

[–] logging_strict 1 points 1 week ago (2 children)

especially JS, some packages.json are super long. The sqlite author would blush looking at that

[–] logging_strict 2 points 1 week ago* (last edited 1 week ago) (2 children)

You are not wrong, yaml can be confusing.

Recently got tripped up on sequence of mapping of mapping. Which is just a simple list of records.

But for the life of me, couldn't get a simple example working.

Ended up reversed the logic.

Instead of parsing a yaml str. Created the sample list of dict and asked strictyaml to produce the yaml str.

Turns out the record is indented four spaces, not two.

- file: "great_file_name_0.yml"
    key_0: "value 0"
- file: "great_file_name_1.yml"
    key_0: "value 0"

Something like ^^. That is a yaml database. It has records, a schema, and can be safely validated!

The strictyaml documentation covers ridiculously simple cases. There are no practical examples. So it was no help.

Parser kept complaining about duplicate keys.

[–] logging_strict 1 points 1 week ago (10 children)

In this super specific case, the data that is being worked with is a many list of dict. A schema-less table. There would be frequent updates to this data. As package versions are upgraded, fixes are made, and security patches are added.

[–] logging_strict 1 points 1 week ago* (last edited 1 week ago)

Not in circles, this is helping for me.

If you have strong support for a rw toml, would like to hear your arguments

[–] logging_strict 1 points 1 week ago* (last edited 1 week ago)

Highly suggest reading the strictyaml docs

The author lays out both

Should be required reading for anyone dealing with config files, especially those encountering yaml.

Warning: After reading these, and confirming the examples yourself, seeing packages using pyyaml will come off as lessor

 

PEP 735 what is it's goal? Does it solve our dependency hell issue?

A deep dive and out comes this limitation

The mutual compatibility of Dependency Groups is not guaranteed.

-- https://peps.python.org/pep-0735/#lockfile-generation

Huh?! Why not?

mutual compatibility or go pound sand!

pip install -r requirements/dev.lock
pip install -r requirements/kit.lock -r requirements/manage.lock

The above code, purposefully, does not afford pip a fighting chance. If there are incompatibilities, it'll come out when trying randomized combinations.

Without a means to test for and guarantee mutual compatibility, end users will always find themselves in dependency hell.

Any combination of requirement files (or dependency groups), intended for the same venv, MUST always work!

What if this is scaled further, instead of one package, a chain of packages?!

11
submitted 2 months ago* (last edited 2 months ago) by logging_strict to c/python
 

In a requirements-*.in file, at the top of the file, are lines with -c and -r flags followed by a requirements-*.in file. Uses relative paths (ignoring URLs).

Say have docs/requirements-pip-tools.in

-r ../requirements/requirements-prod.in
-c ../requirements/requirements-pins-base.in
-c ../requirements/requirements-pins-cffi.in

...

The intent is compiling this would produce docs/requirements-pip-tool.txt

But there is confusion as to which flag to use. It's non-obvious.

constraint

Subset of requirements features. Intended to restrict package versions. Does not necessarily (might not) install the package!

Does not support:

  • editable mode (-e)

  • extras (e.g. coverage[toml])

Personal preference

  • always organize requirements files in folder(s)

  • don't prefix requirements files with requirements-, just doing it here

  • DRY principle applies; split out constraints which are shared.

view more: next ›