Depends on your data, but there are two major contenders for that title: 7z (with solid mode off) and zpaq. You will probably get slightly better compression on zpaq, but it's not widely known.
Data Hoarder
We are digital librarians. Among us are represented the various reasons to keep data -- legal requirements, competitive requirements, uncertainty of permanence of cloud services, distaste for transmitting your data externally (e.g. government or corporate espionage), cultural and familial archivists, internet collapse preppers, and people who do it themselves so they're sure it's done right. Everyone has their reasons for curating the data they have decided to keep (either forever or For A Damn Long Time (tm) ). Along the way we have sought out like-minded individuals to exchange strategies, war stories, and cautionary tales of failures.
I tried with zpaq but it told ne that archive type did not support partial extraction
That is kind of inconsequential as you can always compress the files individually if you wish and then make a tar with all of them together.
The question is what files you have, based on that various algorithms would do better or worse. And of course not doing solid archives would add a penalty to most algorithms if the files are somehow similar.
images and videos
mostly jpg png mp4 webm
It's dependent on dataset. I would suggest 7z and simply uncheck "solid archive". There is info here on running a test to find the best compression: Link
You may want to look into filesystem compression. As it will be much easier to implement and may suit your needs.