The package manager in the python ecosystem is pip.
For an overview of installing packages in python head over to the tutorial.
As per the documentation pip is taking the following steps:
Identify the base requirements. The user supplied arguments are processed here.
Resolve dependencies. What will be installed is determined here.
Build wheels. All the dependencies that can be are built into wheels.
Install the packages (and uninstall anything being upgraded/replaced).
In order keep your environment clean you should setup and activate a venv. By using a venv you do not change the globally installed versions of packages. Pip will still use the global pip cache so you minimize the amount of wasted disk space by having multiple installs.
Complex packages like as ML frameworks have a lot of dependencies. In some cases it may be simpler to use a docker image, if provided by the developers, or conda.
When searching for tensorflow
on pypi you will see that the have the extension whl
.
These are wheel files, zip files with a custom extension and are defined in PEP-0427.
When building wheels for multiple linux architectures it can be beneficial to use manylinux.
From Read the docs: “Setuptools is a collection of enhancements to the Python distutils that allow developers to more easily build and distribute Python packages, especially ones that have dependencies on other packages.” The documentation page mentions that setuptools is used to build Python egg distributable which seems to be deprecated in favor of wheels.
Some of the python magic is achieved through binary dependencies. Some of the tools used to bring those in the interpreter are: