ChatGPT解决这个技术问题 Extra ChatGPT

Installing Python packages from local file system folder to virtualenv with pip

Is it possible to install packages using pip from the local filesystem?

I have run python setup.py sdist for my package, which has created the appropriate tar.gz file. This file is stored on my system at /srv/pkg/mypackage/mypackage-0.1.0.tar.gz.

Now in a virtual environment I would like to install packages either coming from pypi or from the specific local location /srv/pkg.

Is this possible?

PS I know that I can specify pip install /srv/pkg/mypackage/mypackage-0.1.0.tar.gz. That will work, but I am talking about using the /srv/pkg location as another place for pip to search if I typed pip install mypackage.

I was looking to install a PyPi package without setup.py, from WHL wheel, and it got installed after I downloaded a correct version for my Python version, and ran pip install <Path-to-WHL-file>.

k
kejbaly2

What about::

pip install --help
...
  -e, --editable <path/url>   Install a project in editable mode (i.e. setuptools
                              "develop mode") from a local project path or a VCS url.

eg, pip install -e /srv/pkg

where /srv/pkg is the top-level directory where 'setup.py' can be found.


This will install the package in develop mode, meaning it will just link back to where the sources are. If by any chance the sources are moved or deleted, importing the package will fail.
@MarcoDinacci What's interesting about --editable is that it seems to look into the local package's directory and set the source as a git repo if there is one - a bit more flexible than just a folder. I can't find documentation for this though.
Whilst this is correct for installing a particular package, especially one in current/intensive development on a local machine or VCS url. It does not answer the question about searching a parent directory for all local package sources as opposed to one particular package source. The accepted answer works when you have a directory or url with multiple packages you want to pip install from.
@Simon is there a way to tell pip not to look for a git repo? If I do pip install -e it checks out the code from the git remote, I would like to install the code as it is (with changes not yet in the remote).
I made a simple working example of this if it's helpful: github.com/MareoRaft/…
M
Mikko Ohtamaa

I am pretty sure that what you are looking for is called --find-links option.

You can do

pip install mypackage --no-index --find-links file:///srv/pkg/mypackage

pip install mypackage --no-index --find-links file:///srv/pkg/mypackage should work.
It do help. And we can use -i option of pip to treat it as a local PyPI.
The equivalent easy_install command is easy_install --allow-hosts=None --find-links file:///srv/pkg/mypackage mypackage
Note that --find-links will still allow searching on PyPI if the package is not found in the specified location or if a newer version is available. On Windows, I find that combining this with --only-binary=packagename (or --only-binary=:all:) is extraordinarily useful. This prevents pip from trying to download any packages with native dependencies that only have source distributions on PyPI (and thus requiring compilation), which is the most common reason I have to try to install from a directory. Unlike --no-index, I can still install other packages from PyPI.
D
Dimitar

From the installing-packages page you can simply run:

pip install /srv/pkg/mypackage

where /srv/pkg/mypackage is the directory, containing setup.py.

Additionally1, you can install it from the archive file:

pip install ./mypackage-1.0.4.tar.gz

1 Although noted in the question, due to its popularity, it is also included.


This should be the default answer. No need to make pip hunt around with --find-links if you know exactly where your package is on the local machine.
This is the easiest solution! I could add that adding another than the default location also works: pip install./mypackage-1.0.4.tgz --prefix C:\OtherLocation
J
Jean-Francois T.

I am installing pyfuzzybut is is not in PyPI; it returns the message: No matching distribution found for pyfuzzy.

I tried the accepted answer

pip install  --no-index --find-links=file:///Users/victor/Downloads/pyfuzzy-0.1.0 pyfuzzy

But it does not work either and returns the following error:

Ignoring indexes: https://pypi.python.org/simple Collecting pyfuzzy Could not find a version that satisfies the requirement pyfuzzy (from versions: ) No matching distribution found for pyfuzzy

At last , I have found a simple good way there: https://pip.pypa.io/en/latest/reference/pip_install.html

Install a particular source archive file.
$ pip install ./downloads/SomePackage-1.0.4.tar.gz
$ pip install http://my.package.repo/SomePackage-1.0.4.zip

So the following command worked for me:

pip install ../pyfuzzy-0.1.0.tar.gz.

Hope it can help you.


This worked for me. I tried the other approaches and was stumbled by Could not find a version that satisfies error. Thanks.
C
Community

This is the solution that I ended up using:

import pip


def install(package):
    # Debugging
    # pip.main(["install", "--pre", "--upgrade", "--no-index",
    #         "--find-links=.", package, "--log-file", "log.txt", "-vv"])
    pip.main(["install", "--upgrade", "--no-index", "--find-links=.", package])


if __name__ == "__main__":
    install("mypackagename")
    raw_input("Press Enter to Exit...\n")

I pieced this together from pip install examples as well as from Rikard's answer on another question. The "--pre" argument lets you install non-production versions. The "--no-index" argument avoids searching the PyPI indexes. The "--find-links=." argument searches in the local folder (this can be relative or absolute). I used the "--log-file", "log.txt", and "-vv" arguments for debugging. The "--upgrade" argument lets you install newer versions over older ones.

I also found a good way to uninstall them. This is useful when you have several different Python environments. It's the same basic format, just using "uninstall" instead of "install", with a safety measure to prevent unintended uninstalls:

import pip


def uninstall(package):
    response = raw_input("Uninstall '%s'? [y/n]:\n" % package)
    if "y" in response.lower():
        # Debugging
        # pip.main(["uninstall", package, "-vv"])
        pip.main(["uninstall", package])
    pass


if __name__ == "__main__":
    uninstall("mypackagename")
    raw_input("Press Enter to Exit...\n")

The local folder contains these files: install.py, uninstall.py, mypackagename-1.0.zip


Thanks. The "--find-links=." and "--no-index", where the key in glueing together a python-script inside my utility-package, that first removes the old-version-package from site-packages then installs a tar.gz'ed package from a subdir of the utility-package-folder (did not knew about --find-links=.), then creates the wheel and installs it. All automated via plumbum and click. If someone wants it, I'll post a link. Upvoted.
J
Jean-Francois T.

An option --find-links does the job and it works from requirements.txt file!

You can put package archives in some folder and take the latest one without changing the requirements file, for example requirements:

.
└───requirements.txt
└───requirements
    ├───foo_bar-0.1.5-py2.py3-none-any.whl
    ├───foo_bar-0.1.6-py2.py3-none-any.whl
    ├───wiz_bang-0.7-py2.py3-none-any.whl
    ├───wiz_bang-0.8-py2.py3-none-any.whl
    ├───base.txt
    ├───local.txt
    └───production.txt

Now in requirements/base.txt put:

--find-links=requirements
foo_bar
wiz_bang>=0.8

A neat way to update proprietary packages, just drop new one in the folder

In this way you can install packages from local folder AND pypi with the same single call: pip install -r requirements/production.txt

PS. See my cookiecutter-djangopackage fork to see how to split requirements and use folder based requirements organization.


Thanks, this is even better than what I was thinking of doing!
What are the .whl files? Are they needed? What are the local.txt and production.txt files? I feel like there's too much stuff in the example that only make it more complicated and confusing.
@ashrasmun these are packages that you would like to install from the local filesystem. If you do not know what .whl files, then you are probably looking at the wrong question, cause this is exactly about how to install packages using pip from the local filesystem.
@JanuszSkonieczny fair enough. Thanks for explanation and guidance on which knowledge is needed to understand the answer.
b
bunbun

Assuming you have virtualenv and a requirements.txt file, then you can define inside this file where to get the packages:

# Published pypi packages 
PyJWT==1.6.4
email_validator==1.0.3
# Remote GIT repo package, this will install as django-bootstrap-themes
git+https://github.com/marquicus/django-bootstrap-themes#egg=django-bootstrap-themes
# Local GIT repo package, this will install as django-knowledge
git+file:///soft/SANDBOX/python/django/forks/django-knowledge#egg=django-knowledge

S
Sławomir Lenart

Having requirements in requirements.txt and egg_dir as a directory

you can build your local cache:

$ pip download -r requirements.txt -d eggs_dir

then, using that "cache" is simple like:

$ pip install -r requirements.txt --find-links=eggs_dir


Thanks, the -r was the part I was missing, as the docs don't seem to indicate this is required. I expected it to just install everything from --find-links, but you also need to tell it, some way, what to install from that directory. Cheers!
O
Oliver

To install only from local you need 2 options:

--find-links: where to look for dependencies. There is no need for the file:// prefix mentioned by others.

--no-index: do not look in pypi indexes for missing dependencies (dependencies not installed and not in the --find-links path).

So you could run from any folder the following:

pip install --no-index --find-links /srv/pkg /path/to/mypackage-0.1.0.tar.gz

If your mypackage is setup properly, it will list all its dependencies, and if you used pip download to download the cascade of dependencies (ie dependencies of depencies etc), everything will work.

If you want to use the pypi index if it is accessible, but fallback to local wheels if not, you can remove --no-index and add --retries 0. You will see pip pause for a bit while it is try to check pypi for a missing dependency (one not installed) and when it finds it cannot reach it, will fall back to local. There does not seem to be a way to tell pip to "look for local ones first, then the index".


B
Brooke Yang

What you need is --find-links of pip install.

-f, --find-links If a url or path to an html file, then parse for links to archives. If a local path or file:// url that's a directory, then look for archives in the directory listing.

In my case, after python -m build, tar.gz package (and whl file) are generated in ./dist directory.

pip install --no-index -f ./dist YOUR_PACKAGE_NAME

Any tar.gz python package in ./dist can be installed by this way.

But if your package has dependencies, this command will prompt error. To solve this, you can either pip install those deps from official pypi source, then add --no-deps like this

pip install --no-index --no-deps -f ./dist YOUR_PACKAGE_NAME

or copy your deps packages to ./dist directory.


A
Aldo 'xoen' Giambelluca

I've been trying to achieve something really simple and failed miserably, probably I'm stupid.

Anyway, if you have a script/Dockerfile which download a python package zip file (e.g. from GitHub) and you then want to install it you can use the file:/// prefix to install it as shown in the following example:

$ wget https://example.com/mypackage.zip
$ echo "${MYPACKAGE_MD5}  mypackage.zip" | md5sum --check -
$ pip install file:///.mypackage.zip

NOTE: I know you could install the package straight away using pip install https://example.com/mypackage.zip but in my case I wanted to verify the checksum (never paranoid enough) and I failed miserably when trying to use the various options that pip provides/the #md5 fragment.

It's been surprisingly frustrating to do something so simple directly with pip. I just wanted to pass a checksum and have pip verify that the zip was matching before installing it.

I was probably doing something very stupid but in the end I gave up and opted for this. I hope it helps others trying to do something similar.


B
Ben Caine

In my case, it was because this library depended on another local library, which I had not yet installed. Installing the dependency with pip, and then the dependent library, solved the issue.


r
rubmz

If you want to install one local package (package A) to be used inside another local project/package (B) this is quite simple. All you need is to CD to (B) and call:

pip install /path/to/package(A)

Of course you will need to first compile the package (A) with:

sudo python3 ./setup.py install

And, each time you change package A, just run again setup.py in package (A) then pip install ... inside the using project/package (B)