What is __init__.py
for in a Python source directory?
__init__
is namespace package, not a regular package. It's not the same thing as @methane pointed out with an example here.
It used to be a required part of a package (old, pre-3.3 "regular package", not newer 3.3+ "namespace package").
Python defines two types of packages, regular packages and namespace packages. Regular packages are traditional packages as they existed in Python 3.2 and earlier. A regular package is typically implemented as a directory containing an __init__.py file. When a regular package is imported, this __init__.py file is implicitly executed, and the objects it defines are bound to names in the package’s namespace. The __init__.py file can contain the same Python code that any other module can contain, and Python will add some additional attributes to the module when it is imported.
But just click the link, it contains an example, more information, and an explanation of namespace packages, the kind of packages without __init__.py
.
Files named __init__.py
are used to mark directories on disk as Python package directories. If you have the files
mydir/spam/__init__.py
mydir/spam/module.py
and mydir
is on your path, you can import the code in module.py
as
import spam.module
or
from spam import module
If you remove the __init__.py
file, Python will no longer look for submodules inside that directory, so attempts to import the module will fail.
The __init__.py
file is usually empty, but can be used to export selected portions of the package under more convenient name, hold convenience functions, etc. Given the example above, the contents of the init module can be accessed as
import spam
based on this
__init__.py
was required under Python 2.X and is still required under Python 2.7.12 (I tested it) but it is no longer required from (allegedly) Python 3.3 onwards, and is not required under Python 3.4.3 (I tested it). See stackoverflow.com/questions/37139786 for more details.
import spam
inside` init.py`, what's its help
import spam
inside __init__.py
, he has it inside main.py or whatever file needs to import the contents of spam
. You can treat spam
as an object you import and use functions defined within spam/__init__.py
In addition to labeling a directory as a Python package and defining __all__
, __init__.py
allows you to define any variable at the package level. Doing so is often convenient if a package defines something that will be imported frequently, in an API-like fashion. This pattern promotes adherence to the Pythonic "flat is better than nested" philosophy.
An example
Here is an example from one of my projects, in which I frequently import a sessionmaker
called Session
to interact with my database. I wrote a "database" package with a few modules:
database/
__init__.py
schema.py
insertions.py
queries.py
My __init__.py
contains the following code:
import os
from sqlalchemy.orm import sessionmaker
from sqlalchemy import create_engine
engine = create_engine(os.environ['DATABASE_URL'])
Session = sessionmaker(bind=engine)
Since I define Session
here, I can start a new session using the syntax below. This code would be the same executed from inside or outside of the "database" package directory.
from database import Session
session = Session()
Of course, this is a small convenience -- the alternative would be to define Session
in a new file like "create_session.py" in my database package, and start new sessions using:
from database.create_session import Session
session = Session()
Further reading
There is a pretty interesting reddit thread covering appropriate uses of __init__.py
here:
http://www.reddit.com/r/Python/comments/1bbbwk/whats_your_opinion_on_what_to_include_in_init_py/
The majority opinion seems to be that __init__.py
files should be very thin to avoid violating the "explicit is better than implicit" philosophy.
engine
, sessionmaker
, create_engine
, and os
can all also be imported from database
now... seems like you've made a mess of that namespace.
__all__ = [...]
to limit what gets imported with import *
. But aside from that, yes, you're left with a messy top-level namespace.
import *
by default. Eg: import os as _os
and use _os
inside the __init__.py
module in place of os
.
There are 2 main reasons for __init__.py
For convenience: the other users will not need to know your functions' exact location in your package hierarchy (documentation). your_package/ __init__.py file1.py file2.py ... fileN.py # in __init__.py from .file1 import * from .file2 import * ... from .fileN import * # in file1.py def add(): pass then others can call add() by from your_package import add without knowing file1's inside functions, like from your_package.file1 import add If you want something to be initialized; for example, logging (which should be put in the top level): import logging.config logging.config.dictConfig(Your_logging_config)
__init__.py
may be useful sometimes, but not all times.
The __init__.py
file makes Python treat directories containing it as modules.
Furthermore, this is the first file to be loaded in a module, so you can use it to execute code that you want to run each time a module is loaded, or specify the submodules to be exported.
Since Python 3.3, __init__.py
is no longer required to define directories as importable Python packages.
Check PEP 420: Implicit Namespace Packages:
Native support for package directories that don’t require __init__.py marker files and can automatically span multiple path segments (inspired by various third party approaches to namespace packages, as described in PEP 420)
Here's the test:
$ mkdir -p /tmp/test_init
$ touch /tmp/test_init/module.py /tmp/test_init/__init__.py
$ tree -at /tmp/test_init
/tmp/test_init
├── module.py
└── __init__.py
$ python3
>>> import sys
>>> sys.path.insert(0, '/tmp')
>>> from test_init import module
>>> import test_init.module
$ rm -f /tmp/test_init/__init__.py
$ tree -at /tmp/test_init
/tmp/test_init
└── module.py
$ python3
>>> import sys
>>> sys.path.insert(0, '/tmp')
>>> from test_init import module
>>> import test_init.module
references:
https://docs.python.org/3/whatsnew/3.3.html#pep-420-implicit-namespace-packages
https://www.python.org/dev/peps/pep-0420/
Is __init__.py not required for packages in Python 3?
Although Python works without an __init__.py
file you should still include one.
It specifies that the directory should be treated as a package, so therefore include it (even if it is empty).
There is also a case where you may actually use an __init__.py
file:
Imagine you had the following file structure:
main_methods
|- methods.py
And methods.py
contained this:
def foo():
return 'foo'
To use foo()
you would need one of the following:
from main_methods.methods import foo # Call with foo()
from main_methods import methods # Call with methods.foo()
import main_methods.methods # Call with main_methods.methods.foo()
Maybe there you need (or want) to keep methods.py
inside main_methods
(runtimes/dependencies for example) but you only want to import main_methods
.
If you changed the name of methods.py
to __init__.py
then you could use foo()
by just importing main_methods
:
import main_methods
print(main_methods.foo()) # Prints 'foo'
This works because __init__.py
is treated as part of the package.
Some Python packages actually do this. An example is with JSON, where running import json
is actually importing __init__.py
from the json
package (see the package file structure here):
Source code: Lib/json/__init__.py
In Python the definition of package is very simple. Like Java the hierarchical structure and the directory structure are the same. But you have to have __init__.py
in a package. I will explain the __init__.py
file with the example below:
package_x/
|-- __init__.py
|-- subPackage_a/
|------ __init__.py
|------ module_m1.py
|-- subPackage_b/
|------ __init__.py
|------ module_n1.py
|------ module_n2.py
|------ module_n3.py
__init__.py
can be empty, as long as it exists. It indicates that the directory should be regarded as a package. Of course, __init__.py
can also set the appropriate content.
If we add a function in module_n1:
def function_X():
print "function_X in module_n1"
return
After running:
>>>from package_x.subPackage_b.module_n1 import function_X
>>>function_X()
function_X in module_n1
Then we followed the hierarchy package and called module_n1 the function. We can use __init__.py
in subPackage_b like this:
__all__ = ['module_n2', 'module_n3']
After running:
>>>from package_x.subPackage_b import *
>>>module_n1.function_X()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named module_n1
Hence using * importing, module package is subject to __init__.py
content.
from package_x.subPackage_b.module_n1 import function_X
__init__.py
will treat the directory it is in as a loadable module.
For people who prefer reading code, I put Two-Bit Alchemist's comment here.
$ find /tmp/mydir/
/tmp/mydir/
/tmp/mydir//spam
/tmp/mydir//spam/__init__.py
/tmp/mydir//spam/module.py
$ cd ~
$ python
>>> import sys
>>> sys.path.insert(0, '/tmp/mydir')
>>> from spam import module
>>> module.myfun(3)
9
>>> exit()
$
$ rm /tmp/mydir/spam/__init__.py*
$
$ python
>>> import sys
>>> sys.path.insert(0, '/tmp/mydir')
>>> from spam import module
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named spam
>>>
It facilitates importing other python files. When you placed this file in a directory (say stuff)containing other py files, then you can do something like import stuff.other.
root\
stuff\
other.py
morestuff\
another.py
Without this __init__.py
inside the directory stuff, you couldn't import other.py, because Python doesn't know where the source code for stuff is and unable to recognize it as a package.
An __init__.py
file makes imports easy. When an __init__.py
is present within a package, function a()
can be imported from file b.py
like so:
from b import a
Without it, however, you can't import directly. You have to amend the system path:
import sys
sys.path.insert(0, 'path/to/b.py')
from b import a
One thing __init__.py allows is converting a module to a package without breaking the API or creating extraneous nested namespaces or private modules*. This helps when I want to extend a namespace.
If I have a file util.py containing
def foo():
...
then users will access foo
with
from util import foo
If I then want to add utility functions for database interaction, and I want them to have their own namespace under util
, I'll need a new directory**, and to keep API compatibility (so that from util import foo
still works), I'll call it util/. I could move util.py into util/ like so,
util/
__init__.py
util.py
db.py
and in util/__init__.py do
from util import *
but this is redundant. Instead of having a util/util.py file, we can just put the util.py contents in __init__.py and the user can now
from util import foo
from util.db import check_schema
I think this nicely highlights how a util
package's __init__.py acts in a similar way to a util
module
* this is hinted at in the other answers, but I want to highlight it here
** short of employing import gymnastics. Note it won't work to create a new package with the same name as the file, see this
from util import check_schema
since you already did in __init __.py from util import *
from util import *
would be in util/__init__.py, and so wouldn't import db
it would import the contents of util/util.py. I'll clarify the answer
If you're using Python 2 and want to load siblings of your file you can simply add the parent folder of your file to your system paths of the session. It will behave about the same as if your current file was an init file.
import os
import sys
dir_path = os.path.dirname(__file__)
sys.path.insert(0, dir_path)
After that regular imports relative to the file's directory will work just fine. E.g.
import cheese
from vehicle_parts import *
# etc.
Generally you want to use a proper init.py file instead though, but when dealing with legacy code you might be stuck with f.ex. a library hard-coded to load a particular file and nothing but. For those cases this is an alternative.
init.py : It is a python file found in a package directory, it is invoked when the package or a module in the package is imported. You can use this to execute package initialization code, i.e. whenever the package is imported the python statements are executed first before the other modules in this folder gets executed. It is similar to main function of c or java program but this exists in the python package module(folder) rather than in the core python file. also it has access to global variables defined in this init.py file as when the module is imported into python file.
for eg. I have a init.py file in a folder called pymodlib, this file contains the following statements:
print(f'Invoking init.py for {name}') pystructures = ['for_loop', 'while__loop', 'ifCondition']
when I import this package "pymodlib" in the my solution module or notebook or python console: this two statements gets executed while importing. So in the log or console you would see the following output:
import pymodlib Invoking init.py for pymodlib
in the next statement of python console: I can access the global variable:
pymodlib.pystructures it gives the following output:
['for_loop', 'while__loop', 'ifCondition']
Now from python3.3 onwards the use of this file has been optional to make folder a python module. So you skip from including it in the python module folder.
Success story sharing
sys.path.insert(0, '/path/to/datetime')
, replacing that path with the path to whatever directory you just made. Now try something likefrom datetime import datetime;datetime.now()
. You should get an AttributeError (because it is importing your blank file now). If you were to repeat these steps without creating the blank init file, this would not happen. That's what it's intended to prevent.ImportError: attempted relative import with no known parent package
. My structure: /PyToHtml init.py pytohtml.py test.py wheretest.py
has:from .pytohtml import HTML