Tutorial - start here !

Well, this module does actually nothing.

It’s a skeleton module that servers as a template for python3 projects:

_images/skull.png

Some features:

  • Autogenerated documentation

  • Python packaging

  • Self-contained docker-compose scheme within the package (optional)

  • Creating a debian package of your software (very optional)

  • Development and production downloads from git

  • Uploading your package to PyPi

  • A C++ extension with SWIG : pass numpy arrays to & fro between python and C++ (optional)

  • Live/hot-reload software development using notebooks

  • Proper code organization

  • Proper logging (never optional!)

What’s in it?

Let’s take a closer look at the directory structure:

skeleton/               # python package dir
    setup.py            # standard python install script
    bash/               # helper scripts
    sudo/               # scripts for hard dependencies: sudo apt-get etc.
    build/              # python package build
    debian/             # debian package of your python package (optional)
    docker/             # Files for dockerization (optional) (see below)
    docs/               # this documentation
    licenses/           # your favorite FOSS licenses - pick one
    notebook/           # interactive code development with notebooks
    skeleton/           # python source code directory
        data/           # package's static data
        ini/            # a default/example ini config file --> for microservices, the final one should live in secrets (**)
        greeters/       # demo
        qt/             # if you're into PyQt/PySide2 (optional)
        skeleton_cpp/   # interfaced cpp code as this python package's submodule OR as a separate python package

As you can see, there’s a lot here. Some more details:

  • You can also run your package as a microservice using a docker-compose-dev.yml file (see below)

  • The package includes a C++ submodule with C++ code interfaced to python and that is compiled on-the-fly once you install the python package (see below)

  • Directory notebook/ shows you how to use notebooks for interactive code developments. There you can also test calls to the C++ part.

If you wish to create a docker(-compose) scaffold, just type:

bash/dockerize.bash

After that, your directory structure has been modified into this:

skeleton/
    dev.bash                # softlinked to --> (A)
    docker-compose-dev.yml  # softlinked to --> (B)
    secrets/                # password, etc. secrets for microservice deployment
        dev.ini             # copied from (C)
    datashare/              # directory for data sharing with docker run

    skeleton/               # THE ORIGINAL PYTHON PACKAGE DIRECTORY (see above)
        bash/               # helper scripts
            dev.bash        # (A) helper that runs docker-compose (copies (C) always into secrets/dev.ini)
        docker/             # Dockerfile(s) (optional) of this microservice
            docker-compose-dev.yml
                            # (B) docker-compose file for dockerization/microservice
        skeleton/           # python source code directory
            ini/
                default.ini # (C) default .ini file

i.e. series of extra directories and softlinks are created, while the python package directory stays unmodified.

It is recommended to run the dockerize.bash script after you have done the “Creating Your Own Package” chapter (see below).

Why do it this way? We want a self-contained python package with dockerization information in it. On the other hand, we don’t want our package/repo to be a docker-compose “wireframe”. Let’s emphasize that: the only thing we upload to git is the python package directory.

Once you have created that directory structure & links, its great starting point for developing your python package into a microservice. Of course, for your final service including (maybe) various microservices, you would create a separate repo having only docker-compose-dev.yml (and little more) and use the following scheme:

docker-compose-dev.yml # consider using docker-compose "include" clause from directories microservice-1/docker/ etc.
datashare/
secrets/
microservice-1/     # as git submodule
microservice-2/     # as git submodule
microservice-3/     # as git submodule
...

For more info about git submodules, please see this.

Creating Your Own Package

Let’s suppose your name is Janne Jantunen and you have decided to create a new package named your_package_name. Proceed like this:

git clone https://github.com/elsampsa/skeleton
mv skeleton your_package_name
cd your_package_name # the self-contained python package dir
bash/reinit.bash
bash/cleanpy.bash
bash/setauthor.bash "Janne Jantunen"
bash/setver.bash 0 0 1

That should change all occurrences of “skeleton” (in the file and directory names and inside the files) into “your_package_name”, among other things.

If you don’t wan’t the cpp extensions, edit setup.py & comment the cpp section away.

If you want to use the provided dockerization scheme (see above), run next:

bash/dockerize.bash
cd your_package_name

Which will create series of directories and softlinks

Finally, run this command:

cd your_package_name
pip3 install --user -e .

This way your package is installed in the “editable mode” (that extra “-e”). It creates link from the python package directory (in linux, look for $HOME/.local/lib/python3.x/site-packages/*.egg-link files) to your local directory. Now python’s package system can find your module.

If you did not use the -e switch, it would simply create a copy of your code and place it into the python package directory and your future changes to the codebase would not be updated.

Now you can try these commands:

skeleton -h
skeleton-service -h

They map to skeleton/skeleton/cli.py and service.py (look into setup.py how this mapping is done).

One is a desktop-like end-point that caches configurations under user’s home dir, while the other one is a microservice-like end-point that uses an .ini file for configuration and sensitive information.

You can mod the documentation by editing .rst files in the docs/ directory. The idea is that you hide/delete this page (tutorial.rst) from your own module.

After installing the package, you can re-compile and check the documentation with

cd docs
./compile.bash
cd ..
firefox docs/index.html

If you want to push your newly minted python project into a git repo, do:

git init
git remote add origin https://[your-personal-git-repository]/your_package_name.git
git add *
git commit -m "initial commit"
git push -u origin master

NOTE: be sure just to do this to the python package directory and not for the whole dockerization directory scheme if you use that.

Please note the file .gitignore for the files we do not want into our package.

The name of your package (that is also the name of your python module) should not contain “weird characters”, i.e. “.,-” etc.

Git tags can be added with:

git tag v1.2 -m "cool version v1.2"
git push origin --tags

However, it’s recommended to use the bash/git_tag.bash script together with the bash/setver.bash script

Now production users can install with pip3 certain revisions of the software (see here)

Docker

If you ran the dockerization script, your directory structure should look like explained in here. In order to build the docker image and create a running container, simply run in the main directory:

dev.bash build
dev.bash up

The C++ extension is put on the docker image, while all other code is read from local directory via a shared docker mount.

The docker container sees this directory structure (see Dockerfile.dev and docker-compose-dev.yml for more details):

/usr/src/app/          # directory on the docker image

    requirements.txt   # files copied from the python package directory
    MANIFEST.in        # into the docker image
    setup.py           #

    skeleton/          # python SOURCE CODE DIRECTORY as a shared docker mount
                       # NOTE: NOT the python package directory (see above)
                       # docker shall not write __pycache__ directories here
                       # (see "PYTHONPYCACHEPREFIX" in Dockerfile.dev)
        ...            #
        docker/        # docker entry-point scripts
        ...            #

Dependencies

Keep them up-to-date in requirements.txt

Files

This page has been produced with the file skeleton/docs/tutorial.rst. Go ahead and open it in your editor. Open also “index.rst”. Edit them accordingly. Said that, let’s take a closer look at the directory structure (in the following skeleton is synonymous to your_package_name):

  • skeleton/ : This is the python package directory

    README.md

    Readme file in online markdown format (nice for github). Play around with online markdown here

    licenses/

    Pick up a FOSS license and copy it here as “COPYING”

    CHANGELOG

    Recent changes in the code

    setup.py

    Python setup script for creating packages

    MANIFEST.in

    Used by setup.py

    .gitignore

    Files ignored by git

  • skeleton/sudo : System-specific install scripts

    Sometimes you just need to use that apt-get install for some base packages & other libraries your program package needs. It’s a good practice to collect all those extra super-user commands into a separate script. The ubuntu.bash in the bootstrap/ directory for example, installs build environment & swig that you need to install the cpp extensions.

  • bash/ : Some helper scripts

    dockerize.bash

    Creates a docker-compose “wireframe” directory structure around your python package

    add_untracked.bash

    Add all new files to git

    changestr.bash

    Helper script to change strings in your package

    reinit.bash

    Reinitializes the package name (you used this in the very beginning)

    cleanpy.bash

    Removes __pycache__ directories and .pyc files; Do maybe before “git add .”

    dev.bash

    Docker-compose short-hand command

    make_venv.bash

    Create python virtualenv short-hand command

    upload.bash

    Uploads your python package into PyPi

    test_upload.bash

    Uploads your python package into PyPi test repository

    setver.bash

    Helper script to change the version of your package

    setauthor.bash

    Helper script to change the author of the package

    git_tag.bash

    Pushes version information into git. This script is modified by the setver.bash

    git_rm_tag.bash

    Removes the version information (if you made an error)

  • skeleton/docs/ : Documentation and autogenerated documentation lives here

    index.html

    Redirects to the autogenerated html [don’t touch]

    index.rst

    The main index start from here

    tutorial.rst

    The file you are staring at the moment (or html version of it)

    intro.rst

    Introduction of the module

    requirements.rst

    What the user needs in order to use this module

    examples.rst

    Copy-pastable examples for the api user

    submodules.rst

    Documentation generated automatically from source code. Just add a new entry here for each submodule, don’t touch otherwise

    license.rst

    Copyright and License

    authors.rst

    Who is the maintainer & author

    compile.bash

    Run this always after modifying your documents / source code [don’t edit]

    clean.bash

    Clean autogenerated documentation [don’t edit]

    conf.py

    Sphinx configuration file. edit this and try different styles

    .nojekyll

    Dummy file. This is needed for the on-line documentation to work with GitHub [don’t touch]

    snippets/

    Example snippet source files and scripts for generating pages from them. Edit also the file requirements.txt here.

    generated/

    Auto-generated documents [don’t touch]

  • skeleton/skeleton : Python source code directory

    __init__.py

    Python module initialization

    greeters/

    Submodule

    data/

    Static data the module needs and that is included in the python package

  • skeleton/skeleton_cpp/ : A numpy C++ extension module that uses SWIG

    README.md

    read this first

    compile.bash

    stand-alone compilation command. Remember that when you run pip3 install, it also compiles the package

    runswig.bash

    stand-alone swig command.

    skeleton_cpp_module.cpp

    cpp example

    skeleton_cpp_module.h

    dummy cpp example

    skeleton_cpp_module.i

    an example interface file

    skeleton_cpp_module.py

    generated by swig

    skeleton_cpp_module_wrap.cpp

    generated by swig

It’s time to start documenting! Edit the files docs/*.rst. Here are some nice tips for using Sphinx and here are some more

To create and to recreate the docs (after changing the code, etc.), do (in the docs/ directory):

./compile.bash

To see your documentation, launch

firefox index.html

Online autodocumentation

Github users

After creating the git repo, create also a site for your project like this: Settings => GitHub pages => Source : master branch / docs folder => press Save. Now your documentation is online in github! (don’t forget to include the “docs/” folder into git).

Gitlab users

Using a generic service that uses a gitlab server? No problem. There is a simple hack to put the autodocs online.

  • In gitlab, create a wiki page for your project. Write there the words “Hello world”. Next, clone the wiki repository (not the repository, just the wiki repository) with that button on the right that says “clone repository”.

  • Edit your projects “docs/compile.bash”. Uncomment the options for Gitlab and set the directory of your wiki repository correctly. Run “./compile.bash”. Now the whole documentation tree has been copied to your wiki repo directory.

  • In the wiki repo directory, open “home.md” in an editor. Modify it to look like this:

For documentation
[click here](_build/html/index.html)
  • You still need to use git to add and push all the files to the wiki repository online (there are instructions in “compile.bash”)

Organize with namespaces

You can also organize your module under namespaces. Read more about it here.

Let’s assume two modules

namespace_subpackage1/
    setup.py            Remember that automatic package finder does not work for namespace packages
    etc.
    namespace/          Keep this directory empty
        subpackage1/
            __init__.py
            submodule1
            submodule2

The other one being

namespace_subpackage2/
    setup.py            Remember that automatic package finder does not work for namespace packages
    etc.
    namespace/          Keep this directory empty
        subpackage2/
            __init__.py
            submodule1
            submodule2

Once installed, the following works:

from namespace.subpackage1 import something
from namespace.subpackage2 import something_else

Code Organization

This tutorial comes with a module that has a single submodule called “greeters” that is organized as follows:

greeters
__init__.py
base.py
fancy.py

The idea is that base.py has some base class definitions that are used by fancy.py to create derived classes. __init__.py has been tweaked so that user can import with

from your_package_name.greeters import FancyHelloWorld

instead of the cumbersome

from your_package_name.fancy.greeters import FancyHelloWorld

In general, use __init__.py only for exposing your API. Do not write any classes or methods therein.

The file cli.py is the command-line entry point for the whole application. See in setup.py how you can map a command into cli.py.

Packaging

If you have done everything as instructed here, creating a distributable python package can be done as follows:

cd your_package_name
python3 setup.py sdist

Your distributable python package is now in directory “dist/”. You can install it with:

pip3 install --upgrade dist/your_package_name-version.tar.gz

The setup.py script automatically finds and includes python packages to the distribution package. In “MANIFEST.in” we also tell it to include the complete “docs/” directory and an auxiliary file from the “greeters” submodule. See “setup.py” for more instructions.

Polish your setup script with the following cycle:

rm dist/*
python3 setup.py sdist
pip3 install --user --upgrade --verbose dist/your_package_name-version.tar.gz

The verbose option is nice to see any problems, with say, your post-installing script defined in setup.py. Once you have got rid of all the errors, you can be sure that it works also when people install your python package directly from git using pip3.

Package testing

When testing the pip3 installation, use virtualenv to see that you got the dependencies right. First, create a virtualenv:

virtualenv --no-site-packages -p python3 test

Then let’s use that virtualenv (we also clean up PYTHONPATH) to test your production system:

cd test
source bin/activate
export PYTHONPATH=
pip3 install --user --upgrade git+git://[your-personal-git-repository]/your_package_name

See here how the end-user would be using your python module directly from git.

To exit from virtualenv, use:

deactivate

PyPi

By uploading your python package to PyPi (Python Package Index), end users can install it simply with

pip3 install --user packagename

To use the python package repositories, follow these steps:

  1. Create an account in PyPi repository and in PyPi test repository

  2. Create file $HOME/.pypirc with the following contents:

[distutils]
index-servers =
    pypi
    test

[pypi]
username: xxx
password: xxx

[test]
repository: https://test.pypi.org/legacy/
username: xxx
password: xxx

The modern way to connect safely to pypi, is via a token. Then the pypi section looks like this:

[pypi]
username = __token__
password = a_very_long_token

Get your token from pypi.org!

  1. Install twine

pip3 install --user twine

Now you can use the scripts test_upload.bash and upload.bash to send your package to the python (test) repository (first, edit those scripts).

To install from test repository (instead of the official one), use:

pip3 install --user --extra-index-url https://testpypi.python.org/pypi packagename

Debian packaging

Create a .deb package like this:

make -f debian/rules package

Uploading to an Ubuntu PPA

Start like this:

debuild -S -sa

Now in the upper level directory you will have the file skeleton_0.0.0-0ubuntu1-bionicppa1_source.changes. You can send it to your Ubuntu PPA (personal package archive) with:

dput ppa:your-name/your-repo skeleton_0.0.0-0ubuntu1-bionicppa1_source.changes

Example snippets

The following python code can be downloaded from [here]

from skeleton.greeters import FancyHelloWorld
# let's create an instance
gr=FancyHelloWorld(person="Sampsa")
# print the result
print(gr)

Check out directory “docs/snippets/”. It serves as a collection for small example programs. Open and edit “form_snippets.bash” there. You can then run it. Remember to recompile the documentation once you’ve run that script.

Notebooks

Notebooks can used for interactive code development and testing!

Go to directory notebook and run there:

jupyter notebook

Open the example notebook. Observe how neatly you can do interactive code development with notebooks. The example notebook also shows how to use the installed python package (including the cpp extensions)

Miscellaneous

  • Check out in the source code, how I init and check a large number of parameters in the constructor

  • To keep your code clean and PEP8 compliant, there’s a nice tool called “autopep8”, install with:

pip3 install --user autopep8

Now you can fix a python file like this:

autopep8 --aggressive --in-place yourfile.py