Using geometric transformations

In this example, we will see how to use geometric transformations in the context of image processing.

from __future__ import print_function

import math
import numpy as np
import matplotlib.pyplot as plt

from skimage import data
from skimage import transform as tf

margins = dict(hspace=0.01, wspace=0.01, top=1, bottom=0, left=0, right=1)

Basics

Several different geometric transformation types are supported: similarity, affine, projective and polynomial.

Geometric transformations can either be created using the explicit parameters (e.g. scale, shear, rotation and translation) or the transformation matrix:

First we create a transformation using explicit parameters:

tform = tf.SimilarityTransform(scale=1, rotation=math.pi / 2,
                               translation=(0, 1))
print(tform.params)

Alternatively you can define a transformation by the transformation matrix itself:

matrix = tform.params.copy()
matrix[1, 2] = 2
tform2 = tf.SimilarityTransform(matrix)

These transformation objects can then be used to apply forward and inverse coordinate transformations between the source and destination coordinate systems:

coord = [1, 0]
print(tform2(coord))
print(tform2.inverse(tform(coord)))

Image warping

Geometric transformations can also be used to warp images:

text = data.text()

tform = tf.SimilarityTransform(scale=1, rotation=math.pi / 4,
                               translation=(text.shape[0] / 2, -100))

rotated = tf.warp(text, tform)
back_rotated = tf.warp(rotated, tform.inverse)

fig, (ax1, ax2, ax3) = plt.subplots(ncols=3, figsize=(8, 3))
fig.subplots_adjust(**margins)
plt.gray()
ax1.imshow(text)
ax1.axis('off')
ax2.imshow(rotated)
ax2.axis('off')
ax3.imshow(back_rotated)
ax3.axis('off')

../../_images/plot_geometric_1.png

Parameter estimation

In addition to the basic functionality mentioned above you can also estimate the parameters of a geometric transformation using the least-squares method.

This can amongst other things be used for image registration or rectification, where you have a set of control points or homologous/corresponding points in two images.

Let’s assume we want to recognize letters on a photograph which was not taken from the front but at a certain angle. In the simplest case of a plane paper surface the letters are projectively distorted. Simple matching algorithms would not be able to match such symbols. One solution to this problem would be to warp the image so that the distortion is removed and then apply a matching algorithm:

text = data.text()

src = np.array((
    (0, 0),
    (0, 50),
    (300, 50),
    (300, 0)
))
dst = np.array((
    (155, 15),
    (65, 40),
    (260, 130),
    (360, 95)
))

tform3 = tf.ProjectiveTransform()
tform3.estimate(src, dst)
warped = tf.warp(text, tform3, output_shape=(50, 300))

fig, (ax1, ax2) = plt.subplots(nrows=2, figsize=(8, 3))
fig.subplots_adjust(**margins)
plt.gray()
ax1.imshow(text)
ax1.plot(dst[:, 0], dst[:, 1], '.r')
ax1.axis('off')
ax2.imshow(warped)
ax2.axis('off')

../../_images/plot_geometric_2.png

plt.show()

STDOUT


        

STDERR


        

Python source code: download (generated using skimage 0.11dev)

IPython Notebook: download (generated using skimage 0.11dev)

CmZyb20gX19mdXR1cmVfXyBpbXBvcnQgcHJpbnRfZnVuY3Rpb24KCmltcG9ydCBtYXRoCmltcG9ydCBudW1weSBhcyBucAppbXBvcnQgbWF0cGxvdGxpYi5weXBsb3QgYXMgcGx0Cgpmcm9tIHNraW1hZ2UgaW1wb3J0IGRhdGEKZnJvbSBza2ltYWdlIGltcG9ydCB0cmFuc2Zvcm0gYXMgdGYKCm1hcmdpbnMgPSBkaWN0KGhzcGFjZT0wLjAxLCB3c3BhY2U9MC4wMSwgdG9wPTEsIGJvdHRvbT0wLCBsZWZ0PTAsIHJpZ2h0PTEp
CnRmb3JtID0gdGYuU2ltaWxhcml0eVRyYW5zZm9ybShzY2FsZT0xLCByb3RhdGlvbj1tYXRoLnBpIC8gMiwKICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgIHRyYW5zbGF0aW9uPSgwLCAxKSkKcHJpbnQodGZvcm0ucGFyYW1zKQ==
Cm1hdHJpeCA9IHRmb3JtLnBhcmFtcy5jb3B5KCkKbWF0cml4WzEsIDJdID0gMgp0Zm9ybTIgPSB0Zi5TaW1pbGFyaXR5VHJhbnNmb3JtKG1hdHJpeCk=
CmNvb3JkID0gWzEsIDBdCnByaW50KHRmb3JtMihjb29yZCkpCnByaW50KHRmb3JtMi5pbnZlcnNlKHRmb3JtKGNvb3JkKSkp
CnRleHQgPSBkYXRhLnRleHQoKQoKdGZvcm0gPSB0Zi5TaW1pbGFyaXR5VHJhbnNmb3JtKHNjYWxlPTEsIHJvdGF0aW9uPW1hdGgucGkgLyA0LAogICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgdHJhbnNsYXRpb249KHRleHQuc2hhcGVbMF0gLyAyLCAtMTAwKSkKCnJvdGF0ZWQgPSB0Zi53YXJwKHRleHQsIHRmb3JtKQpiYWNrX3JvdGF0ZWQgPSB0Zi53YXJwKHJvdGF0ZWQsIHRmb3JtLmludmVyc2UpCgpmaWcsIChheDEsIGF4MiwgYXgzKSA9IHBsdC5zdWJwbG90cyhuY29scz0zLCBmaWdzaXplPSg4LCAzKSkKZmlnLnN1YnBsb3RzX2FkanVzdCgqKm1hcmdpbnMpCnBsdC5ncmF5KCkKYXgxLmltc2hvdyh0ZXh0KQpheDEuYXhpcygnb2ZmJykKYXgyLmltc2hvdyhyb3RhdGVkKQpheDIuYXhpcygnb2ZmJykKYXgzLmltc2hvdyhiYWNrX3JvdGF0ZWQpCmF4My5heGlzKCdvZmYnKQ==
CnRleHQgPSBkYXRhLnRleHQoKQoKc3JjID0gbnAuYXJyYXkoKAogICAgKDAsIDApLAogICAgKDAsIDUwKSwKICAgICgzMDAsIDUwKSwKICAgICgzMDAsIDApCikpCmRzdCA9IG5wLmFycmF5KCgKICAgICgxNTUsIDE1KSwKICAgICg2NSwgNDApLAogICAgKDI2MCwgMTMwKSwKICAgICgzNjAsIDk1KQopKQoKdGZvcm0zID0gdGYuUHJvamVjdGl2ZVRyYW5zZm9ybSgpCnRmb3JtMy5lc3RpbWF0ZShzcmMsIGRzdCkKd2FycGVkID0gdGYud2FycCh0ZXh0LCB0Zm9ybTMsIG91dHB1dF9zaGFwZT0oNTAsIDMwMCkpCgpmaWcsIChheDEsIGF4MikgPSBwbHQuc3VicGxvdHMobnJvd3M9MiwgZmlnc2l6ZT0oOCwgMykpCmZpZy5zdWJwbG90c19hZGp1c3QoKiptYXJnaW5zKQpwbHQuZ3JheSgpCmF4MS5pbXNob3codGV4dCkKYXgxLnBsb3QoZHN0WzosIDBdLCBkc3RbOiwgMV0sICcucicpCmF4MS5heGlzKCdvZmYnKQpheDIuaW1zaG93KHdhcnBlZCkKYXgyLmF4aXMoJ29mZicp
CnBsdC5zaG93KCk=