Modules#

openpifpaf#

An open implementation of PifPaf.

class openpifpaf.Annotation(keypoints, skeleton, sigmas=None, *, categories=None, score_weights=None, suppress_score_index=None)#
set(data, joint_scales=None, *, category_id=1, fixed_score=None, fixed_bbox=None)#

Set the data (keypoint locations, category, …) for this instance.

json_data(coordinate_digits=2)#

Data ready for json dump.

class openpifpaf.Configurable(**kwargs)#

Make a class configurable with CLI and by instance.

Warning

This is an experimental class. It is in limited use already but should not be expanded for now.

To use this class, inherit from it in the class that you want to make configurable. There is nothing else to do if your class does not have an __init__ method. If it does, you should take extra keyword arguments (kwargs) in the signature and pass them to the super constructor.

Example:

>>> class MyClass(openpifpaf.Configurable):
...     a = 0
...     def __init__(self, myclass_argument=None, **kwargs):
...         super().__init__(**kwargs)
...     def get_a(self):
...         return self.a
>>> MyClass().get_a()
0

Instance configurability allows to overwrite a class configuration variable with an instance variable by passing that variable as a keyword into the class constructor:

>>> MyClass(a=1).get_a()  # instance variable overwrites value locally
1
>>> MyClass().get_a()  # while the class variable is untouched
0
classmethod cli(parser: argparse.ArgumentParser)#

Extend an ArgumentParser with the configurable parameters.

classmethod configure(args: argparse.Namespace)#

Configure the class from parsed command line arguments.

class openpifpaf.Predictor(checkpoint=None, head_metas=None, *, json_data=False, visualize_image=False, visualize_processed_image=False)#

Convenience class to predict from various inputs with a common configuration.

batch_size = 1#

batch size

device = device(type='cpu')#

device

fast_rescaling = True#

fast rescaling

loader_workers = None#

loader workers

long_edge = None#

long edge

classmethod cli(parser: argparse.ArgumentParser, *, skip_batch_size=False, skip_loader_workers=False)#

Add command line arguments.

When using this class together with datasets (e.g. in eval), skip the cli arguments for batch size and loader workers as those will be provided via the datasets module.

classmethod configure(args: argparse.Namespace)#

Configure from command line parser.

dataset(data)#

Predict from a dataset.

enumerated_dataloader(enumerated_dataloader)#

Predict from an enumerated dataloader.

dataloader(dataloader)#

Predict from a dataloader.

image(file_name)#

Predict from an image file name.

images(file_names, **kwargs)#

Predict from image file names.

pil_image(image)#

Predict from a Pillow image.

pil_images(pil_images, **kwargs)#

Predict from Pillow images.

numpy_image(image)#

Predict from a numpy image.

numpy_images(numpy_images, **kwargs)#

Predict from numpy images.

image_file(file_pointer)#

Predict from an opened image file pointer.

openpifpaf.datasets#

Datasets and tools to load data in batches.

class openpifpaf.datasets.DataModule#

Base class to extend OpenPifPaf with custom data.

This class gives you all the handles to train OpenPifPaf on a new dataset. Create a new class that inherits from this to handle a new datasets.

  1. Define the PifPaf heads you would like to train. For example, CIF (Composite Intensity Fields) to detect keypoints, and CAF (Composite Association Fields) to associate joints

  2. Add class variables, such as annotations, training/validation image paths.

batch_size = 1#

Data loader batch size.

head_metas: Optional[List[openpifpaf.headmeta.Base]] = None#

A list of head metas for this dataset. Set as instance variable (not class variable) in derived classes so that different instances of head metas are created for different instances of the data module. Head metas contain the base stride which might be different for different data module instances. When loading a checkpoint, entries in this list will be matched by name and dataset to entries in the checkpoint and overwritten here.

classmethod cli(parser: argparse.ArgumentParser)#

Command line interface (CLI) to extend argument parser for your custom dataset.

Make sure to use unique CLI arguments for your dataset. For clarity, we suggest to start every CLI argument with the name of your new dataset, i.e. --<dataset_name>-train-annotations.

All PifPaf commands will still work. E.g. to load a model, there is no need to implement the command --checkpoint

classmethod configure(args: argparse.Namespace)#

Take the parsed argument parser output and configure class variables.

metrics() List[openpifpaf.metric.base.Base]#

Define a list of metrics to be used for eval.

train_loader() torch.utils.data.dataloader.DataLoader#

Loader of the training dataset.

A Coco Data loader is already available, or a custom one can be created and called here. To modify preprocessing steps of your images (for example scaling image during training):

  1. chain them using torchvision.transforms.Compose(transforms)

  2. pass them to the preprocessing argument of the dataloader

val_loader() torch.utils.data.dataloader.DataLoader#

Loader of the validation dataset.

The augmentation and preprocessing should be the same as for train_loader. The only difference is the set of data. This allows to inspect the train/val curves for overfitting.

As in the train_loader, the annotations should be encoded fields so that the loss function can be computed.

eval_loader() torch.utils.data.dataloader.DataLoader#

Loader of the evaluation dataset.

For local runs, it is common that the validation dataset is also the evaluation dataset. This is then changed to test datasets (without ground truth) to produce predictions for submissions to a competition server that holds the private ground truth.

This loader shouldn’t have any data augmentation. The images should be as close as possible to the real application. The annotations should be the ground truth annotations similarly to what the output of the decoder is expected to be.

openpifpaf.decoder#

Collections of decoders: fields to annotations.

class openpifpaf.decoder.Decoder#

Generate predictions from image or field inputs.

When creating a new generator, the main implementation goes into __call__().

classmethod cli(parser: argparse.ArgumentParser)#

Command line interface (CLI) to extend argument parser.

classmethod configure(args: argparse.Namespace)#

Take the parsed argument parser output and configure class variables.

classmethod factory(head_metas) List[openpifpaf.decoder.decoder.Decoder]#

Create instances of an implementation.

__call__(fields, *, initial_annotations=None) List[openpifpaf.annotation.Base]#

For single image, from fields to annotations.

classmethod fields_batch(model, image_batch, *, device=None)#

From image batch to field batch.

batch(model, image_batch, *, device=None, gt_anns_batch=None)#

From image batch straight to annotations batch.

openpifpaf.encoder#

Convert a set of keypoint coordinates into target fields.

Takes an annotation from a dataset and turns it into the ground truth for a field.

openpifpaf.headmeta#

Head meta objects contain meta information about head networks.

This includes the name, the name of the individual fields, the composition, etc.

class openpifpaf.headmeta.Base(name: str, dataset: str)#
class openpifpaf.headmeta.Cif(name: str, dataset: str, keypoints: List[str], sigmas: List[float], pose: Optional[Any] = None, draw_skeleton: Optional[List[Tuple[int, int]]] = None, score_weights: Optional[List[float]] = None, decoder_seed_mask: Optional[List[int]] = None, training_weights: Optional[List[float]] = None)#

Head meta data for a Composite Intensity Field (CIF).

class openpifpaf.headmeta.Caf(name: str, dataset: str, keypoints: List[str], sigmas: List[float], skeleton: List[Tuple[int, int]], pose: Optional[Any] = None, sparse_skeleton: Optional[List[Tuple[int, int]]] = None, dense_to_sparse_radius: float = 2.0, only_in_field_of_view: bool = False, decoder_confidence_scales: Optional[List[float]] = None, training_weights: Optional[List[float]] = None)#

Head meta data for a Composite Association Field (CAF).

class openpifpaf.headmeta.CifDet(name: str, dataset: str, categories: List[str], training_weights: Optional[List[float]] = None)#

Head meta data for a Composite Intensity Field (CIF) for Detection.

class openpifpaf.headmeta.TSingleImageCif(name: str, dataset: str, keypoints: List[str], sigmas: List[float], pose: Optional[Any] = None, draw_skeleton: Optional[List[Tuple[int, int]]] = None, score_weights: Optional[List[float]] = None, decoder_seed_mask: Optional[List[int]] = None, training_weights: Optional[List[float]] = None)#

Single-Image CIF head in tracking models.

class openpifpaf.headmeta.TSingleImageCaf(name: str, dataset: str, keypoints: List[str], sigmas: List[float], skeleton: List[Tuple[int, int]], pose: Optional[Any] = None, sparse_skeleton: Optional[List[Tuple[int, int]]] = None, dense_to_sparse_radius: float = 2.0, only_in_field_of_view: bool = False, decoder_confidence_scales: Optional[List[float]] = None, training_weights: Optional[List[float]] = None)#

Single-Image CAF head in tracking models.

class openpifpaf.headmeta.Tcaf(name: str, dataset: str, keypoints_single_frame: List[str], sigmas_single_frame: List[float], pose_single_frame: Any, draw_skeleton_single_frame: Optional[List[Tuple[int, int]]] = None, keypoints: Optional[List[str]] = None, sigmas: Optional[List[float]] = None, pose: Optional[Any] = None, draw_skeleton: Optional[List[Tuple[int, int]]] = None, only_in_field_of_view: bool = False, training_weights: Optional[List[float]] = None)#

Tracking Composite Association Field.

openpifpaf.metric#

class openpifpaf.metric.Base#
accumulate(predictions, image_meta, *, ground_truth=None)#

For every image, accumulate that image’s predictions into this metric.

Parameters
  • predictions – List of predictions for one image.

  • image_meta – Meta dictionary for this image as returned by the data loader.

  • ground_truth – Ground truth information as produced by the eval loader. Optional because some metrics (i.e. pycocotools) read ground truth separately.

stats()#

Return a dictionary of summary statistics.

The dictionary should be of the following form and can contain an arbitrary number of entries with corresponding labels:

{
    'stats': [0.1234, 0.5134],
    'text_labels': ['AP', 'AP0.50'],
}
write_predictions(filename, *, additional_data=None)#

Write predictions to a file.

This is used to produce a metric-compatible output of predictions. It is used for test challenge submissions where a remote server holds the private test set.

Parameters
  • filename – Output filename of prediction file.

  • additional_data – Additional information that might be worth saving along with the predictions.

openpifpaf.network#

Backbone networks, head networks and tools for training.

class openpifpaf.network.BaseNetwork(name: str, *, stride: int, out_features: int)#

Common base network.

Parameters
  • name – a short name for the base network, e.g. resnet50

  • stride – total stride from input to output

  • out_features – number of output features

classmethod cli(parser: argparse.ArgumentParser)#

Command line interface (CLI) to extend argument parser.

classmethod configure(args: argparse.Namespace)#

Take the parsed argument parser output and configure class variables.

class openpifpaf.network.HeadNetwork(meta: openpifpaf.headmeta.Base, in_features: int)#

Base class for head networks.

Parameters
  • meta – head meta instance to configure this head network

  • in_features – number of input features which should be equal to the base network’s output features

classmethod cli(parser: argparse.ArgumentParser)#

Command line interface (CLI) to extend argument parser.

classmethod configure(args: argparse.Namespace)#

Take the parsed argument parser output and configure class variables.

openpifpaf.show#

Drawing primitives.

class openpifpaf.show.AnimationFrame(*, fig_width=8.0, fig_init_args=None, video_output=None, second_visual=False)#

Animations.

video_fps = 10#

frames per second

video_dpi = 100#

video dpi

show = False#

call matplotlib show()

class openpifpaf.show.Canvas#

Canvas for plotting.

All methods expose Axes objects. To get Figure objects, you can ask the axis ax.get_figure().

class openpifpaf.show.KeypointPainter(*, xy_scale=1.0, highlight=None, highlight_invisible=False, **kwargs)#

Paint poses.

The constructor can take any class attribute as parameter and overwrite the global default for that instance.

Example to create a KeypointPainter with thick lines:

>>> kp = KeypointPainter(line_width=48)

openpifpaf.transforms#

Transform input data.

class openpifpaf.transforms.Preprocess#

Preprocess an image with annotations and meta information.

class openpifpaf.transforms.Assert(function, message=None)#

Inspect (and assert) on current image, anns, meta.

class openpifpaf.transforms.Compose(preprocess_list: List[Optional[openpifpaf.transforms.preprocess.Preprocess]])#

Execute given transforms in sequential order.

class openpifpaf.transforms.Crop(long_edge, use_area_of_interest=True)#

Random cropping.

static area_of_interest(anns, valid_area)#

area that contains annotations with keypoints

class openpifpaf.transforms.Encoders(encoders)#

Preprocess operation that runs encoders.

class openpifpaf.transforms.HFlip(keypoints, hflip)#

Horizontally flip image and annotations.

class openpifpaf.transforms.Blur(max_sigma=5.0)#

Blur image.

class openpifpaf.transforms.ImageTransform(image_transform)#

Transform image without modifying annotations or meta.

class openpifpaf.transforms.JpegCompression(quality=50)#

Add jpeg compression.

class openpifpaf.transforms.MinSize(min_side=1.0)#

Convert annotations below a size to crowd annotations.

class openpifpaf.transforms.CenterPad(target_size: Union[int, Tuple[int, int]])#

Pad to a square of given size.

class openpifpaf.transforms.DeterministicEqualChoice(transforms: List[openpifpaf.transforms.preprocess.Preprocess], salt: int = 0)#

Deterministically choose one of the transforms.

Parameters
  • transforms – a list of transforms

  • salt – integer that combined with meta[‘image_id] determines the choice of the transform

class openpifpaf.transforms.RandomApply(transform: openpifpaf.transforms.preprocess.Preprocess, probability: float)#

Randomly apply another transformation.

Parameters
  • transform – another transformation

  • probability – probability to apply the given transform

class openpifpaf.transforms.RandomChoice(transforms: List[Optional[openpifpaf.transforms.preprocess.Preprocess]], probabilities: List[float])#

Choose a single random transform.

class openpifpaf.transforms.RotateBy90(angle_perturbation=0.0, fixed_angle=None, prepad=False)#

Randomly rotate by multiples of 90 degrees.

class openpifpaf.transforms.RotateUniform(max_angle=30.0, prepad=True)#

Rotate by a random angle uniformly drawn from a given angle range.

class openpifpaf.transforms.RescaleAbsolute(long_edge, *, fast=False, resample=Resampling.BILINEAR)#

Rescale to a given size.

class openpifpaf.transforms.RescaleRelative(scale_range=(0.5, 1.0), *, resample=Resampling.BILINEAR, absolute_reference=None, fast=False, power_law=False, stretch_range=None)#

Rescale relative to input image.

class openpifpaf.transforms.ToAnnotations(converters)#

Convert inputs to annotation objects.

class openpifpaf.transforms.ToCrowdAnnotations(categories)#

Input to crowd annotations.

class openpifpaf.transforms.ToDetAnnotations(categories)#

Input to detection annotations.

class openpifpaf.transforms.ToKpAnnotations(categories, keypoints_by_category, skeleton_by_category)#

Input to keypoint annotations.

class openpifpaf.transforms.UnclippedArea(*, threshold=0.5)#

Only keep annotations that have a certain fraction of the original area.

class openpifpaf.transforms.UnclippedSides(*, margin=10, clipped_sides_okay=2)#

Only keep annotations with given number of unclipped sides.

openpifpaf.visualizer#

Higher level drawing functions.

class openpifpaf.visualizer.Caf(meta: openpifpaf.headmeta.Caf)#

Visualize CAF field.

class openpifpaf.visualizer.Cif(meta: openpifpaf.headmeta.Cif)#

Visualize a CIF field.

class openpifpaf.visualizer.CifDet(meta: openpifpaf.headmeta.CifDet)#

Visualize a CifDet field.

class openpifpaf.visualizer.CifHr(*, stride=1, field_names=None)#

Visualize the CifHr map.

class openpifpaf.visualizer.Occupancy(*, field_names=None)#

Visualize occupancy map.

class openpifpaf.visualizer.Seeds(*, stride=1)#

Visualize seeds.

predicted(seeds)#

Seeds are: confidence, field_index, x, y, …