Implementation Details

Wrapper classes

The tools implemented in this package provide wrapper classes for preprocessing, feature extraction and face recognition algorithms that are implemented in other packages of bob.bio. The basic idea is that the wrapped algorithms are provided with several frames of the video. For this purpose, the bob.bio.video.FrameSelector can be applied to select one or several frames from the source video. For each of the selected frames, the faces are aligned – either using hand-labeled data, or after detecting the faces using bob.bio.face.preprocessor.FaceDetect. Afterward, features are extracted, models are enrolled using several frames per video, and the scoring procedure fuses the scores from one model and several probe frames of a probe video. If one of the base algorithms requires training, the wrapper classes provide these information accordingly.

Hence, in this package we provide three wrapper classes:

Each of these wrapper classes is created with a base algorithm that will do the actual preprocessing, extraction, projection, enrollment or scoring. The base class can be specified in three different ways. The most prominent ways will surely be to use some of the registered Resources. The more sophisticated way is to provide an instance of the wrapped class, or even a string that represents a constructor call of the desired object. Finally (rarely used, though) you can provide the path of Configuration Files.

The IO of the preprocessed frames, and extracted or projected features is provided using the bob.bio.video.FrameContainer interface. This frame container reads and writes bob.io.base.HDF5Files, where it stores information about the frames. Additionally, it uses the IO functionality of the wrapped classes to actually write the data in the desired format. Hence, all IO functionalities of the wrapped classes need to be able to handle bob.io.base.HDF5File.

Note

The video extensions also integrate into the specialized scripts provided by bob.bio.gmm.

Registered Resources

In this package we do not provide registered resources for the wrapper classes. Hence, when you want to run an experiment using the video wrapper classes, you might want to create the wrapper classes inline:

verify.py --database youtube --preprocessor 'bob.bio.video.preprocessor.Wrapper("landmark-detect")' --features 'bob.bio.video.extractor.Wrapper("dct-blocks")' --algorithm 'bob.bio.video.algorithm.Wrapper("gmm")' ...

Databases

All video databases defined here rely on the bob.bio.base.database.BioDatabase interface, which in turn uses the verification_databases.

After downloading and extracting the original data of the data sets, it is necessary that the scripts know, where the data was installed. For this purpose, the verify.py script can read a special file, where those directories are stored, see Installation Instructions. By default, this file is located in your home directory, but you can specify another file on command line.

The other option is to change the directories directly inside the configuration files. Here is the list of files and replacement strings for all databases that are registered as resource, in alphabetical order:

  • MOBIO: 'mobio-video'

    • Videos: [YOUR_MOBIO_VIDEO_DIRECTORY]
  • Youtube: 'youtube'

    • Frames : [YOUR_YOUTUBE_DIRECTORY]

      Note

      You can choose any of the frame databases, i.e., the frames_images_DB directory containing the original data, or the aligned_images_DB containing pre-cropped faces.

You can use the databases.py script to list, which data directories are correctly set up.