NOTICE: This software (or technical data) was produced for the U.S. Government under contract, and is subject to the Rights in Data-General Clause 52.227-14, Alt. IV (DEC 2007). Copyright 2023 The MITRE Corporation. All Rights Reserved.
API Overview
In OpenMPF, a component is a plugin that receives jobs (containing media), processes that media, and returns results.
The OpenMPF Batch Component API currently supports the development of detection components, which are used to detect objects in image, video, audio, or other (generic) files that reside on disk.
Using this API, detection components can be built to provide:
- Detection (Localizing an object)
- Tracking (Localizing an object across multiple frames)
- Classification (Detecting the type of object and optionally localizing that object)
- Transcription (Detecting speech and transcribing it into text)
How Components Integrate into OpenMPF
Components are integrated into OpenMPF through the use of OpenMPF's Component Executable. Developers create component libraries that encapsulate the component detection logic. Each instance of the Component Executable loads one of these libraries and uses it to service job requests sent by the OpenMPF Workflow Manager (WFM).
The Component Executable:
- Receives and parses job requests from the WFM
- Invokes functions on the component library to obtain detection results
- Populates and sends the respective responses to the WFM
The basic pseudocode for the Component Executable is as follows:
component->SetRunDirectory(...)
component->Init()
while (true) {
job = ReceiveJob()
if (component->Supports(job.data_type))
component->GetDetections(...) // Component logic does the work here
SendJobResponse()
}
component->Close()
Each instance of a Component Executable runs as a separate process.
The Component Executable receives and parses requests from the WFM, invokes functions on the Component Logic to get detection objects, and subsequently populates responses with the component output and sends them to the WFM.
A component developer implements a detection component by extending MPFDetectionComponent
.
As an alternative to extending MPFDetectionComponent
directly, a developer may extend one of several convenience adapter classes provided by OpenMPF. See Convenience Adapters for more information.
Getting Started
The quickest way to get started with the C++ Batch Component API is to first read the OpenMPF Component API Overview and then review the source for example OpenMPF C++ detection components.
Detection components are implemented by:
- Extending
MPFDetectionComponent
. - Building the component into a shared object library. (See HelloWorldComponent CMakeLists.txt).
- Creating a component Docker image. (See the README).
API Specification
The figure below presents a high-level component diagram of the C++ Batch Component API:
The Node Manager is only used in a non-Docker deployment. In a Docker deployment the Component Executor is started by the Docker container itself.
The API consists of Component Interfaces, which provide interfaces and abstract classes for developing components; Job Definitions, which define the work to be performed by a component; Job Results, which define the results generated by the component; Component Adapters, which provide default implementations of several of the MPFDetectionComponent
interface functions; and Component Utilities, which perform actions such as image rotation, and cropping.
Component Interface
MPFComponent
- Abstract base class for components.
Detection Component Interface
MPFDetectionComponent
extendsMPFComponent
- Abstract class that should be extended by all OpenMPF C++ detection components that perform batch processing.
Job Definitions
The following data structures contain details about a specific job (work unit):
MPFImageJob
extendsMPFJob
MPFVideoJob
extendsMPFJob
MPFAudioJob
extendsMPFJob
MPFGenericJob
extendsMPFJob
Job Results
The following data structures define detection results:
Components must also include two Component Factory Functions.
Component Interface
The MPFComponent
class is the abstract base class utilized by all OpenMPF C++ components that perform batch processing.
IMPORTANT: This interface should not be directly implemented, because no mechanism exists for launching components based off of it. Currently, the only supported type of component is detection, and all batch detection components should instead extend
MPFDetectionComponent
.
SetRunDirectory(string)
Sets the value of the private run_directory
data member which contains the full path of the parent folder above where the component is installed.
- Function Definition:
void SetRunDirectory(const string &run_dir)
- Parameters:
Parameter | Data Type | Description |
---|---|---|
run_dir | const string & |
Full path of the parent folder above where the component is installed. |
- Returns: none
IMPORTANT:
SetRunDirectory
is called by the Component Executable to set the correct path. This function should not be called within your implementation.
GetRunDirectory()
Returns the value of the private run_directory
data member which contains the full path of the parent folder above where the component is installed. This parent folder is also known as the plugin folder.
- Function Definition:
string GetRunDirectory()
-
Parameters: none
-
Returns: (
string
) Full path of the parent folder above where the component is installed. -
Example:
string run_dir = GetRunDirectory();
string plugin_path = run_dir + "/SampleComponent";
string config_path = plugin_path + "/config";
Init()
The component should perform all initialization operations in the Init
member function.
This will be executed once by the Component Executable, on component startup, before the first job, after SetRunDirectory
.
- Function Definition:
bool Init()
-
Parameters: none
-
Returns: (
bool
) Return true if initialization is successful, otherwise return false. -
Example:
bool SampleComponent::Init() {
// Get component paths
string run_dir = GetRunDirectory();
string plugin_path = run_dir + "/SampleComponent";
string config_path = plugin_path + "/config";
// Setup logger, load data models, etc.
return true;
}
Close()
The component should perform all shutdown operations in the Close
member function.
This will be executed once by the Component Executable, on component shutdown, usually after the last job.
This function is called before the component instance is deleted (see Component Factory Functions).
- Function Definition:
bool Close()
-
Parameters: none
-
Returns: (
bool
) Return true if successful, otherwise return false. -
Example:
bool SampleComponent::Close() {
// Free memory, etc.
return true;
}
GetComponentType()
The GetComponentType() member function allows the C++ Batch Component API to determine the component "type." Currently MPF_DETECTION_COMPONENT
is the only supported component type. APIs for other component types may be developed in the future.
- Function Definition:
MPFComponentType GetComponentType()
-
Parameters: none
-
Returns: (
MPFComponentType
) Currently,MPF_DETECTION_COMPONENT
is the only supported return value. -
Example:
MPFComponentType SampleComponent::GetComponentType() {
return MPF_DETECTION_COMPONENT;
};
Component Factory Functions
Every detection component must include the following macros in its implementation:
MPF_COMPONENT_CREATOR(TYPENAME);
MPF_COMPONENT_DELETER();
The creator macro takes the TYPENAME
of the detection component (for example, “HelloWorld”). This macro creates the factory function that the OpenMPF Component Executable will call in order to instantiate the detection component. The creation function is called once, to obtain an instance of the component, after the component library has been loaded into memory.
The deleter macro creates the factory function that the Component Executable will use to delete that instance of the detection component.
These macros must be used outside of a class declaration, preferably at the bottom or top of a component source (.cpp) file.
Example:
// Note: Do not put the TypeName/Class Name in quotes
MPF_COMPONENT_CREATOR(HelloWorld);
MPF_COMPONENT_DELETER();
Detection Component Interface
The MPFDetectionComponent
class is the abstract class utilized by all OpenMPF C++ detection components that perform batch processing. This class provides functions for developers to integrate detection logic into OpenMPF.
IMPORTANT: Each batch detection component must implement all of the
GetDetections()
functions or extend from a superclass which provides implementations for them (see convenience adapters).If your component does not support a particular data type, it should simply:
return MPF_UNSUPPORTED_DATA_TYPE;
Convenience Adapters
As an alternative to extending MPFDetectionComponent
directly, developers may extend one of several convenience adapter classes provided by OpenMPF.
These adapters provide default implementations of several functions in MPFDetectionComponent
and ensure that the component's logic properly extends from the Component API. This enables developers to concentrate on implementation of the detection algorithm.
The following adapters are provided:
- Image Detection (source)
- Video Detection (source)
- Image and Video Detection (source)
- Audio Detection (source)
- Audio and Video Detection (source)
- Generic Detection (source)
Example: Creating Adaptors to Perform Naive Tracking: A simple detector that operates on videos may simply go through the video frame-by-frame, extract each frame’s data, and perform detections on that data as though it were processing a new unrelated image each time. As each frame is processed, one or more
MPFImageLocations
are generated.Generally, it is preferred that a detection component that supports
VIDEO
data is able to perform tracking across video frames to appropriately correlateMPFImageLocation
detections across frames.An adapter could be developed to perform simple tracking. This would correlate
MPFImageLocation
detections across frames by naïvely looking for bounding box regions in each contiguous frame that overlap by a given threshold such as 50%.
Supports(MPFDetectionDataType)
Returns true or false depending on the data type is supported or not.
- Function Definition:
bool Supports(MPFDetectionDataType data_type)
- Parameters:
Parameter | Data Type | Description |
---|---|---|
data_type | MPFDetectionDataType |
Return true if the component supports IMAGE, VIDEO, AUDIO, and/or UNKNOWN (generic) processing. |
-
Returns: (
bool
) True if the component supports the data type, otherwise false. -
Example:
// Sample component that supports only image and video files
bool SampleComponent::Supports(MPFDetectionDataType data_type) {
return data_type == MPFDetectionDataType::IMAGE || data_type == MPFDetectionDataType::VIDEO;
}
GetDetections(MPFImageJob …)
Used to detect objects in an image file. The MPFImageJob structure contains the data_uri specifying the location of the image file.
Currently, the data_uri is always a local file path. For example, "/opt/mpf/share/remote-media/test-file.jpg". This is because all media is copied to the OpenMPF server before the job is executed.
- Function Definition:
std::vector<MPFImageLocation> GetDetections(const MPFImageJob &job);
- Parameters:
Parameter | Data Type | Description |
---|---|---|
job | const MPFImageJob& |
Structure containing details about the work to be performed. See MPFImageJob |
- Returns: (
std::vector<MPFImageLocation>
) TheMPFImageLocation
data for each detected object.
GetDetections(MPFVideoJob …)
Used to detect objects in a video file. Prior to being sent to the component, videos are split into logical "segments" of video data and each segment (containing a range of frames) is assigned to a different job. Components are not guaranteed to receive requests in any order. For example, the first request processed by a component might receive a request for frames 300-399 of a Video A, while the next request may cover frames 900-999 of a Video B.
- Function Definition:
std::vector<MPFVideoTrack> GetDetections(const MPFVideoJob &job);
- Parameters:
Parameter | Data Type | Description |
---|---|---|
job | const MPFVideoJob& |
Structure containing details about the work to be performed. See MPFVideoJob |
- Returns: (
std::vector<MPFVideoTrack>
) TheMPFVideoTrack
data for each detected object.
GetDetections(MPFAudioJob …)
Used to detect objects in an audio file. Currently, audio files are not logically segmented, so a job will contain the entirety of the audio file.
- Function Definition:
std::vector<MPFAudioTrack> GetDetections(const MPFAudioJob &job);
- Parameters:
Parameter | Data Type | Description |
---|---|---|
job | const MPFAudioJob & |
Structure containing details about the work to be performed. See MPFAudioJob |
- Returns: (
std::vector<MPFAudioTrack>
) TheMPFAudioTrack
data for each detected object.
GetDetections(MPFGenericJob …)
Used to detect objects in files that aren't video, image, or audio files. Such files are of the UNKNOWN type and handled generically. These files are not logically segmented, so a job will contain the entirety of the file.
- Function Definition:
std::vector<MPFGenericTrack> GetDetections(const MPFGenericJob &job);
- Parameters:
Parameter | Data Type | Description |
---|---|---|
job | const MPFGenericJob & |
Structure containing details about the work to be performed. See MPFGenericJob |
- Returns: (
std::vector<MPFGenericTrack>
) TheMPFGenericTrack
data for each detected object.
Detection Job Data Structures
The following data structures contain details about a specific job (work unit):
MPFImageJob
extendsMPFJob
MPFVideoJob
extendsMPFJob
MPFAudioJob
extendsMPFJob
MPFGenericJob
extendsMPFJob
The following data structures define detection results:
MPFJob
Structure containing information about a job to be performed on a piece of media.
- Constructor(s):
MPFJob(
const string &job_name,
const string &data_uri,
const Properties &job_properties,
const Properties &media_properties)
- Members:
Member | Data Type | Description |
---|---|---|
job_name | const string & |
A specific name given to the job by the OpenMPF framework. This value may be used, for example, for logging and debugging purposes. |
data_uri | const string & |
The URI of the input media file to be processed. Currently, this is a file path. For example, "/opt/mpf/share/remote-media/test-file.avi". |
job_properties | const Properties & |
Contains a map of <string, string> which represents the property name and the property value. The key corresponds to the property name specified in the component descriptor file described in the Component Descriptor Reference. Values are determined when creating a pipeline or when submitting a job. Note: The job_properties map may not contain the full set of job properties. For properties not contained in the map, the component must use a default value. |
media_properties | const Properties & |
Contains a map of <string, string> of metadata about the media associated with the job. The entries in the map vary depending on the type of media. Refer to the type-specific job structures below. |
Job properties can also be set through environment variables prefixed with MPF_PROP_
. This allows
users to set job properties in their
docker-compose files.
These will take precedence over all other property types (job, algorithm, media, etc). It is not
possible to change the value of properties set via environment variables at runtime and therefore
they should only be used to specify properties that will not change throughout the entire lifetime
of the service (e.g. Docker container).
MPFImageJob
Extends MPFJob
Structure containing data used for detection of objects in an image file.
- Constructor(s):
MPFImageJob(
const string &job_name,
const string &data_uri,
const Properties &job_properties,
const Properties &media_properties)
MPFImageJob(
const string &job_name,
const string &data_uri,
const MPFImageLocation &location,
const Properties &job_properties,
const Properties &media_properties)
- Members:
Member | Data Type | Description |
---|---|---|
job_name | const string & |
See MPFJob.job_name for description. |
data_uri | const string & |
See MPFJob.data_uri for description. |
location | const MPFImageLocation & |
An MPFImageLocation from the previous pipeline stage. Provided when feed forward is enabled. See Feed Forward Guide. |
job_properties | const Properties & |
See MPFJob.job_properties for description. |
media_properties | const Properties & |
See MPFJob.media_properties for description.
Includes the following key-value pairs:
|
MPFVideoJob
Extends MPFJob
Structure containing data used for detection of objects in a video file.
- Constructor(s):
MPFVideoJob(
const string &job_name,
const string &data_uri,
int start_frame,
int stop_frame,
const Properties &job_properties,
const Properties &media_properties)
MPFVideoJob(
const string &job_name,
const string &data_uri,
int start_frame,
int stop_frame,
const MPFVideoTrack &track,
const Properties &job_properties,
const Properties &media_properties)
- Members:
Member | Data Type | Description |
---|---|---|
job_name | const string & |
See MPFJob.job_name for description. |
data_uri | const string & |
See MPFJob.data_uri for description. |
start_frame | const int |
The first frame number (0-based index) of the video that should be processed to look for detections. |
stop_frame | const int |
The last frame number (0-based index) of the video that should be processed to look for detections. |
track | const MPFVideoTrack & |
An MPFVideoTrack from the previous pipeline stage. Provided when feed forward is enabled. See Feed Forward Guide. |
job_properties | const Properties & |
See MPFJob.job_properties for description. |
media_properties | const Properties & |
See MPFJob.media_properties for description.
Includes the following key-value pairs:
|
IMPORTANT:
FRAME_INTERVAL
is a common job property that many components support. For frame intervals greater than 1, the component must look for detections starting with the first frame, and then skip frames as specified by the frame interval, until or before it reaches the stop frame. For example, given a start frame of 0, a stop frame of 99, and a frame interval of 2, then the detection component must look for objects in frames numbered 0, 2, 4, 6, ..., 98.
MPFAudioJob
Extends MPFJob
Structure containing data used for detection of objects in an audio file. Currently, audio files are not logically segmented, so a job will contain the entirety of the audio file.
- Constructor(s):
MPFAudioJob(
const string &job_name,
const string &data_uri,
int start_time,
int stop_time,
const Properties &job_properties,
const Properties &media_properties)
MPFAudioJob(
const string &job_name,
const string &data_uri,
int start_time,
int stop_time,
const MPFAudioTrack &track,
const Properties &job_properties,
const Properties &media_properties)
- Members:
Member | Data Type | Description |
---|---|---|
job_name | const string & |
See MPFJob.job_name for description. |
data_uri | const string & |
See MPFJob.data_uri for description. |
start_time | const int |
The time (0-based index, in milliseconds) associated with the beginning of the segment of the audio file that should be processed to look for detections. |
stop_time | const int |
The time (0-based index, in milliseconds) associated with the end of the segment of the audio file that should be processed to look for detections. |
track | const MPFAudioTrack & |
An MPFAudioTrack from the previous pipeline stage. Provided when feed forward is enabled. See Feed Forward Guide. |
job_properties | const Properties & |
See MPFJob.job_properties for description. |
media_properties | const Properties & |
See MPFJob.media_properties for description.
Includes the following key-value pairs:
|
MPFGenericJob
Extends MPFJob
Structure containing data used for detection of objects in a file that isn't a video, image, or audio file. The file is of the UNKNOWN type and handled generically. The file is not logically segmented, so a job will contain the entirety of the file.
- Constructor(s):
MPFGenericJob(
const string &job_name,
const string &data_uri,
const Properties &job_properties,
const Properties &media_properties)
MPFGenericJob(
const string &job_name,
const string &data_uri,
const MPFGenericTrack &track,
const Properties &job_properties,
const Properties &media_properties)
}
- Members:
Member | Data Type | Description |
---|---|---|
job_name | const string & |
See MPFJob.job_name for description. |
data_uri | const string & |
See MPFJob.data_uri for description. |
track | const MPFGenericTrack & |
An MPFGenericTrack from the previous pipeline stage. Provided when feed forward is enabled. See Feed Forward Guide. |
job_properties | const Properties & |
See MPFJob.job_properties for description. |
media_properties | const Properties & |
See MPFJob.media_properties for description.
Includes the following key-value pair:
|
Detection Job Result Classes
MPFImageLocation
Structure used to store the location of detected objects in a image file.
- Constructor(s):
MPFImageLocation()
MPFImageLocation(
int x_left_upper,
int y_left_upper,
int width,
int height,
float confidence = -1,
const Properties &detection_properties = {})
- Members:
Member | Data Type | Description |
---|---|---|
x_left_upper | int |
Upper left X coordinate of the detected object. |
y_left_upper | int |
Upper left Y coordinate of the detected object. |
width | int |
The width of the detected object. |
height | int |
The height of the detected object. |
confidence | float |
Represents the "quality" of the detection. The range depends on the detection algorithm. 0.0 is lowest quality. Higher values are higher quality. Using a standard range of [0.0 - 1.0] is advised. If the component is unable to supply a confidence value, it should return -1.0. |
detection_properties | Properties & |
Optional additional information about the detected object. There is no restriction on the keys or the number of entries that can be added to the detection_properties map. For best practice, keys should be in all CAPS. See the section for ROTATION and HORIZONTAL_FLIP below, |
- Example:
A component that performs generic object classification can add an entry to detection_properties
where the key is CLASSIFICATION
and the value is the type of object detected.
MPFImageLocation {
x_left_upper = 0, y_left_upper = 0, width = 100, height = 50, confidence = 1.0,
{ {"CLASSIFICATION", "backpack"} }
}
Rotation and Horizontal Flip
When the detection_properties
map contains a ROTATION
key, it should be a floating point value in the interval
[0.0, 360.0)
indicating the orientation of the detection in degrees in the counter-clockwise direction.
In order to view the detection in the upright orientation, it must be rotated the given number of degrees in the
clockwise direction.
The detection_properties
map can also contain a HORIZONTAL_FLIP
property that will either be "true"
or "false"
.
The detection_properties
map may have both HORIZONTAL_FLIP
and ROTATION
keys.
The Workflow Manager performs the following algorithm to draw the bounding box when generating markup:
- Draw the rectangle ignoring rotation and flip.
- Rotate the rectangle counter-clockwise the given number of degrees around its top left corner.
- If the rectangle is flipped, flip horizontally around the top left corner.
In the image above you can see the three steps required to properly draw a bounding box. Step 1 is drawn in red. Step 2 is drawn in blue. Step 3 and the final result is drawn in green. The detection for the image above is:
MPFImageLocation {
x_left_upper = 210, y_left_upper = 189, width = 177, height = 41, confidence = 1.0,
{ {"ROTATION", "15"}, { "HORIZONTAL_FLIP", "true" } }
}
Note that the x_left_upper
, y_left_upper
, width
, and height
values describe the red rectangle. The addition
of the ROTATION
property results in the blue rectangle, and the addition of the HORIZONTAL_FLIP
property results
in the green rectangle.
One way to think about the process is "draw the unrotated and unflipped rectangle, stick a pin in the upper left corner, and then rotate and flip around the pin".
Rotation-Only Example
The Workflow Manager generated the above image by performing markup on the original image with the following detection:
MPFImageLocation {
x_left_upper = 156, y_left_upper = 339, width = 194, height = 243, confidence = 1.0,
{ {"ROTATION", "90.0"} }
}
The markup process followed steps 1 and 2 in the previous section, skipping step 3 because there is no
HORIZONTAL_FLIP
.
In order to properly extract the detection region from the original image, such as when generating an artifact, you would need to rotate the region in the above image 90 degrees clockwise around the cyan dot currently shown in the bottom-left corner so that the face is in the proper upright position.
When the rotation is properly corrected in this way, the cyan dot will appear in the top-left corner of the bounding
box. That is why its position is described using the x_left_upper
, and y_left_upper
variables. They refer to the
top-left corner of the correctly oriented region.
MPFVideoTrack
Structure used to store the location of detected objects in a video file.
- Constructor(s):
MPFVideoTrack()
MPFVideoTrack(
int start_frame,
int stop_frame,
float confidence = -1,
map<int, MPFImageLocation> frame_locations,
const Properties &detection_properties = {})
- Members:
Member | Data Type | Description |
---|---|---|
start_frame | int |
The first frame number (0-based index) that contained the detected object. |
stop_frame | int |
The last frame number (0-based index) that contained the detected object. |
frame_locations | map<int, MPFImageLocation> |
A map of individual detections. The key for each map entry is the frame number where the detection was generated, and the value is a MPFImageLocation calculated as if that frame was a still image. Note that a key-value pair is not required for every frame between the track start frame and track stop frame. |
confidence | float |
Represents the "quality" of the detection. The range depends on the detection algorithm. 0.0 is lowest quality. Higher values are higher quality. Using a standard range of [0.0 - 1.0] is advised. If the component is unable to supply a confidence value, it should return -1.0. |
detection_properties | Properties & |
Optional additional information about the detected object. There is no restriction on the keys or the number of entries that can be added to the detection_properties map. For best practice, keys should be in all CAPS. |
- Example:
NOTE: Currently,
MPFVideoTrack.detection_properties
do not show up in the JSON output object or are used by the WFM in any way.
A component that detects text can add an entry to detection_properties
where the key is TRANSCRIPT
and the value is a string representing the text found in the video segment.
MPFVideoTrack track;
track.start_frame = 0;
track.stop_frame = 5;
track.confidence = 1.0;
track.frame_locations = frame_locations;
track.detection_properties["TRANSCRIPT"] = "RE5ULTS FR0M A TEXT DETECTER";
MPFAudioTrack
Structure used to store the location of detected objects in an audio file.
- Constructor(s):
MPFAudioTrack()
MPFAudioTrack(
int start_time,
int stop_time,
float confidence = -1,
const Properties &detection_properties = {})
- Members:
Member | Data Type | Description |
---|---|---|
start_time | int |
The time (0-based index, in ms) when the audio detection event started. |
stop_time | int |
The time (0-based index, in ms) when the audio detection event stopped. |
confidence | float |
Represents the "quality" of the detection. The range depends on the detection algorithm. 0.0 is lowest quality. Higher values are higher quality. Using a standard range of [0.0 - 1.0] is advised. If the component is unable to supply a confidence value, it should return -1.0. |
detection_properties | Properties & |
Optional additional information about the detection. There is no restriction on the keys or the number of entries that can be added to the detection_properties map. For best practice, keys should be in all CAPS. |
NOTE: Currently,
MPFAudioTrack.detection_properties
do not show up in the JSON output object or are used by the WFM in any way.
MPFGenericTrack
Structure used to store the location of detected objects in a file that is not a video, image, or audio file. The file is of the UNKNOWN type and handled generically.
- Constructor(s):
MPFGenericTrack()
MPFGenericTrack(
float confidence = -1,
const Properties &detection_properties = {})
- Members:
Member | Data Type | Description |
---|---|---|
confidence | float |
Represents the "quality" of the detection. The range depends on the detection algorithm. 0.0 is lowest quality. Higher values are higher quality. Using a standard range of [0.0 - 1.0] is advised. If the component is unable to supply a confidence value, it should return -1.0. |
detection_properties | Properties & |
Optional additional information about the detection. There is no restriction on the keys or the number of entries that can be added to the detection_properties map. For best practice, keys should be in all CAPS. |
Exception Types
MPFDetectionException
Exception that should be thrown by the GetDetections()
methods when an error occurs.
The content of the error_code
and what()
members will appear in the JSON output object.
- Constructors:
MPFDetectionException(MPFDetectionError error_code, const std::string &what = "")
MPFDetectionException(const std::string &what)
Member | Data Type | Description |
---|---|---|
error_code | MPFDetectionError |
Specifies the error type. See MPFDetectionError . |
what() | const char* |
Textual description of the specific error. (Inherited from std::exception ) |
Enumeration Types
MPFDetectionError
Enum used to indicate the type of error that occurred in a GetDetections()
method. It is used as a parameter to
the MPFDetectionException
constructor. A component is not required to support all error types.
ENUM | Description |
---|---|
MPF_DETECTION_SUCCESS | The component function completed successfully. |
MPF_OTHER_DETECTION_ERROR_TYPE | The component function has failed for a reason that is not captured by any of the other error codes. |
MPF_DETECTION_NOT_INITIALIZED | The initialization of the component, or the initialization of any of its dependencies, has failed for any reason. |
MPF_UNSUPPORTED_DATA_TYPE | The job passed to a component requests processing of a job of an unsupported type. For instance, a component that is only capable of processing audio files should return this error code if a video or image job request is received. |
MPF_COULD_NOT_OPEN_DATAFILE | The data file to be processed could not be opened for any reason, such as a permissions failure, or an unreachable URI. Use MPF_COULD_NOT_OPEN_MEDIA for media files. |
MPF_COULD_NOT_READ_DATAFILE | There is a failure reading data from a successfully opened input data file. Use MPF_COULD_NOT_READ_MEDIA for media files. |
MPF_FILE_WRITE_ERROR | The component received a failure for any reason when attempting to write to a file. |
MPF_BAD_FRAME_SIZE | The frame data retrieved has an incorrect or invalid frame size. For example, if a call to cv::imread() returns a frame of data with either the number of rows or columns less than or equal to 0. |
MPF_DETECTION_FAILED | General failure of a detection algorithm. This does not indicate a lack of detections found in the media, but rather a break down in the algorithm that makes it impossible to continue to try to detect objects. |
MPF_INVALID_PROPERTY | The component received a property that is unrecognized or has an invalid/out-of-bounds value. |
MPF_MISSING_PROPERTY | The component received a job that is missing a required property. |
MPF_MEMORY_ALLOCATION_FAILED | The component failed to allocate memory for any reason. |
MPF_GPU_ERROR | The job was configured to execute on a GPU, but there was an issue with the GPU or no GPU was detected. |
MPF_NETWORK_ERROR | The component failed to communicate with an external system over the network. The system may not be available or there may have been a timeout. |
MPF_COULD_NOT_OPEN_MEDIA | The media file to be processed could not be opened for any reason, such as a permissions failure, or an unreachable URI. |
MPF_COULD_NOT_READ_MEDIA | There is a failure reading data from a successfully opened media file. |
Utility Classes
For convenience, the OpenMPF provides the MPFImageReader
(source) and MPFVideoCapture
(source) utility classes to perform horizontal flipping, rotation, and cropping to a region of interest. Note, that when using these classes, the component will also need to utilize the class to perform a reverse transform to convert the transformed pixel coordinates back to the original (e.g. pre-flipped, pre-rotated, and pre-cropped) coordinate space.
C++ Component Build Environment
A C++ component library must be built for the same C++ compiler and Linux version that is used by the OpenMPF Component Executable. This is to ensure compatibility between the executable and the library functions at the Application Binary Interface (ABI) level. At this writing, the OpenMPF runs on Ubuntu 20.04 (kernel version 5.13.0-30), and the OpenMPF C++ Component Executable is built with g++ (GCC) 9.3.0-17.
Components should be supplied as a tar file, which includes not only the component library, but any other libraries or files needed for execution. This includes all other non-standard libraries used by the component (aside from the standard Linux and C++ libraries), and any configuration or data files.
Component Development Best Practices
Single-threaded Operation
Implementations are encouraged to operate in single-threaded mode. OpenMPF will parallelize components through multiple instantiations of the component, each running as a separate service.
Stateless Behavior
OpenMPF components should be stateless in operation and give identical output for a provided input (i.e. when processing the same MPFJob
).
GPU Support
For components that want to take advantage of NVIDA GPU processors, please read the GPU Support Guide. Also ensure that your build environment has the NVIDIA CUDA Toolkit installed, as described in the Build Environment Setup Guide.
Component Structure for non-Docker Deployments
It is recommended that C++ components are organized according to the following directory structure:
componentName
├── config - Optional component-specific configuration files
├── descriptor
│ └── descriptor.json
└── lib
└──libComponentName.so - Compiled component library
Once built, components should be packaged into a .tar.gz containing the contents of the directory shown above.
Logging
It is recommended to use Apache log4cxx for
OpenMPF Component logging. Components using log4cxx should not configure logging themselves.
The Component Executor will configure log4cxx globally. Components should call
log4cxx::Logger::getLogger("<componentName>")
to a get a reference to the logger. If you
are using a different logging framework, you should make sure its behavior is similar to how
the Component Executor configures log4cxx as described below.
The following log LEVELs are supported: FATAL, ERROR, WARN, INFO, DEBUG, TRACE
.
The LOG_LEVEL
environment variable can be set to one of the log levels to change the logging
verbosity. When LOG_LEVEL
is absent, INFO
is used.
Note that multiple instances of the same component can log to the same file. Also, logging content can span multiple lines.
The logger will write to both standard error and
${MPF_LOG_PATH}/${THIS_MPF_NODE}/log/<componentName>.log
.
Each log statement will take the form:
DATE TIME LEVEL CONTENT
For example:
2016-02-09 13:42:42,341 INFO - Starting sample-component: [ OK ]