Part 2. DeviceData
Overview & Data Scope
- DeviceData: closely related to the concept of Device and the DeviceMark
-
The latter derives largely from the analyses of DeviceData. Various computational modules in the Service stack are the main consumers of the DeviceData. The end-user may also access the whole or part of the DeviceData through DeviceMark which may contain references to bulkier DeviceData.
As discussed in the DeviceMode section, once the context of the Example service is set, the service stack configures the Example devices to start streaming of data selected for the DeviceMode role assigned to each. The DeviceManager module then pass the incoming data to the DeviceDataManager module which processes the raw Data and makes DeviceData objects out of them, before turning them over to the DeviceMarkManager module.
Logistically, a DeviceData object in its degree of abstraction or information extraction, lies between the raw data and the highly compact and A.I. enhanced DeviceMarks. Within the computational stack, a DeviceData object exists as an instantiation of the subclass of the base-DeviceData class. The implementation is specific to the platform on which the service stack lies and is beyond the scope of the specification. However, the spec shall define its requisite data model and methods, as well as its transport-ready manifestations for both storage and streaming, which should largely follow the model laid down for the MPEG-4 standard and its extensions.
- DeviceData objects
-
Examples of typical DeviceData objects include the Depth-map buffer objects, object representations of an image, timed- textual annotations of the frame images, morphologically transformed frame images channels, audio segments, composite image created from multiple video sources, etc. and their assemblage into a video-like rendition for a set amount of time.
For the latter, just like the standard MPEG videos are constructed out of several video, audio, and metadata channels, the service stack should provide the set of functions that may assemble such processed frames and audiograms into a coherent media container ready for streaming on demand, with requisite security, encode/decode and compression schemes.
Function-wise, the DeviceData objects play multiple roles in the Example workflow.
- First, they act as ingredients on which the A.I. recipe acts (i.e., in the DeviceMark extraction workflow) and the DeviceMark generation is based. Much of DeviceData thus used may exists only within the service stack and purged after use.
- Secondly, some DeviceData objects that are deemed worthy of archival, may remain persistent in the system - the DeviceMarkManager may instruct the DeviceDataManager to archive them in a specific form and store their references in the DeviceMark to be invoked by the end-user.
While some types of DeviceData may be small enough (e.g., vector properties such as face-rectangles, or textual content) to be included as part of the DeviceMark data, but bulky buffer items such as Depth map, optical flow map, etc. should be treated as assets. Typical assets such as the video segments are stored as files in the system, and their URL (with time-stamp specified optionally) would be included in the DeviceMark, for example. These archived assets allow the end-users to examine a situation more closely, and also allows the service stack to re-process them with better information, if necessary.
Note that the DeviceData objects may be created and managed throughout the workflow irrespective of whether a Device is open or not. When a Device is open and actively monitored, the DeviceData may be archived as an asset either in the form of a single frame data or as a short video-like file. That depends on the nature of the analytics algorithm which consumes the DeviceData to arrive at a decision regarding DeviceMark generation.
For example, a single recognized face image may be sufficient for deciding whether a family member or a stranger appeared on a spot; On the other hand, a short video clip shown a dangerous action would be worth storing if the decision is based on analysis of moving images.
- DeviceDataManager
-
Given the wide range of media formats and other types of data, DeviceDataManager takes the form of an ensemble of various data processing sub-modules. These sub-modules may utilize most of the existing standard media formats (jpeg, mpeg, wav, m4v, etc.) but also address translation between the proprietary formats designed and managed by the sensor/device developers and the Example-conforming Reference Data formats that are to be used by analytics modules and downstream apps. An analogous example would be the plug-ins for Raw format processors and standard jpeg/png formats in Photographic applications.