The monitoring of activity for people as operators in a world of M2M communications is moving from a dream to a practical and ubiquitous reality. The historic perspective was that human interface was based on touch-oriented entry (gesture, typing, point, touch), however, systems now have a large number of methods to detect the human-to-machine interaction. These include non-touch motion, 3D gesture, temperature, motion, sound, and visual processing as the leading cost-effective measures. All of these methods now communicate to a central data repository and must do so without being tethered to the products. These wireless interfaces feature a serial data stream that requires a transformation between serial and parallel modes as the high-speed data is transferred around the system. As a result, wireless technology and new high-speed wireless connectivity links to these sensor acquisition systems are the new norm.
Types of data traffic
The network traffic is grouped into categories based on the size and type of data being sent from the sensor system, and what needs to be stored in the database. The high level groups for these systems are: (A) small data, local data processing, acknowledge transfer, (B) small data, remote processing, acknowledge transferred, (C) large data packet, local data processing, acknowledge transfer, (D) large data packet, remote processing, acknowledge transferred, and (E) streaming data one way from the sensors for another compute environment to handle.
Small data, local processing
The application control interfaces have moved from simple push buttons or full keyboards to menu-based user operation that has a touchpad or touchscreen interface. These interfaces require that the control function is defined and can be put into a state machine, procedure, sequence or algorithm to complete the task at hand. A prime example is a large screen Point-of-Sale (POS) terminal.
Figure 1: Touchscreen tabletop POS terminal.
Figure 1 shows a touch screen computer that has full, high resolution graphics as the "touch" icons on the screen. When you select the item, it performs a sequence of steps from updating the register information, initializing the payment processing system, and updating the inventory. By having the local processing capability, all of the actions take place at the terminal for register control and the only information that is transferred to the central computer is the SKU and number of units information. This reduced data set of just the SKU and transaction number is typically less than 1 kb and can be transferred with a number of protocol options – wired, wireless, and multiple options on wireless. These POS terminals are line cord operated so power during the transfer is not an issue. For the small data sets, ZigBee®, Z-Wave®, and Bluetooth® are good alternate protocols to the overhead of Wi-Fi®.
The new generation of human interface for the POS terminal is the multi-connect wireless handheld terminal that can be seen in Figure 2. Once again this device uses more power than most objects for the size due to the number of wireless connection modes and device scan capabilities that are required. The device features not only Wi-Fi, but cellular connections to be able to transfer data from any location. This type of terminal generally has SMS style messaging, rather than full digital data systems when they communicate with the home system through the Wi-Fi and GSM bands resulting with the device using small data sets consisting of just the SKU and transaction number. This allows a cost effective use of the terminal without worry about lost data in the transfer.
Figure 2: Wireless handheld portable customer POS terminal.
Small data, remote processing
The handheld device shown in Figure 3 was originally designed for wired docking station applications. They suffer from the reduced processing power that is available in the small form factor designs. Since they have smaller CPUs and memory, a lot of the processing and complexity for the analysis of the human touch interface has to be done at the central location.
Figure 3: Handheld POS terminal features (Courtesy Verifone).
The human interfaces for these devices are migrating to an industrial design and form factor using capacitive touch-sensitive keys or displays and a drop-in RF radio to add wireless capabilities. The development of these designs is enabled by the use of Cypress’ hardware development kits for PSOC® products which show how to bring increased reliability by eliminating mechanical keys and select pads. The development kit is shown in Figure 4. In addition to the change in the GUI, the connectivity of the device can be made to use the standard serial data going out, and use a completed module for the wireless. These modules are available for low data blocks (such as the Bluetooth protocol) using devices such as the Panasonic PAN1325 Class 1 Bluetooth Module (see Figure 5).
Figure 4: Cypress PSOC 3 FirstTouch™ Starter Kit (Courtesy Cypress Semiconductors).
Figure 5: Panasonic Wi-Fi and Bluetooth module (Courtesy Panasonic).
Large data, local and remote processing
An additional challenge for some applications of the new portable devices is image and recognition-based processing for embedded designs. The smaller CPU and MCUs focus on the formatting in memory and transfer of the images to either a separate compute-engine locally – such as a DSP or multicore configuration – or the processing for the analysis of the human touch interface completed at the remote and off-device location. As a result, larger data sets – those with the full database of information, such as those in a POS transaction – are required to be sent from the unit.
These include the use of 1D, and 2D bar codes, UPC codes, QR codes, RF inventory tags, SKU numbers, object quantities and sometimes even visual images – a camera or a scanner to process a transaction. These data sets are in the 100 Kb to multi-Mb range in size. For the large pieces of data, Bluetooth, ZigBee and Z-Wave, which are all small packet protocols, are not efficient. Instead, the use of Wi-Fi in single or multiple antenna configurations is best.
There are analog front ends, such as the combined Bluetooth 802.11b/g module from Epcos, which take the data from the digital cores and handle all of the analog processing for the two protocols. This will allow, in a very small area of 4.5 mm x 3.2 mm, high speed 2.4 GHz wireless connectivity to an existing "wire-based" design.
Large data sets can also be managed for export through fully assembled transceiver units. These are available for multiple protocols and include cellular communications. The transceiver units need to be on both the end-point device and the remote processing station. The high bandwidth possible is at rates of Mbps per antenna used. The units range from small USB pluggable units that can work as a retrofit on any human interface device with USB serial IO, to larger full wired units with a power supply that can connect to existing wired Ethernet ports. The fully assembled transceiver products are available from a number of manufacturers including Digi International, Laird, Multi-Tech Systems, Inc., and Roving Networks, Inc.
Streaming data for remote processing
Streaming data has transitioned from a wired connection to a remote location and a remote access function. The capability of wireless cameras for monitoring human actions in both the visible and alternate light spectrum is changing remote monitoring. Assuming a wired power source, these cameras support full local IP network management on multiple protocols in an autonomous mode. Incorporating both sound and image capture in a single unit, IP cameras are capable of broadcasting upwards of 100 Mbps of data on a continual basis for applications such as optical inspection. They log up to 128 images per block at 1/1000 sec to 1/4000 sec shutter rate.
The blocks of data can then be processed locally in a system application, or more typically shipped "en-mass" to a data store and compute processing machine. The CCD units typically incorporate some sort of multicore or DSP processing to take the camera raw scan data and put it into an encoded and standardized image format such as TIFF or JPEG. The processing is handled through a codec in the device. Products, such as the Aven 26100-100 1/3” color CCD with DSP (see Figure 6), have these features and work in integrated machine vision applications. These cameras accept two-way communication, and support adjustment of functions such as: iris, shutter, AGC, white balance, mirror function, B.L.C. function, positive/negative, digital zoom, and flicker.
Figure 6: Aven 1/3” DSP color CCD (Courtesy Aven).
The human interface that results from these images can be incorporated into a touchscreen display. With most touch sensitive displays, the video image can be auto-centered so the POS-style transactions can be performed on the streaming data to support factory floor automation or physical facility access. The use of wireless control allows remote facility access (gates, door release, elevator, call access) to be possible in secure locations away from the access area. As long as there is power for the interface, the access control methods can communicate to central computing for verification.
A new area of incorporation of the wireless connections is in biometric-based sensor interfaces. Biometric-based sensor interfaces are generally stand-alone and not connected to standard IT networks. An example of biometric sensors is the Atmel® FingerChip® Biometric Module (see Figure 7). This evaluation system allows for the complete development of the sensor secure authentication interface and it communicates with standard I/O. The module can then be connected with the use of private protocol with an independent channel (e.g. not a commercial carrier for the cellular network or an encrypted Wi-Fi connection) that allows for secure information transmission and response networking. Coupled with optical image capture, and touch interfaces, the complete access to a facility or area for personnel can be not only managed, but it can be recorded and analyzed.
Figure 7: Atmel FingerChip Biometric Module (Courtesy Atmel).
The use of wireless connectivity also allows access cards and ID cards with magnetic stripes to be used in new locations. As explained previously with the POS terminal application, card readers and smart card readers can be incorporated into any existing equipment to track who the operators are, and the time and date of use. Some equipment does not have active use-tracking and cannot be easily integrated into the products. The use of the wireless interface, with the small data and remote processing option, allows for a user card swipe to be added as a portable power, and low cost connectivity solution. The use-tracking of equipment is becoming a large financial concern to allow for optimization of maintenance and down time, in addition to personnel scheduling for resources. Having the imaging capability along with the wireless tracking lets a single centralized location now keep track of what is happening with the hard assets of the company.
Next generation features
The newest trend in remote human interface management is not only with the monitoring capabilities, but where the monitoring is being done. The trend is to collect and process all of the transactions and security information in a private cloud that is addressable by a fixed and secured IP address. This data is then analyzed in visual reports – representative video or fixed images that are then reviewed on a remote compute device. These have moved from simple SMS messages to photos to now down-sampled live video and received video streams. The end devices are now smart phones and tablets, which primarily get their multi-Mbps bandwidth from 802.11 Wi-Fi. Standard 802.11b/g systems were single antenna. 802.11n/ac systems are MIMO (multiple inputs and multiple outputs) and can support up to four simultaneous data streams in or out. These allow data rates up to Gbps levels. At these levels, multiple data sets can be analyzed and sent to a single device so the entire visual dynamic can shift. The design aspects of the changes are in the application software. There is not a fundamental shift in the design architecture for these high speed apps, as in the low speed applications – it is just identifying the correct design model (A-E) to use.
As the mobile experience grows, the human interfaces to these devices need to follow a simple, consistent method across home and industrial applications. It is critical that these systems be based on interaction that is intuitive and does not require any significant learning or training curves.