First, we created self-driving cars. Then, robots with artificial intelligence. Finally, when the Creator Ci20 was born, machines went out of human control. However, there was someone who didn't give up. John Connor developed new advanced system called VIPE, and put the Ci20 under control. Machines began to protect humans.

True Terminator story (1:15 min)

The “Terminator Vision System” project

Skynet forces consist of dozens of factory-produced fighting terminators, including T-1 Tanks, HK-Aerial Hunters, T-7 Tetrapods, Hydrobots and T-800 Robots. Human detection is quite an easy task for Skynet terminators, because human beings significantly differ from robots.

To fight the Skynet humans need a vision system that should not only distinguish robots from humans, but also recognize manlike terminators, spies, manage humans’ access level etc. The vision system should be able to work both autonomously and in coordination with command centers, to provide driving directions for autonomous vehicles and to give targets for autonomous defense systems.

The developed Terminator Vision System (TVS) is one of the key features of automation appliance in assault, defense and secure battle missions. It is responsible for a battlefield orientation, aiming, movements etc. The crucial task of the vision system for both combat and defense applications is the identification and classification of human beings.

Terminator Vision System

Terminator Vision System structure

The TVS is a complex Cyber-Physical System that combines computing, control and mechanical parts.

The heart of the TVS is the new compact and high-performance Creator Ci20 platform: the single-board “all in one” computer. It carries out main functions of the TVS: human detection, classification and identification. Additionally Ci20 carries run-time operating system (Debian 8 Linux) and hosts periphery (Microsoft LifeCam HD-3000 camera, keyboard/mouse when needed, monitor etc.). The high-speed Wi-Fi connection allows having a remote access to the board and reporting information to the main host.

Compact Arduino Nano board is used as a control device for mechanical periphery. The Arduino is attached to Ci20 via serial interface; it receives control commands from the Ci20 and makes a direct control to mechanical equipment. Mechanical equipment can move platforms, perform various measurements and establish physical control.

TVS structure

As a power source for the TVS we use the successful solution from our previous project: two-port USB power bank “CANYON CNE-CPB78BL”. It powers both Ci20 (with camera through it) and Arduino (with servo through it). The power bank solves two tasks at once: provides up to 8 hours of autonomous functioning (Creator Ci20 takes up to 0.8A max, Arduino Uno takes up to 0.2A max) and provides a coordinated power source for data transfers between boards (no need in a common electrical ground).

TVS physical scheme

Body structure

We tried to develop the TVS project in a normal way, not in the “try and see”. Thus, we started from sketches where all parts, sizes and coupling were measured and coordinated.

1 / 3 • TVS sketches

Then, we developed 3D models for all necessary parts.

1 / 2 • TVS 3D models

Then, we printed parts on the 3D printer and colored them in chrome.

1 / 3 • TVS printed parts

During an assembling and testing some issues were revealed: for example, we did not take into account the weight of the moving head and our servo was not able to turn it. Therefore, we had to replace the servo with more powerful one and modify the structure to fit the new servo. Finally, everything was OK.

TVS assembling (60 sec)

Embedded software

For the TVS software development, we used the VIPE visual development environment. The VIPE implements a visual programming approach based on Domain-Specific Languages (DSLs). It allows developing a portable software for both general purpose and embedded systems. A program is constructed from visual operators of the DSL and it is required only minor involvement on the textual (C/C++) language to implement specific functionality.

The VIPE environment

For TVS we selected the computer vision DSL. The VIPE computer vision DSL is based on the OpenCV library and it includes more than 80 functions, which cover most of required TVS face detection and recognition functionality.

VIPE computer vision DSL

The VIPE environment easily allowed to add more vision functions like face identification in database, output messages etc. Additionally we added functions to interact with the Arduino using the WiringX library.

1 / 8 • TVS with computer vision DSL in VIPE

The TVS program can be easily executed on various platforms with VIPE multi-platform back-end tools. The multi-platform back-end allows performing the one-button run on the host PC and one-button deployment directly to the Ci20 board depending on the selected target platform.

VIPE one-button deployment

Finally, we’ve got the cross-platform solution that can run in general-purpose and in embedded environment. The developed application implements lots of functions: video stream acquiring, faces detecting and identifying, personal info showing, mechanical equipment controlling via the Arduino etc. All this functionality was implemented with a visual programming in the VIPE!

TVS development process in VIPE (1:45 min)

The embedded Arduino code is rather small because it only controls servo rotation according to Ci20 commands. However, in future versions of TVS system it will hold much more control functionality.

Summary

The Creator Ci20 is a high-performance platform that can hold various kinds of tasks from control of robotics and up to performing heavy computing. It gives us a chance to develop, compose and tightly integrate a set of tasks in a single autonomous complex with wide area of possible applications.

Running the TVS (55 sec)

We see many further steps to extend the TVS system. We are planning to integrate the TVS system with a mobile platform, so it will be able to not only track the target but also pursue it. The TVS can also be applied in a real control, access and security systems. In addition, we are looking forward to use the newest Imagination Technologies/ELVEES chip like ELISE SoC.

Acknowledgement

The project was done inside the IHPCNT department of the SUAI university, Saint Petersburg, Russia.

The research leading to these results has received funding from the Ministry of Education and Science of Russian Federation under agreement #14.575.21.0021, identifier RFMEFI57514X0021.