Home
Blog
Go Back
EDUCATION
|
11
Min
Updated On
November 25, 2024

AI Robotics Case - Open Source AI Object Tracking with SMD

Introduction: Definition and Importance of Object Tracking and AI Object Tracking

Object tracking is a sophisticated technology that entails the continuous observation of moving objects to monitor their position and status over time. This process typically utilizes a combination of sensors, cameras, and specialized software to capture and analyze data about the object's movements and behaviors. In its most fundamental form, object tracking helps in identifying and following an object as it traverses through a given space, ensuring that its trajectory and changes are accurately recorded.

The advent of Artificial Intelligence (AI) has dramatically transformed the field of object tracking, significantly enhancing its efficiency and precision. AI algorithms, with their advanced capabilities, improve the tracking process by providing more accurate predictions and analyses of an object’s path and behavior. This is especially beneficial in complex environments where traditional tracking methods may fall short1.

In this article, we are sharing detailed information about an AI-supported object tracking application based on ACROME's Smart Motion Devices (SMD) robotic products and open-source AI libraries. The article first gives preliminary information about the AI-Supported Object Tracking Applications, and then gives detailed information about the implementation both with software and hardware content and details. The source codes are shared via the Github repository and the hardware parts lists are provided in the Gitbook document.

Below, you can see a video of the application. For more information about the application, continue reading this article:

Examples of AI-Supported Object Tracking Applications

In industrial settings, AI-supported object tracking elevates operational efficiency by enabling precise monitoring of machinery, inventory, and production processes. It helps in identifying potential issues before they escalate, thus optimizing workflows and reducing downtime. In military applications, AI-driven object tracking is crucial for strategic operations, including surveillance and reconnaissance. It enhances situational awareness and ensures that critical movements and activities are tracked with high accuracy.

AI detecting humans and objects from camera stream
An AI-object tracking example of street navigation - Image is taken from https://www.augmentedstartups.com/blog/how-to-implement-object-detection-using-deep-learning-a-step-by-step-guide


Industrial Applications of AI-Supported Object Tracking

AI-driven object tracking brings substantial advancements to several critical areas, reshaping how operations are managed. When it comes to production processes, AI excels at swiftly identifying bottlenecks and inefficiencies within production lines. By analyzing real-time data, AI not only highlights these issues but also provides targeted solutions that enhance production speed and optimize resource allocation, leading to a more streamlined and productive workflow. In the realm of quality control, AI plays a pivotal role by continuously monitoring products throughout the manufacturing process. This real-time oversight ensures that each item adheres to quality standards, resulting in improved overall product quality and consistency. 

Additionally, in inventory management, AI-driven tracking systems offer a sophisticated approach to managing stock levels. By meticulously tracking raw materials and finished goods, AI helps prevent unnecessary stockpiling and ensures a timely and efficient supply chain. This leads to better resource management and a more balanced inventory, supporting smooth operational processes and reducing waste. Overall, AI-driven object tracking significantly enhances efficiency, quality, and resource management across industrial operations.

ai object detection in a factory environment
Object detection is the starting point of a tracking application - Image is from https://www.press.bmwgroup.com/

 

Military Applications of AI-Supported Object Tracking

In military contexts, AI-supported object tracking revolutionizes operational effectiveness in multiple critical ways. For intelligence gathering and reconnaissance, AI-enhanced drones and satellite systems rapidly and accurately collect data across expansive areas, providing timely and comprehensive intelligence. This capability is essential for effective surveillance and situational awareness. In the realm of border security, AI analysis of data from cameras and radar systems plays a crucial role in detecting smuggling activities and illegal crossings, significantly improving the ability to manage and secure borders. When it comes to target tracking and guidance, AI delivers precise targeting and control of munitions, enhancing operational accuracy and reducing the likelihood of collateral damage. Furthermore, AI-powered unmanned aerial vehicles (UAVs) are instrumental in monitoring large regions efficiently, swiftly identifying and tracking moving targets with high precision. 

AI-supported object detection in a military application
AI-Supported object tracking is used for tracking target vehicles in military applications. Image is taken from https://medium.com/@rizwanye/python-object-detection-using-yolov5-on-identifying-infra-red-military-assets-soldiers-supply-cc1a528f9da5

Overall, AI-supported object tracking not only boosts operational efficiency and security in military applications but also stands as a fundamental technology shaping the future of both military and industrial solutions.

Example Application Details: AI-Supported Object Tracking with a Differential Drive Open-Source Mobile Robot

The project employs various libraries and algorithms to enable a robot to recognize and track specific objects. It utilizes MobileNet-SSD model for object detection, a webcam for image acquisition, a Jetson Nano for running the AI-supported robotics software and finally the ACROME’s SMD products for motion control, mechanical setup and other sensory detection tasks. Here are the key points of the major components used in the application:

  • MobileNet-SSD CAFFE Model: Optimized for real-time object detection, this model is integrated with the SMD position and motion library to accurately track the selected object
  • Jetson Nano: Ideal for image processing and running hardware accelerated (GPU) AI algorithms
  • Smart Motion Devices (SMD) Products: Ideal for managing low-level hardware interactions, such as motion control, servo motor management and auxiliary sensor data acquisition. The mobile robot is also based on the SMD Building Set, which is a modular, reliable structure made of aluminum alloy parts and flexible joints. allows easy integration of different motor and sensor components, enhancing the project's flexibility.

This object tracking example project demonstrates the successful implementation of AI algorithms facilitated by the ACROME’s Smart Motion Devices (SMD) product line for building a simple yet effective mobile robot, using various SMD modules and easy to use programming library. We will share more information about the robot part in the following sections.

Software Details

The complete source code for this project is available on this GitHub repository, which provides all the necessary files and documentation for implementation. 

For a better understanding of the underlying logic and flow of the source code, we have provided a comprehensive flowchart diagram below. This flowchart visually represents the step-by-step processes and decision points that govern the operation of the object detection and tracking algorithm. Each component of the flowchart corresponds to a specific function or module within the source code, offering a clear overview of how the system works as a whole.

Software Flowchart of the AI-Supported Object Tracking Mobile Robot
Software Flowchart of the AI-Supported Object Tracking Mobile Robot

Software has following open-source library requirements:

  • OpenCV: OpenCV Python library, allowing access to functions for computer vision and image processing. It also gives access to the Caffe framework, which is used to run pre-trained AI models for image classification, object detection tasks.
  • SMD: This library provides easy-to-use Python modules and methods for interfacing with Acrome Smart Motor Driver products.

The example application has 2 important functions, the first one is about the object’s detection and calculating the coordinates of the object. Here is the part of the code responsible of this:

   # Prediction of network
   detections = net.forward()
‍
   # Size of frame resize (300x300)
   cols = frame_resized.shape[1   rows = frame_resized.shape[0]
            # For get the class and location of object detected, 
            # There is a fix index for class, location and confidence
            # value in @detections array .
            for i in range(detections.shape[2]):
                confidence = detections[0, 0, i, 2] #Confidence of prediction 
                if confidence > args.thr: # Filter prediction 
                    class_id = int(detections[0, 0, i, 1]) # Class label
                    # Object location 
                    xLeftBottom = int(detections[0, 0, i, 3] * cols) 
                    yLeftBottom = int(detections[0, 0, i, 4] * rows)
                    xRightTop   = int(detections[0, 0, i, 5] * cols)
                    yRightTop   = int(detections[0, 0, i, 6] * rows)
                    
                    # Factor for scale to original size of frame
                    heightFactor = frame.shape[0]/300.0  
                    widthFactor = frame.shape[1]/300.0 
                    
                    # Scale object detection to frame
                    xLeftBottom = int(widthFactor * xLeftBottom) 
                    yLeftBottom = int(heightFactor * yLeftBottom)
                    xRightTop   = int(widthFactor * xRightTop)

Once the software calculates the coordinates of the AI-detected object, then it is transferred into the motion commands. The robot either tracks the object with its pan-tilt camera or with its own body using the following code part:

   # Turn the body of the robot to the specified degree 
    def turn(self, deg):
        d = 0.5
        deg -= 90
        self.m.set_velocity(0, deg*d)
        self.m.set_velocity(1, deg*d)
        self.m.set_servo(0, 1, 95)
        self.servo_pos = 95
        time.sleep(1)
        self.m.set_velocity(0, 0)
        self.m.set_velocity(1, 0)
    
    # If a object is close track it with only the servo
    def track_object_close(self, error):
        kp = 0.02
        kd = 0.001
        
        # PID calculation for servo amx 180 min 0
        change = kp*error + kd*(self.last_error_servo - error)
        self.servo_pos = min(180, max(0, self.servo_pos + change))
        self.last_error_servo = error
        # Servo position
        print(f"Servo:{self.servo_pos}")
        self.m.set_servo(0, 1, int(self.servo_pos))
        
        # If object is getting out of frame turn the body
        if self.servo_pos > 95:
            self.las_dir = 1
        else:
            self.las_dir = 0
        if self.servo_pos > 160:
            self.turn(self.servo_pos)
        elif self.servo_pos < 20:
            self.turn(self.servo_pos)

There are some important middle tasks and conversions as well. However the complexity of robotic related tasks are minimized thanks to the SMD software library that is used in the project.

About SMD Software Library

The motion commands m.set_servo, m.set_velocity and the sensor measurement commands are used and they simplify the software development process as well as measurement tasks. Here are some major advantages of using ACROME’s SMD hardware and software in this project:

  • Modular Structure: Facilitates easy integration of various motor and sensor components, increasing the project's flexibility.
  • Easy Configuration: Offers a comprehensive yet straightforward API for configuring and controlling motors and servos, allowing precise control over motor speeds and servo angles.
  • Quick Integration: The ready-made functions in SMD accelerate software development, boosting overall project efficiency.

Hardware Details

Sensors and actuators are the most important part of autonomous mobile robots, which together make up the sensory and motor systems that enable these machines to communicate with and navigate their environment successfully. The robots' sensory organs are sensors, which provide real-time data about the surroundings, such as details on obstacles, topography, and ambient conditions. This information is essential for the robot to navigate safely and precisely and to make deft decisions and adjust to changing conditions.

Robot Hardware: Imaging Sensor

Robot relies solely on one sensor: a typical USB camera. In this project we used a Logitech HD camera, which has a 25 FPS frame rate and 1024*768 resolution. This single sensory organ, powered by AI object detection algorithms, provides real-time data about its surroundings. We mounted the camera onto the SMD Building Set's Pan-Tilt kit, which is a motorized with 2x servo motors and controlled with SMD Servo add-on modules.

SMD Pan-Tilt kit
SMD Pan-Tilt kit is utilized for moving the camera to search the environment without rotating the robot's body

Through the lens of the camera, the robot perceives obstacles, topography, and ambient conditions, allowing it to navigate with precision and make agile decisions. The MobileNet-SSD CAFFE AI model processes this visual information, enabling the robot to detect, classify, and track objects seamlessly. This setup exemplifies the power of AI, where sophisticated algorithms compensate for the lack of multiple sensors, showcasing that a single camera can be as effective as a suite of sensory devices.

Robot Hardware: Robot Chassis and Electronics based on Smart Motion Devices

ACROME’s Smart Motion Devices (SMD) product family carries the infrastructure of the mobile robot that is used in this project. SMD products are split into 3 categories and we use multiple items from each category:

  1. SMD Motor Driver Modules: In this project we used 2x SMD RED Brushed DC Motor driver modules, each module connected to one DC motor with 100 RPM speed with a maximum of 34 kg⋅cm stall torque output. 

Mobile robots need a motor driver system to control and drive their motors. Acrome’s SMD (Smart Motion Devices) RED Brushed DC motor is used to control and drive the motors in this porject.

SMD RED Brushed DC Motor module
SMD Pan-Tilt kit is utilized for moving the camera to search the environment without moving the robot's body

The SMD RED is not only easy to use but also “SMART”, meaning that it offloads the main controller (Jetson NANO in this project) for controlling the speed of the motors. It also collects data from the sensor modules and delivers power through its daisy-chained power network.

  1. SMD Sensor Add-On Modules: Additional to the main navigation motor control, we also used SMD servo motor modules for controlling the servo motors of the SMD pan-tilt kit, as well as the SMD Ultrasound module for obstacle avoidance. Obstacle avoidance can be developed further by adding other sensor modules such as Reflectance or Ambient Light. 

The SMD Sensor modules are connected with each other in a Daisy Chain network. They are connected to each other and also to the driver module as well.

  1. SMD Building Set: This is a modular set of aluminum plates, hinges, joints, wheels as well as the pan-tilt mechanism of the camera. Different robot types (differential drive, Ackermann, omni-wheel etc.) can easily be built with the SMD Building Set.
mobile robots can be built with SMD Building Set
Various mobile robots can be built with SMD Building Set. Tracking Mobile Robot is also built with the same hardware kit.

Computing Hardware

The backbone of this sophisticated object-tracking robot is the Jetson Nano, a powerhouse in the world of edge AI computing. Compact yet mighty, the Jetson Nano not only provides the computational muscle required to run the Caffe AI model efficiently but also communicates with the SMD network to run the robot, and Pan-Tilt arm of the camera as well as the tracking algorithm. The Jetson Nano’s integration into the robot exemplifies how advanced hardware can complement cutting-edge AI algorithms, creating a seamless and powerful object detection system using minimal resources.

Application Areas of the Project and Future Developments

The AI object-tracking robot, driven by the Jetson Nano and cutting-edge AI algorithms, is poised to revolutionize various fields with its remarkable versatility.

In the forefront of its applications is security and surveillance, where the robot’s real-time tracking capabilities significantly enhance safety by monitoring and identifying potential threats.

Retail environments also benefit greatly, as the robot streamlines inventory management and tracks customer interactions, leading to optimized operations and improved service. In healthcare, the robot’s ability to monitor patient movements and manage medical supplies brings about greater efficiency and better care.

Industrial sectors see transformative improvements as the robot oversees production lines and equipment, boosting productivity and safety. Logistics and warehousing are optimized with the robot’s precise tracking of inventory and goods, reducing errors and enhancing efficiency. Agriculture is similarly advanced through real-time crop and livestock monitoring, leading to more effective resource management. Smart homes leverage the robot’s tracking features to enhance security and convenience.

Lastly, in the entertainment sector, the robot provides engaging and interactive experiences, enhancing visitor interactions in dynamic environments. This array of applications highlights the robot’s pivotal role in driving innovation across diverse sectors through intelligent, real-time data processing and tracking.

References

[1]https://sertiscorp.medium.com/an-overview-of-object-tracking-use-cases-challenges-and-applications-f5689794c3ba

[2] https://medium.com/@MShivani/object-detection-using-caffe-7ae939a9d910

[3] https://medium.com/@tauseefahmad12/object-detection-using-mobilenet-ssd-e75b177567ee

[4] ACROME SMD Python Library Document

Author

Erhan Türker
Intern Engineer

Discover Acrome

Acrome was founded in 2013. Our name stands for ACcessible RObotics MEchatronics. Acrome is a worldwide provider of robotic experience with software & hardware for academia, research and industry.