Home
Blog
Go Back
TECHNOLOGY
|
10
Min
Updated On
February 21, 2025

AI Robotics Case - Controlling SMD Mobile Robots with Groq

In ACROME Robotics, we are developing easy to understand and easy to replicate content about AI Robotics applications. In our recent articles, we provided an introductory content about using Large Language Model based AI engines in mobile robots. This is a rapid evolving area and we are trying different AI engines and trying to compare their performances. We also published another article, which uses MobileNET SSD’s pre-trained deep-learning models on a small low-cost single-board computer and used these models for visual tracking of objects with a mobile robot built with ACROME’s Smart Motion Devices (SMD) products, which is also a hot area for new updates with new AI tools and algorithms.

What is Groq and why should you consider using Groq for controlling a mobile robot?

Groq is a transformer-based LLM developed by Groq, Inc., a company founded by former Google researchers. Groq is designed to be a highly scalable and efficient LLM, capable of processing large amounts of text data and generating high-quality responses. Groq's architecture is based on the popular BERT model, but it has several key improvements for performance and flexibility.

Why did we select Groq for this application?

According to Groq, it is more suitable for real-time applications that require fast and accurate text processing, such as chatbots, virtual assistants, and language translation systems. In this application, our goal is to develop a customized “robotics” chatbot indeed. And again, Groq is also well known for applications that require customized solutions, such as fine-tuning for specific tasks and domains.

Using Groq for mobile robot control makes your robot smarter, more flexible, and highly efficient. Compared to traditional programming, an AI-powered system can understand user commands in natural language, thus helping to build more human-centered robotics solutions. With further optimization, the AI can process sensor data, and instantly respond to environmental changes, determine the optimal path, and avoid obstacles while optimizing its movement. Additionally, it can analyze the robot’s sensor data (encoder, current, battery voltage etc.)  to monitor its sub-system performance and predict potential failures before they occur.

With Groq’s high-speed AI models, the decision-making process is accelerated, opening a doorway for seamless remote management and integration with cloud-based control systems. As a result, the robot not only executes predefined tasks but can adapt with latest developments in the AI robotics field and thus can improve its useability and performance.

Controlling Mobile Robots with Groq.AI

The AI-powered autonomous mobile robot combines the simplicity and modularity of ACROME’s SMD platform with the intelligence of Grok AI to deliver smooth motion control and real-time decision-making. The application is developed for educational purposes. However it can be re-used for automation and daily robotics tasks as well.

The robot enables precise motor control, ensuring stability and adaptability in dynamic environments. We have integrated a chatbot connected to Groq AI, allowing real-time communication and intelligent decision-making. The robot can easily be equipped with LiDAR, cameras, and other sensors, enabling obstacle detection and autonomous navigation. Users can interact with the robot via a voice or text-based chatbot, which processes commands and queries Groq AI for real-time data, enhancing the robot’s decision-making capabilities.

Whether used in automated warehouses, smart factories, or customer service environments, this combination provides not only precise movement and scalable electronics, but also interactive engagement, and real-time adaptability. The combination of advanced motor control with ACROME’s Smart Motion Devices products, AI-driven processing, and Groq’s vast knowledge base makes a powerful, intelligent, and efficient robotic system that represents the future of smart automation and AI-integrated robotics.

Code enables remote control of an SMD Red motor through a web-based API, allowing users to manage motor operations with simple commands. It automatically detects the SMD motor via USB connection, establishes communication, and provides functions to start, stop, and adjust motor speed. Users can send commands such as starting the motor at a specific speed, stopping it instantly, or modifying its velocity dynamically. The system integrates Groq AI, which enhances motor performance by predicting movement patterns, optimizing speed adjustments, and ensuring precise control in real time. Additionally, it logs all operations and potential errors in a file for monitoring, troubleshooting, and ensuring smooth execution. By combining SMD motor control with Groq AI-powered optimization, this program provides an efficient, adaptive, and user-friendly solution for automation and robotics applications.

Robot Hardware Details

At the heart of the project lies a mobile robot, built with the modular Smart Motion Devices (SMD) product group. More information is available at the GitBook pages of the SMD products. The SMD products provides an open path for modifications of the robot without barriers.

Here are the major parts of the mobile robot and the details of each item in the project.

SMD RED Brushed DC Motor Driver from SMD Electronics Set: The robot is equipped with ACROME's Smart Motion Devices, which provide high torque and precise positioning capabilities. These devices are crucial for the accurate control of the robot’s movements.

Brushed DC Motors from SMD Electronics: The robot uses two DC motors for a differential drive by the SMD RED BDC modules. These motors are responsible for the robot's mobility, allowing it to perform linear and radial movements as well as rotations. You may check Differential Robot Projects from SMD documentation to learn more about differential mobile robot applications.

• Raspberry Pi: The Raspberry Pi serves as the central control unit, running the Flask API that manages the robot's commands. It interfaces with the SMD modules through the SMD USB Gateway module and handles the communication with the Client Side PC through a wireless (or sometimes wired for small task) network. SMD products have a native Python API.

USB Gateway from SMD Electronics: The SMD communication network can be connected to the main controller using the USB gateway. This works best with the USB capable host controllers. Alternatively, UART communication (TTL) can be considered with the SMD’s Arduino Gateway Modules.

Ultrasonic Distance Sensor Module from SMD Electronics: Multiple ultrasound modules (2 or 4) are connected to the robot’s chassis, they are used for preventing collisions. Thanks to the daisy chain connection, the sensors are connected to the SMD RED BDC modules that are close-by. Power and communication is carried with the RJ-45 typed cable, and it reduces the wire clutter and bad connections as well.

• Battery Management System from SMD Electronics: A battery pack powers both the Raspberry Pi and the motors, ensuring consistent operation during the robot's movement and control processes.

Mechanical Parts from SMD Building Set: The robot chassis is built with the modular SMD building set parts. Major parts are Plates, Joints and the Wheel set. The mechanical parts have different options and alternative mounting points, which gives users freedom to alter the design with minimal effort.

Software Details

The software enables remote control of the robot through a web-based application, allowing users to manage tasks with simple commands.

Starting with the Wi-Fi connection setup, we establish our communication with the robot using the IP scanner panel.

As the AI part of the robot integrates with the Groq AI, users need to enter their own API key for once, which is used to access users own Groq account.

With the successful connection of the robot and the Groq API, the robot becomes ready for receiving commands from the prompt screen. The system is designed to allow users to control the robot using voice or text-based commands, making it highly interactive. Users can enter their commands either by typing their commands to the “Command” section and click “Send Command” button or by initiating a speech recognition task with the “Start Listening” button.

Currently, the robot executes command in a sequential manner and provides a written feed-back on the application with the result of the command. The application enhances the performance by predicting movement patterns, optimizing speed adjustments, and ensuring precise control in real time. Additionally, it logs all operations and potential errors in a file for monitoring, troubleshooting, and ensuring smooth execution.

This video shows a simple text command entry for controlling the mobile robot built with SMD products and Groq AI:

This video shows a sequence of commands for controlling the same mobile robot:

Tele-operation of the mobile robot with the mobile application (No AI usage in here):

The software is structured to support real-time communication, modular architecture, and extensibility for future updates. The AI part of the application provides functions to start, stop, and adjust motor speed. Users can send commands such as starting the motor at a specific speed, stopping it instantly, or modifying its velocity dynamically.

Client-Side (Android Application)

The mobile application  is built using Flutter  and serves as the primary user interface  for controlling the motion kit. It connects to the Raspberry Pi via Wi-Fi  and provides several key functionalities:

Main Features:

1. Device Discovery & Connection

  - The app scans the local network to find available Raspberry Pi devices running the control software.

  - It filters out non-Linux devices and presents a list for selection.

2. Wi-Fi Configuration & Management

  - Allows users to manually enter network SSID and password.

  - Can switch between predefined network profiles for different locations.

3. AI-Powered Voice & Text Commands

  - Users can enter commands like `"Move forward 50 cm, then turn right"` using speech-to-text conversion.

  - AI processes the command and translates it into precise movement instructions.

4. Manual Control Panel

  - Provides on-screen joystick controls for real-time manual navigation.

  - Displays robot telemetry (battery level, speed, network status).

5. Error Handling & Notifications

  - Detects connection issues and provides user-friendly alerts.

  - If an incorrect command is given, the system suggests alternative phrasing.

Pseudo-Function Design

The pseudo-function design ensures an efficient and structured flow of command execution and feedback. The process is divided into several layers:

Processing Steps:

1. User Input Layer

  - Receives user commands from voice or text input.

2. AI Parsing Layer

  - Converts commands into structured movement instructions.

3. Communication Layer

  - Transmits API requests to Raspberry Pi.

4. Execution Layer

  - Robot processes API commands and executes movement.

5. Feedback Layer

  - Sends motion status and telemetry back to the user interface.

Integration with LLM (Large Language Model)

Integration with LLM (Large Language Model)The project utilizes Groq AI to enable natural language understanding. The AI performs the following tasks:

Key Functionalities:

Command Breakdown: AI understands and structures complex movement instructions.

Error Detection: AI identifies ambiguous commands and requests clarification.

Learning Mechanism: The system adapts to frequently used commands for faster response.

Multilingual Support: Potentially supports different languages for user interaction.

LLama-3.3-70B-Versatile Integration: The model enhances processing efficiency, ensuring accurate interpretation and response generation.

Guidance for LLM

To ensure accuracy and robustness, the LLM follows structured guidance principles:

1. Predefined Command Sets

  - The AI recognizes and prioritizes well-defined motion instructions.

2. Context Awareness

  - AI maintains memory of previous commands for sequential movements.

3. Data Logging & Training

  - Command history is stored for continuous improvement of response accuracy.

4. Real-Time Processing

  - AI processes inputs with minimal latency for smooth robot operation.

The User Interface (UI)

The Flutter-based UI is designed to be clean, intuitive, and user-friendly. It consists of:

Main Screens:

1. Home Screen

  - Displays available Raspberry Pi devices for connection.

2. Control Panel

  - Provides joystick-based manual control.

  - Allows AI-based command execution.

3. Settings Screen

  - Wi-Fi configuration options.

  - API key management for AI integration.

4. Telemetry Dashboard

  - Shows real-time sensor data from the robot.

Robot Side of the Software

The robot software runs on Raspberry Pi and serves as the command execution engine.

Core Functions:

- Receives API requests and translates them into movement instructions.

- Controls the motors using the Acrome SMD Python library.

- Manages network configurations for seamless connectivity.

- Executes predefined safety checks to prevent collision.

Flask-Based RESTful API

A Flask-based API  is implemented on Raspberry Pi for handling communication with the client application.

API Functionalities:

- Motion commands (forward, backward, turn left, turn right, stop).

- System diagnostics (Wi-Fi status, battery level, sensor readings).

- Error reporting (command failures, connection issues).

Control Functions Defined in the RESTful API

The API defines various movement control functions that are exposed via HTTP endpoints:

| **Endpoint** | **Functionality** | | `/move_forward?cm=X` | Moves forward by X cm | | `/move_backward?cm=X` | Moves backward by X cm | | `/turn_left?degrees=Y` | Turns left by Y degrees | | `/turn_right?degrees=Y` | Turns right by Y degrees | | `/stop` | Stops all motion |

API Endpoint Structure

Each API endpoint follows a structured format with:

-  Request type: `POST`

-  Parameters: Distance, direction, or angle

- Response: JSON status updates with success/failure messages

Example Request:

```json { "command": "move_forward", "distance": 50 }

Python Library of the SMD Modules

The Acrome SMD Python library (`acrome-smd`)  is used for precise motor control.

Library Features:

- Low-level motor control

- Velocity and acceleration adjustments

- Custom movement functions

- Error handling and safety limits

Results and Further Reading

Whether you are starting AI robotics tasks, or considering new tools and robotics projects, SMD product family will help you at every level. Feel free to check different level do-it-yourself projects available at SMD Projects documentation page. Contact us for more information or share your own experience.

Author

Ashkan Zanjani
Software Engineer

Discover Acrome

Acrome was founded in 2013. Our name stands for ACcessible RObotics MEchatronics. Acrome is a worldwide provider of robotic experience with software & hardware for academia, research and industry.