Merge branch 'main' into 35-als-gebruiker-wil-ik-dat-mijn-data-word-opgeslagen-in-een-database-om-data-terug-te-zien

This commit is contained in:
2025-01-13 10:26:29 +01:00
36 changed files with 2043 additions and 380 deletions

4
.gitignore vendored
View File

@@ -13,7 +13,7 @@ src/Socket/a.out
src/C++/Driver/cmake_install.cmake src/C++/Driver/cmake_install.cmake
src/C++/Socket/a.out src/C++/Socket/a.out
src/C++/Driver/Makefile src/C++/Driver/Makefile
src/C++/Driver/vgcore* vgcore*
src/C++/Driver/cmake_install.cmake src/C++/Driver/cmake_install.cmake
src/C++/Driver/Makefile src/C++/Driver/Makefile
src/C++/Driver/log src/C++/Driver/log
@@ -31,3 +31,5 @@ CMakeFiles/
Makefile Makefile
CMakeCache.txt CMakeCache.txt
cmake_install.cmake cmake_install.cmake
src/C++/OpenCV/main
.vs

View File

@@ -1,93 +1,8 @@
# TI-project # TI-project - Kobuki
## Getting started
To make it easy for you to get started with GitLab, here's a list of recommended next steps.
Already a pro? Just edit this README.md and make it your own. Want to make it easy? [Use the template at the bottom](#editing-this-readme)!
## Add your files
- [ ] [Create](https://docs.gitlab.com/ee/user/project/repository/web_editor.html#create-a-file) or [upload](https://docs.gitlab.com/ee/user/project/repository/web_editor.html#upload-a-file) files
- [ ] [Add files using the command line](https://docs.gitlab.com/ee/gitlab-basics/add-file.html#add-a-file-using-the-command-line) or push an existing Git repository with the following command:
```
cd existing_repo
git remote add origin https://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-project.git
git branch -M main
git push -uf origin main
```
## Integrate with your tools
- [ ] [Set up project integrations](https://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-project/-/settings/integrations)
## Collaborate with your team
- [ ] [Invite team members and collaborators](https://docs.gitlab.com/ee/user/project/members/)
- [ ] [Create a new merge request](https://docs.gitlab.com/ee/user/project/merge_requests/creating_merge_requests.html)
- [ ] [Automatically close issues from merge requests](https://docs.gitlab.com/ee/user/project/issues/managing_issues.html#closing-issues-automatically)
- [ ] [Enable merge request approvals](https://docs.gitlab.com/ee/user/project/merge_requests/approvals/)
- [ ] [Set auto-merge](https://docs.gitlab.com/ee/user/project/merge_requests/merge_when_pipeline_succeeds.html)
## Test and Deploy
Use the built-in continuous integration in GitLab.
- [ ] [Get started with GitLab CI/CD](https://docs.gitlab.com/ee/ci/quick_start/index.html)
- [ ] [Analyze your code for known vulnerabilities with Static Application Security Testing (SAST)](https://docs.gitlab.com/ee/user/application_security/sast/)
- [ ] [Deploy to Kubernetes, Amazon EC2, or Amazon ECS using Auto Deploy](https://docs.gitlab.com/ee/topics/autodevops/requirements.html)
- [ ] [Use pull-based deployments for improved Kubernetes management](https://docs.gitlab.com/ee/user/clusters/agent/)
- [ ] [Set up protected environments](https://docs.gitlab.com/ee/ci/environments/protected_environments.html)
***
# Editing this README
When you're ready to make this README your own, just edit this file and use the handy template below (or feel free to structure it however you want - this is just a starting point!). Thanks to [makeareadme.com](https://www.makeareadme.com/) for this template.
## Suggestions for a good README
Every project is different, so consider which of these sections apply to yours. The sections used in the template are suggestions for most open source projects. Also keep in mind that while a README can be too long and detailed, too long is better than too short. If you think your README is too long, consider utilizing another form of documentation rather than cutting out information.
## Name
Choose a self-explaining name for your project.
## Description ## Description
Let people know what your project can do specifically. Provide context and add a link to any reference visitors might be unfamiliar with. A list of Features or a Background subsection can also be added here. If there are alternatives to your project, this is a good place to list differentiating factors. This project is a kobuki that drives around in dangerous areas and detects objects in its path. It uses a camera to detect objects. The kobuki is able to drive around in a room and detect objects.
## Badges ## Photos
On some READMEs, you may see small images that convey metadata, such as whether or not all the tests are passing for the project. You can use Shields to add some to your README. Many services also have instructions for adding a badge. ![Kobuki](/docs/assets/KobukiPhoto.jpg)
## Visuals
Depending on what you are making, it can be a good idea to include screenshots or even a video (you'll frequently see GIFs rather than actual videos). Tools like ttygif can help, but check out Asciinema for a more sophisticated method.
## Installation
Within a particular ecosystem, there may be a common way of installing things, such as using Yarn, NuGet, or Homebrew. However, consider the possibility that whoever is reading your README is a novice and would like more guidance. Listing specific steps helps remove ambiguity and gets people to using your project as quickly as possible. If it only runs in a specific context like a particular programming language version or operating system or has dependencies that have to be installed manually, also add a Requirements subsection.
## Usage
Use examples liberally, and show the expected output if you can. It's helpful to have inline the smallest example of usage that you can demonstrate, while providing links to more sophisticated examples if they are too long to reasonably include in the README.
## Support
Tell people where they can go to for help. It can be any combination of an issue tracker, a chat room, an email address, etc.
## Roadmap
If you have ideas for releases in the future, it is a good idea to list them in the README.
## Contributing
State if you are open to contributions and what your requirements are for accepting them.
For people who want to make changes to your project, it's helpful to have some documentation on how to get started. Perhaps there is a script that they should run or some environment variables that they need to set. Make these steps explicit. These instructions could also be useful to your future self.
You can also document commands to lint the code or run tests. These steps help to ensure high code quality and reduce the likelihood that the changes inadvertently break something. Having instructions for running tests is especially helpful if it requires external setup, such as starting a Selenium server for testing in a browser.
## Authors and acknowledgment
Show your appreciation to those who have contributed to the project.
## License
For open source projects, say how it is licensed.
## Project status
If you have run out of energy or time for your project, put a note at the top of the README saying that development has slowed down or stopped completely. Someone may choose to fork your project or volunteer to step in as a maintainer or owner, allowing your project to keep going. You can also make an explicit request for maintainers.

View File

@@ -0,0 +1,51 @@
# Systemd Services
# What is a service
A service is a program or script that runs in the background and is managed by the system. Services are started at boot time and run until the system is shut down. Services can be started, stopped, and restarted by the system administrator.
# How to manage services on systemD
## Starting a service
To start a service, use the `systemctl start` command followed by the service name. For example, to start the `apache2` service, use the following command:
```bash
sudo systemctl start apache2
```
## Stopping a service
To stop a service, use the `systemctl stop` command followed by the service name. For example, to stop the `apache2` service, use the following command:
```bash
sudo systemctl stop apache2
```
## Restarting a service
To restart a service, use the `systemctl restart` command followed by the service name. For example, to restart the `apache2` service, use the following command:
```bash
sudo systemctl restart apache2
```
## Enabling a service
To enable a service to start at boot time, use the `systemctl enable` command followed by the service name. For example, to enable the `apache2` service, use the following command:
```bash
sudo systemctl enable apache2
```
## Creating a new service
To create a new service, you need to create a new service file in the `/etc/systemd/system/` directory. The service file should have a `.service` extension and contain the following sections:
### Example service file:
```bash
[Unit]
Description=FlaskApp #description of the service
After=network.target #start the service after the network is up
[Service]
User=ishak #start the service as a specific user
WorkingDirectory=/home/ishak/rooziinuubii79/src/Python/flask/web/ #working directory of the service
ExecStart=/usr/bin/gunicorn -w 3 -b 127.0.0.1:5000 app:app #command to start the service
```

BIN
docs/assets/KobukiPhoto.jpg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 491 KiB

50
docs/code/Mqtt.md Normal file
View File

@@ -0,0 +1,50 @@
# MQTT
## What is MQTT?
MQTT is a lightweight messaging protocol made for IOT devices. It allows efficient communication between IoT devices, servers, and applications by allowing them to
publish and subscribe to messages.
## How to connect
To connect to a MQTT server you need to create a instance of the class.
Example:
```cpp
// server adress, Client ID, Client Username, Client Password
MqttClient client("ws://145.92.224.21/ws/", "KobukiRPI", "rpi", "rpiwachtwoordofzo"); // create a client object
```
Later in the setup function you need to call ```client.connect();``` to connect to the mqtt server.
```cpp
client.connect();
```
When you've connected and the instance is initiated you can subscribe to topics or send messages to topics.
## Subscribing and receiving messages
Example subscribing to a topic:
```cpp
void setup(){
client.subscribe("home/commands");
}
```
Example receiving latest message from a topic:
```cpp
std::string foo(){
std::string latestMqttMessage = "";
latestMqttMessage = client.getLastMessage();
return latestMqttMessage;
}
```
If you want to subscribe to mulitple topics you need to initiate multiple instances of the mqtt class.
## Publishing messages
Example publishing a message:
```cpp
void foo(std::string Message){
//channel, payload
client.publishMessage("kobuki/example", Message);
}
```

69
docs/code/OpenCV.md Normal file
View File

@@ -0,0 +1,69 @@
# OpenCV
## Requirements
We want that the camera we want it to detect what is happening on the video feed and identify it so it can identify dangers.
## Issues
* OpenCL not grabbing gpu
* Solution: https://github.com/Smorodov/Multitarget-tracker/issues/93
## Installation
### Dependencies
* glew (for openGL)
* opencv C++ lib
How to install OpenCV
```bash
sudo apt-get install libopencv-dev
```
## Code explanation
### Opening the camera with OpenCV
```cpp
VideoCapture cap(0); //Open the default camera (0), points to /dev/video0. You could also change the number to the preferred camera
if (!cap.isOpened()) { //if camera is not opened throw a error message
cerr << "Error: Could not open camera" << endl;
return;
}
```
## Taking a picture and storing it in a variable
```cpp
Mat frame; //create a new Matrix variable called frame
while (true) {
cap >> frame; // Capture a new image frame.
if (frame.empty()) { //if the variable frame is not filled return a error
cerr << "Error: Could not capture image" << endl;
continue;
}
```
## Encoding the image for sending it over MQTT
```cpp
vector<uchar> buf; //create a dyanmic buffer for the image
imencode(".jpg", frame, buf); //encode the image to the buffer
auto* enc_msg = reinterpret_cast<unsigned char*>(buf.data());
```
```cpp
void CapnSend() {
// Convert the image to a byte array
// Publish the image data
client.publishMessage("kobuki/cam", string(enc_msg, enc_msg + buf.size()));
cout << "Sent image" << endl;
std::this_thread::sleep_for(std::chrono::milliseconds(300)); // Send image every 1000ms
}
}
```
## Sources
* https://github.com/UnaNancyOwen/OpenCVDNNSample/tree/master

View File

@@ -0,0 +1,25 @@
# Kobuki driver
## How do i communicate with the kobuki
You can communicate with the kobuki by usb serial or the big serial port on the front. We chose the usb port paired with a raspberry Pi.
The Kobuki sends a message every 200ms with a baudrate of 115200. It sends all the sensordata and the message always starts with the same 2 bytes 0xAA and 0x55.
## Kobuki payloads
To communicate with the kobuki we need to send payloads to the kobuki. These are structured the same as the payloads that the kobuki sends.
```cpp
unsigned char KobukiPayload[11] = {
0xaa, // Start byte 1
0x55, // Start byte 2
0x08, // Payload length (the first 2 bytes dont count)
0x01, // payload type (0x01 = control command)
0x04, // Control byte or additional identifier
actual_speed % 256, // Lower byte of speed value (max actual_speed 1024)
actual_speed >> 8, // Upper byte of speed value
0x00, // Placeholder for radius
0x00, // Placeholder for radius
0x00 // Placeholder for checksum (will be applied later)
};
```
You can also find the documentation about the payloads on the kobuki website

View File

@@ -499,32 +499,6 @@ void CKobuki::doRotation(long double th) {
usleep(25 * 1000); usleep(25 * 1000);
} }
// combines navigation to a coordinate and rotation by an angle, performs
// movement to the selected coordinate in the robot's coordinate system
void CKobuki::goToXy(long double xx, long double yy) {
long double th;
yy = yy * -1;
th = atan2(yy, xx);
doRotation(th);
long double s = sqrt(pow(xx, 2) + pow(yy, 2));
// resetnem suradnicovu sustavu robota
x = 0;
y = 0;
iterationCount = 0;
theta = 0;
// std::cout << "mam prejst: " << s << "[m]" << std::endl;
goStraight(s);
usleep(25 * 1000);
return;
}
/// @brief Makes the robot move forward for 3 seconds /// @brief Makes the robot move forward for 3 seconds
/// @param speedvalue How fast it will drive forward from 0 - 1024 /// @param speedvalue How fast it will drive forward from 0 - 1024
void CKobuki::forward(int speedvalue) { void CKobuki::forward(int speedvalue) {
@@ -592,7 +566,7 @@ void CKobuki::robotSafety(std::string *pointerToMessage) {
parser.data.CliffCenter || parser.data.CliffRight) { parser.data.CliffCenter || parser.data.CliffRight) {
std::cout << "Safety condition triggered!" << std::endl; // Debug print std::cout << "Safety condition triggered!" << std::endl; // Debug print
*pointerToMessage = "estop"; *pointerToMessage = "estop";
forward(-100); // reverse the robot forward(-300); // reverse the robot
} }
std::this_thread::sleep_for(std::chrono::milliseconds(static_cast<int>(100))); std::this_thread::sleep_for(std::chrono::milliseconds(static_cast<int>(100)));
} }
@@ -606,8 +580,10 @@ void CKobuki::robotSafety() {
parser.data.BumperRight || parser.data.CliffLeft || parser.data.BumperRight || parser.data.CliffLeft ||
parser.data.CliffCenter || parser.data.CliffRight) { parser.data.CliffCenter || parser.data.CliffRight) {
std::cout << "Safety condition triggered!" << std::endl; // Debug print std::cout << "Safety condition triggered!" << std::endl; // Debug print
forward(-100); // reverse the robot forward(-300); // reverse the robot
} }
std::this_thread::sleep_for(std::chrono::milliseconds(static_cast<int>(100)));
} }
} }

View File

@@ -31,7 +31,6 @@
#include <chrono> #include <chrono>
#include <sstream> #include <sstream>
#include "KobukiParser.h" #include "KobukiParser.h"
#include "graph.h"
using namespace std; using namespace std;

View File

@@ -2,6 +2,8 @@
#include <iostream> #include <iostream>
//moet checkenvalue gebruiken of moet kijken naar de payloadlength welke dingen er extra zijn //moet checkenvalue gebruiken of moet kijken naar de payloadlength welke dingen er extra zijn
int KobukiParser::parseKobukiMessage(TKobukiData &output, unsigned char *data) { int KobukiParser::parseKobukiMessage(TKobukiData &output, unsigned char *data) {
std::this_thread::sleep_for(std::chrono::milliseconds(static_cast<int>(20))); //avoid busy waiting. The kobuki sends a message every 20ms
int rtrnvalue = checkChecksum(data); int rtrnvalue = checkChecksum(data);
if (rtrnvalue != 0) { if (rtrnvalue != 0) {
// std::cerr << "Invalid checksum" << std::endl; // std::cerr << "Invalid checksum" << std::endl;

View File

@@ -2,6 +2,8 @@
#define KOBUKIPARSER_H #define KOBUKIPARSER_H
#include <vector> #include <vector>
#include <thread>
struct TRawGyroData { struct TRawGyroData {
int x, y, z; int x, y, z;

View File

@@ -1,71 +0,0 @@
#ifndef GRAPH1010
#define GRAPH1010
#include <stdio.h>
#include <stdlib.h>
#include <vector>
using namespace std;
#define GRAPH_ENABLED true
class plot {
public:
FILE *gp;
bool enabled,persist;
plot(bool _persist=false,bool _enabled=GRAPH_ENABLED) {
enabled=_enabled;
persist=_persist;
if (enabled) {
if(persist)
gp=popen("gnuplot -persist","w");
else
gp=popen("gnuplot","w");
}
}
void plot_data(vector<float> x,const char* style="points",const char* title="Data") {
if(!enabled)
return;
fprintf(gp,"set title '%s' \n",title);
fprintf(gp,"plot '-' w %s \n",style);
for(int k=0;k<x.size();k++) {
fprintf(gp,"%f\n",x[k]);
}
fprintf(gp,"e\n");
fflush(gp);
}
void plot_data(vector<float> x,vector<float> y,const char* style="points",const char* title="Data") {
if(!enabled)
return;
fprintf(gp,"set title '%s' \n",title);
fprintf(gp,"plot '-' w %s \n",style);
for(int k=0;k<x.size();k++) {
fprintf(gp,"%f %f \n",x[k],y[k]);
}
fprintf(gp,"e\n");
fflush(gp);
}
~plot() {
if(enabled)
pclose(gp);
}
};
/*
int main(int argc,char **argv) {
plot p;
for(int a=0;a<100;a++) {
vector<float> x,y;
for(int k=a;k<a+200;k++) {
x.push_back(k);
y.push_back(k*k);
}
p.plot_data(x,y);
}
return 0;
}
*/
#endif

View File

@@ -5,6 +5,7 @@ MqttClient::MqttClient(const std::string& address, const std::string& clientId,
//here all the @PARAMS are getting set for the connection //here all the @PARAMS are getting set for the connection
: client_(address, clientId), username_(username), password_(password), callback_(*this) { : client_(address, clientId), username_(username), password_(password), callback_(*this) {
client_.set_callback(callback_); client_.set_callback(callback_);
options.set_clean_session(true); options.set_clean_session(true);
options.set_mqtt_version(MQTTVERSION_3_1_1); // For MQTT 3.1.1 options.set_mqtt_version(MQTTVERSION_3_1_1); // For MQTT 3.1.1
if (!username_.empty() && !password_.empty()) { if (!username_.empty() && !password_.empty()) {
@@ -36,7 +37,6 @@ void MqttClient::subscribe(const std::string& topic, int qos) {
void MqttClient::publishMessage(const std::string& topic, const std::string& payload) { void MqttClient::publishMessage(const std::string& topic, const std::string& payload) {
try { try {
std::cout << "Publishing message: " << payload << std::endl;
client_.publish(topic, payload)->wait(); client_.publish(topic, payload)->wait();
} catch (const mqtt::exception& exc) { } catch (const mqtt::exception& exc) {
std::cerr << "Error: " << exc.what() << std::endl; std::cerr << "Error: " << exc.what() << std::endl;

View File

@@ -12,63 +12,52 @@ CKobuki robot;
std::string readMQTT(); std::string readMQTT();
void parseMQTT(std::string message); void parseMQTT(std::string message);
void CapnSend(); void CapnSend();
MqttClient client("ws://145.92.224.21/ws/", "KobukiRPI", "rpi","rpiwachtwoordofzo"); // create a client object //ip, clientID, username, password
MqttClient client("ws://145.92.224.21/ws/", "KobukiRPI", "rpi", "rpiwachtwoordofzo"); // create a client object
std::string message = "stop"; std::string message = "stop";
std::string serializeKobukiData(const TKobukiData &data); std::string serializeKobukiData(const TKobukiData &data);
void sendKobukiData(TKobukiData &data); void sendKobukiData(TKobukiData &data);
void setup() { void setup()
unsigned char *null_ptr(0); {
std::cout << "Attempting to start communication with Kobuki..." << std::endl; unsigned char *null_ptr(0);
robot.startCommunication("/dev/ttyUSB0", true, null_ptr); robot.startCommunication("/dev/ttyUSB0", true, null_ptr);
if (!robot.isConnected()) { //connect mqtt server and sub to commands
std::cerr << "Failed to start communication with Kobuki." << std::endl;
} else {
std::cout << "Successfully started communication with Kobuki." << std::endl;
}
// connect mqtt server and sub to commands
client.connect(); client.connect();
client.subscribe("home/commands"); client.subscribe("home/commands");
} }
int main() { int main()
setup(); {
setup();
std::thread image(CapnSend); std::thread image (CapnSend);
std::thread safety([&]() { robot.robotSafety(&message); }); std::thread safety([&]() { robot.robotSafety(&message); });
std::thread sendMqtt([&]() { sendKobukiData(robot.parser.data); }); std::thread sendMqtt([&]() { sendKobukiData(robot.parser.data); });
while (true) {
if (!robot.isConnected()) {
std::cout << "Kobuki is not connected anymore. Reconnecting..." << std::endl;
robot.startCommunication("/dev/ttyUSB0", true, nullptr);
while (!robot.isConnected()) {
std::cout << "Attempting to reconnect..." << std::endl;
std::this_thread::sleep_for(std::chrono::seconds(1));
}
std::cout << "Reconnected to Kobuki." << std::endl;
}
std::string message = readMQTT(); while(true){
if (!message.empty()) { std::string message = readMQTT();
parseMQTT(message); if (!message.empty()){
parseMQTT(message);
}
} }
}
sendMqtt.join(); sendMqtt.join();
safety.join(); safety.join();
image.join(); image.join();
} }
std::string readMQTT() { std::string readMQTT()
static std::string lastMessage; {
static std::string lastMessage;
std::string message = client.getLastMessage(); std::string message = client.getLastMessage();
if (!message.empty() && message != lastMessage) { if (!message.empty() && message != lastMessage)
std::cout << "MQTT Message: " << message << std::endl; {
lastMessage = message; std::cout << "MQTT Message: " << message << std::endl;
} lastMessage = message;
}
// Add a small delay to avoid busy-waiting // Add a small delay to avoid busy-waiting
std::this_thread::sleep_for(std::chrono::milliseconds(100)); std::this_thread::sleep_for(std::chrono::milliseconds(100));
@@ -335,45 +324,37 @@ std::string serializeKobukiData(const TKobukiData &data) {
// create extra function to send the message every 100ms // create extra function to send the message every 100ms
// needed it so it can be threaded // needed it so it can be threaded
void sendKobukiData(TKobukiData &data) { void sendKobukiData(TKobukiData &data) {
while (true) { while (true) {
// if(!robot.isConnected()){ client.publishMessage("kobuki/data", serializeKobukiData(data));
// std::cout << "Kobuki is not connected anymore" << std::endl; std::cout << "Sent data" << std::endl;
// robot.startCommunication("/dev/ttyUSB0", true, nullptr); std::this_thread::sleep_for(std::chrono::milliseconds(1000));
// while(!robot.isConnected()){ }
// std::this_thread::sleep_for(std::chrono::seconds(1));
// }
// }
client.publishMessage("kobuki/data", serializeKobukiData(data));
std::cout << "Sent data" << std::endl;
std::this_thread::sleep_for(std::chrono::milliseconds(1000));
}
} }
void CapnSend() { void CapnSend() {
VideoCapture cap(0); VideoCapture cap(0);
if (!cap.isOpened()) { if (!cap.isOpened()) {
cerr << "Error: Could not open camera" << endl; cerr << "Error: Could not open camera" << endl;
return; return;
}
Mat frame;
while (true) {
cap >> frame; // Capture a new image frame
if (frame.empty()) {
cerr << "Error: Could not capture image" << endl;
continue;
} }
// Convert the image to a byte array Mat frame;
vector<uchar> buf; while (true) {
imencode(".jpg", frame, buf); cap >> frame; // Capture a new image frame
auto *enc_msg = reinterpret_cast<unsigned char *>(buf.data()); if (frame.empty()) {
cerr << "Error: Could not capture image" << endl;
continue;
}
// Publish the image data // Convert the image to a byte array
client.publishMessage("kobuki/cam", string(enc_msg, enc_msg + buf.size())); vector<uchar> buf;
cout << "Sent image" << endl; imencode(".jpg", frame, buf);
auto* enc_msg = reinterpret_cast<unsigned char*>(buf.data());
std::this_thread::sleep_for( // Publish the image data
std::chrono::milliseconds(300)); // Send image every 1000ms client.publishMessage("kobuki/cam", string(enc_msg, enc_msg + buf.size()));
} cout << "Sent image" << endl;
std::this_thread::sleep_for(std::chrono::milliseconds(200)); // Send image every 200ms
}
} }

View File

@@ -1,15 +0,0 @@
cmake_minimum_required(VERSION 3.10)
set(CMAKE_CXX_STANDARD 23)
# Find the Paho MQTT C++ library
find_library(PAHO_MQTTPP_LIBRARY paho-mqttpp3 PATHS /usr/local/lib)
find_library(PAHO_MQTT_LIBRARY paho-mqtt3a PATHS /usr/local/lib)
# Include the headers
include_directories(/usr/local/include)
# Add the executable
add_executable(my_program main.cpp)
# Link the libraries
target_link_libraries(my_program ${PAHO_MQTTPP_LIBRARY} ${PAHO_MQTT_LIBRARY})

View File

@@ -1,64 +0,0 @@
#include <iostream>
#include <mqtt/async_client.h>
#include <thread> // For std::this_thread::sleep_for
#include <chrono> // For std::chrono::seconds
// Define the address of the MQTT broker, the client ID, and the topic to subscribe to.
const std::string ADDRESS("mqtt://localhost:1883"); // Broker address (Raspberry Pi)
const std::string CLIENT_ID("raspberry_pi_client");
const std::string TOPIC("home/commands");
// Define a callback class that handles incoming messages and connection events.
class callback : public virtual mqtt::callback {
// Called when a message arrives on a subscribed topic.
void message_arrived(mqtt::const_message_ptr msg) override {
std::cout << "Received message: '" << msg->get_topic()<< "' : " << msg->to_string() << std::endl;
}
// Called when the connection to the broker is lost.
void connection_lost(const std::string& cause) override {
std::cerr << "Connection lost. Reason: " << cause << std::endl;
}
// Called when a message delivery is complete.
void delivery_complete(mqtt::delivery_token_ptr token) override {
std::cout << "Message delivered!" << std::endl;
}
};
int main() {
// Create an MQTT async client and set up the callback class.
mqtt::async_client client(ADDRESS, CLIENT_ID);
callback cb;
client.set_callback(cb);
// Set up the connection options (such as username and password).
mqtt::connect_options connOpts;
connOpts.set_clean_session(true);
connOpts.set_user_name("ishak");
connOpts.set_password("kobuki");
connOpts.set_mqtt_version(MQTTVERSION_3_1_1);
try {
// Try to connect to the broker and wait until successful.
std::cout << "Connecting to broker..." << std::endl;
client.connect(connOpts)->wait(); // Connect with the provided options
std::cout << "Connected!" << std::endl;
// Subscribe to the specified topic and wait for confirmation.
std::cout << "Subscribing to topic: " << TOPIC << std::endl;
client.subscribe(TOPIC, 1)->wait(); // Subscribe with QoS level 1
// Keep the program running to continue receiving messages from the broker.
while (true) {
std::this_thread::sleep_for(std::chrono::seconds(1)); // Sleep to reduce CPU usage
}
} catch (const mqtt::exception &exc) {
// Catch any MQTT exceptions and display the error message.
std::cerr << "Error: " << exc.what() << std::endl;
return 1;
}
return 0; // Return 0 to indicate successful execution
}

View File

@@ -0,0 +1,44 @@
cmake_minimum_required( VERSION 3.6 )
# Require C++11 (or later)
set( CMAKE_CXX_STANDARD 23 )
set( CMAKE_CXX_STANDARD_REQUIRED ON )
set( CMAKE_CXX_EXTENSIONS OFF )
set(BUILD_MODE Debug)
# Create Project
project( Sample )
add_executable( YOLOv4 util.h main.cpp )
# Set StartUp Project
set_property( DIRECTORY PROPERTY VS_STARTUP_PROJECT "YOLOv4" )
# Find Package
# OpenCV
find_package( OpenCV REQUIRED )
if( OpenCV_FOUND )
# Additional Include Directories
include_directories( ${OpenCV_INCLUDE_DIRS} )
# Additional Dependencies
target_link_libraries( YOLOv4 ${OpenCV_LIBS} )
endif()
# Download Model
set( MODEL https://github.com/AlexeyAB/darknet/releases/download/darknet_yolo_v3_optimal/yolov4.weights )
file( DOWNLOAD
"${MODEL}"
"${CMAKE_CURRENT_LIST_DIR}/yolov4.weights"
EXPECTED_HASH SHA256=e8a4f6c62188738d86dc6898d82724ec0964d0eb9d2ae0f0a9d53d65d108d562
SHOW_PROGRESS
)
# Download Config
set( CONFIG https://raw.githubusercontent.com/AlexeyAB/darknet/master/cfg/yolov4.cfg )
file( DOWNLOAD
"${CONFIG}"
"${CMAKE_CURRENT_LIST_DIR}/yolov4.cfg"
EXPECTED_HASH SHA256=a6d0f8e5c62cc8378384f75a8159b95fa2964d4162e33351b00ac82e0fc46a34
SHOW_PROGRESS
)

BIN
src/C++/OpenCV/YOLOv4 Executable file

Binary file not shown.

80
src/C++/OpenCV/coco.names Normal file
View File

@@ -0,0 +1,80 @@
person
bicycle
car
motorbike
aeroplane
bus
train
truck
boat
traffic light
fire hydrant
stop sign
parking meter
bench
bird
cat
dog
horse
sheep
cow
elephant
bear
zebra
giraffe
backpack
umbrella
handbag
tie
suitcase
frisbee
skis
snowboard
sports ball
kite
baseball bat
baseball glove
skateboard
surfboard
tennis racket
bottle
wine glass
cup
fork
knife
spoon
bowl
banana
apple
sandwich
orange
broccoli
carrot
hot dog
pizza
donut
cake
chair
sofa
pottedplant
bed
diningtable
toilet
tvmonitor
laptop
mouse
remote
keyboard
cell phone
microwave
oven
toaster
sink
refrigerator
book
clock
vase
scissors
teddy bear
hair drier
toothbrush

209
src/C++/OpenCV/main.cpp Normal file
View File

@@ -0,0 +1,209 @@
#include <iostream>
#include <string>
#include <vector>
#include <opencv2/opencv.hpp>
#include <opencv2/dnn.hpp>
#include <filesystem>
#include <fstream>
#include "util.h"
// Helper function to check if a file exists
bool fileExists(const std::string &path)
{
return std::filesystem::exists(path);
}
// Function to read class names from a file
std::vector<std::string> _readClassNameList(const std::string &path)
{
std::vector<std::string> classes;
// Check if file exists
if (!fileExists(path))
{
throw std::runtime_error("Class names file not found: " + path);
}
// Try to open and read file
std::ifstream file(path);
if (!file.is_open())
{
throw std::runtime_error("Unable to open class names file: " + path);
}
std::string line;
while (std::getline(file, line))
{
if (!line.empty())
{
classes.push_back(line);
}
}
if (classes.empty())
{
throw std::runtime_error("No classes found in file: " + path);
}
return classes;
}
int main(int argc, char *argv[])
{
try
{
// Open Video Capture
cv::VideoCapture capture = cv::VideoCapture(0);
if (!capture.isOpened())
{
std::cerr << "Failed to open camera device" << std::endl;
return -1;
}
// Read Class Name List and Color Table
const std::string list = "coco.names";
const std::vector<std::string> classes = _readClassNameList(list);
const std::vector<cv::Scalar> colors = getClassColors(classes.size());
// Debug: Print the size of the colors vector
std::cout << "Number of colors: " << colors.size() << std::endl;
// Read Darknet
const std::string model = "yolov4.weights";
const std::string config = "yolov4.cfg";
cv::dnn::Net net = cv::dnn::readNet(model, config);
if (net.empty())
{
std::cerr << "Failed to load network" << std::endl;
return -1;
}
// Set Preferable Backend
net.setPreferableBackend(cv::dnn::DNN_BACKEND_OPENCV);
// Set Preferable Target
net.setPreferableTarget(cv::dnn::DNN_TARGET_OPENCL);
while (true)
{
// Read Frame
cv::Mat frame;
capture >> frame;
if (frame.empty())
{
cv::waitKey(0);
break;
}
if (frame.channels() == 4)
{
cv::cvtColor(frame, frame, cv::COLOR_BGRA2BGR);
}
// Create Blob from Input Image
cv::Mat blob = cv::dnn::blobFromImage(frame, 1 / 255.f, cv::Size(416, 416), cv::Scalar(), true, false);
// Set Input Blob
net.setInput(blob);
// Run Forward Network
std::vector<cv::Mat> detections;
net.forward(detections, getOutputsNames(net));
// Draw Region
std::vector<int32_t> class_ids;
std::vector<float> confidences;
std::vector<cv::Rect> rectangles;
for (cv::Mat &detection : detections)
{
if (detection.empty())
{
std::cerr << "Detection matrix is empty!" << std::endl;
continue;
}
for (int32_t i = 0; i < detection.rows; i++)
{
cv::Mat region = detection.row(i);
// Retrieve Max Confidence and Class Index
cv::Mat scores = region.colRange(5, detection.cols);
cv::Point class_id;
double confidence;
cv::minMaxLoc(scores, 0, &confidence, 0, &class_id);
// Check Confidence
constexpr float threshold = 0.2;
if (threshold > confidence)
{
continue;
}
// Retrieve Object Position
const int32_t x_center = static_cast<int32_t>(region.at<float>(0) * frame.cols);
const int32_t y_center = static_cast<int32_t>(region.at<float>(1) * frame.rows);
const int32_t width = static_cast<int32_t>(region.at<float>(2) * frame.cols);
const int32_t height = static_cast<int32_t>(region.at<float>(3) * frame.rows);
const cv::Rect rectangle = cv::Rect(x_center - (width / 2), y_center - (height / 2), width, height);
// Add Class ID, Confidence, Rectangle
class_ids.push_back(class_id.x);
confidences.push_back(confidence);
rectangles.push_back(rectangle);
}
}
// Remove Overlap Rectangles using Non-Maximum Suppression
constexpr float confidence_threshold = 0.5; // Confidence
constexpr float nms_threshold = 0.5; // IoU (Intersection over Union)
std::vector<int32_t> indices;
cv::dnn::NMSBoxes(rectangles, confidences, confidence_threshold, nms_threshold, indices);
// Draw Rectangle
for (const int32_t &index : indices)
{
// Bounds checking
if (class_ids[index] >= colors.size())
{
std::cerr << "Color index out of bounds: " << class_ids[index] << " (max: " << colors.size() - 1 << ")" << std::endl;
continue;
}
const cv::Rect rectangle = rectangles[index];
const cv::Scalar color = colors[class_ids[index]];
// Debug: Print the index and color
std::cout << "Drawing rectangle with color index: " << class_ids[index] << std::endl;
constexpr int32_t thickness = 3;
cv::rectangle(frame, rectangle, color, thickness);
std::string label = classes[class_ids[index]] + ": " + std::to_string(static_cast<int>(confidences[index] * 100)) + "%";
int baseLine;
cv::Size labelSize = cv::getTextSize(label, cv::FONT_HERSHEY_SIMPLEX, 0.5, 1, &baseLine);
int top = std::max(rectangle.y, labelSize.height);
cv::rectangle(frame, cv::Point(rectangle.x, top - labelSize.height),
cv::Point(rectangle.x + labelSize.width, top + baseLine), color, cv::FILLED);
cv::putText(frame, label, cv::Point(rectangle.x, top), cv::FONT_HERSHEY_SIMPLEX, 0.5, cv::Scalar(255, 255, 255), 1);
}
// Show Image
cv::imshow("Object Detection", frame);
const int32_t key = cv::waitKey(1);
if (key == 'q')
{
break;
}
}
cv::destroyAllWindows();
return 0;
}
catch (const std::exception &e)
{
std::cerr << "Error: " << e.what() << std::endl;
return -1;
}
}
// cloned and fixed from https://github.com/UnaNancyOwen/OpenCVDNNSample/tree/master

61
src/C++/OpenCV/util.h Normal file
View File

@@ -0,0 +1,61 @@
#ifndef __UTIL__
#define __UTIL__
#include <vector>
#include <string>
#include <fstream>
#include <opencv2/dnn.hpp>
#include <opencv2/core.hpp>
#include <opencv2/highgui.hpp>
// Get Output Layers Name
std::vector<std::string> getOutputsNames( const cv::dnn::Net& net )
{
static std::vector<std::string> names;
if( names.empty() ){
std::vector<int32_t> out_layers = net.getUnconnectedOutLayers();
std::vector<std::string> layers_names = net.getLayerNames();
names.resize( out_layers.size() );
for( size_t i = 0; i < out_layers.size(); ++i ){
names[i] = layers_names[out_layers[i] - 1];
}
}
return names;
}
// Get Output Layer Type
std::string getOutputLayerType( cv::dnn::Net& net )
{
const std::vector<int32_t> out_layers = net.getUnconnectedOutLayers();
const std::string output_layer_type = net.getLayer( out_layers[0] )->type;
return output_layer_type;
}
// Read Class Name List
std::vector<std::string> readClassNameList( const std::string list_path )
{
std::vector<std::string> classes;
std::ifstream ifs( list_path );
if( !ifs.is_open() ){
return classes;
}
std::string class_name = "";
while( std::getline( ifs, class_name ) ){
classes.push_back( class_name );
}
return classes;
}
// Get Class Color Table for Visualize
std::vector<cv::Scalar> getClassColors( const int32_t number_of_colors )
{
cv::RNG random;
std::vector<cv::Scalar> colors;
for( int32_t i = 0; i < number_of_colors; i++ ){
cv::Scalar color( random.uniform( 0, 255 ), random.uniform( 0, 255 ), random.uniform( 0, 255 ) );
colors.push_back( color );
}
return colors;
}
#endif // __UTIL__

1158
src/C++/OpenCV/yolov4.cfg Normal file

File diff suppressed because it is too large Load Diff

Binary file not shown.

41
src/Python/YOLO/app.py Normal file
View File

@@ -0,0 +1,41 @@
from ultralytics import YOLO
import cv2
import numpy as np
import requests
import time
model = YOLO("yolo11n.pt")
#try to fetch the image from the given url
def fetch_image(url):
try:
response = requests.get(url)
response.raise_for_status()
image_array = np.frombuffer(response.content, np.uint8)
image = cv2.imdecode(image_array, cv2.IMREAD_COLOR)
return image
except requests.RequestException as e:
print(f"Error: Could not fetch image - {e}")
return None
# URL of the photostream
url = "http://145.92.224.21/image"
while True:
frame = fetch_image(url)
if frame is None:
print("Error: Could not fetch image, retrying...")
time.sleep(1) # Wait for 1 second before retrying
continue
# Predict on the frame
results = model(frame)
# Display the results
results[0].show()
# Exit if 'q' is pressed
if cv2.waitKey(1) & 0xFF == ord('q'):
break
cv2.destroyAllWindows()

View File

@@ -0,0 +1 @@
__pycache__

View File

@@ -0,0 +1,18 @@
FROM python:3.9
WORKDIR /app
COPY . .
RUN apt-get update && apt-get install -y libgl1
RUN pip install -r requirements.txt
EXPOSE 5000
CMD ["python", "web/app.py"]
#build instruction: sudo docker buildx build -t flaskapp:latest .
#run instruction: sudo docker run --network="host" flaskapp:latest
# need to use network host to connect to the host's mqtt server

View File

@@ -0,0 +1,5 @@
Flask==3.1.0
paho-mqtt==1.6.1
ultralytics==8.3.58
opencv-python-headless==4.6.0.66
numpy==1.23.4

View File

@@ -1,13 +1,54 @@
from flask import Flask, request, render_template, jsonify, g from flask import Flask, Response, request, render_template, jsonify
import paho.mqtt.client as mqtt import paho.mqtt.client as mqtt
from ultralytics import YOLO
import cv2
import numpy as np
import threading
import mysql.connector import mysql.connector
import json import json
app = Flask(__name__) app = Flask(__name__)
# Globale variabelen # Load a model
model = YOLO("yolo11n.pt") # pretrained YOLO11n model
kobuki_message = "" kobuki_message = ""
latest_image = None latest_image = None
processed_image = None
yolo_results = []
# Lock for thread-safe access to shared variables
lock = threading.Lock()
# List of class names (example for COCO dataset)
yolo_classes = list(model.names.values())
def on_message(client, userdata, message):
global kobuki_message, latest_image, processed_image, yolo_results
if message.topic == "kobuki/data":
kobuki_message = str(message.payload.decode("utf-8"))
elif message.topic == "kobuki/cam":
with lock: # Lock the shared variables between threads so they can't be accessed at the same time and you cant have half processed images
latest_image = np.frombuffer(message.payload, np.uint8)
latest_image = cv2.imdecode(latest_image, cv2.IMREAD_COLOR)
# Process the image with YOLO
results = model(latest_image)
yolo_results = []
processed_image = latest_image.copy() # Create a copy for processing
for result in results:
for box in result.boxes:
class_id = int(box.cls.item())
class_name = yolo_classes[class_id]
yolo_results.append({
"class": class_name,
"confidence": box.conf.item(),
"bbox": box.xyxy.tolist()
})
# Draw bounding box on the processed image
x1, y1, x2, y2 = map(int, box.xyxy[0])
cv2.rectangle(processed_image, (x1, y1), (x2, y2), (0, 255, 0), 2)
cv2.putText(processed_image, f"{class_name} {box.conf.item():.2f}", (x1, y1 - 10), cv2.FONT_HERSHEY_SIMPLEX, 0.9, (0, 255, 0), 2)
# Globale MQTT setup # Globale MQTT setup
def on_message(client,userdata, message): def on_message(client,userdata, message):
@@ -26,6 +67,9 @@ mqtt_client.username_pw_set("server", "serverwachtwoordofzo")
mqtt_client.connect("localhost", 1884, 60) mqtt_client.connect("localhost", 1884, 60)
mqtt_client.loop_start() mqtt_client.loop_start()
mqtt_client.subscribe("kobuki/data") mqtt_client.subscribe("kobuki/data")
mqtt_client.subscribe("kobuki/cam")
mqtt_client.on_message = on_message # this line needs to be under the function definition otherwise it can't find which function it needs to use
mqtt_client.on_message = on_message # this line needs to be under the function definition otherwise it can't find which function it needs to use mqtt_client.on_message = on_message # this line needs to be under the function definition otherwise it can't find which function it needs to use
@@ -60,11 +104,6 @@ def control():
else: else:
return ('Unauthorized', 401, {'WWW-Authenticate': 'Basic realm="Login Required"'}) return ('Unauthorized', 401, {'WWW-Authenticate': 'Basic realm="Login Required"'})
@app.route('/data', methods=['GET'])
def data():
return kobuki_message
@app.route('/move', methods=['POST']) @app.route('/move', methods=['POST'])
def move(): def move():
data = request.get_json() data = request.get_json()
@@ -127,5 +166,28 @@ def sensor_data(kobuki_message):
except mysql.connector.Error as err: except mysql.connector.Error as err:
print(f"Database error: {err}") print(f"Database error: {err}")
@app.route('/data', methods=['GET'])
def data():
return kobuki_message
@app.route('/image')
def image():
global processed_image
with lock: # Lock the shared variables between threads so they can't be accessed at the same time and you cant have half processed images
if processed_image is not None:
_, buffer = cv2.imencode('.jpg', processed_image)
return Response(buffer.tobytes(), mimetype='image/jpeg')
else:
return "No image available", 404
@app.route('/yolo_results', methods=['GET'])
def yolo_results_endpoint():
global yolo_results
with lock:
return jsonify(yolo_results)
if __name__ == '__main__': if __name__ == '__main__':
app.run(debug=True, port=5000) app.run(debug=True, port=5000)

View File

@@ -1,41 +1,61 @@
document.querySelectorAll(".btn").forEach(button => { document.addEventListener("DOMContentLoaded", function() {
button.addEventListener("click", async function(event) { // Maak de functie async document.querySelectorAll(".btn").forEach(button => {
event.preventDefault(); // voorkomt pagina-verversing button.addEventListener("click", function(event) {
event.preventDefault(); // prevents page refresh
// Haal de waarde van de knop op // Get the value of the button
const direction = event.target.value; const direction = event.target.value;
try { fetch("/move", {
const response = await fetch("/move", {
method: "POST", method: "POST",
headers: { headers: {
"Content-Type": "application/json" "Content-Type": "application/json"
}, },
body: JSON.stringify({ direction: direction }) body: JSON.stringify({ direction: direction })
})
.then(response => response.json())
.then(data => {
console.log("Success:", data);
})
.catch(error => {
console.error("Error:", error);
}); });
const data = await response.json(); });
console.log("Success:", data); });
} catch (error) {
console.error("Error:", error);
}
// Fetch data from the server // Fetch data from the server
async function fetchData() { async function fetchData() {
try {
const response = await fetch("/data"); const response = await fetch("/data");
const data = await response.json(); const data = await response.json();
return data; return data;
} catch (error) {
console.error("Error:", error);
} }
}
// Parse the data and show it on the website // Parse the data and show it on the website
const data = await fetchData(); const data = await fetchData();
const sensorDataContainer = document.getElementById("sensor-data"); const sensorDataContainer = document.getElementById("sensor-data");
sensorDataContainer.innerHTML = ""; // Clear previous data sensorDataContainer.innerHTML = ""; // Clear previous data
//for each object in json array create a new paragraph element and append it to the sensorDataContainer // For each object in JSON array, create a new paragraph element and append it to the sensorDataContainer
for (const [key, value] of Object.entries(data)) { for (const [key, value] of Object.entries(data)) {
const dataElement = document.createElement("p"); const dataElement = document.createElement("p");
dataElement.textContent = `${key}: ${value}`; dataElement.textContent = `${key}: ${value}`;
sensorDataContainer.appendChild(dataElement); // Voeg het element toe aan de container sensorDataContainer.appendChild(dataElement); // Voeg het element toe aan de container
} }
}); }
// Update the image
function updateImage() {
var img = document.getElementById("robot-image");
img.src = "/image?" + new Date().getTime(); // Add timestamp to avoid caching
}
// Fetch and display sensor data every 1 second
setInterval(parseData, 1000);
// Update the image every 200 milliseconds
setInterval(updateImage, 100);
}); });

View File

@@ -167,4 +167,4 @@ th,td {
th { th {
background-color: #f2f2f2; background-color: #f2f2f2;
text-align: left; text-align: left;
} }

Binary file not shown.

View File

@@ -1,7 +0,0 @@
import sys
import logging
logging.basicConfig(stream=sys.stderr)
sys.path.insert(0, "/home/ishak/rooziinuubii79/src/Python/flask/web")
from app import app as application

View File

@@ -0,0 +1,13 @@
[Unit]
Description=kobukiDriver
After=network.target
[Service]
User=user1
WorkingDirectory=/home/user1/rooziinuubii79/src/C++/Driver/
ExecStart=/home/user1/rooziinuubii79/src/C++/Driver/kobuki_control
Restart=always
RestartSec=5
[Install]
WantedBy=multi-user.target

View File

@@ -0,0 +1,7 @@
allow_anonymous false
password_file /etc/mosquitto/passwordfile
listener 8080
protocol websockets
listener 1884
protocol mqtt

View File

@@ -0,0 +1,22 @@
server {
listen 80;
server_name 145.92.224.21;
# Proxy WebSocket connections for MQTT
location /ws/ {
proxy_pass http://localhost:9001;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $host;
}
# Proxy HTTP connections for Flask
location / {
proxy_pass http://localhost:5000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}

View File

@@ -0,0 +1,7 @@
stream {
server {
listen 9001;
proxy_pass localhost:8080;
}
}