122 Commits

Author SHA1 Message Date
1431d2c164 updated set up docs 2025-01-27 08:41:19 +01:00
0a8b96a45a increase image compression level and re-enabled thread 2025-01-22 13:46:47 +01:00
69eba455f9 added image compression to thread 2025-01-22 13:32:43 +01:00
e262325565 disabled image logick 2025-01-22 12:28:36 +01:00
ishak jmilou.ishak
f493665275 riep de functie nergens aan 2025-01-21 16:13:26 +01:00
ishak jmilou.ishak
899aa94b40 seperate yolo results because it updates now only when going to /yolo_results 2025-01-21 16:01:59 +01:00
ishak jmilou.ishak
d5524d7890 shouldn't have done POST 2025-01-21 15:09:23 +01:00
ishak jmilou.ishak
976840c6b2 forgot to do system prune 2025-01-21 14:28:40 +01:00
ishak jmilou.ishak
88364561ea got error when running new docker 2025-01-21 14:27:44 +01:00
ishak jmilou.ishak
64d2aedc3b feat: allow POST method for yolo_results endpoint 2025-01-21 14:16:26 +01:00
ishak jmilou.ishak
6597cb133a checking for error in db input 2025-01-21 14:10:31 +01:00
ishak jmilou.ishak
ec44cb955b feat: implement automatic reconnection for Kobuki when disconnected 2025-01-21 12:29:03 +01:00
ishak jmilou.ishak
c74b9a8758 refactor: use dynamic port detection for Kobuki communication setup 2025-01-21 12:26:41 +01:00
ishak jmilou.ishak
b20b9b693a changed portname to const char 2025-01-21 12:25:06 +01:00
ishak jmilou.ishak
99599a6c21 refactor: change portname parameter to const char* in startCommunication and improve USB device check logic 2025-01-21 12:23:20 +01:00
ishak jmilou.ishak
29ef742a94 fix: use const_cast for port string in Kobuki communication 2025-01-20 15:22:16 +01:00
ishak jmilou.ishak
0e9d9dda68 laat kobuki nu connecten op open ports 2025-01-20 15:18:37 +01:00
ishak jmilou.ishak
2c630bf89b refactor: improve Kobuki connection handling and add USB device check 2025-01-20 12:11:38 +01:00
ishak jmilou.ishak
ef3407b742 refactor: reorder includes and improve code formatting for readability 2025-01-20 11:15:45 +01:00
ishak jmilou.ishak
3d9a68ff7f added else statement 2025-01-20 11:09:12 +01:00
ishak jmilou.ishak
aedff1c2cc made more debug 2025-01-20 10:58:34 +01:00
ishak jmilou.ishak
c31689ac70 Merge branch 'main' of ssh://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79 2025-01-16 13:55:25 +01:00
ishak jmilou.ishak
1ab718a472 new reconnect function 2025-01-16 13:55:24 +01:00
bbade2384c Merge branch 'usb-reconnect' into 'main'
opencv camera logic rewrite

See merge request technische-informatica-sm3/ti-projectten/rooziinuubii79!5
2025-01-16 12:36:32 +01:00
e04cff3d65 opencv camera logic rewrite 2025-01-16 12:28:40 +01:00
ishak jmilou.ishak
36aaee9bad removed commented code 2025-01-16 12:15:14 +01:00
7b51330675 update image refresh logic to be more optimized 2025-01-15 16:32:32 +01:00
3bb44ad4ab updated readme 2025-01-15 16:32:32 +01:00
ishak jmilou.ishak
fb12b20a0b changed colom name, addprint 2025-01-15 15:46:39 +01:00
ishak jmilou.ishak
1b3ccd1e72 commented yolor_result_db 2025-01-15 15:21:29 +01:00
ishak jmilou.ishak
a16abe068c returned to ohter function 2025-01-15 15:11:47 +01:00
ishak jmilou.ishak
9f7d7e7ac9 fixed insert into typo 2025-01-15 14:53:10 +01:00
ishak jmilou.ishak
6f34a0f554 Merge branch 'main' of ssh://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79 2025-01-15 14:43:41 +01:00
ishak jmilou.ishak
364f6e5259 test of dit het probleem is voor camera 2025-01-15 14:43:40 +01:00
7c30d838f7 added thread.sleep to prevenet console flood 2025-01-15 14:40:38 +01:00
ishak jmilou.ishak
50bf777f78 cam does not work 2025-01-15 14:27:33 +01:00
ishak jmilou.ishak
95e2d292c9 Merge branch 'main' of ssh://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79 2025-01-15 14:10:17 +01:00
ishak jmilou.ishak
3367f1dbd2 returen yolo result db 2025-01-15 14:09:46 +01:00
e273e175cb remove old code 2025-01-15 13:53:33 +01:00
06e08a2cfb camera reconnection added 2025-01-15 13:52:13 +01:00
ishak jmilou.ishak
4e78caa577 Comment out call to yolo_results_db function in on_message handler 2025-01-15 12:22:44 +01:00
ishak jmilou.ishak
5b0e843654 Add call to yolo_results_db function after processing YOLO results 2025-01-15 11:36:54 +01:00
ishak jmilou.ishak
8b66702605 changed function 2025-01-14 16:50:11 +01:00
ishak jmilou.ishak
d8b3ec2938 added thread 2025-01-14 16:39:52 +01:00
ishak jmilou.ishak
97076dfe05 Add Kobuki connection monitoring and automatic start/stop functionality 2025-01-14 16:35:51 +01:00
ishak jmilou.ishak
967bc8247c Refactor YOLO results handling by separating database insertion logic into a dedicated function 2025-01-14 15:37:08 +01:00
ishak jmilou.ishak
5d61579973 Refactor YOLO results endpoint to handle empty results and improve database insertion logic 2025-01-14 14:22:50 +01:00
ishak jmilou.ishak
ebd88e43ab Add error handling and database insertion for YOLO results 2025-01-14 13:23:48 +01:00
ishak jmilou.ishak
2fbe18be76 went back 2025-01-14 13:14:38 +01:00
ishak jmilou.ishak
74d9687af5 Merge branch 'main' of ssh://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79 2025-01-14 12:30:56 +01:00
ishak jmilou.ishak
48023773c6 went back to older version. db get empty rows 2025-01-14 12:30:54 +01:00
56ac9cf687 change dockerfile command 2025-01-14 12:11:50 +01:00
ishak jmilou.ishak
3232ff121f changed db connection 2025-01-14 12:08:08 +01:00
5844387b19 merge foutje opgelost 2025-01-14 11:56:28 +01:00
ishak jmilou.ishak
b48243f831 changed sensor data to db in other function 2025-01-13 14:57:52 +01:00
317731ec87 python merge fix 2025-01-13 11:00:43 +01:00
441ca19578 repaired js after merge 2025-01-13 10:44:20 +01:00
7f807d0031 added g import from flask 2025-01-13 10:39:23 +01:00
c0ec6901c4 edited python requirements 2025-01-13 10:33:46 +01:00
2fa8fb2926 Merge branch '35-als-gebruiker-wil-ik-dat-mijn-data-word-opgeslagen-in-een-database-om-data-terug-te-zien' into 'main'
Resolve "Als gebruiker wil ik dat mijn data word opgeslagen in een database om data terug te zien"

Closes #35

See merge request technische-informatica-sm3/ti-projectten/rooziinuubii79!4
2025-01-13 10:27:00 +01:00
3fddee73c7 Merge branch 'main' into 35-als-gebruiker-wil-ik-dat-mijn-data-word-opgeslagen-in-een-database-om-data-terug-te-zien 2025-01-13 10:26:29 +01:00
ishak jmilou.ishak
1fd88c7636 added some info on the readme 2025-01-08 15:19:03 +01:00
ff7b148556 start readme 2025-01-08 14:20:38 +01:00
6fe28f997a edited kobuki speedvalue for safety 2025-01-07 12:51:24 +01:00
1bf9ebddab added mutex in python 2025-01-07 11:30:04 +01:00
585a0e9a52 fix thread crash 2025-01-07 11:19:47 +01:00
cb988a5260 store new image in processed_image 2025-01-07 11:09:33 +01:00
5e01e25d9c comment update 2025-01-06 16:40:40 +01:00
0d184261fd updated systemd kobukidriver service file 2025-01-06 16:40:00 +01:00
ccaa722973 edited startup file for kobukidriver
(everything works now)
2025-01-06 16:31:52 +01:00
7d1b878c30 fix yolo image boxes 2025-01-06 16:11:03 +01:00
228c508012 attempt to fix broken code 2025-01-06 16:02:17 +01:00
7845feb9f8 update yolo naming in image 2025-01-06 15:55:24 +01:00
20d6d8799d attempt to show name next to image box 2025-01-06 15:45:29 +01:00
9c7c774030 change boxing and text error of YOLO 2025-01-06 15:26:00 +01:00
0832da0d3b change mqtt port in python 2025-01-06 15:16:13 +01:00
a59b9c8714 requirements update 2025-01-06 15:11:27 +01:00
4a05ec5efc added dockerfile 2025-01-06 15:11:21 +01:00
c3d575ccf1 added yolo image object detection to /image 2025-01-06 13:00:00 +01:00
1b0b1e87ce change directories and added driver service config 2025-01-06 11:44:08 +01:00
9c41d64c69 remove mqtt debug print 2025-01-06 11:23:27 +01:00
b48eda9735 change image refresh time 2025-01-06 10:35:54 +01:00
629f9cba92 attempt to fix javascript 2025-01-06 10:34:00 +01:00
51aad34c78 added bracket 2025-01-06 10:08:05 +01:00
1b3fead2b3 documentation 2025-01-06 09:49:56 +01:00
ishak jmilou.ishak
61651a9a02 try to see why sensor value doesnt show on website 2024-12-18 12:22:22 +01:00
ishak jmilou.ishak
e5881f1b37 making message print only once 2024-12-18 11:56:43 +01:00
50b6b83299 fix headerfile 2024-12-17 11:19:15 -04:00
10597ba37d avoid busy waiting 2024-12-17 11:17:25 -04:00
6ab5716797 Documentation kobuki, mqtt, opencv 2024-12-13 18:33:24 +01:00
50b90e461f gitignore update 2024-12-13 10:52:53 +01:00
2811036595 change port of mqtt flask 2024-12-12 14:01:39 +01:00
58f1a931a6 Merge branch 'OpenCV' into 'main'
Open cv

See merge request technische-informatica-sm3/ti-projectten/rooziinuubii79!3
2024-12-12 13:48:59 +01:00
c9d3b0f795 Merge branch 'main' into 'OpenCV'
# Conflicts:
#   src/Python/flask/web/app.py
#   src/Python/flask/web/static/script.js
2024-12-12 13:48:21 +01:00
85af15d7a3 change default camera 2024-12-12 13:28:43 +01:00
a1b50a3780 changes to video settings 2024-12-12 13:27:06 +01:00
b86528595e change camera 2024-12-11 16:51:01 +01:00
eef4f9c79c revert video format change 2024-12-11 16:39:45 +01:00
3c23d37be1 change video format 2024-12-11 16:37:01 +01:00
c2886d32c9 use libcamera with picam 2024-12-11 16:30:14 +01:00
8158c85d6e use astra backend 2024-12-11 16:12:16 +01:00
e682969ec8 code revert 2024-12-11 16:07:26 +01:00
0dfc3b5c13 attempt with gstreamer 2024-12-11 15:43:05 +01:00
7f786d5197 change camera logic 2024-12-11 15:37:31 +01:00
60ba177dc2 add pipeline for picam 2024-12-11 15:34:53 +01:00
e9f998b3e7 set V4L2 backend 2024-12-11 15:28:21 +01:00
7eeaba482e removed attempt for camera detection 2024-12-11 14:50:02 +01:00
e8db00120f update video camera logic 2024-12-11 14:47:29 +01:00
c65f310e81 cleanup 2024-12-11 14:46:58 +01:00
ec3e83ef7f changed ip adress and cmakelist 2024-12-11 14:35:42 +01:00
480d36393a update website so it shows image 2024-12-10 13:29:58 +01:00
fea0f19857 update ip adress 2024-12-10 13:29:50 +01:00
e1135dac0f update cmakelist 2024-12-10 13:13:45 +01:00
2f4e5ae096 re enable robot communication 2024-12-09 10:31:01 +01:00
9e07a243ea receive images from mqtt server and display on endpoint 2024-12-03 12:06:12 +01:00
b93a5f2dca added mosquitto conf 2024-12-02 14:00:29 +01:00
911b870786 remove unused library 2024-12-02 13:59:27 +01:00
dd39bd3021 fixed mqtt and sockets and reverse proxy after 5 hours 2024-12-02 13:44:15 +01:00
8aa54805ac Grabbed existing progam off github and repaired it 2024-11-27 21:25:48 +01:00
d26d277c3c driver cleanup 2024-11-26 13:32:14 +01:00
508d2ed4e2 added base OpenCV script and documentation 2024-11-25 11:46:24 +01:00
3e202acc8d gitignore update 2024-11-25 11:46:07 +01:00
37 changed files with 2303 additions and 398 deletions

4
.gitignore vendored
View File

@@ -13,7 +13,7 @@ src/Socket/a.out
src/C++/Driver/cmake_install.cmake
src/C++/Socket/a.out
src/C++/Driver/Makefile
src/C++/Driver/vgcore*
vgcore*
src/C++/Driver/cmake_install.cmake
src/C++/Driver/Makefile
src/C++/Driver/log
@@ -31,3 +31,5 @@ CMakeFiles/
Makefile
CMakeCache.txt
cmake_install.cmake
src/C++/OpenCV/main
.vs

135
README.md
View File

@@ -1,93 +1,70 @@
# TI-project
## Getting started
To make it easy for you to get started with GitLab, here's a list of recommended next steps.
Already a pro? Just edit this README.md and make it your own. Want to make it easy? [Use the template at the bottom](#editing-this-readme)!
## Add your files
- [ ] [Create](https://docs.gitlab.com/ee/user/project/repository/web_editor.html#create-a-file) or [upload](https://docs.gitlab.com/ee/user/project/repository/web_editor.html#upload-a-file) files
- [ ] [Add files using the command line](https://docs.gitlab.com/ee/gitlab-basics/add-file.html#add-a-file-using-the-command-line) or push an existing Git repository with the following command:
```
cd existing_repo
git remote add origin https://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-project.git
git branch -M main
git push -uf origin main
```
## Integrate with your tools
- [ ] [Set up project integrations](https://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-project/-/settings/integrations)
## Collaborate with your team
- [ ] [Invite team members and collaborators](https://docs.gitlab.com/ee/user/project/members/)
- [ ] [Create a new merge request](https://docs.gitlab.com/ee/user/project/merge_requests/creating_merge_requests.html)
- [ ] [Automatically close issues from merge requests](https://docs.gitlab.com/ee/user/project/issues/managing_issues.html#closing-issues-automatically)
- [ ] [Enable merge request approvals](https://docs.gitlab.com/ee/user/project/merge_requests/approvals/)
- [ ] [Set auto-merge](https://docs.gitlab.com/ee/user/project/merge_requests/merge_when_pipeline_succeeds.html)
## Test and Deploy
Use the built-in continuous integration in GitLab.
- [ ] [Get started with GitLab CI/CD](https://docs.gitlab.com/ee/ci/quick_start/index.html)
- [ ] [Analyze your code for known vulnerabilities with Static Application Security Testing (SAST)](https://docs.gitlab.com/ee/user/application_security/sast/)
- [ ] [Deploy to Kubernetes, Amazon EC2, or Amazon ECS using Auto Deploy](https://docs.gitlab.com/ee/topics/autodevops/requirements.html)
- [ ] [Use pull-based deployments for improved Kubernetes management](https://docs.gitlab.com/ee/user/clusters/agent/)
- [ ] [Set up protected environments](https://docs.gitlab.com/ee/ci/environments/protected_environments.html)
***
# Editing this README
When you're ready to make this README your own, just edit this file and use the handy template below (or feel free to structure it however you want - this is just a starting point!). Thanks to [makeareadme.com](https://www.makeareadme.com/) for this template.
## Suggestions for a good README
Every project is different, so consider which of these sections apply to yours. The sections used in the template are suggestions for most open source projects. Also keep in mind that while a README can be too long and detailed, too long is better than too short. If you think your README is too long, consider utilizing another form of documentation rather than cutting out information.
## Name
Choose a self-explaining name for your project.
# TI-project - exploration robot Kobuki
## Description
Let people know what your project can do specifically. Provide context and add a link to any reference visitors might be unfamiliar with. A list of Features or a Background subsection can also be added here. If there are alternatives to your project, this is a good place to list differentiating factors.
This project is a kobuki that drives around in dangerous areas and detects objects in its path. It uses a camera to detect objects. The purpose of this project is to explore dangerous areas without risking human lives. You are able to control the robot using controller on the website.
## Badges
On some READMEs, you may see small images that convey metadata, such as whether or not all the tests are passing for the project. You can use Shields to add some to your README. Many services also have instructions for adding a badge.
## Visuals
Depending on what you are making, it can be a good idea to include screenshots or even a video (you'll frequently see GIFs rather than actual videos). Tools like ttygif can help, but check out Asciinema for a more sophisticated method.
## Photos
![Kobuki](/docs/assets/KobukiPhoto.jpg)
## Installation
Within a particular ecosystem, there may be a common way of installing things, such as using Yarn, NuGet, or Homebrew. However, consider the possibility that whoever is reading your README is a novice and would like more guidance. Listing specific steps helps remove ambiguity and gets people to using your project as quickly as possible. If it only runs in a specific context like a particular programming language version or operating system or has dependencies that have to be installed manually, also add a Requirements subsection.
## Usage
Use examples liberally, and show the expected output if you can. It's helpful to have inline the smallest example of usage that you can demonstrate, while providing links to more sophisticated examples if they are too long to reasonably include in the README.
### Requirements
## Support
Tell people where they can go to for help. It can be any combination of an issue tracker, a chat room, an email address, etc.
- Kobuki robot
- Raspberry Pi (minimum 3B)
- Camera
- power supply for Raspberry Pi
- laptop or computer
## Roadmap
If you have ideas for releases in the future, it is a good idea to list them in the README.
### Steps
## Contributing
State if you are open to contributions and what your requirements are for accepting them.
1. **Install Python and Pip**
- Ensure you have Python installed on your system. You can download it from [python.org](https://www.python.org/).
- Pip is the package installer for Python. It usually comes with Python, but you can install it separately if needed.
For people who want to make changes to your project, it's helpful to have some documentation on how to get started. Perhaps there is a script that they should run or some environment variables that they need to set. Make these steps explicit. These instructions could also be useful to your future self.
2. **Clone Our Repository**
- Clone our repository to your local machine doing the following :
- Open your terminal
- Change the current working directory to the location where you want the cloned directory.
- Type `git clone https://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79.git
3. **Install the required packages**
- Install the following packages on the server: "docker docker-buildx mosquitto nginx"
- Install the following packages on the Raspberry Pi: "g++ make cmake libopencv-dev libssl-dev", https://github.com/eclipse-paho/paho.mqtt.c, https://github.com/eclipse-paho/paho.mqtt.cpp
4. **Run the project**
#### Server side
- Run the following commands in the terminal to start the website:
- `cd src/Python/flask`
- `sudo docker buildx build -t flaskapp:latest .`
- `sudo docker run --network="host" --restart=always flaskapp:latest`
- Run the following commands in the terminal to start the MQTT broker:
- `cd src/config/server/`
- `mosquitto -c mosquitto.conf`
- Run the following commands in the terminal to start the Nginx server:
- `cd src/config/server/`
- `cp nginx.conf /etc/nginx/nginx.conf`
- `cp nginx-sites.conf /etc/nginx/sites-enable/nginx-sites.conf`
#### Raspberry Pi side
- Run the following commands to build and start the driver:
- `cd src/C++/Driver`
- `cmake ..`
- `make`
- `./kobuki_driver`
- Run the following commands to autostart the driver on startup of the Raspberry Pi:
- `cd src/config/rpi/`
- `cp kobukiDriver.service /etc/systemd/system/kobukiDriver.service`
- `systemctl enable kobukiDriver.service`
- `systemctl start kobukiDriver.service`
## Extra notes
Dont forget to change the IP address in the `src/C++/Driver/src/main.cpp` file to the IP address of the server.
You can also document commands to lint the code or run tests. These steps help to ensure high code quality and reduce the likelihood that the changes inadvertently break something. Having instructions for running tests is especially helpful if it requires external setup, such as starting a Selenium server for testing in a browser.
## Authors and acknowledgment
Show your appreciation to those who have contributed to the project.
## License
For open source projects, say how it is licensed.
## Project status
If you have run out of energy or time for your project, put a note at the top of the README saying that development has slowed down or stopped completely. Someone may choose to fork your project or volunteer to step in as a maintainer or owner, allowing your project to keep going. You can also make an explicit request for maintainers.

View File

@@ -0,0 +1,51 @@
# Systemd Services
# What is a service
A service is a program or script that runs in the background and is managed by the system. Services are started at boot time and run until the system is shut down. Services can be started, stopped, and restarted by the system administrator.
# How to manage services on systemD
## Starting a service
To start a service, use the `systemctl start` command followed by the service name. For example, to start the `apache2` service, use the following command:
```bash
sudo systemctl start apache2
```
## Stopping a service
To stop a service, use the `systemctl stop` command followed by the service name. For example, to stop the `apache2` service, use the following command:
```bash
sudo systemctl stop apache2
```
## Restarting a service
To restart a service, use the `systemctl restart` command followed by the service name. For example, to restart the `apache2` service, use the following command:
```bash
sudo systemctl restart apache2
```
## Enabling a service
To enable a service to start at boot time, use the `systemctl enable` command followed by the service name. For example, to enable the `apache2` service, use the following command:
```bash
sudo systemctl enable apache2
```
## Creating a new service
To create a new service, you need to create a new service file in the `/etc/systemd/system/` directory. The service file should have a `.service` extension and contain the following sections:
### Example service file:
```bash
[Unit]
Description=FlaskApp #description of the service
After=network.target #start the service after the network is up
[Service]
User=ishak #start the service as a specific user
WorkingDirectory=/home/ishak/rooziinuubii79/src/Python/flask/web/ #working directory of the service
ExecStart=/usr/bin/gunicorn -w 3 -b 127.0.0.1:5000 app:app #command to start the service
```

BIN
docs/assets/KobukiPhoto.jpg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 491 KiB

50
docs/code/Mqtt.md Normal file
View File

@@ -0,0 +1,50 @@
# MQTT
## What is MQTT?
MQTT is a lightweight messaging protocol made for IOT devices. It allows efficient communication between IoT devices, servers, and applications by allowing them to
publish and subscribe to messages.
## How to connect
To connect to a MQTT server you need to create a instance of the class.
Example:
```cpp
// server adress, Client ID, Client Username, Client Password
MqttClient client("ws://145.92.224.21/ws/", "KobukiRPI", "rpi", "rpiwachtwoordofzo"); // create a client object
```
Later in the setup function you need to call ```client.connect();``` to connect to the mqtt server.
```cpp
client.connect();
```
When you've connected and the instance is initiated you can subscribe to topics or send messages to topics.
## Subscribing and receiving messages
Example subscribing to a topic:
```cpp
void setup(){
client.subscribe("home/commands");
}
```
Example receiving latest message from a topic:
```cpp
std::string foo(){
std::string latestMqttMessage = "";
latestMqttMessage = client.getLastMessage();
return latestMqttMessage;
}
```
If you want to subscribe to mulitple topics you need to initiate multiple instances of the mqtt class.
## Publishing messages
Example publishing a message:
```cpp
void foo(std::string Message){
//channel, payload
client.publishMessage("kobuki/example", Message);
}
```

69
docs/code/OpenCV.md Normal file
View File

@@ -0,0 +1,69 @@
# OpenCV
## Requirements
We want that the camera we want it to detect what is happening on the video feed and identify it so it can identify dangers.
## Issues
* OpenCL not grabbing gpu
* Solution: https://github.com/Smorodov/Multitarget-tracker/issues/93
## Installation
### Dependencies
* glew (for openGL)
* opencv C++ lib
How to install OpenCV
```bash
sudo apt-get install libopencv-dev
```
## Code explanation
### Opening the camera with OpenCV
```cpp
VideoCapture cap(0); //Open the default camera (0), points to /dev/video0. You could also change the number to the preferred camera
if (!cap.isOpened()) { //if camera is not opened throw a error message
cerr << "Error: Could not open camera" << endl;
return;
}
```
## Taking a picture and storing it in a variable
```cpp
Mat frame; //create a new Matrix variable called frame
while (true) {
cap >> frame; // Capture a new image frame.
if (frame.empty()) { //if the variable frame is not filled return a error
cerr << "Error: Could not capture image" << endl;
continue;
}
```
## Encoding the image for sending it over MQTT
```cpp
vector<uchar> buf; //create a dyanmic buffer for the image
imencode(".jpg", frame, buf); //encode the image to the buffer
auto* enc_msg = reinterpret_cast<unsigned char*>(buf.data());
```
```cpp
void CapnSend() {
// Convert the image to a byte array
// Publish the image data
client.publishMessage("kobuki/cam", string(enc_msg, enc_msg + buf.size()));
cout << "Sent image" << endl;
std::this_thread::sleep_for(std::chrono::milliseconds(300)); // Send image every 1000ms
}
}
```
## Sources
* https://github.com/UnaNancyOwen/OpenCVDNNSample/tree/master

View File

@@ -0,0 +1,25 @@
# Kobuki driver
## How do i communicate with the kobuki
You can communicate with the kobuki by usb serial or the big serial port on the front. We chose the usb port paired with a raspberry Pi.
The Kobuki sends a message every 200ms with a baudrate of 115200. It sends all the sensordata and the message always starts with the same 2 bytes 0xAA and 0x55.
## Kobuki payloads
To communicate with the kobuki we need to send payloads to the kobuki. These are structured the same as the payloads that the kobuki sends.
```cpp
unsigned char KobukiPayload[11] = {
0xaa, // Start byte 1
0x55, // Start byte 2
0x08, // Payload length (the first 2 bytes dont count)
0x01, // payload type (0x01 = control command)
0x04, // Control byte or additional identifier
actual_speed % 256, // Lower byte of speed value (max actual_speed 1024)
actual_speed >> 8, // Upper byte of speed value
0x00, // Placeholder for radius
0x00, // Placeholder for radius
0x00 // Placeholder for checksum (will be applied later)
};
```
You can also find the documentation about the payloads on the kobuki website

View File

@@ -0,0 +1,78 @@
# Kobuki automatische reconnect
Mijn taak was om de kobuki automatisch te reconnecten als de verbinding verbroken werd met de pi. nu moet je telkens handmatig de pi laten connecten met de kobuki, dit is niet handig als iemand niet weet hoe dit moet.
De connectie word gemaakt met ttyUSB0. Dat is de eerste poort die wij kunnen gebruiken voor de kobuki.
```cpp
robot.startCommunication("/dev/ttyUSB0", true, null_ptr);
```
ik heb een functie gemaakt die kijkt of de pi nog is verbonden aan de kobuki. in de if statement kijkt hij elke 5 seconden of de ttyusb0 poort beschikbaar is.
```cpp
void checkKobukiConnection()
{
while (true)
{
std::lock_guard<std::mutex> lock(connectionMutex);
// Controleer of het apparaat beschikbaar is
if (!std::ifstream("/dev/ttyUSB0")){
if (kobuki_connected){
cout << "Kobuki disconnected: USB device not found." << endl;
kobuki_connected = false;
}
std::this_thread::sleep_for(std::chrono::seconds(5));
continue; // Probeer later opnieuw
}
```
Hier kijk ik dan of de kobuki is geconnect, zoniet dan moet ie weer connecten met ttyUSB0
```cpp
// Controleer of de Kobuki verbonden is
if (!robot.isConnected()){
if (kobuki_connected){
cout << "Kobuki disconnected." << endl;
kobuki_connected = false;
}
cout << "Attempting to reconnect Kobuki..." << endl;
robot.startCommunication("/dev/ttyUSB0", true, nullptr);
if (robot.isConnected()){
cout << "Kobuki reconnected successfully!" << endl;
kobuki_connected = true;
}
else{
cout << "Failed to reconnect Kobuki, retrying in 5 seconds..." << endl;
}
}
```
Nu heb ik het probleem dat als ik de kabel eruit steek en terug in de kobuki stop dat de pi niet meer wil connecten. Dit komt omdat het systeem denkt dat de poort "ttyUSB0" nog steeds gebruikt word. Als ik de kabel dan terug stop word de poort "ttyUSB1" gebruikt, omdat ttyusb0 niet word vrijgegeven.
```bash
ishak@raspberrypi:~ $ dmesg | tail -n 20
[10516.084132] usb 1-1.3: Product: iClebo Kobuki
[10516.084144] usb 1-1.3: Manufacturer: Yujin Robot
[10516.084155] usb 1-1.3: SerialNumber: kobuki_AI02MQMK
[10516.091210] ftdi_sio 1-1.3:1.0: FTDI USB Serial Device converter detected
[10516.091414] usb 1-1.3: Detected FT232R
[10516.099169] usb 1-1.3: FTDI USB Serial Device converter now attached to ttyUSB1
[10574.100491] usb 1-1.3: USB disconnect, device number 34
[10574.101596] ftdi_sio ttyUSB1: FTDI USB Serial Device converter now disconnected from ttyUSB1
[10574.101735] ftdi_sio 1-1.3:1.0: device disconnected
[10579.697816] usb 1-1.3: new full-speed USB device number 35 using dwc_otg
[10579.829776] usb 1-1.3: New USB device found, idVendor=0403, idProduct=6001, bcdDevice= 6.00
[10579.829821] usb 1-1.3: New USB device strings: Mfr=1, Product=2, SerialNumber=3
[10579.829836] usb 1-1.3: Product: iClebo Kobuki
[10579.829848] usb 1-1.3: Manufacturer: Yujin Robot
[10579.829860] usb 1-1.3: SerialNumber: kobuki_AI02MQMK
[10579.840148] ftdi_sio 1-1.3:1.0: FTDI USB Serial Device converter detected
[10579.840351] usb 1-1.3: Detected FT232R
[10579.842208] usb 1-1.3: FTDI USB Serial Device converter now attached to ttyUSB1
[10612.745819] hwmon hwmon1: Voltage normalised
[10614.761829] hwmon hwmon1: Undervoltage detected!
```

View File

@@ -499,32 +499,6 @@ void CKobuki::doRotation(long double th) {
usleep(25 * 1000);
}
// combines navigation to a coordinate and rotation by an angle, performs
// movement to the selected coordinate in the robot's coordinate system
void CKobuki::goToXy(long double xx, long double yy) {
long double th;
yy = yy * -1;
th = atan2(yy, xx);
doRotation(th);
long double s = sqrt(pow(xx, 2) + pow(yy, 2));
// resetnem suradnicovu sustavu robota
x = 0;
y = 0;
iterationCount = 0;
theta = 0;
// std::cout << "mam prejst: " << s << "[m]" << std::endl;
goStraight(s);
usleep(25 * 1000);
return;
}
/// @brief Makes the robot move forward for 3 seconds
/// @param speedvalue How fast it will drive forward from 0 - 1024
void CKobuki::forward(int speedvalue) {
@@ -592,7 +566,7 @@ void CKobuki::robotSafety(std::string *pointerToMessage) {
parser.data.CliffCenter || parser.data.CliffRight) {
std::cout << "Safety condition triggered!" << std::endl; // Debug print
*pointerToMessage = "estop";
forward(-100); // reverse the robot
forward(-300); // reverse the robot
}
std::this_thread::sleep_for(std::chrono::milliseconds(static_cast<int>(100)));
}
@@ -606,8 +580,10 @@ void CKobuki::robotSafety() {
parser.data.BumperRight || parser.data.CliffLeft ||
parser.data.CliffCenter || parser.data.CliffRight) {
std::cout << "Safety condition triggered!" << std::endl; // Debug print
forward(-100); // reverse the robot
forward(-300); // reverse the robot
}
std::this_thread::sleep_for(std::chrono::milliseconds(static_cast<int>(100)));
}
}

View File

@@ -31,7 +31,6 @@
#include <chrono>
#include <sstream>
#include "KobukiParser.h"
#include "graph.h"
using namespace std;

View File

@@ -2,6 +2,8 @@
#include <iostream>
//moet checkenvalue gebruiken of moet kijken naar de payloadlength welke dingen er extra zijn
int KobukiParser::parseKobukiMessage(TKobukiData &output, unsigned char *data) {
std::this_thread::sleep_for(std::chrono::milliseconds(static_cast<int>(20))); //avoid busy waiting. The kobuki sends a message every 20ms
int rtrnvalue = checkChecksum(data);
if (rtrnvalue != 0) {
// std::cerr << "Invalid checksum" << std::endl;

View File

@@ -2,6 +2,8 @@
#define KOBUKIPARSER_H
#include <vector>
#include <thread>
struct TRawGyroData {
int x, y, z;

View File

@@ -1,71 +0,0 @@
#ifndef GRAPH1010
#define GRAPH1010
#include <stdio.h>
#include <stdlib.h>
#include <vector>
using namespace std;
#define GRAPH_ENABLED true
class plot {
public:
FILE *gp;
bool enabled,persist;
plot(bool _persist=false,bool _enabled=GRAPH_ENABLED) {
enabled=_enabled;
persist=_persist;
if (enabled) {
if(persist)
gp=popen("gnuplot -persist","w");
else
gp=popen("gnuplot","w");
}
}
void plot_data(vector<float> x,const char* style="points",const char* title="Data") {
if(!enabled)
return;
fprintf(gp,"set title '%s' \n",title);
fprintf(gp,"plot '-' w %s \n",style);
for(int k=0;k<x.size();k++) {
fprintf(gp,"%f\n",x[k]);
}
fprintf(gp,"e\n");
fflush(gp);
}
void plot_data(vector<float> x,vector<float> y,const char* style="points",const char* title="Data") {
if(!enabled)
return;
fprintf(gp,"set title '%s' \n",title);
fprintf(gp,"plot '-' w %s \n",style);
for(int k=0;k<x.size();k++) {
fprintf(gp,"%f %f \n",x[k],y[k]);
}
fprintf(gp,"e\n");
fflush(gp);
}
~plot() {
if(enabled)
pclose(gp);
}
};
/*
int main(int argc,char **argv) {
plot p;
for(int a=0;a<100;a++) {
vector<float> x,y;
for(int k=a;k<a+200;k++) {
x.push_back(k);
y.push_back(k*k);
}
p.plot_data(x,y);
}
return 0;
}
*/
#endif

View File

@@ -5,6 +5,7 @@ MqttClient::MqttClient(const std::string& address, const std::string& clientId,
//here all the @PARAMS are getting set for the connection
: client_(address, clientId), username_(username), password_(password), callback_(*this) {
client_.set_callback(callback_);
options.set_clean_session(true);
options.set_mqtt_version(MQTTVERSION_3_1_1); // For MQTT 3.1.1
if (!username_.empty() && !password_.empty()) {
@@ -36,7 +37,6 @@ void MqttClient::subscribe(const std::string& topic, int qos) {
void MqttClient::publishMessage(const std::string& topic, const std::string& payload) {
try {
std::cout << "Publishing message: " << payload << std::endl;
client_.publish(topic, payload)->wait();
} catch (const mqtt::exception& exc) {
std::cerr << "Error: " << exc.what() << std::endl;

View File

@@ -1,5 +1,7 @@
#include <iostream>
#include <thread>
#include <fstream>
#include <filesystem>
#include "MQTT/MqttClient.h"
#include "KobukiDriver/CKobuki.h"
#include <opencv4/opencv2/opencv.hpp>
@@ -8,50 +10,56 @@
using namespace std;
using namespace cv;
CKobuki robot;
std::atomic<bool> kobuki_connected(false);
std::string readMQTT();
void parseMQTT(std::string message);
void CapnSend();
MqttClient client("ws://145.92.224.21/ws/", "KobukiRPI", "rpi","rpiwachtwoordofzo"); // create a client object
void checkKobukiConnection();
// ip, clientID, username, password
MqttClient client("ws://145.92.224.21/ws/", "KobukiRPI", "rpi", "rpiwachtwoordofzo"); // create a client object
std::string message = "stop";
std::string serializeKobukiData(const TKobukiData &data);
void sendKobukiData(TKobukiData &data);
void setup() {
unsigned char *null_ptr(0);
std::cout << "Attempting to start communication with Kobuki..." << std::endl;
robot.startCommunication("/dev/ttyUSB0", true, null_ptr);
if (!robot.isConnected()) {
std::cerr << "Failed to start communication with Kobuki." << std::endl;
} else {
std::cout << "Successfully started communication with Kobuki." << std::endl;
}
// connect mqtt server and sub to commands
std::string findKobukiPort()
{
for (const auto& entry : std::filesystem::directory_iterator("/dev"))
{
std::string device = entry.path().string();
if (device.find("ttyUSB") != std::string::npos)
{
return device; // Returneer de eerste gevonden poort
}
}
return ""; // Geen poort gevonden
}
void setup()
{
std::string port = findKobukiPort();
unsigned char *null_ptr(0);
robot.startCommunication(const_cast<char*>(port.c_str()), true, null_ptr);
// connect mqtt server and sub to commands
client.connect();
client.subscribe("home/commands");
}
int main() {
int main()
{
setup();
std::thread image(CapnSend);
std::thread safety([&]() { robot.robotSafety(&message); });
std::thread sendMqtt([&]() { sendKobukiData(robot.parser.data); });
std::thread safety([&](){ robot.robotSafety(&message); });
std::thread sendMqtt([&](){ sendKobukiData(robot.parser.data); });
std::thread connectionChecker(checkKobukiConnection);
connectionChecker.detach(); // Laat deze thread onafhankelijk draaien
while (true) {
if (!robot.isConnected()) {
std::cout << "Kobuki is not connected anymore. Reconnecting..." << std::endl;
robot.startCommunication("/dev/ttyUSB0", true, nullptr);
while (!robot.isConnected()) {
std::cout << "Attempting to reconnect..." << std::endl;
std::this_thread::sleep_for(std::chrono::seconds(1));
}
std::cout << "Reconnected to Kobuki." << std::endl;
}
while (true)
{
std::string message = readMQTT();
if (!message.empty()) {
if (!message.empty())
{
parseMQTT(message);
}
}
@@ -61,11 +69,54 @@ int main() {
image.join();
}
std::string readMQTT() {
std::mutex connectionMutex;
void checkKobukiConnection()
{
while (true)
{
std::lock_guard<std::mutex> lock(connectionMutex);
std::string port = findKobukiPort();
if (port.empty()) {
if (kobuki_connected) {
cout << "Kobuki disconnected: No USB device found." << endl;
kobuki_connected = false;
}
std::this_thread::sleep_for(std::chrono::seconds(5));
continue; // Probeer later opnieuw
}
// Controleer of de Kobuki verbonden is
if (!robot.isConnected()){
if (kobuki_connected){
cout << "Kobuki disconnected." << endl;
kobuki_connected = false;
}
cout << "Attempting to reconnect Kobuki..." << endl;
robot.startCommunication(const_cast<char*>(port.c_str()), true, nullptr);
if (robot.isConnected()){
cout << "Kobuki reconnected successfully!" << endl;
kobuki_connected = true;
}
else{
cout << "Failed to reconnect Kobuki, retrying in 5 seconds..." << endl;
}
}
// Wacht voordat je opnieuw controleert
std::this_thread::sleep_for(std::chrono::seconds(5));
}
}
std::string readMQTT()
{
static std::string lastMessage;
std::string message = client.getLastMessage();
if (!message.empty() && message != lastMessage) {
if (!message.empty() && message != lastMessage)
{
std::cout << "MQTT Message: " << message << std::endl;
lastMessage = message;
}
@@ -75,32 +126,49 @@ std::string readMQTT() {
return lastMessage;
}
void parseMQTT(std::string message) {
if (message == "up") {
void parseMQTT(std::string message)
{
if (message == "up")
{
robot.forward(350);
} else if (message == "left") {
}
else if (message == "left")
{
robot.setRotationSpeed(4);
} else if (message == "right") {
}
else if (message == "right")
{
robot.setRotationSpeed(-4);
} else if (message == "down") {
}
else if (message == "down")
{
robot.forward(-350);
} else if (message == "stop") {
}
else if (message == "stop")
{
robot.sendNullMessage();
robot.sendNullMessage();
} else if (message == "estop") {
}
else if (message == "estop")
{
robot.forward(-400);
} else {
}
else
{
std::cout << "Invalid command" << std::endl;
}
}
void logToFile() {
while (true) {
void logToFile()
{
while (true)
{
TKobukiData robotData = robot.parser.data;
std::ofstream outputFile("log",
std::ios_base::app); // Open file in append mode to
// not overwrite own content
if (outputFile.is_open()) { // check if the file was opened successfully
if (outputFile.is_open())
{ // check if the file was opened successfully
// Get current time
std::time_t now = std::time(nullptr);
outputFile << "Timestamp: " << std::ctime(&now);
@@ -158,7 +226,9 @@ void logToFile() {
outputFile << "UDID1: " << robotData.extraInfo.UDID1 << "\n";
outputFile << "UDID2: " << robotData.extraInfo.UDID2 << "\n";
outputFile.close();
} else {
}
else
{
std::cerr << "Error opening file\n";
}
@@ -166,8 +236,10 @@ void logToFile() {
}
}
void sendIndividualKobukiData(const TKobukiData &data) {
while (true) {
void sendIndividualKobukiData(const TKobukiData &data)
{
while (true)
{
std::cout << "Kobuki Data wordt gepubliceerd naar kobuki/data/timestamp: "
<< data.timestamp << std::endl;
client.publishMessage("kobuki/data/timestamp",
@@ -255,7 +327,8 @@ void sendIndividualKobukiData(const TKobukiData &data) {
client.publishMessage("kobuki/data/extraInfo/UDID2",
std::to_string(data.extraInfo.UDID2));
if (!data.gyroData.empty()) {
if (!data.gyroData.empty())
{
const auto &latestGyro = data.gyroData.back();
client.publishMessage("kobuki/data/gyroData/x",
std::to_string(latestGyro.x));
@@ -269,7 +342,8 @@ void sendIndividualKobukiData(const TKobukiData &data) {
}
}
std::string serializeKobukiData(const TKobukiData &data) {
std::string serializeKobukiData(const TKobukiData &data)
{
std::string json =
"{\"timestamp\":" + std::to_string(data.timestamp) +
",\"BumperCenter\":" + std::to_string(data.BumperCenter) +
@@ -322,7 +396,8 @@ std::string serializeKobukiData(const TKobukiData &data) {
",\"UDID1\":" + std::to_string(data.extraInfo.UDID1) +
",\"UDID2\":" + std::to_string(data.extraInfo.UDID2) + "},\"gyroData\":[";
if (!data.gyroData.empty()) {
if (!data.gyroData.empty())
{
const auto &latestGyro = data.gyroData.back();
json += "{\"x\":" + std::to_string(latestGyro.x) +
",\"y\":" + std::to_string(latestGyro.y) +
@@ -332,17 +407,13 @@ std::string serializeKobukiData(const TKobukiData &data) {
json += "]}";
return json;
}
// create extra function to send the message every 100ms
// needed it so it can be threaded
void sendKobukiData(TKobukiData &data) {
while (true) {
// if(!robot.isConnected()){
// std::cout << "Kobuki is not connected anymore" << std::endl;
// robot.startCommunication("/dev/ttyUSB0", true, nullptr);
// while(!robot.isConnected()){
// std::this_thread::sleep_for(std::chrono::seconds(1));
// }
// }
void sendKobukiData(TKobukiData &data)
{
while (true)
{
client.publishMessage("kobuki/data", serializeKobukiData(data));
std::cout << "Sent data" << std::endl;
std::this_thread::sleep_for(std::chrono::milliseconds(1000));
@@ -350,30 +421,47 @@ void sendKobukiData(TKobukiData &data) {
}
void CapnSend() {
VideoCapture cap(0);
if (!cap.isOpened()) {
cerr << "Error: Could not open camera" << endl;
return;
}
Mat frame;
while (true) {
cap >> frame; // Capture a new image frame
if (frame.empty()) {
cerr << "Error: Could not capture image" << endl;
continue;
int COMPRESSION_LEVEL = 90;
VideoCapture cap(0); // Open the camera
if (!cap.isOpened()) {
cerr << "Error: Could not open camera" << endl;
return;
}
// Convert the image to a byte array
vector<uchar> buf;
imencode(".jpg", frame, buf);
auto *enc_msg = reinterpret_cast<unsigned char *>(buf.data());
Mat frame;
while (true) {
if (!cap.read(frame)) {
cout << "Reconnecting camera" << endl;
cap.release();
std::this_thread::sleep_for(std::chrono::seconds(1));
// Attempt to reconnect to the camera
cap.open(0);
if (!cap.isOpened()) {
cerr << "Error: Could not reconnect to camera" << endl;
std::this_thread::sleep_for(std::chrono::seconds(1)); // Wait before retrying
continue;
} else {
cout << "Reconnected to camera" << endl;
continue;
}
}
// Publish the image data
client.publishMessage("kobuki/cam", string(enc_msg, enc_msg + buf.size()));
cout << "Sent image" << endl;
std::this_thread::sleep_for(
std::chrono::milliseconds(300)); // Send image every 1000ms
}
vector<uchar> imgbuf;
vector<int> compression_params;
compression_params.push_back(IMWRITE_JPEG_QUALITY); // Set JPEG quality
compression_params.push_back(COMPRESSION_LEVEL); // Adjust the quality level (0-100, lower = more compression)
// Encode the image into the byte buffer with the specified compression parameters
imencode(".jpg", frame, imgbuf, compression_params);
// Convert the vector<uchar> buffer to a string (no casting)
string enc_msg(imgbuf.begin(), imgbuf.end());
// Publish the compressed image data (MQTT, in this case)
client.publishMessage("kobuki/cam", enc_msg);
cout << "Sent compressed image" << endl;
std::this_thread::sleep_for(std::chrono::milliseconds(200)); // Send image every 200ms
}
}

View File

@@ -1,15 +0,0 @@
cmake_minimum_required(VERSION 3.10)
set(CMAKE_CXX_STANDARD 23)
# Find the Paho MQTT C++ library
find_library(PAHO_MQTTPP_LIBRARY paho-mqttpp3 PATHS /usr/local/lib)
find_library(PAHO_MQTT_LIBRARY paho-mqtt3a PATHS /usr/local/lib)
# Include the headers
include_directories(/usr/local/include)
# Add the executable
add_executable(my_program main.cpp)
# Link the libraries
target_link_libraries(my_program ${PAHO_MQTTPP_LIBRARY} ${PAHO_MQTT_LIBRARY})

View File

@@ -1,64 +0,0 @@
#include <iostream>
#include <mqtt/async_client.h>
#include <thread> // For std::this_thread::sleep_for
#include <chrono> // For std::chrono::seconds
// Define the address of the MQTT broker, the client ID, and the topic to subscribe to.
const std::string ADDRESS("mqtt://localhost:1883"); // Broker address (Raspberry Pi)
const std::string CLIENT_ID("raspberry_pi_client");
const std::string TOPIC("home/commands");
// Define a callback class that handles incoming messages and connection events.
class callback : public virtual mqtt::callback {
// Called when a message arrives on a subscribed topic.
void message_arrived(mqtt::const_message_ptr msg) override {
std::cout << "Received message: '" << msg->get_topic()<< "' : " << msg->to_string() << std::endl;
}
// Called when the connection to the broker is lost.
void connection_lost(const std::string& cause) override {
std::cerr << "Connection lost. Reason: " << cause << std::endl;
}
// Called when a message delivery is complete.
void delivery_complete(mqtt::delivery_token_ptr token) override {
std::cout << "Message delivered!" << std::endl;
}
};
int main() {
// Create an MQTT async client and set up the callback class.
mqtt::async_client client(ADDRESS, CLIENT_ID);
callback cb;
client.set_callback(cb);
// Set up the connection options (such as username and password).
mqtt::connect_options connOpts;
connOpts.set_clean_session(true);
connOpts.set_user_name("ishak");
connOpts.set_password("kobuki");
connOpts.set_mqtt_version(MQTTVERSION_3_1_1);
try {
// Try to connect to the broker and wait until successful.
std::cout << "Connecting to broker..." << std::endl;
client.connect(connOpts)->wait(); // Connect with the provided options
std::cout << "Connected!" << std::endl;
// Subscribe to the specified topic and wait for confirmation.
std::cout << "Subscribing to topic: " << TOPIC << std::endl;
client.subscribe(TOPIC, 1)->wait(); // Subscribe with QoS level 1
// Keep the program running to continue receiving messages from the broker.
while (true) {
std::this_thread::sleep_for(std::chrono::seconds(1)); // Sleep to reduce CPU usage
}
} catch (const mqtt::exception &exc) {
// Catch any MQTT exceptions and display the error message.
std::cerr << "Error: " << exc.what() << std::endl;
return 1;
}
return 0; // Return 0 to indicate successful execution
}

View File

@@ -0,0 +1,44 @@
cmake_minimum_required( VERSION 3.6 )
# Require C++11 (or later)
set( CMAKE_CXX_STANDARD 23 )
set( CMAKE_CXX_STANDARD_REQUIRED ON )
set( CMAKE_CXX_EXTENSIONS OFF )
set(BUILD_MODE Debug)
# Create Project
project( Sample )
add_executable( YOLOv4 util.h main.cpp )
# Set StartUp Project
set_property( DIRECTORY PROPERTY VS_STARTUP_PROJECT "YOLOv4" )
# Find Package
# OpenCV
find_package( OpenCV REQUIRED )
if( OpenCV_FOUND )
# Additional Include Directories
include_directories( ${OpenCV_INCLUDE_DIRS} )
# Additional Dependencies
target_link_libraries( YOLOv4 ${OpenCV_LIBS} )
endif()
# Download Model
set( MODEL https://github.com/AlexeyAB/darknet/releases/download/darknet_yolo_v3_optimal/yolov4.weights )
file( DOWNLOAD
"${MODEL}"
"${CMAKE_CURRENT_LIST_DIR}/yolov4.weights"
EXPECTED_HASH SHA256=e8a4f6c62188738d86dc6898d82724ec0964d0eb9d2ae0f0a9d53d65d108d562
SHOW_PROGRESS
)
# Download Config
set( CONFIG https://raw.githubusercontent.com/AlexeyAB/darknet/master/cfg/yolov4.cfg )
file( DOWNLOAD
"${CONFIG}"
"${CMAKE_CURRENT_LIST_DIR}/yolov4.cfg"
EXPECTED_HASH SHA256=a6d0f8e5c62cc8378384f75a8159b95fa2964d4162e33351b00ac82e0fc46a34
SHOW_PROGRESS
)

BIN
src/C++/OpenCV/YOLOv4 Executable file

Binary file not shown.

80
src/C++/OpenCV/coco.names Normal file
View File

@@ -0,0 +1,80 @@
person
bicycle
car
motorbike
aeroplane
bus
train
truck
boat
traffic light
fire hydrant
stop sign
parking meter
bench
bird
cat
dog
horse
sheep
cow
elephant
bear
zebra
giraffe
backpack
umbrella
handbag
tie
suitcase
frisbee
skis
snowboard
sports ball
kite
baseball bat
baseball glove
skateboard
surfboard
tennis racket
bottle
wine glass
cup
fork
knife
spoon
bowl
banana
apple
sandwich
orange
broccoli
carrot
hot dog
pizza
donut
cake
chair
sofa
pottedplant
bed
diningtable
toilet
tvmonitor
laptop
mouse
remote
keyboard
cell phone
microwave
oven
toaster
sink
refrigerator
book
clock
vase
scissors
teddy bear
hair drier
toothbrush

209
src/C++/OpenCV/main.cpp Normal file
View File

@@ -0,0 +1,209 @@
#include <iostream>
#include <string>
#include <vector>
#include <opencv2/opencv.hpp>
#include <opencv2/dnn.hpp>
#include <filesystem>
#include <fstream>
#include "util.h"
// Helper function to check if a file exists
bool fileExists(const std::string &path)
{
return std::filesystem::exists(path);
}
// Function to read class names from a file
std::vector<std::string> _readClassNameList(const std::string &path)
{
std::vector<std::string> classes;
// Check if file exists
if (!fileExists(path))
{
throw std::runtime_error("Class names file not found: " + path);
}
// Try to open and read file
std::ifstream file(path);
if (!file.is_open())
{
throw std::runtime_error("Unable to open class names file: " + path);
}
std::string line;
while (std::getline(file, line))
{
if (!line.empty())
{
classes.push_back(line);
}
}
if (classes.empty())
{
throw std::runtime_error("No classes found in file: " + path);
}
return classes;
}
int main(int argc, char *argv[])
{
try
{
// Open Video Capture
cv::VideoCapture capture = cv::VideoCapture(0);
if (!capture.isOpened())
{
std::cerr << "Failed to open camera device" << std::endl;
return -1;
}
// Read Class Name List and Color Table
const std::string list = "coco.names";
const std::vector<std::string> classes = _readClassNameList(list);
const std::vector<cv::Scalar> colors = getClassColors(classes.size());
// Debug: Print the size of the colors vector
std::cout << "Number of colors: " << colors.size() << std::endl;
// Read Darknet
const std::string model = "yolov4.weights";
const std::string config = "yolov4.cfg";
cv::dnn::Net net = cv::dnn::readNet(model, config);
if (net.empty())
{
std::cerr << "Failed to load network" << std::endl;
return -1;
}
// Set Preferable Backend
net.setPreferableBackend(cv::dnn::DNN_BACKEND_OPENCV);
// Set Preferable Target
net.setPreferableTarget(cv::dnn::DNN_TARGET_OPENCL);
while (true)
{
// Read Frame
cv::Mat frame;
capture >> frame;
if (frame.empty())
{
cv::waitKey(0);
break;
}
if (frame.channels() == 4)
{
cv::cvtColor(frame, frame, cv::COLOR_BGRA2BGR);
}
// Create Blob from Input Image
cv::Mat blob = cv::dnn::blobFromImage(frame, 1 / 255.f, cv::Size(416, 416), cv::Scalar(), true, false);
// Set Input Blob
net.setInput(blob);
// Run Forward Network
std::vector<cv::Mat> detections;
net.forward(detections, getOutputsNames(net));
// Draw Region
std::vector<int32_t> class_ids;
std::vector<float> confidences;
std::vector<cv::Rect> rectangles;
for (cv::Mat &detection : detections)
{
if (detection.empty())
{
std::cerr << "Detection matrix is empty!" << std::endl;
continue;
}
for (int32_t i = 0; i < detection.rows; i++)
{
cv::Mat region = detection.row(i);
// Retrieve Max Confidence and Class Index
cv::Mat scores = region.colRange(5, detection.cols);
cv::Point class_id;
double confidence;
cv::minMaxLoc(scores, 0, &confidence, 0, &class_id);
// Check Confidence
constexpr float threshold = 0.2;
if (threshold > confidence)
{
continue;
}
// Retrieve Object Position
const int32_t x_center = static_cast<int32_t>(region.at<float>(0) * frame.cols);
const int32_t y_center = static_cast<int32_t>(region.at<float>(1) * frame.rows);
const int32_t width = static_cast<int32_t>(region.at<float>(2) * frame.cols);
const int32_t height = static_cast<int32_t>(region.at<float>(3) * frame.rows);
const cv::Rect rectangle = cv::Rect(x_center - (width / 2), y_center - (height / 2), width, height);
// Add Class ID, Confidence, Rectangle
class_ids.push_back(class_id.x);
confidences.push_back(confidence);
rectangles.push_back(rectangle);
}
}
// Remove Overlap Rectangles using Non-Maximum Suppression
constexpr float confidence_threshold = 0.5; // Confidence
constexpr float nms_threshold = 0.5; // IoU (Intersection over Union)
std::vector<int32_t> indices;
cv::dnn::NMSBoxes(rectangles, confidences, confidence_threshold, nms_threshold, indices);
// Draw Rectangle
for (const int32_t &index : indices)
{
// Bounds checking
if (class_ids[index] >= colors.size())
{
std::cerr << "Color index out of bounds: " << class_ids[index] << " (max: " << colors.size() - 1 << ")" << std::endl;
continue;
}
const cv::Rect rectangle = rectangles[index];
const cv::Scalar color = colors[class_ids[index]];
// Debug: Print the index and color
std::cout << "Drawing rectangle with color index: " << class_ids[index] << std::endl;
constexpr int32_t thickness = 3;
cv::rectangle(frame, rectangle, color, thickness);
std::string label = classes[class_ids[index]] + ": " + std::to_string(static_cast<int>(confidences[index] * 100)) + "%";
int baseLine;
cv::Size labelSize = cv::getTextSize(label, cv::FONT_HERSHEY_SIMPLEX, 0.5, 1, &baseLine);
int top = std::max(rectangle.y, labelSize.height);
cv::rectangle(frame, cv::Point(rectangle.x, top - labelSize.height),
cv::Point(rectangle.x + labelSize.width, top + baseLine), color, cv::FILLED);
cv::putText(frame, label, cv::Point(rectangle.x, top), cv::FONT_HERSHEY_SIMPLEX, 0.5, cv::Scalar(255, 255, 255), 1);
}
// Show Image
cv::imshow("Object Detection", frame);
const int32_t key = cv::waitKey(1);
if (key == 'q')
{
break;
}
}
cv::destroyAllWindows();
return 0;
}
catch (const std::exception &e)
{
std::cerr << "Error: " << e.what() << std::endl;
return -1;
}
}
// cloned and fixed from https://github.com/UnaNancyOwen/OpenCVDNNSample/tree/master

61
src/C++/OpenCV/util.h Normal file
View File

@@ -0,0 +1,61 @@
#ifndef __UTIL__
#define __UTIL__
#include <vector>
#include <string>
#include <fstream>
#include <opencv2/dnn.hpp>
#include <opencv2/core.hpp>
#include <opencv2/highgui.hpp>
// Get Output Layers Name
std::vector<std::string> getOutputsNames( const cv::dnn::Net& net )
{
static std::vector<std::string> names;
if( names.empty() ){
std::vector<int32_t> out_layers = net.getUnconnectedOutLayers();
std::vector<std::string> layers_names = net.getLayerNames();
names.resize( out_layers.size() );
for( size_t i = 0; i < out_layers.size(); ++i ){
names[i] = layers_names[out_layers[i] - 1];
}
}
return names;
}
// Get Output Layer Type
std::string getOutputLayerType( cv::dnn::Net& net )
{
const std::vector<int32_t> out_layers = net.getUnconnectedOutLayers();
const std::string output_layer_type = net.getLayer( out_layers[0] )->type;
return output_layer_type;
}
// Read Class Name List
std::vector<std::string> readClassNameList( const std::string list_path )
{
std::vector<std::string> classes;
std::ifstream ifs( list_path );
if( !ifs.is_open() ){
return classes;
}
std::string class_name = "";
while( std::getline( ifs, class_name ) ){
classes.push_back( class_name );
}
return classes;
}
// Get Class Color Table for Visualize
std::vector<cv::Scalar> getClassColors( const int32_t number_of_colors )
{
cv::RNG random;
std::vector<cv::Scalar> colors;
for( int32_t i = 0; i < number_of_colors; i++ ){
cv::Scalar color( random.uniform( 0, 255 ), random.uniform( 0, 255 ), random.uniform( 0, 255 ) );
colors.push_back( color );
}
return colors;
}
#endif // __UTIL__

1158
src/C++/OpenCV/yolov4.cfg Normal file

File diff suppressed because it is too large Load Diff

Binary file not shown.

41
src/Python/YOLO/app.py Normal file
View File

@@ -0,0 +1,41 @@
from ultralytics import YOLO
import cv2
import numpy as np
import requests
import time
model = YOLO("yolo11n.pt")
#try to fetch the image from the given url
def fetch_image(url):
try:
response = requests.get(url)
response.raise_for_status()
image_array = np.frombuffer(response.content, np.uint8)
image = cv2.imdecode(image_array, cv2.IMREAD_COLOR)
return image
except requests.RequestException as e:
print(f"Error: Could not fetch image - {e}")
return None
# URL of the photostream
url = "http://145.92.224.21/image"
while True:
frame = fetch_image(url)
if frame is None:
print("Error: Could not fetch image, retrying...")
time.sleep(1) # Wait for 1 second before retrying
continue
# Predict on the frame
results = model(frame)
# Display the results
results[0].show()
# Exit if 'q' is pressed
if cv2.waitKey(1) & 0xFF == ord('q'):
break
cv2.destroyAllWindows()

View File

@@ -0,0 +1 @@
__pycache__

View File

@@ -0,0 +1,18 @@
FROM python:3.9
WORKDIR /app
COPY . .
RUN apt-get update && apt-get install -y libgl1
RUN pip install -r requirements.txt
EXPOSE 5000
CMD ["python", "web/app.py"]
#build instruction: sudo docker buildx build -t flaskapp:latest .
#run instruction: sudo docker run --network="host" --restart=always flaskapp:latest
# need to use network host to connect to the host's mqtt server

View File

@@ -0,0 +1,6 @@
Flask==3.1.0
paho-mqtt==1.6.1
ultralytics==8.3.58
opencv-python-headless==4.6.0.66
numpy==1.23.4
mysql-connector-python==9.1.0

View File

@@ -1,24 +1,53 @@
from flask import Flask, request, render_template, jsonify, g
from flask import Flask, Response, request, render_template, jsonify, g
import paho.mqtt.client as mqtt
from ultralytics import YOLO
import cv2
import numpy as np
import threading
import mysql.connector
import json
app = Flask(__name__)
# Globale variabelen
# Load a model
model = YOLO("yolo11n.pt") # pretrained YOLO11n model
kobuki_message = ""
latest_image = None
processed_image = None
yolo_results = []
# Globale MQTT setup
def on_message(client,userdata, message):
global kobuki_message, latest_image
# Lock for thread-safe access to shared variables
lock = threading.Lock()
# List of class names (example for COCO dataset)
yolo_classes = list(model.names.values())
def on_message(client, userdata, message):
global kobuki_message, latest_image, processed_image, yolo_results
if message.topic == "kobuki/data":
kobuki_message = str(message.payload.decode("utf-8"))
with app.app_context():
sensor_data(kobuki_message) # Sla de data op in de database
elif message.topic == "kobuki/cam":
latest_image = message.payload
with lock: # Lock the shared variables between threads so they can't be accessed at the same time and you cant have half processed images
latest_image = np.frombuffer(message.payload, np.uint8)
latest_image = cv2.imdecode(latest_image, cv2.IMREAD_COLOR)
# Process the image with YOLO
results = model(latest_image)
yolo_results = []
processed_image = latest_image.copy() # Create a copy for processing
for result in results:
for box in result.boxes:
class_id = int(box.cls.item())
class_name = yolo_classes[class_id]
yolo_results.append({
"class": class_name,
"confidence": box.conf.item(),
"bbox": box.xyxy.tolist()
})
# Draw bounding box on the processed image
x1, y1, x2, y2 = map(int, box.xyxy[0])
cv2.rectangle(processed_image, (x1, y1), (x2, y2), (0, 255, 0), 2)
cv2.putText(processed_image, f"{class_name} {box.conf.item():.2f}", (x1, y1 - 10), cv2.FONT_HERSHEY_SIMPLEX, 0.9, (0, 255, 0), 2)
# Create an MQTT client instance
mqtt_client = mqtt.Client()
@@ -26,10 +55,10 @@ mqtt_client.username_pw_set("server", "serverwachtwoordofzo")
mqtt_client.connect("localhost", 1884, 60)
mqtt_client.loop_start()
mqtt_client.subscribe("kobuki/data")
mqtt_client.subscribe("kobuki/cam")
mqtt_client.on_message = on_message # this line needs to be under the function definition otherwise it can't find which function it needs to use
# Database connectie-functie
def get_db():
if 'db' not in g: # 'g' is specifiek voor een request en leeft zolang een request duurt
@@ -56,15 +85,11 @@ def index():
@app.route('/control', methods=["GET", "POST"])
def control():
if request.authorization and request.authorization.username == 'ishak' and request.authorization.password == 'kobuki':
yolo_results_db()
return render_template('control.html')
else:
return ('Unauthorized', 401, {'WWW-Authenticate': 'Basic realm="Login Required"'})
@app.route('/data', methods=['GET'])
def data():
return kobuki_message
@app.route('/move', methods=['POST'])
def move():
data = request.get_json()
@@ -83,17 +108,10 @@ def move():
cursor.close()
db_connection.close()
return jsonify({"status": "success", "direction": direction})
@app.route("/database")
def database():
db = get_db()
cursor = db.cursor()
cursor.execute("SELECT * FROM kobuki_data")
rows = cursor.fetchall()
cursor.close()
return str(rows)
def sensor_data(kobuki_message):
@app.route('/data', methods=['GET'])
def data():
try:
# Parse de JSON-string naar een Python-dictionary
data = json.loads(kobuki_message)
@@ -113,19 +131,55 @@ def sensor_data(kobuki_message):
# Database-insert
db = get_db()
cursor = db.cursor()
with db.cursor() as cursor:
# Zorg dat je tabel `kobuki_data` kolommen heeft: `name` en `value`
sql_sensor = "INSERT INTO kobuki_data (name, value) VALUES (%s, %s)"
cursor.executemany(sql_sensor, sensor_data_tuples)
# Commit en sluit de cursor
db.commit()
cursor.close()
# Zorg dat je tabel `kobuki_data` kolommen heeft: `name` en `value`
sql_sensor = "INSERT INTO kobuki_data (name, value) VALUES (%s, %s)"
cursor.executemany(sql_sensor, sensor_data_tuples)
# Commit en sluit de cursor
db.commit()
cursor.close()
except json.JSONDecodeError as e:
print(f"JSON decode error: {e}")
except mysql.connector.Error as err:
print(f"Database error: {err}")
return kobuki_message
@app.route('/image')
def image():
global processed_image
with lock: # Lock the shared variables between threads so they can't be accessed at the same time and you cant have half processed images
if processed_image is not None:
_, buffer = cv2.imencode('.jpg', processed_image)
return Response(buffer.tobytes(), mimetype='image/jpeg')
else:
return "No image available", 404
@app.route('/yolo_results', methods=['GET'])
def yolo_results_endpoint():
global yolo_results
return jsonify(yolo_results)
def yolo_results_db():
global yolo_results
with lock:
try:
db = get_db()
with db.cursor() as cursor:
sql_yolo = "INSERT INTO image (class, confidence) VALUES (%s, %s)"
yolo_tuples = [(result["class"], result["confidence"]) for result in yolo_results]
print(f"YOLO Tuples: {yolo_tuples}") # Debug statement
cursor.executemany(sql_yolo, yolo_tuples)
db.commit()
cursor.close()
except mysql.connector.Error as err:
print(f"Database error: {err}")
except Exception as e:
print(f"Unexpected error: {e}")
if __name__ == '__main__':
app.run(debug=True, port=5000)
app.run(debug=True, port=5000)

View File

@@ -1,41 +1,63 @@
document.querySelectorAll(".btn").forEach(button => {
button.addEventListener("click", async function(event) { // Maak de functie async
event.preventDefault(); // voorkomt pagina-verversing
document.addEventListener("DOMContentLoaded", function() {
document.querySelectorAll(".btn").forEach(button => {
button.addEventListener("click", function(event) {
event.preventDefault(); // prevents page refresh
// Haal de waarde van de knop op
const direction = event.target.value;
// Get the value of the button
const direction = event.target.value;
try {
const response = await fetch("/move", {
fetch("/move", {
method: "POST",
headers: {
"Content-Type": "application/json"
},
body: JSON.stringify({ direction: direction })
})
.then(response => response.json())
.then(data => {
console.log("Success:", data);
})
.catch(error => {
console.error("Error:", error);
});
const data = await response.json();
console.log("Success:", data);
} catch (error) {
console.error("Error:", error);
}
});
});
// Fetch data from the server
async function fetchData() {
// Fetch data from the server
async function fetchData() {
try {
const response = await fetch("/data");
const data = await response.json();
return data;
} catch (error) {
console.error("Error:", error);
}
}
// Parse the data and show it on the website
// Parse the data and show it on the website
async function parseData() {
const data = await fetchData();
const sensorDataContainer = document.getElementById("sensor-data");
sensorDataContainer.innerHTML = ""; // Clear previous data
//for each object in json array create a new paragraph element and append it to the sensorDataContainer
// For each object in JSON array, create a new paragraph element and append it to the sensorDataContainer
for (const [key, value] of Object.entries(data)) {
const dataElement = document.createElement("p");
dataElement.textContent = `${key}: ${value}`;
sensorDataContainer.appendChild(dataElement); // Voeg het element toe aan de container
sensorDataContainer.appendChild(dataElement); // Add the element to the container
}
});
});
}
// Update the image
async function updateImage() {
let img = document.getElementById("robot-image");
img.src = "/image?" + new Date().getTime(); // Add timestamp to avoid caching
// Wait for 200 milliseconds before fetching the next image
setTimeout(updateImage, 200);
}
// Fetch and display sensor data every 1 second
setInterval(parseData, 1000);
// Start updating the image
updateImage();
});

View File

@@ -167,4 +167,4 @@ th,td {
th {
background-color: #f2f2f2;
text-align: left;
}
}

Binary file not shown.

View File

@@ -1,7 +0,0 @@
import sys
import logging
logging.basicConfig(stream=sys.stderr)
sys.path.insert(0, "/home/ishak/rooziinuubii79/src/Python/flask/web")
from app import app as application

View File

@@ -0,0 +1,13 @@
[Unit]
Description=kobukiDriver
After=network.target
[Service]
User=user1
WorkingDirectory=/home/user1/rooziinuubii79/src/C++/Driver/
ExecStart=/home/user1/rooziinuubii79/src/C++/Driver/kobuki_control
Restart=always
RestartSec=5
[Install]
WantedBy=multi-user.target

View File

@@ -0,0 +1,7 @@
allow_anonymous false
password_file /etc/mosquitto/passwordfile
listener 8080
protocol websockets
listener 1884
protocol mqtt

View File

@@ -0,0 +1,22 @@
server {
listen 80;
server_name 145.92.224.21;
# Proxy WebSocket connections for MQTT
location /ws/ {
proxy_pass http://localhost:9001;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $host;
}
# Proxy HTTP connections for Flask
location / {
proxy_pass http://localhost:5000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}

View File

@@ -0,0 +1,7 @@
stream {
server {
listen 9001;
proxy_pass localhost:8080;
}
}