191 Commits

Author SHA1 Message Date
ff89b8a742 revert and change port number 2025-01-14 15:07:44 +01:00
eaee4ebd60 change port 2025-01-14 15:06:54 +01:00
a4ba54161a change streaming protocol to udp 2025-01-14 15:04:06 +01:00
2448691e63 rtsp stream opencv using gstreamen 2025-01-14 14:55:59 +01:00
ff7b148556 start readme 2025-01-08 14:20:38 +01:00
6fe28f997a edited kobuki speedvalue for safety 2025-01-07 12:51:24 +01:00
1bf9ebddab added mutex in python 2025-01-07 11:30:04 +01:00
585a0e9a52 fix thread crash 2025-01-07 11:19:47 +01:00
cb988a5260 store new image in processed_image 2025-01-07 11:09:33 +01:00
5e01e25d9c comment update 2025-01-06 16:40:40 +01:00
0d184261fd updated systemd kobukidriver service file 2025-01-06 16:40:00 +01:00
ccaa722973 edited startup file for kobukidriver
(everything works now)
2025-01-06 16:31:52 +01:00
7d1b878c30 fix yolo image boxes 2025-01-06 16:11:03 +01:00
228c508012 attempt to fix broken code 2025-01-06 16:02:17 +01:00
7845feb9f8 update yolo naming in image 2025-01-06 15:55:24 +01:00
20d6d8799d attempt to show name next to image box 2025-01-06 15:45:29 +01:00
9c7c774030 change boxing and text error of YOLO 2025-01-06 15:26:00 +01:00
0832da0d3b change mqtt port in python 2025-01-06 15:16:13 +01:00
a59b9c8714 requirements update 2025-01-06 15:11:27 +01:00
4a05ec5efc added dockerfile 2025-01-06 15:11:21 +01:00
c3d575ccf1 added yolo image object detection to /image 2025-01-06 13:00:00 +01:00
1b0b1e87ce change directories and added driver service config 2025-01-06 11:44:08 +01:00
9c41d64c69 remove mqtt debug print 2025-01-06 11:23:27 +01:00
b48eda9735 change image refresh time 2025-01-06 10:35:54 +01:00
629f9cba92 attempt to fix javascript 2025-01-06 10:34:00 +01:00
51aad34c78 added bracket 2025-01-06 10:08:05 +01:00
1b3fead2b3 documentation 2025-01-06 09:49:56 +01:00
ishak jmilou.ishak
61651a9a02 try to see why sensor value doesnt show on website 2024-12-18 12:22:22 +01:00
ishak jmilou.ishak
e5881f1b37 making message print only once 2024-12-18 11:56:43 +01:00
50b6b83299 fix headerfile 2024-12-17 11:19:15 -04:00
10597ba37d avoid busy waiting 2024-12-17 11:17:25 -04:00
6ab5716797 Documentation kobuki, mqtt, opencv 2024-12-13 18:33:24 +01:00
50b90e461f gitignore update 2024-12-13 10:52:53 +01:00
2811036595 change port of mqtt flask 2024-12-12 14:01:39 +01:00
58f1a931a6 Merge branch 'OpenCV' into 'main'
Open cv

See merge request technische-informatica-sm3/ti-projectten/rooziinuubii79!3
2024-12-12 13:48:59 +01:00
c9d3b0f795 Merge branch 'main' into 'OpenCV'
# Conflicts:
#   src/Python/flask/web/app.py
#   src/Python/flask/web/static/script.js
2024-12-12 13:48:21 +01:00
85af15d7a3 change default camera 2024-12-12 13:28:43 +01:00
a1b50a3780 changes to video settings 2024-12-12 13:27:06 +01:00
b86528595e change camera 2024-12-11 16:51:01 +01:00
eef4f9c79c revert video format change 2024-12-11 16:39:45 +01:00
3c23d37be1 change video format 2024-12-11 16:37:01 +01:00
c2886d32c9 use libcamera with picam 2024-12-11 16:30:14 +01:00
8158c85d6e use astra backend 2024-12-11 16:12:16 +01:00
e682969ec8 code revert 2024-12-11 16:07:26 +01:00
0dfc3b5c13 attempt with gstreamer 2024-12-11 15:43:05 +01:00
7f786d5197 change camera logic 2024-12-11 15:37:31 +01:00
60ba177dc2 add pipeline for picam 2024-12-11 15:34:53 +01:00
e9f998b3e7 set V4L2 backend 2024-12-11 15:28:21 +01:00
7eeaba482e removed attempt for camera detection 2024-12-11 14:50:02 +01:00
e8db00120f update video camera logic 2024-12-11 14:47:29 +01:00
c65f310e81 cleanup 2024-12-11 14:46:58 +01:00
ec3e83ef7f changed ip adress and cmakelist 2024-12-11 14:35:42 +01:00
480d36393a update website so it shows image 2024-12-10 13:29:58 +01:00
fea0f19857 update ip adress 2024-12-10 13:29:50 +01:00
e1135dac0f update cmakelist 2024-12-10 13:13:45 +01:00
2f4e5ae096 re enable robot communication 2024-12-09 10:31:01 +01:00
ishak jmilou.ishak
606506e40c update review feedback with additional documentation and planning notes 2024-12-03 13:37:44 +01:00
ishak jmilou.ishak
6efd95fb32 feedback sprint 4 2024-12-03 13:34:43 +01:00
ishak jmilou.ishak
5eff7fccba fix 2024-12-03 12:45:18 +01:00
ishak jmilou.ishak
a03894e52e changed MQTT connection port from 1884 to 80 2024-12-03 12:18:02 +01:00
9e07a243ea receive images from mqtt server and display on endpoint 2024-12-03 12:06:12 +01:00
b93a5f2dca added mosquitto conf 2024-12-02 14:00:29 +01:00
911b870786 remove unused library 2024-12-02 13:59:27 +01:00
dd2a1b56c4 changed port 2024-12-02 12:58:14 +00:00
dd39bd3021 fixed mqtt and sockets and reverse proxy after 5 hours 2024-12-02 13:44:15 +01:00
ishak jmilou.ishak
1563528b67 changed file location 2024-12-02 13:32:45 +01:00
ishak jmilou.ishak
2e5af52ba8 changed mqtt port to 8080 2024-12-02 12:45:26 +01:00
ishak jmilou.ishak
eb04d35d40 changed path 2024-12-02 10:13:42 +01:00
ishak jmilou.ishak
80fcb1ccc3 daily stand up 02/12/2024 2024-12-02 09:13:34 +01:00
ishak jmilou.ishak
62cdf98098 testing phpmyadmin 2024-11-30 15:18:01 +01:00
ishak jmilou.ishak
db6fa156c9 Update WSGI path to point to the correct application directory 2024-11-28 13:21:24 +01:00
ishak jmilou.ishak
048790ec8b Add WSGI entry point for the application 2024-11-28 13:16:03 +01:00
8aa54805ac Grabbed existing progam off github and repaired it 2024-11-27 21:25:48 +01:00
ishak jmilou.ishak
aca6644c02 Merge branch 'main' of ssh://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79 2024-11-26 14:55:47 +01:00
ishak jmilou.ishak
492f506aa2 daily stand up-/ hoofdvraag en deelvragen 2024-11-26 14:55:45 +01:00
d26d277c3c driver cleanup 2024-11-26 13:32:14 +01:00
c76ba93e82 code req 2024-11-26 13:30:38 +01:00
508d2ed4e2 added base OpenCV script and documentation 2024-11-25 11:46:24 +01:00
3e202acc8d gitignore update 2024-11-25 11:46:07 +01:00
0bfba0bffe Added comments 2024-11-24 16:42:34 +01:00
ishak jmilou.ishak
8a5b349040 Merge branch 'main' of ssh://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79 2024-11-20 17:14:55 +01:00
ishak jmilou.ishak
a41ea1b70c styling website 2024-11-20 17:14:54 +01:00
eb804c888c updated credentials to secure mqtt 2024-11-20 16:29:22 +01:00
a028a6f88f change mqtt credentials 2024-11-20 16:07:39 +01:00
47b29a1c55 added both json and idividual topic sending in cpp 2024-11-20 15:49:40 +01:00
528de4f3f4 code conventions 2024-11-20 15:49:40 +01:00
ishak jmilou.ishak
c16ba3cf9d busy with website redesign 2024-11-20 15:38:56 +01:00
ishak jmilou.ishak
82c4381143 daily stand up 19/11/2024 2024-11-20 11:45:54 +01:00
ishak jmilou.ishak
25b1fa8c35 changed locations of different files 2024-11-18 13:07:02 +01:00
ishak jmilou.ishak
cc9aefa424 made admonition work 2024-11-18 09:18:49 +01:00
ishak jmilou.ishak
3d95479840 fixed drop down 2024-11-18 09:15:56 +01:00
ishak jmilou.ishak
b2a24779f5 Merge branch 'main' of ssh://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79 2024-11-18 09:13:36 +01:00
ishak jmilou.ishak
97efd7d6e1 started with daily 18-11 2024-11-18 09:13:35 +01:00
Mees Roelofsz
ef52dbefe4 Merge branch 'main' of ssh://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79 2024-11-13 16:06:57 +01:00
Mees Roelofsz
f24b88bd68 fixed markdown 2024-11-13 16:06:53 +01:00
ishak jmilou.ishak
f9767965a1 wrote comments 2024-11-13 15:01:15 +01:00
ishak jmilou.ishak
a6b1b3bd1e removed space 2024-11-13 14:20:14 +01:00
ishak jmilou.ishak
c9efba62d4 testing vpn 2024-11-13 13:41:40 +01:00
661fdb9d26 Added placeholder webite for main website 2024-11-12 14:30:45 +01:00
0e30854b51 fix html and added password to enter page 2024-11-12 14:13:43 +01:00
ishak jmilou.ishak
194920bdad refert to older version to check if this is the problem 2024-11-06 15:08:47 +01:00
ishak jmilou.ishak
a67f5238b6 Enhance MQTT message handling and add data fetching in Flask app 2024-11-06 14:44:12 +01:00
ishak jmilou.ishak
d534940370 Refactor safety checks and improve message handling in CKobuki class 2024-11-06 14:06:15 +01:00
89b608b759 comments and cleanup 2024-11-06 14:01:39 +01:00
6dba1d0262 added sensordata to website 2024-11-06 13:59:40 +01:00
f95b78d236 comment 2024-11-06 13:08:48 +01:00
6326227be6 made working endpoint for kobuki data 2024-11-05 19:11:29 +01:00
5f4a7606ce attempt to fix endpoint 2024-11-05 17:07:40 +01:00
9cdf9aa44d intergrated everything and added simple endpoint for mqtt data 2024-11-05 17:06:07 +01:00
7424c2d033 made it so it automaticly sends all its data in json to kobuki/data 2024-11-05 14:20:24 +01:00
47f29f0f0a whole struct to json 2024-11-05 13:47:26 +01:00
222b14b9e5 can send messages to server from cpp 2024-11-05 12:55:02 +01:00
Mees Roelofsz
ce5412847f added to a correctly named file 2024-11-05 12:43:28 +01:00
Mees Roelofsz
8dde47febb added ino file 2024-11-05 12:39:06 +01:00
Mees Roelofsz
6dcf25c810 added tinygpslibrary 2024-11-05 12:38:59 +01:00
d31c70269c remove old parser 2024-11-04 15:16:18 +01:00
206bf965e5 fix safety 2024-11-04 14:42:53 +01:00
9cb5468885 flip rotation value to negative 2024-11-04 14:39:05 +01:00
a422456747 change rotation way 2024-11-04 14:36:46 +01:00
a508d5c881 stop the robot 2024-11-04 14:10:17 +01:00
ishak jmilou.ishak
4aa0356a8a Merge branch 'main' of ssh://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79 2024-11-04 14:06:28 +01:00
ishak jmilou.ishak
1b747edb8d FIXED CSS 2024-11-04 14:06:27 +01:00
f3033b1632 change rotation controls 2024-11-04 14:00:35 +01:00
f111030e73 added better safety 2024-11-04 13:37:09 +01:00
0a43e2ef57 added more preciese controls 2024-11-04 13:31:40 +01:00
95bd144c2e drive testing 2024-11-04 13:29:18 +01:00
c3270a1c47 fix thread going out of scope 2024-11-04 13:23:02 +01:00
0dc267536e fix nullpointer 2024-11-04 13:19:31 +01:00
60a51daf7b added sleep 2024-11-04 13:16:10 +01:00
3af23f61cf start safety thread 2024-11-04 13:10:16 +01:00
b90bca0060 re-added robotsafety 2024-11-04 13:05:17 +01:00
ishak jmilou.ishak
7cc2fbd8ec Merge branch 'main' of ssh://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79 2024-11-04 12:32:33 +01:00
ishak jmilou.ishak
d14108d656 Add 'Stop' button to direction control in index.html 2024-11-04 12:32:32 +01:00
3d7bcf84e4 added more drive logic 2024-11-04 12:31:30 +01:00
25a179ded6 fix last build errors 2024-11-04 12:29:24 +01:00
a3ed0f0f30 used wrong operator 2024-11-04 12:23:34 +01:00
c246258c6d . 2024-11-04 12:22:31 +01:00
94f1b40e74 fix compile errors 2024-11-04 12:19:38 +01:00
d845d01e0a reset mqtt var after running command 2024-11-04 12:18:14 +01:00
ishak jmilou.ishak
005021e2d5 Merge branch 'main' of ssh://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79 2024-11-04 12:17:18 +01:00
ishak jmilou.ishak
38dfccbd28 Change fetch endpoint from /send-direction to /move 2024-11-04 12:17:17 +01:00
d9f0abd86c fix 2024-11-04 12:15:14 +01:00
ishak jmilou.ishak
85d54f8422 Merge branch 'main' of ssh://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79 2024-11-04 12:14:43 +01:00
ishak jmilou.ishak
99676cc83a Refactor button event handling to send direction via fetch to /send-direction 2024-11-04 12:14:42 +01:00
d09352861f reset message after input 2024-11-04 12:13:52 +01:00
ff599e127d remove debug prints 2024-11-04 11:58:06 +01:00
cf59adae8b attempt to fix 2024-11-04 11:56:57 +01:00
7449f713fa added breaks 2024-11-04 11:55:27 +01:00
ishak jmilou.ishak
705ffcd98d Merge branch 'main' of ssh://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79 2024-11-04 11:54:22 +01:00
ishak jmilou.ishak
ca9b81c03e Refactor move endpoint to accept JSON input and update form submission method 2024-11-04 11:54:21 +01:00
b64ccd51eb more debug prints 2024-11-04 11:53:58 +01:00
7e1e46bcc3 added debugging print 2024-11-04 11:50:51 +01:00
60f27af05d some changes 2024-11-04 11:44:51 +01:00
ishak jmilou.ishak
17e1399643 Merge branch 'main' of ssh://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79 2024-11-04 11:43:21 +01:00
ishak jmilou.ishak
d673b4e0ad loading js correctly 2024-11-04 11:43:20 +01:00
e708cbb6ea added comments and controls for mqtt 2024-11-04 11:43:04 +01:00
ishak jmilou.ishak
41034ba85c fixed fetch 2024-11-04 11:34:42 +01:00
ishak jmilou.ishak
21ed9c5080 changed location 2024-11-04 10:55:39 +01:00
ishak jmilou.ishak
19bc1a4184 made get method 2024-11-04 10:26:15 +01:00
b8b61df756 it compiles again 2024-11-04 10:09:39 +01:00
896897d1b0 added thread library to driver 2024-11-04 09:36:21 +01:00
fd4615f0af re-added functions that where removed when merged 2024-11-04 09:32:02 +01:00
9c5825fd61 fix driver 2024-11-04 09:27:18 +01:00
ishak jmilou.ishak
6f17c1ed6f named action /move 2024-11-04 09:23:07 +01:00
a46900dfd2 Merge branch 'test_script_movement' 2024-10-31 21:51:04 +01:00
2919e20970 update gitignore 2024-10-31 21:49:17 +01:00
ishak jmilou.ishak
c4a694e601 moved venv to the .gitignore 2024-10-24 14:07:11 +02:00
d78389992e added a comment 2024-10-22 15:33:19 +02:00
7fa04a5c35 made it so we can read indiviual buttons 2024-10-22 15:10:08 +02:00
1175444abf edited cmakelist 2024-10-21 13:09:46 +02:00
79f21f8755 testing 2024-10-21 12:34:03 +02:00
9935375a5a comment 2024-10-21 10:41:57 +02:00
d41e6432b3 fixed safety of robot and that it always drives when its told to drive 2024-10-21 10:41:48 +02:00
b5f5491222 attempt to make function so robot doenst throw itself from a cliff 2024-10-20 17:35:55 +02:00
856d60eea2 Junk removal 2024-10-18 16:22:47 +02:00
6a0024ebb8 rotation sort of works now 2024-10-16 16:43:24 +02:00
0011096977 INfrastructure update 2024-10-14 14:34:45 +02:00
5ead72c47e uitleg welke sensoren gebruiken en waarom 2024-10-14 13:57:48 +02:00
ishak jmilou.ishak
216b28d27c added template user and learning story 2024-10-14 12:10:03 +02:00
b5e619ac0a attempt to make robot rotate 2024-10-09 14:24:23 +02:00
c38491ce20 added to parser so we can read raw sensordata from cliffsensor (thanks anwar) 2024-10-08 15:05:41 +02:00
c9d128ecd1 todo commnets from ferry at expert review 2024-10-08 14:27:38 +02:00
Mees Roelofsz
927126a797 Merge branch 'main' of ssh://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79 2024-10-08 13:53:27 +02:00
Mees Roelofsz
6235af77b9 changed word 2024-10-08 13:53:25 +02:00
9a4cd473a6 added and fixed logger 2024-10-08 13:29:24 +02:00
Mees Roelofsz
acd4b9589c To Be Determined 2024-10-08 13:13:27 +02:00
Mees Roelofsz
c129ae4980 added more tabs 2024-10-08 13:09:45 +02:00
Mees Roelofsz
cdbb538238 decision written out 2024-10-08 13:05:34 +02:00
Mees Roelofsz
26cc0e2041 finished brainstorm 2024-10-08 12:59:34 +02:00
2bee0ffc25 fix nonsense errors 2024-10-08 12:53:22 +02:00
cede3689f4 cleanup and logger creation 2024-10-08 12:39:31 +02:00
71 changed files with 3868 additions and 785 deletions

11
.gitignore vendored
View File

@@ -13,9 +13,16 @@ src/Socket/a.out
src/C++/Driver/cmake_install.cmake
src/C++/Socket/a.out
src/C++/Driver/Makefile
src/C++/Driver/vgcore*
vgcore*
src/C++/Driver/cmake_install.cmake
src/C++/Driver/Makefile
src/C++/Driver/log
build/
venv
src/C++/MQTT/CMakeFiles
src/C++/MQTT/Makefile
src/C++/MQTT/CMakeCache.txt
src/C++/MQTT/cmake_install.cmake
src/Python/flask/web/_pycache_
venv
build/
@@ -24,3 +31,5 @@ CMakeFiles/
Makefile
CMakeCache.txt
cmake_install.cmake
src/C++/OpenCV/main
.vs

View File

@@ -0,0 +1,21 @@
```
Inleiding van de learning story:
Vul hier een korte beschrijving in van de learning story.
Probeer dit direct te koppelen aan het project wat studenten gaan doen.
```
**Wat ga ik leren?**
```
Zet hier een overzicht van de vaardigheden die een student gaat leren.
Als de student alle vinkjes heeft afgevinkt, dan hebben ze dit onderwerp goed begrepen.
```
- [ ] <- Maak op deze manier afvinklijstjes
**Hoe ga ik dit leren?**
```
Zet hier de bronnenlijst.
```
- [ ] <- Maak op deze manier afvinklijstjes

View File

@@ -0,0 +1,27 @@
Als gebruiker wil ik ..., zodat ... (verwijder deze regel, plaats de tekst in de titel)
[beschrijving van de user story en context]
**Taken**
Deze user story is opgedeeld in een aantal taken. Vul onderstaande lijst zelf aan.
- [ ] Taak 1.
- [ ] Taak 2.
- [ ] ...
**Acceptatie criteria**
Acceptatie criteria zijn specifieke eisen waaraan de User Story moet voldoen. Deze zijn meestal uniek per User Story.
- [ ] Acceptatiecriterium 1
- [ ] Acceptatiecriterium 2
- [ ] ...
**Definition of Done: Hardware**
- [ ] Alle acceptatiecriteria van de user story zijn afgevinkt.
- [ ] Je werk is gedocumenteerd.
- [ ] Je hebt testen uitgevoerd.
- [ ]

View File

@@ -1,93 +1,8 @@
# TI-project
## Getting started
To make it easy for you to get started with GitLab, here's a list of recommended next steps.
Already a pro? Just edit this README.md and make it your own. Want to make it easy? [Use the template at the bottom](#editing-this-readme)!
## Add your files
- [ ] [Create](https://docs.gitlab.com/ee/user/project/repository/web_editor.html#create-a-file) or [upload](https://docs.gitlab.com/ee/user/project/repository/web_editor.html#upload-a-file) files
- [ ] [Add files using the command line](https://docs.gitlab.com/ee/gitlab-basics/add-file.html#add-a-file-using-the-command-line) or push an existing Git repository with the following command:
```
cd existing_repo
git remote add origin https://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-project.git
git branch -M main
git push -uf origin main
```
## Integrate with your tools
- [ ] [Set up project integrations](https://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-project/-/settings/integrations)
## Collaborate with your team
- [ ] [Invite team members and collaborators](https://docs.gitlab.com/ee/user/project/members/)
- [ ] [Create a new merge request](https://docs.gitlab.com/ee/user/project/merge_requests/creating_merge_requests.html)
- [ ] [Automatically close issues from merge requests](https://docs.gitlab.com/ee/user/project/issues/managing_issues.html#closing-issues-automatically)
- [ ] [Enable merge request approvals](https://docs.gitlab.com/ee/user/project/merge_requests/approvals/)
- [ ] [Set auto-merge](https://docs.gitlab.com/ee/user/project/merge_requests/merge_when_pipeline_succeeds.html)
## Test and Deploy
Use the built-in continuous integration in GitLab.
- [ ] [Get started with GitLab CI/CD](https://docs.gitlab.com/ee/ci/quick_start/index.html)
- [ ] [Analyze your code for known vulnerabilities with Static Application Security Testing (SAST)](https://docs.gitlab.com/ee/user/application_security/sast/)
- [ ] [Deploy to Kubernetes, Amazon EC2, or Amazon ECS using Auto Deploy](https://docs.gitlab.com/ee/topics/autodevops/requirements.html)
- [ ] [Use pull-based deployments for improved Kubernetes management](https://docs.gitlab.com/ee/user/clusters/agent/)
- [ ] [Set up protected environments](https://docs.gitlab.com/ee/ci/environments/protected_environments.html)
***
# Editing this README
When you're ready to make this README your own, just edit this file and use the handy template below (or feel free to structure it however you want - this is just a starting point!). Thanks to [makeareadme.com](https://www.makeareadme.com/) for this template.
## Suggestions for a good README
Every project is different, so consider which of these sections apply to yours. The sections used in the template are suggestions for most open source projects. Also keep in mind that while a README can be too long and detailed, too long is better than too short. If you think your README is too long, consider utilizing another form of documentation rather than cutting out information.
## Name
Choose a self-explaining name for your project.
# TI-project - Kobuki
## Description
Let people know what your project can do specifically. Provide context and add a link to any reference visitors might be unfamiliar with. A list of Features or a Background subsection can also be added here. If there are alternatives to your project, this is a good place to list differentiating factors.
This project is a kobuki that drives around in dangerous areas and detects objects in its path. It uses a camera to detect objects. The kobuki is able to drive around in a room and detect objects.
## Badges
On some READMEs, you may see small images that convey metadata, such as whether or not all the tests are passing for the project. You can use Shields to add some to your README. Many services also have instructions for adding a badge.
## Photos
![Kobuki](/docs/assets/KobukiPhoto.jpg)
## Visuals
Depending on what you are making, it can be a good idea to include screenshots or even a video (you'll frequently see GIFs rather than actual videos). Tools like ttygif can help, but check out Asciinema for a more sophisticated method.
## Installation
Within a particular ecosystem, there may be a common way of installing things, such as using Yarn, NuGet, or Homebrew. However, consider the possibility that whoever is reading your README is a novice and would like more guidance. Listing specific steps helps remove ambiguity and gets people to using your project as quickly as possible. If it only runs in a specific context like a particular programming language version or operating system or has dependencies that have to be installed manually, also add a Requirements subsection.
## Usage
Use examples liberally, and show the expected output if you can. It's helpful to have inline the smallest example of usage that you can demonstrate, while providing links to more sophisticated examples if they are too long to reasonably include in the README.
## Support
Tell people where they can go to for help. It can be any combination of an issue tracker, a chat room, an email address, etc.
## Roadmap
If you have ideas for releases in the future, it is a good idea to list them in the README.
## Contributing
State if you are open to contributions and what your requirements are for accepting them.
For people who want to make changes to your project, it's helpful to have some documentation on how to get started. Perhaps there is a script that they should run or some environment variables that they need to set. Make these steps explicit. These instructions could also be useful to your future self.
You can also document commands to lint the code or run tests. These steps help to ensure high code quality and reduce the likelihood that the changes inadvertently break something. Having instructions for running tests is especially helpful if it requires external setup, such as starting a Selenium server for testing in a browser.
## Authors and acknowledgment
Show your appreciation to those who have contributed to the project.
## License
For open source projects, say how it is licensed.
## Project status
If you have run out of energy or time for your project, put a note at the top of the README saying that development has slowed down or stopped completely. Someone may choose to fork your project or volunteer to step in as a maintainer or owner, allowing your project to keep going. You can also make an explicit request for maintainers.

View File

@@ -2,35 +2,51 @@
```mermaid
classDiagram
Apache <--> Flask
Flask <--> MariaDB
ESP32 --> RPI : Wired communication
RPI <--> Wireguard : MQTT with a VPN connection
Wireguard <--> Flask
VirtualMachine <--> RPI
Kobuki <--> RPI
namespace Server {
class VirtualMachine {
+Apache()
+Flask()
+SocketServer()
+MariaDB()
Python/C++
Database
Website
class Wireguard {
+VPN()
}
class Apache {
+ReverseProxy()
}
class MariaDB {
+Database()
}
class Flask {
Python
+Webserver()
+endpoints communication()
}
}
namespace robot {
class RPI {
Receiver
Sensors
C++
+SocketClient()
+Kobuki()
}
namespace Kobuki {
class RPI {
+KobukiCommunication()
+ESP32Communication()
C++
}
class Kobuki {
+data
}
class ESP32 {
+TVOC()
+ECO2()
+Temperature()
+LDR()
+Camera()
+GPS()
+ToF()
}
}
```
```

View File

@@ -0,0 +1,51 @@
# Systemd Services
# What is a service
A service is a program or script that runs in the background and is managed by the system. Services are started at boot time and run until the system is shut down. Services can be started, stopped, and restarted by the system administrator.
# How to manage services on systemD
## Starting a service
To start a service, use the `systemctl start` command followed by the service name. For example, to start the `apache2` service, use the following command:
```bash
sudo systemctl start apache2
```
## Stopping a service
To stop a service, use the `systemctl stop` command followed by the service name. For example, to stop the `apache2` service, use the following command:
```bash
sudo systemctl stop apache2
```
## Restarting a service
To restart a service, use the `systemctl restart` command followed by the service name. For example, to restart the `apache2` service, use the following command:
```bash
sudo systemctl restart apache2
```
## Enabling a service
To enable a service to start at boot time, use the `systemctl enable` command followed by the service name. For example, to enable the `apache2` service, use the following command:
```bash
sudo systemctl enable apache2
```
## Creating a new service
To create a new service, you need to create a new service file in the `/etc/systemd/system/` directory. The service file should have a `.service` extension and contain the following sections:
### Example service file:
```bash
[Unit]
Description=FlaskApp #description of the service
After=network.target #start the service after the network is up
[Service]
User=ishak #start the service as a specific user
WorkingDirectory=/home/ishak/rooziinuubii79/src/Python/flask/web/ #working directory of the service
ExecStart=/usr/bin/gunicorn -w 3 -b 127.0.0.1:5000 app:app #command to start the service
```

View File

@@ -1,2 +0,0 @@
# Feedback expert review

BIN
docs/assets/KobukiPhoto.jpg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 491 KiB

50
docs/code/Mqtt.md Normal file
View File

@@ -0,0 +1,50 @@
# MQTT
## What is MQTT?
MQTT is a lightweight messaging protocol made for IOT devices. It allows efficient communication between IoT devices, servers, and applications by allowing them to
publish and subscribe to messages.
## How to connect
To connect to a MQTT server you need to create a instance of the class.
Example:
```cpp
// server adress, Client ID, Client Username, Client Password
MqttClient client("ws://145.92.224.21/ws/", "KobukiRPI", "rpi", "rpiwachtwoordofzo"); // create a client object
```
Later in the setup function you need to call ```client.connect();``` to connect to the mqtt server.
```cpp
client.connect();
```
When you've connected and the instance is initiated you can subscribe to topics or send messages to topics.
## Subscribing and receiving messages
Example subscribing to a topic:
```cpp
void setup(){
client.subscribe("home/commands");
}
```
Example receiving latest message from a topic:
```cpp
std::string foo(){
std::string latestMqttMessage = "";
latestMqttMessage = client.getLastMessage();
return latestMqttMessage;
}
```
If you want to subscribe to mulitple topics you need to initiate multiple instances of the mqtt class.
## Publishing messages
Example publishing a message:
```cpp
void foo(std::string Message){
//channel, payload
client.publishMessage("kobuki/example", Message);
}
```

69
docs/code/OpenCV.md Normal file
View File

@@ -0,0 +1,69 @@
# OpenCV
## Requirements
We want that the camera we want it to detect what is happening on the video feed and identify it so it can identify dangers.
## Issues
* OpenCL not grabbing gpu
* Solution: https://github.com/Smorodov/Multitarget-tracker/issues/93
## Installation
### Dependencies
* glew (for openGL)
* opencv C++ lib
How to install OpenCV
```bash
sudo apt-get install libopencv-dev
```
## Code explanation
### Opening the camera with OpenCV
```cpp
VideoCapture cap(0); //Open the default camera (0), points to /dev/video0. You could also change the number to the preferred camera
if (!cap.isOpened()) { //if camera is not opened throw a error message
cerr << "Error: Could not open camera" << endl;
return;
}
```
## Taking a picture and storing it in a variable
```cpp
Mat frame; //create a new Matrix variable called frame
while (true) {
cap >> frame; // Capture a new image frame.
if (frame.empty()) { //if the variable frame is not filled return a error
cerr << "Error: Could not capture image" << endl;
continue;
}
```
## Encoding the image for sending it over MQTT
```cpp
vector<uchar> buf; //create a dyanmic buffer for the image
imencode(".jpg", frame, buf); //encode the image to the buffer
auto* enc_msg = reinterpret_cast<unsigned char*>(buf.data());
```
```cpp
void CapnSend() {
// Convert the image to a byte array
// Publish the image data
client.publishMessage("kobuki/cam", string(enc_msg, enc_msg + buf.size()));
cout << "Sent image" << endl;
std::this_thread::sleep_for(std::chrono::milliseconds(300)); // Send image every 1000ms
}
}
```
## Sources
* https://github.com/UnaNancyOwen/OpenCVDNNSample/tree/master

View File

@@ -0,0 +1,10 @@
# Requirements
1. Het compileerd op x86 en ARM architechturen
2. Geen dubbele code
3. commentaar bij lastige code
4. Doxygen comments bij elke functie, behalve als het duidelijk is in de functienaam
5. Hou je code leesbaar
6. Geen dode code
7. Gebruik TODO comments (TODO TREE)
8.

View File

@@ -22,3 +22,32 @@ The ideas we came up with where:
- A mobile scarecrow that will ride around a farm scaring birds, checking for optimal growing conditions using sensors, and checking (and possibly fighting) pests.
- A robot that would be sent into buildings with a gasleak and check where the leak is the strongest and thus where it would most likely be coming from. (also possible lighting mechanic for controlled explosions in case of preventable disasters without loss of a human life)
- A mobile fire extinguisher placed in buildings or terrains that are prone to fires. The robot would be able to detect fires and extinguish them before they get out of hand. Or control the fire before the fire department arrives.
- A mobile solar panel that moves to the spot with the harshest sunlight (without a certain area). After which it would plug itself into a battery and charge it. This battery could then be used to power a building or a car.
- A robot that is dropped in dangerous environments to scout the area and report back to the people who sent it. This could be used in warzones, dangerous buildings, or other dangerous environments.
It scouts for stuff in the air using air sensors, it scouts for stuff on the ground using cameras, and it scouts for possible hazards like fire using infrared and other sensors.
### Decision
In the end we decided to go with the final idea. This was partially due to the fact that it merges some of the most interresting aspect of the other ideas. We think that this is the most fitting with our goal oof minimizing the loss of human life because it can be used in not just fires or gasleaks but also in warzones and other dangerous environments. This makes it so that it has way more usecases than the other ideas and the buyer has enough with just these robots to cover a lot of dangerous situations.
## Realization & Plan
### Hardware
#### Sensors
placeholder
#### IC's
placeholder
#### Motors
placeholder
#### Batteries
placeholder
### Software

View File

@@ -0,0 +1,25 @@
# Kobuki driver
## How do i communicate with the kobuki
You can communicate with the kobuki by usb serial or the big serial port on the front. We chose the usb port paired with a raspberry Pi.
The Kobuki sends a message every 200ms with a baudrate of 115200. It sends all the sensordata and the message always starts with the same 2 bytes 0xAA and 0x55.
## Kobuki payloads
To communicate with the kobuki we need to send payloads to the kobuki. These are structured the same as the payloads that the kobuki sends.
```cpp
unsigned char KobukiPayload[11] = {
0xaa, // Start byte 1
0x55, // Start byte 2
0x08, // Payload length (the first 2 bytes dont count)
0x01, // payload type (0x01 = control command)
0x04, // Control byte or additional identifier
actual_speed % 256, // Lower byte of speed value (max actual_speed 1024)
actual_speed >> 8, // Upper byte of speed value
0x00, // Placeholder for radius
0x00, // Placeholder for radius
0x00 // Placeholder for checksum (will be applied later)
};
```
You can also find the documentation about the payloads on the kobuki website

View File

@@ -1,22 +1,21 @@
# **Projectplan - robot voor Gevaarlijke Omgevingen**
### Projectbescrhijving
### Projectbeschrijving
Het project betreft de ontwikkeling van een robot voor gevaarlijke omgevingen. De robot is bedoeld om gevaarlijke situaties te verkennen en in kaart te brengen. De robot is uitgerust met verschillende sensoren en daar gaan we zelf nog een camera en andere sensoren op zetten om de omgeving te verkennen. De data wordt verzameld en geanalyseerd om een beeld te krijgen van de situatie. De robot kan worden ingezet in verschillende situaties, zoals branden, instortingen en andere gevaarlijke situaties.
### 1. Organisatorische Context
Bij de ontwikkeling van de robot zijn verschillende factoren van belang. Maatschappelijke veranderingen zoals duurzaamheid en de toenemende vraag naar technologische oplossingen voor gevaarlijke werkomgevingen spelen een grote rol. De robot kan worden ingezet voor gevaarlijke situaties waar bijvoorbeeld geen mensen naar binnen kunnen.
**Vervolgstappen:**
- Focus op duurzaamheid en herbruikbaarheid.
- Focus op duurzaamheid en herbruikbaarheid.
- Inzetten op veiligheid en efficiëntie.
### 2. Ethiek
Ethische vragen staan centraal bij de ontwikkeling van de robot. Er moet rekening gehouden worden met de veiligheid van de gegevens van gebruiker. Het minimaliseren van de milieu-impact door duurzame materialen te gebruiken zoals hout en metaal. Privacy speelt ook een belangrijke rol. De verzamelde data moet veilig worden gebruikt en opgeslagen en het moet de EU privacy richtlijnen volgen.
Ethische vragen staan centraal bij de ontwikkeling van de robot. Er moet rekening gehouden worden met de veiligheid van de gegevens van gebruiker. Het minimaliseren van de milieu-impact door duurzame materialen te gebruiken zoals hout en metaal. Privacy speelt ook een belangrijke rol. De verzamelde data moet veilig worden gebruikt en opgeslagen en het moet de EU privacy richtlijnen volgen.
**Punten waar we rekening mee moeten houden:**
@@ -27,7 +26,6 @@ Ethische vragen staan centraal bij de ontwikkeling van de robot. Er moet rekenin
Het project wordt uitgevoerd vanuit een duidelijk plan waar elke sprint een deel van het project centraal staat. Driver bouwen, testen en verbindingen leggen tussen elk gedeelte van het project. We gebruiken de Agile methodiek dus alles kan nog veranderen. Hierbij moeten wij dus ook rekening houden met de etische en organisatorische aspecten, zoals duurzaamheid en veiligheid.
### Aanpak
**Werk methode:** Gebruik van Agile projectmanagement voor flexibiliteit.
@@ -36,4 +34,4 @@ Het project wordt uitgevoerd vanuit een duidelijk plan waar elke sprint een deel
**Duurzaamheid & Verbetering:** Zorgen dat we duurzame materialen kunnen gebruiken en dat we de robot kunnen blijven verbeteren.
Deze aanpak zorgt ervoor dat het project niet alleen technisch succesvol is, maar ook voldoet aan ethische en organisatorische normen.
Deze aanpak zorgt ervoor dat het project niet alleen technisch succesvol is, maar ook voldoet aan ethische en organisatorische normen.

View File

@@ -0,0 +1,40 @@
# Wat gaan we maken
## Sensoren
* Camera
* GPS module
* Temparatuur sensor
* TVOC sensor
* ECO2 sensor
* LDR sensor
* Time of Flight sensor
## Wat gaan we met de sensoren doen?
### Camera
De camera word gebruikt om foto's te maken in de omgeving in het geval van informatie verkrijgen voor als de robot bijvoorbeeld vast zit, geeft ook optie om informatie te krijgen zonder op de plek zelf te zijn.
### GPS module
De GPS module word gebruikt om de locatie van de robot te bepalen en aan te geven waar bijzonderheden bevinden.
### Temparatuur, TVOC en ECO2 sensor
Deze sensoren zijn bedoeld om de omgeving te meten en te kijken of de omgeving veilig is voor mensen om in te gaan.
### LDR sensor
De LDR sensor word gebruikt om de lichtsterkte te meten en te kijken of er een lamp op de robot aan moet gaan voor de camera.
### Time of Flight sensor
De Time of Flight sensor word gebruikt om de afstand te meten tussen de robot en de muur, zodat de robot niet tegen de muur aan botst.
## Het project
Bij brand of op fabrieksterreinen met gevaarlijke stoffen kan het nodig zijn om een verkenning te
doen van een verdachte omgeving. Het is dan niet verstandig om mensen naar binnen te sturen, in
die gevallen vallen de hulpdiensten terug om een verkenningsrobot. Het doel van het project is het
realiseren van een verkenningsrobot die zelfstandig een gevaarlijke omgeving in kaart kan brengen.

View File

@@ -1 +0,0 @@
# home

View File

@@ -1,5 +0,0 @@
- [x] Kobuki werkt met driver.
- [x] Ik kan de data uitlezen.
- [ ] Data wordt correct weergegeven.
- [ ] Ik kan de data laten zien in op de website.
- [ ] Ik kan de kobuki besturen vanaf de website.

View File

@@ -0,0 +1,20 @@
# Daily stand ups
??? note "Daily Stand-ups Sprint 4"
| Day | Submitted by | What did you do yesterday | What will you do today | Any blockers? |
| ---------- | ------------ | ----------------------------- | -------------------------------------------------- | ------------------------------------ |
| 18/11/2024 | Ishak | --- | Engels, Repo fixen, beginnen met nieuwe user story | --- |
| 18/11/2024 | Sam | --- | Engels, Feedback verwerken medium stake | None |
| 18/11/2024 | Yannick | --- | Engels, Documentatie, Code samenvoegen | None |
| 18/11/2024 | Mees | --- | Engels, Onderzoek | None |
| 19/11/2024 | Ishak | Engels, Repo fixen | workshop | --- |
| 19/11/2024 | Sam | Engels, Feedback verwerken | workshop | None |
| 19/11/2024 | Yannick | code samenvoegen,schema maken | workshop, documentatie | None |
| 19/11/2024 | Mees | niks | workshop, fixen include path | include path werkt niet |
| 26/11/2024 | Ishak | Workshop | database, engels video opnemen | phpmyadmin werkt niet(weet probleem) |
| 26/11/2024 | Sam | opencv | opencv | --- |
| 26/11/2024 | Yannick | ziek | ziek | --- |
| 26/11/2024 | Mees | Engels video | stepper motor | vscode werkt niet |
| 02/12/2024 | Ishak | database | database | --- |
| 02/12/2024 | Sam | opencv | camera beeld op website | --- |
| 02/12/2024 | Yannick | ziek, documentatie | behuizing voor esp | --- |

View File

@@ -0,0 +1,12 @@
# sprint review 4 feedback
- Definition of done SMART maken
- Uitgebreider beschrijven wat er voor de definition of done nodig is
- Testen van de software niet meer dan een halve A4
- Acceptatie criteria beter uitschrijven( vragen aan ed)
- Meer software ontwikkelen
- kijken of we met een punten systeem kunnen werken in user stories. zo kan je zien hoe groot een user story is.
- read.me file aanpassen
- meer aan documentatie doen.
- technisch iets te uitdagend
- planning beter maken

View File

@@ -34,17 +34,22 @@ plugins:
modules: [mkdocs_macros_mdocotion]
markdown_extensions:
- attr_list
- md_in_html
- fenced_code
- pymdownx.highlight:
linenums: true
use_pygments: true
- pymdownx.inlinehilite
- pymdownx.snippets
- pymdownx.tabbed
- pymdownx.superfences:
custom_fences:
- name: mermaid
class: mermaid
format: !!python/name:pymdownx.superfences.fence_code_format
- attr_list
- admonition
- pymdownx.details
- pymdownx.superfences
- md_in_html
- fenced_code
- pymdownx.highlight:
linenums: true
use_pygments: true
- pymdownx.inlinehilite
- pymdownx.snippets
- pymdownx.superfences:
custom_fences:
- name: mermaid
class: mermaid
format: !!python/name:pymdownx.superfences.fence_code_format
- toc:
permalink: true
- pymdownx.details

View File

@@ -6,7 +6,10 @@ set(CMAKE_CXX_STANDARD 23)
find_library(PAHO_MQTTPP_LIBRARY paho-mqttpp3 PATHS /usr/local/lib)
find_library(PAHO_MQTT_LIBRARY paho-mqtt3a PATHS /usr/local/lib)
include_directories(/usr/local/include)
# Find OpenCV package
find_package(OpenCV REQUIRED)
find_package(OpenEXR REQUIRED)
include_directories(${OpenCV_INCLUDE_DIRS})
set(SOURCE_FILES
src/KobukiDriver/KobukiParser.cpp
@@ -20,4 +23,4 @@ set(SOURCE_FILES
add_executable(kobuki_control ${SOURCE_FILES})
# Link the static libraries
target_link_libraries(kobuki_control ${PAHO_MQTTPP_LIBRARY} ${PAHO_MQTT_LIBRARY} pthread)
target_link_libraries(kobuki_control ${PAHO_MQTTPP_LIBRARY} ${PAHO_MQTT_LIBRARY} ${OpenCV_LIBS} pthread OpenEXR::OpenEXR)

View File

@@ -3,6 +3,7 @@
#include "termios.h"
#include <cstddef>
#include <iostream>
#include <thread>
// plot p;
static std::vector<float> vectorX;
@@ -281,199 +282,6 @@ int CKobuki::measure() {
return 0;
}
// int CKobuki::parseKobukiMessage(TKobukiData &output, unsigned char *data)
// {
// int rtrnvalue = checkChecksum(data);
// // ak je zly checksum,tak kaslat na to
// if (rtrnvalue != 0)
// return -2;
// int checkedValue = 1;
// // kym neprejdeme celu dlzku
// while (checkedValue < data[0])
// {
// // basic data subload
// if (data[checkedValue] == 0x01)
// {
// checkedValue++;
// if (data[checkedValue] != 0x0F)
// return -1;
// checkedValue++;
// output.timestamp = data[checkedValue + 1] * 256 +
// data[checkedValue]; checkedValue += 2; output.BumperCenter =
// data[checkedValue] && 0x02; output.BumperLeft =
// data[checkedValue] && 0x04; output.BumperRight =
// data[checkedValue] && 0x01; checkedValue++; output.WheelDropLeft
// = data[checkedValue] && 0x02; output.WheelDropRight =
// data[checkedValue] && 0x01; checkedValue++; output.CliffCenter =
// data[checkedValue] && 0x02; output.CliffLeft = data[checkedValue]
// && 0x04; output.CliffRight = data[checkedValue] && 0x01;
// checkedValue++;
// output.EncoderLeft = data[checkedValue + 1] * 256 +
// data[checkedValue]; checkedValue += 2; output.EncoderRight =
// data[checkedValue + 1] * 256 + data[checkedValue]; checkedValue
// += 2; output.PWMleft = data[checkedValue]; checkedValue++;
// output.PWMright = data[checkedValue];
// checkedValue++;
// output.ButtonPress = data[checkedValue];
// checkedValue++;
// output.Charger = data[checkedValue];
// checkedValue++;
// output.Battery = data[checkedValue];
// checkedValue++;
// output.overCurrent = data[checkedValue];
// checkedValue++;
// }
// else if (data[checkedValue] == 0x03)
// {
// checkedValue++;
// if (data[checkedValue] != 0x03)
// return -3;
// checkedValue++;
// output.IRSensorRight = data[checkedValue];
// checkedValue++;
// output.IRSensorCenter = data[checkedValue];
// checkedValue++;
// output.IRSensorLeft = data[checkedValue];
// checkedValue++;
// }
// else if (data[checkedValue] == 0x04)
// {
// checkedValue++;
// if (data[checkedValue] != 0x07)
// return -4;
// checkedValue++;
// output.GyroAngle = data[checkedValue + 1] * 256 +
// data[checkedValue]; checkedValue += 2; output.GyroAngleRate =
// data[checkedValue + 1] * 256 + data[checkedValue]; checkedValue
// += 5; // 3 unsued
// }
// else if (data[checkedValue] == 0x05)
// {
// checkedValue++;
// if (data[checkedValue] != 0x06)
// return -5;
// checkedValue++;
// output.CliffSensorRight =
// data[checkedValue + 1] * 256 + data[checkedValue];
// checkedValue += 2;
// output.CliffSensorCenter =
// data[checkedValue + 1] * 256 + data[checkedValue];
// checkedValue += 2;
// output.CliffSensorLeft =
// data[checkedValue + 1] * 256 + data[checkedValue];
// checkedValue += 2;
// }
// else if (data[checkedValue] == 0x06)
// {
// checkedValue++;
// if (data[checkedValue] != 0x02)
// return -6;
// checkedValue++;
// output.wheelCurrentLeft = data[checkedValue];
// checkedValue++;
// output.wheelCurrentRight = data[checkedValue];
// checkedValue++;
// }
// else if (data[checkedValue] == 0x0A)
// {
// checkedValue++;
// if (data[checkedValue] != 0x04)
// return -7;
// checkedValue++;
// output.extraInfo.HardwareVersionPatch = data[checkedValue];
// checkedValue++;
// output.extraInfo.HardwareVersionMinor = data[checkedValue];
// checkedValue++;
// output.extraInfo.HardwareVersionMajor = data[checkedValue];
// checkedValue += 2;
// }
// else if (data[checkedValue] == 0x0B)
// {
// checkedValue++;
// if (data[checkedValue] != 0x04)
// return -8;
// checkedValue++;
// output.extraInfo.FirmwareVersionPatch = data[checkedValue];
// checkedValue++;
// output.extraInfo.FirmwareVersionMinor = data[checkedValue];
// checkedValue++;
// output.extraInfo.FirmwareVersionMajor = data[checkedValue];
// checkedValue += 2;
// }
// else if (data[checkedValue] == 0x0D)
// {
// checkedValue++;
// if (data[checkedValue] % 2 != 0)
// return -9;
// checkedValue++;
// output.frameId = data[checkedValue];
// checkedValue++;
// int howmanyFrames = data[checkedValue] / 3;
// checkedValue++;
// output.gyroData.reserve(howmanyFrames);
// output.gyroData.clear();
// for (int hk = 0; hk < howmanyFrames; hk++)
// {
// TRawGyroData temp;
// temp.x = data[checkedValue + 1] * 256 + data[checkedValue];
// checkedValue += 2;
// temp.y = data[checkedValue + 1] * 256 + data[checkedValue];
// checkedValue += 2;
// temp.z = data[checkedValue + 1] * 256 + data[checkedValue];
// checkedValue += 2;
// output.gyroData.push_back(temp);
// }
// }
// else if (data[checkedValue] == 0x10)
// {
// checkedValue++;
// if (data[checkedValue] != 0x10)
// return -10;
// checkedValue++;
// output.digitalInput = data[checkedValue + 1] * 256 +
// data[checkedValue]; checkedValue += 2; output.analogInputCh0 =
// data[checkedValue + 1] * 256 + data[checkedValue]; checkedValue
// += 2; output.analogInputCh1 = data[checkedValue + 1] * 256 +
// data[checkedValue]; checkedValue += 2; output.analogInputCh2 =
// data[checkedValue + 1] * 256 + data[checkedValue]; checkedValue
// += 2; output.analogInputCh3 = data[checkedValue + 1] * 256 +
// data[checkedValue]; checkedValue += 8; // 2+6
// }
// else if (data[checkedValue] == 0x13)
// {
// checkedValue++;
// if (data[checkedValue] != 0x0C)
// return -11;
// checkedValue++;
// output.extraInfo.UDID0 = data[checkedValue + 3] * 256 * 256 * 256
// +
// data[checkedValue + 2] * 256 * 256 +
// data[checkedValue + 1] * 256 +
// data[checkedValue];
// checkedValue += 4;
// output.extraInfo.UDID1 = data[checkedValue + 3] * 256 * 256 * 256
// +
// data[checkedValue + 2] * 256 * 256 +
// data[checkedValue + 1] * 256 +
// data[checkedValue];
// checkedValue += 4;
// output.extraInfo.UDID2 = data[checkedValue + 3] * 256 * 256 * 256
// +
// data[checkedValue + 2] * 256 * 256 +
// data[checkedValue + 1] * 256 +
// data[checkedValue];
// checkedValue += 4;
// }
// else
// {
// checkedValue++;
// checkedValue += data[checkedValue] + 1;
// }
// }
// return 0;
// }
long double CKobuki::gyroToRad(signed short GyroAngle) {
long double rad;
@@ -701,33 +509,9 @@ void CKobuki::doRotation(long double th) {
usleep(25 * 1000);
}
// combines navigation to a coordinate and rotation by an angle, performs
// movement to the selected coordinate in the robot's coordinate system
void CKobuki::goToXy(long double xx, long double yy) {
long double th;
yy = yy * -1;
th = atan2(yy, xx);
doRotation(th);
long double s = sqrt(pow(xx, 2) + pow(yy, 2));
// resetnem suradnicovu sustavu robota
x = 0;
y = 0;
iterationCount = 0;
theta = 0;
// std::cout << "mam prejst: " << s << "[m]" << std::endl;
goStraight(s);
usleep(25 * 1000);
return;
}
void CKobuki::forward(int speedvalue, long double distance) {
/// @brief Makes the robot move forward for 3 seconds
/// @param speedvalue How fast it will drive forward from 0 - 1024
void CKobuki::forward(int speedvalue) {
// Use the goStraight logic to determine the speed and distance
// Calculate the actual speed and radius values based on the conversion table
@@ -755,3 +539,84 @@ void CKobuki::forward(int speedvalue, long double distance) {
uint32_t pocet;
pocet = write(HCom, &message, 11);
}
/// @brief Makes the kobuki rotate
/// @param degrees Rotation in degrees
void CKobuki::Rotate(int degrees) {
// convert raidans to degrees
float radians = degrees * PI / 180.0;
// Calculate the rotation speed in radians per second
double RADS_PER_SEC = 1;
// calculator rotation time and give absolute value
float rotation_time = std::abs(radians / RADS_PER_SEC);
// Use original function to set the rotation speed in mm/s
setRotationSpeed(radians);
// Sleep for the calculated rotation time
std::this_thread::sleep_for(
std::chrono::milliseconds(static_cast<int>(rotation_time * 1000)));
// Stop the robot after the rotation
setRotationSpeed(0);
}
/// @brief Robot safety function to be ran in another thread. Makes sure the robot does not throw inteself from the table. Only use this when the speed is lower than 350
/// @param pointerToMessage Set this pointer to the control message and then it attempts to reset it when it bumps into something so it doesnt keep trying to do the past commant
// TODO: make this return bool so it can be used in the control part
void CKobuki::robotSafety(std::string *pointerToMessage) {
while (true) {
if (parser.data.BumperCenter || parser.data.BumperLeft ||
parser.data.BumperRight || parser.data.CliffLeft ||
parser.data.CliffCenter || parser.data.CliffRight) {
std::cout << "Safety condition triggered!" << std::endl; // Debug print
*pointerToMessage = "estop";
forward(-300); // reverse the robot
}
std::this_thread::sleep_for(std::chrono::milliseconds(static_cast<int>(100)));
}
}
/// @brief Robot safety function to be ran in another thread. Makes sure the robot does not throw inteself from the table. Only use this when the speed is lower than 350
void CKobuki::robotSafety() {
while (true) {
if (parser.data.BumperCenter || parser.data.BumperLeft ||
parser.data.BumperRight || parser.data.CliffLeft ||
parser.data.CliffCenter || parser.data.CliffRight) {
std::cout << "Safety condition triggered!" << std::endl; // Debug print
forward(-300); // reverse the robot
}
std::this_thread::sleep_for(std::chrono::milliseconds(static_cast<int>(100)));
}
}
/// @brief When called the robot gets a control message to stop whatever its doing
void CKobuki::sendNullMessage() {
unsigned char message[11] = {
0xaa, // Start byte 1
0x55, // Start byte 2
0x08, // Payload length (the first 2 bytes dont count)
0x01, // payload type (0x01 = control command)
0x04, // Control byte or additional identifier
0x00, // Lower byte of speed value
0x00, // Upper byte of speed value
0x00, // Placeholder for radius
0x00, // Placeholder for radius
0x00 // Placeholder for checksum
};
message[10] = message[2] ^ message[3] ^ message[4] ^ message[5] ^ message[6] ^
message[7] ^ message[8] ^ message[9];
// Send the message
uint32_t pocet;
pocet = write(HCom, &message, 11);
}

View File

@@ -12,7 +12,7 @@
////*************************************************************************************
#ifndef KOBUKI_CLASS_123456789
#define KOBUKI_CLASS_123456789
#define PI 3.141592653589793238462643383279502884L /* pi */
#define PI 3.141592653589793238462643383279502884L /* pi */
#define MS_INSTRUCTION_DELAY 25
#include <stdio.h>
@@ -31,95 +31,9 @@
#include <chrono>
#include <sstream>
#include "KobukiParser.h"
#include "graph.h"
using namespace std;
// typedef struct
// {
// unsigned short x;
// unsigned short y;
// unsigned short z;
// }TRawGyroData;
// typedef struct
// {
// //Hardware Version
// unsigned char HardwareVersionMajor;
// unsigned char HardwareVersionMinor;
// unsigned char HardwareVersionPatch;
// //Firmware Version
// unsigned char FirmwareVersionMajor;
// unsigned char FirmwareVersionMinor;
// unsigned char FirmwareVersionPatch;
// //Unique Device IDentifier(UDID)
// unsigned int UDID0;
// unsigned int UDID1;
// unsigned int UDID2;
// //Controller Info
// unsigned char PIDtype;
// unsigned int PIDgainP;
// unsigned int PIDgainI;
// unsigned int PIDgainD;
// }TExtraRequestData;
// typedef struct
// {
// //---basic package
// unsigned short timestamp;
// //bumpers
// bool BumperLeft;
// bool BumperCenter;
// bool BumperRight;
// //cliff
// bool CliffLeft;
// bool CliffCenter;
// bool CliffRight;
// // wheel drop
// bool WheelDropLeft;
// bool WheelDropRight;
// //wheel rotation
// unsigned short EncoderRight;
// unsigned short EncoderLeft;
// unsigned char PWMright;
// unsigned char PWMleft;
// //buttons
// unsigned char ButtonPress;// 0 no, 1 2 4 for button 0 1 2 (7 is all three)
// //power
// unsigned char Charger;
// unsigned char Battery;
// unsigned char overCurrent;
// //---docking ir
// unsigned char IRSensorRight;
// unsigned char IRSensorCenter;
// unsigned char IRSensorLeft;
// //---Inertial Sensor Data
// signed short GyroAngle;
// unsigned short GyroAngleRate;
// //---Cliff Sensor Data
// unsigned short CliffSensorRight;
// unsigned short CliffSensorCenter;
// unsigned short CliffSensorLeft;
// //---Current
// unsigned char wheelCurrentLeft;
// unsigned char wheelCurrentRight;
// //---Raw Data Of 3D Gyro
// unsigned char frameId;
// std::vector<TRawGyroData> gyroData;
// //---General Purpose Input
// unsigned short digitalInput;
// unsigned short analogInputCh0;
// unsigned short analogInputCh1;
// unsigned short analogInputCh2;
// unsigned short analogInputCh3;
// //---structure with data that appears only on request
// TExtraRequestData extraInfo;
// }TKobukiData;
typedef long(*src_callback_kobuki_data) (void *user_data, TKobukiData &Kobuki_data);
class CKobuki
@@ -156,10 +70,15 @@ public:
// control functions
void goStraight(long double distance);
void forward(int speedvalue, long double distance);
void forward(int speedvalue);
void doRotation(long double th);
void goToXy(long double xx, long double yy);
void Rotate(int degrees);
std::ofstream odometry_log;
void robotSafety(std::string *pointerToMessage);
void robotSafety(); //overload
void sendNullMessage();
bool safetyActive = false;
KobukiParser parser;

View File

@@ -1,7 +1,9 @@
#include "KobukiParser.h"
#include <iostream>
//moet checkenvalue gebruiken of moet kijken naar de payloadlength welke dingen er extra zijn
int KobukiParser::parseKobukiMessage(TKobukiData &output, unsigned char *data) {
std::this_thread::sleep_for(std::chrono::milliseconds(static_cast<int>(20))); //avoid busy waiting. The kobuki sends a message every 20ms
int rtrnvalue = checkChecksum(data);
if (rtrnvalue != 0) {
// std::cerr << "Invalid checksum" << std::endl;
@@ -108,7 +110,7 @@ void KobukiParser::parseBasicData(TKobukiData &output, unsigned char *data, int
checkedValue += 2;
output.BumperCenter = (data[checkedValue] & 0x02) >> 1;
output.BumperLeft = (data[checkedValue] & 0x04) >> 2;
output.BumperRight = data[checkedValue] & 0x01;
output.BumperRight = data[checkedValue] & 0x01;
checkedValue++;
output.WheelDropLeft = (data[checkedValue] & 0x02) >> 1;
output.WheelDropRight = data[checkedValue] & 0x01;
@@ -125,7 +127,9 @@ void KobukiParser::parseBasicData(TKobukiData &output, unsigned char *data, int
checkedValue++;
output.PWMright = data[checkedValue];
checkedValue++;
output.ButtonPress = data[checkedValue];
output.ButtonPress1 = data[checkedValue] & 0x01;
output.ButtonPress2 = data[checkedValue] & 0x02;
output.ButtonPress3 = data[checkedValue] & 0x04;
checkedValue++;
output.Charger = data[checkedValue];
checkedValue++;
@@ -151,13 +155,13 @@ void KobukiParser::parseGyroData(TKobukiData &output, unsigned char *data, int &
checkedValue += 5; // 3 unused
}
void KobukiParser::parseCliffSensorData(TKobukiData &output, unsigned char *data, int &checkedValue){
output.CliffSensorRight = data[checkedValue];
checkedValue++;
output.CliffSensorCenter = data[checkedValue];
checkedValue++;
output.CliffSensorLeft = data[checkedValue];
checkedValue++;
void KobukiParser::parseCliffSensorData(TKobukiData &output, unsigned char *data, int &checkedValue) {
output.CliffSensorRight = (data[checkedValue] << 8) | data[checkedValue + 1];
checkedValue += 2;
output.CliffSensorCenter = (data[checkedValue] << 8) | data[checkedValue + 1];
checkedValue += 2;
output.CliffSensorLeft = (data[checkedValue] << 8) | data[checkedValue + 1];
checkedValue += 2;
}
void KobukiParser::parseWheelCurrentData(TKobukiData &output, unsigned char *data, int &checkedValue){

View File

@@ -2,6 +2,8 @@
#define KOBUKIPARSER_H
#include <vector>
#include <thread>
struct TRawGyroData {
int x, y, z;
@@ -19,7 +21,8 @@ struct TKobukiData {
int CliffCenter, CliffLeft, CliffRight;
int EncoderLeft, EncoderRight;
int PWMleft, PWMright;
int ButtonPress, Charger, Battery, overCurrent;
int ButtonPress1, ButtonPress2, ButtonPress3;
int Charger, Battery, overCurrent;
int IRSensorRight, IRSensorCenter, IRSensorLeft;
int GyroAngle, GyroAngleRate;
int CliffSensorRight, CliffSensorCenter, CliffSensorLeft;

View File

@@ -1,71 +0,0 @@
#ifndef GRAPH1010
#define GRAPH1010
#include <stdio.h>
#include <stdlib.h>
#include <vector>
using namespace std;
#define GRAPH_ENABLED true
class plot {
public:
FILE *gp;
bool enabled,persist;
plot(bool _persist=false,bool _enabled=GRAPH_ENABLED) {
enabled=_enabled;
persist=_persist;
if (enabled) {
if(persist)
gp=popen("gnuplot -persist","w");
else
gp=popen("gnuplot","w");
}
}
void plot_data(vector<float> x,const char* style="points",const char* title="Data") {
if(!enabled)
return;
fprintf(gp,"set title '%s' \n",title);
fprintf(gp,"plot '-' w %s \n",style);
for(int k=0;k<x.size();k++) {
fprintf(gp,"%f\n",x[k]);
}
fprintf(gp,"e\n");
fflush(gp);
}
void plot_data(vector<float> x,vector<float> y,const char* style="points",const char* title="Data") {
if(!enabled)
return;
fprintf(gp,"set title '%s' \n",title);
fprintf(gp,"plot '-' w %s \n",style);
for(int k=0;k<x.size();k++) {
fprintf(gp,"%f %f \n",x[k],y[k]);
}
fprintf(gp,"e\n");
fflush(gp);
}
~plot() {
if(enabled)
pclose(gp);
}
};
/*
int main(int argc,char **argv) {
plot p;
for(int a=0;a<100;a++) {
vector<float> x,y;
for(int k=a;k<a+200;k++) {
x.push_back(k);
y.push_back(k*k);
}
p.plot_data(x,y);
}
return 0;
}
*/
#endif

View File

@@ -1,20 +1,23 @@
#include "MqttClient.h"
MqttClient::MqttClient(const std::string& address, const std::string& clientId, const std::string& username, const std::string& password)
//client_ is the connection
//here all the @PARAMS are getting set for the connection
: client_(address, clientId), username_(username), password_(password), callback_(*this) {
client_.set_callback(callback_);
connOpts_.set_clean_session(true);
connOpts_.set_mqtt_version(MQTTVERSION_3_1_1); // For MQTT 3.1.1
options.set_clean_session(true);
options.set_mqtt_version(MQTTVERSION_3_1_1); // For MQTT 3.1.1
if (!username_.empty() && !password_.empty()) {
connOpts_.set_user_name(username_);
connOpts_.set_password(password_);
options.set_user_name(username_);
options.set_password(password_);
}
}
void MqttClient::connect() {
try {
std::cout << "Connecting to broker..." << std::endl;
client_.connect(connOpts_)->wait();
client_.connect(options)->wait();
std::cout << "Connected!" << std::endl;
} catch (const mqtt::exception& exc) {
std::cerr << "Error: " << exc.what() << std::endl;
@@ -32,6 +35,15 @@ void MqttClient::subscribe(const std::string& topic, int qos) {
}
}
void MqttClient::publishMessage(const std::string& topic, const std::string& payload) {
try {
client_.publish(topic, payload)->wait();
} catch (const mqtt::exception& exc) {
std::cerr << "Error: " << exc.what() << std::endl;
throw;
}
}
/// @brief Only needed when program doesnt keep itself alive
void MqttClient::run() {
// Keep the client running to receive messages
@@ -41,6 +53,7 @@ void MqttClient::run() {
}
void MqttClient::Callback::message_arrived(mqtt::const_message_ptr msg) {
//lock the variable, it automaticly unlocks when going out of scope using lock_guard
std::lock_guard<std::mutex> lock(client_.messageMutex_);
client_.lastMessage_ = msg->to_string();
}
@@ -55,7 +68,9 @@ void MqttClient::Callback::delivery_complete(mqtt::delivery_token_ptr token) {
/// @brief Get the last message received from the MQTT broker
/// @return The last message received in a string
//std::string is the datatype of the return value
std::string MqttClient::getLastMessage() {
//lock the variable, it automaticly unlocks when going out of scope using lock_guard
std::lock_guard<std::mutex> lock(messageMutex_);
return lastMessage_;
}

View File

@@ -13,6 +13,7 @@ public:
void subscribe(const std::string& topic, int qos = 1);
void run();
std::string getLastMessage();
void publishMessage(const std::string& topic, const std::string& payload);
private:
class Callback : public virtual mqtt::callback {
@@ -27,7 +28,7 @@ private:
};
mqtt::async_client client_;
mqtt::connect_options connOpts_;
mqtt::connect_options options;
Callback callback_;
std::string username_;
std::string password_;
@@ -35,4 +36,4 @@ private:
std::mutex messageMutex_;
};
#endif // MQTTCLIENT_H
#endif //MQTTCLIENT_H

View File

@@ -1,5 +1,5 @@
#include "MqttClient.h"
//example file for testing
int main(){
MqttClient client("mqtt://localhost:1883", "raspberry_pi_client", "ishak", "kobuki");
client.connect();

View File

@@ -1,79 +1,323 @@
#include <iostream>
#include <cmath>
#include <thread>
#include "KobukiDriver/graph.h"
#include "MQTT/MqttClient.h"
#include "KobukiDriver/CKobuki.h"
#include <opencv4/opencv2/opencv.hpp>
#include <opencv4/opencv2/core.hpp>
using namespace std;
using namespace cv;
CKobuki robot;
int movement();
std::string readMQTT();
MqttClient client("mqtt://localhost:1883", "KobukiRPI", "ishak", "kobuki");
void parseMQTT(std::string message);
void CapnSend();
//ip, clientID, username, password
MqttClient client("ws://145.92.224.21/ws/", "KobukiRPI", "rpi", "rpiwachtwoordofzo"); // create a client object
std::string message = "stop";
std::string serializeKobukiData(const TKobukiData &data);
void sendKobukiData(TKobukiData &data);
void setup()
{
unsigned char *null_ptr(0);
robot.startCommunication("/dev/ttyUSB0", true, null_ptr);
//connect mqtt server and sub to commands
void setup(){
unsigned char *null_ptr(0);
robot.startCommunication("/dev/ttyUSB0", true, null_ptr);
client.connect();
client.subscribe("home/commands");
}
int main(){
setup();
while(true){
readMQTT();
}
client.run();
return 0;
int main()
{
setup();
std::thread image (CapnSend);
std::thread safety([&]() { robot.robotSafety(&message); });
std::thread sendMqtt([&]() { sendKobukiData(robot.parser.data); });
while(true){
std::string message = readMQTT();
if (!message.empty()){
parseMQTT(message);
}
}
sendMqtt.join();
safety.join();
image.join();
}
std::string readMQTT()
{
std::string message = client.getLastMessage();
if (!message.empty()) {
std::cout << "MQTT Message: " << message << std::endl;
static std::string lastMessage;
std::string message = client.getLastMessage();
if (!message.empty() && message != lastMessage)
{
std::cout << "MQTT Message: " << message << std::endl;
lastMessage = message;
}
// Add a small delay to avoid busy-waiting
std::this_thread::sleep_for(std::chrono::milliseconds(100));
return message;
}
void parseMQTT(std::string message)
{
if (message == "up")
{
robot.forward(350);
}
else if (message == "left")
{
robot.setRotationSpeed(4);
}
else if (message == "right")
{
robot.setRotationSpeed(-4);
}
else if (message == "down")
{
robot.forward(-350);
}
else if (message == "stop")
{
robot.sendNullMessage();
robot.sendNullMessage();
}
else if (message == "estop")
{
robot.forward(-400);
}
else
{
std::cout << "Invalid command" << std::endl;
}
}
void logToFile()
{
while (true)
{
TKobukiData robotData = robot.parser.data;
std::ofstream outputFile("log", std::ios_base::app); // Open file in append mode to not overwrite own content
if (outputFile.is_open())
{ // check if the file was opened successfully
// Get current time
std::time_t now = std::time(nullptr);
outputFile << "Timestamp: " << std::ctime(&now);
// Write data to the file
outputFile << "analogInputCh0: " << robotData.analogInputCh0 << "\n";
outputFile << "analogInputCh1: " << robotData.analogInputCh1 << "\n";
outputFile << "analogInputCh2: " << robotData.analogInputCh2 << "\n";
outputFile << "analogInputCh3: " << robotData.analogInputCh3 << "\n";
outputFile << "digitalInput: " << robotData.digitalInput << "\n";
outputFile << "timestamp: " << robotData.timestamp << "\n";
outputFile << "BumperCenter: " << robotData.BumperCenter << "\n";
outputFile << "BumperLeft: " << robotData.BumperLeft << "\n";
outputFile << "BumperRight: " << robotData.BumperRight << "\n";
outputFile << "WheelDropLeft: " << robotData.WheelDropLeft << "\n";
outputFile << "WheelDropRight: " << robotData.WheelDropRight << "\n";
outputFile << "CliffCenter: " << robotData.CliffCenter << "\n";
outputFile << "CliffLeft: " << robotData.CliffLeft << "\n";
outputFile << "CliffRight: " << robotData.CliffRight << "\n";
outputFile << "EncoderLeft: " << robotData.EncoderLeft << "\n";
outputFile << "EncoderRight: " << robotData.EncoderRight << "\n";
outputFile << "PWMleft: " << robotData.PWMleft << "\n";
outputFile << "PWMright: " << robotData.PWMright << "\n";
outputFile << "ButtonPress: " << robotData.ButtonPress1 << "\n";
outputFile << "ButtonPress: " << robotData.ButtonPress2 << "\n";
outputFile << "ButtonPress: " << robotData.ButtonPress3 << "\n";
outputFile << "Charger: " << robotData.Charger << "\n";
outputFile << "Battery: " << robotData.Battery << "\n";
outputFile << "overCurrent: " << robotData.overCurrent << "\n";
outputFile << "IRSensorRight: " << robotData.IRSensorRight << "\n";
outputFile << "IRSensorCenter: " << robotData.IRSensorCenter << "\n";
outputFile << "IRSensorLeft: " << robotData.IRSensorLeft << "\n";
outputFile << "GyroAngle: " << robotData.GyroAngle << "\n";
outputFile << "GyroAngleRate: " << robotData.GyroAngleRate << "\n";
outputFile << "CliffSensorRight: " << robotData.CliffSensorRight << "\n";
outputFile << "CliffSensorCenter: " << robotData.CliffSensorCenter << "\n";
outputFile << "CliffSensorLeft: " << robotData.CliffSensorLeft << "\n";
outputFile << "wheelCurrentLeft: " << robotData.wheelCurrentLeft << "\n";
outputFile << "wheelCurrentRight: " << robotData.wheelCurrentRight << "\n";
outputFile << "frameId: " << robotData.frameId << "\n";
outputFile << "HardwareVersionPatch: " << robotData.extraInfo.HardwareVersionPatch << "\n";
outputFile << "HardwareVersionMinor: " << robotData.extraInfo.HardwareVersionMinor << "\n";
outputFile << "HardwareVersionMajor: " << robotData.extraInfo.HardwareVersionMajor << "\n";
outputFile << "FirmwareVersionPatch: " << robotData.extraInfo.FirmwareVersionPatch << "\n";
outputFile << "FirmwareVersionMinor: " << robotData.extraInfo.FirmwareVersionMinor << "\n";
outputFile << "FirmwareVersionMajor: " << robotData.extraInfo.FirmwareVersionMajor << "\n";
outputFile << "UDID0: " << robotData.extraInfo.UDID0 << "\n";
outputFile << "UDID1: " << robotData.extraInfo.UDID1 << "\n";
outputFile << "UDID2: " << robotData.extraInfo.UDID2 << "\n";
outputFile.close();
}
else
{
std::cerr << "Error opening file\n";
}
// Add a small delay to avoid busy-waiting
std::this_thread::sleep_for(std::chrono::milliseconds(100));
return message;
std::this_thread::sleep_for(std::chrono::seconds(2)); // Sleep for 2 seconds
}
}
void sendIndividualKobukiData(const TKobukiData &data) {
while (true) {
client.publishMessage("kobuki/data/timestamp", std::to_string(data.timestamp));
client.publishMessage("kobuki/data/BumperCenter", std::to_string(data.BumperCenter));
client.publishMessage("kobuki/data/BumperLeft", std::to_string(data.BumperLeft));
client.publishMessage("kobuki/data/BumperRight", std::to_string(data.BumperRight));
client.publishMessage("kobuki/data/WheelDropLeft", std::to_string(data.WheelDropLeft));
client.publishMessage("kobuki/data/WheelDropRight", std::to_string(data.WheelDropRight));
client.publishMessage("kobuki/data/CliffCenter", std::to_string(data.CliffCenter));
client.publishMessage("kobuki/data/CliffLeft", std::to_string(data.CliffLeft));
client.publishMessage("kobuki/data/CliffRight", std::to_string(data.CliffRight));
client.publishMessage("kobuki/data/EncoderLeft", std::to_string(data.EncoderLeft));
client.publishMessage("kobuki/data/EncoderRight", std::to_string(data.EncoderRight));
client.publishMessage("kobuki/data/PWMleft", std::to_string(data.PWMleft));
client.publishMessage("kobuki/data/PWMright", std::to_string(data.PWMright));
client.publishMessage("kobuki/data/ButtonPress1", std::to_string(data.ButtonPress1));
client.publishMessage("kobuki/data/ButtonPress2", std::to_string(data.ButtonPress2));
client.publishMessage("kobuki/data/ButtonPress3", std::to_string(data.ButtonPress3));
client.publishMessage("kobuki/data/Charger", std::to_string(data.Charger));
client.publishMessage("kobuki/data/Battery", std::to_string(data.Battery));
client.publishMessage("kobuki/data/overCurrent", std::to_string(data.overCurrent));
client.publishMessage("kobuki/data/IRSensorRight", std::to_string(data.IRSensorRight));
client.publishMessage("kobuki/data/IRSensorCenter", std::to_string(data.IRSensorCenter));
client.publishMessage("kobuki/data/IRSensorLeft", std::to_string(data.IRSensorLeft));
client.publishMessage("kobuki/data/GyroAngle", std::to_string(data.GyroAngle));
client.publishMessage("kobuki/data/GyroAngleRate", std::to_string(data.GyroAngleRate));
client.publishMessage("kobuki/data/CliffSensorRight", std::to_string(data.CliffSensorRight));
client.publishMessage("kobuki/data/CliffSensorCenter", std::to_string(data.CliffSensorCenter));
client.publishMessage("kobuki/data/CliffSensorLeft", std::to_string(data.CliffSensorLeft));
client.publishMessage("kobuki/data/wheelCurrentLeft", std::to_string(data.wheelCurrentLeft));
client.publishMessage("kobuki/data/wheelCurrentRight", std::to_string(data.wheelCurrentRight));
client.publishMessage("kobuki/data/digitalInput", std::to_string(data.digitalInput));
client.publishMessage("kobuki/data/analogInputCh0", std::to_string(data.analogInputCh0));
client.publishMessage("kobuki/data/analogInputCh1", std::to_string(data.analogInputCh1));
client.publishMessage("kobuki/data/analogInputCh2", std::to_string(data.analogInputCh2));
client.publishMessage("kobuki/data/analogInputCh3", std::to_string(data.analogInputCh3));
client.publishMessage("kobuki/data/frameId", std::to_string(data.frameId));
client.publishMessage("kobuki/data/extraInfo/HardwareVersionPatch", std::to_string(data.extraInfo.HardwareVersionPatch));
client.publishMessage("kobuki/data/extraInfo/HardwareVersionMinor", std::to_string(data.extraInfo.HardwareVersionMinor));
client.publishMessage("kobuki/data/extraInfo/HardwareVersionMajor", std::to_string(data.extraInfo.HardwareVersionMajor));
client.publishMessage("kobuki/data/extraInfo/FirmwareVersionPatch", std::to_string(data.extraInfo.FirmwareVersionPatch));
client.publishMessage("kobuki/data/extraInfo/FirmwareVersionMinor", std::to_string(data.extraInfo.FirmwareVersionMinor));
client.publishMessage("kobuki/data/extraInfo/FirmwareVersionMajor", std::to_string(data.extraInfo.FirmwareVersionMajor));
client.publishMessage("kobuki/data/extraInfo/UDID0", std::to_string(data.extraInfo.UDID0));
client.publishMessage("kobuki/data/extraInfo/UDID1", std::to_string(data.extraInfo.UDID1));
client.publishMessage("kobuki/data/extraInfo/UDID2", std::to_string(data.extraInfo.UDID2));
int movement()
{
int text;
while (true)
{
cin >> text;
if (!data.gyroData.empty()) {
const auto& latestGyro = data.gyroData.back();
client.publishMessage("kobuki/data/gyroData/x", std::to_string(latestGyro.x));
client.publishMessage("kobuki/data/gyroData/y", std::to_string(latestGyro.y));
client.publishMessage("kobuki/data/gyroData/z", std::to_string(latestGyro.z));
}
if (text == 1)
{
robot.forward(1024, 1);
}
else if (text == 2)
{
// 1 is full circle
robot.doRotation(0.25);
}
else if (text == 3)
{
// Add your code here for text == 3
}
else
{
try
{
robot.doRotation(text);
throw "NaN";
}
catch (const char *msg)
{
cerr << msg << endl;
}
}
}
std::this_thread::sleep_for(std::chrono::milliseconds(1000));
}
}
std::string serializeKobukiData(const TKobukiData &data) {
std::string json = "{\"timestamp\":" + std::to_string(data.timestamp) +
",\"BumperCenter\":" + std::to_string(data.BumperCenter) +
",\"BumperLeft\":" + std::to_string(data.BumperLeft) +
",\"BumperRight\":" + std::to_string(data.BumperRight) +
",\"WheelDropLeft\":" + std::to_string(data.WheelDropLeft) +
",\"WheelDropRight\":" + std::to_string(data.WheelDropRight) +
",\"CliffCenter\":" + std::to_string(data.CliffCenter) +
",\"CliffLeft\":" + std::to_string(data.CliffLeft) +
",\"CliffRight\":" + std::to_string(data.CliffRight) +
",\"EncoderLeft\":" + std::to_string(data.EncoderLeft) +
",\"EncoderRight\":" + std::to_string(data.EncoderRight) +
",\"PWMleft\":" + std::to_string(data.PWMleft) +
",\"PWMright\":" + std::to_string(data.PWMright) +
",\"ButtonPress1\":" + std::to_string(data.ButtonPress1) +
",\"ButtonPress2\":" + std::to_string(data.ButtonPress2) +
",\"ButtonPress3\":" + std::to_string(data.ButtonPress3) +
",\"Charger\":" + std::to_string(data.Charger) +
",\"Battery\":" + std::to_string(data.Battery) +
",\"overCurrent\":" + std::to_string(data.overCurrent) +
",\"IRSensorRight\":" + std::to_string(data.IRSensorRight) +
",\"IRSensorCenter\":" + std::to_string(data.IRSensorCenter) +
",\"IRSensorLeft\":" + std::to_string(data.IRSensorLeft) +
",\"GyroAngle\":" + std::to_string(data.GyroAngle) +
",\"GyroAngleRate\":" + std::to_string(data.GyroAngleRate) +
",\"CliffSensorRight\":" + std::to_string(data.CliffSensorRight) +
",\"CliffSensorCenter\":" + std::to_string(data.CliffSensorCenter) +
",\"CliffSensorLeft\":" + std::to_string(data.CliffSensorLeft) +
",\"wheelCurrentLeft\":" + std::to_string(data.wheelCurrentLeft) +
",\"wheelCurrentRight\":" + std::to_string(data.wheelCurrentRight) +
",\"digitalInput\":" + std::to_string(data.digitalInput) +
",\"analogInputCh0\":" + std::to_string(data.analogInputCh0) +
",\"analogInputCh1\":" + std::to_string(data.analogInputCh1) +
",\"analogInputCh2\":" + std::to_string(data.analogInputCh2) +
",\"analogInputCh3\":" + std::to_string(data.analogInputCh3) +
",\"frameId\":" + std::to_string(data.frameId) +
",\"extraInfo\":{\"HardwareVersionPatch\":" + std::to_string(data.extraInfo.HardwareVersionPatch) +
",\"HardwareVersionMinor\":" + std::to_string(data.extraInfo.HardwareVersionMinor) +
",\"HardwareVersionMajor\":" + std::to_string(data.extraInfo.HardwareVersionMajor) +
",\"FirmwareVersionPatch\":" + std::to_string(data.extraInfo.FirmwareVersionPatch) +
",\"FirmwareVersionMinor\":" + std::to_string(data.extraInfo.FirmwareVersionMinor) +
",\"FirmwareVersionMajor\":" + std::to_string(data.extraInfo.FirmwareVersionMajor) +
",\"UDID0\":" + std::to_string(data.extraInfo.UDID0) +
",\"UDID1\":" + std::to_string(data.extraInfo.UDID1) +
",\"UDID2\":" + std::to_string(data.extraInfo.UDID2) + "},\"gyroData\":[";
if (!data.gyroData.empty()) {
const auto& latestGyro = data.gyroData.back();
json += "{\"x\":" + std::to_string(latestGyro.x) +
",\"y\":" + std::to_string(latestGyro.y) +
",\"z\":" + std::to_string(latestGyro.z) + "}";
}
json += "]}";
return json;
}
//create extra function to send the message every 100ms
//needed it so it can be threaded
void sendKobukiData(TKobukiData &data) {
while (true) {
client.publishMessage("kobuki/data", serializeKobukiData(data));
std::cout << "Sent data" << std::endl;
std::this_thread::sleep_for(std::chrono::milliseconds(1000));
}
}
void CapnSend() {
int fps = 15;
int width = 800;
int height = 600;
VideoCapture cap("/dev/video0");
if (!cap.isOpened()) {
cerr << "Error: Could not open camera" << endl;
return;
}
VideoWriter out("appsrc ! videoconvert ! video/x-raw,format=I420 ! x264enc speed-preset=ultrafast bitrate=600 key-int-max=" + to_string(fps * 2) + " ! video/x-h264,profile=baseline ! mpegtsmux ! udpsink host=127.0.0.1 port=5001",
CAP_GSTREAMER, 0, fps, Size(width, height), true);
if (!out.isOpened()) {
cerr << "Error: Can't open video writer" << endl;
return;
}
Mat frame;
while (true) {
cap >> frame; // Capture a new image frame
if (frame.empty()) {
cerr << "Error: Could not capture image" << endl;
continue;
}
// Write the frame to the UDP stream
out.write(frame);
cout << "Sent image" << endl;
std::this_thread::sleep_for(std::chrono::milliseconds(1000 / fps)); // Control the frame rate
}
}

View File

@@ -1,15 +0,0 @@
cmake_minimum_required(VERSION 3.10)
set(CMAKE_CXX_STANDARD 23)
# Find the Paho MQTT C++ library
find_library(PAHO_MQTTPP_LIBRARY paho-mqttpp3 PATHS /usr/local/lib)
find_library(PAHO_MQTT_LIBRARY paho-mqtt3a PATHS /usr/local/lib)
# Include the headers
include_directories(/usr/local/include)
# Add the executable
add_executable(my_program main.cpp)
# Link the libraries
target_link_libraries(my_program ${PAHO_MQTTPP_LIBRARY} ${PAHO_MQTT_LIBRARY})

View File

@@ -1,56 +0,0 @@
#include <iostream>
#include <mqtt/async_client.h>
#include <thread> // Voor std::this_thread::sleep_for
#include <chrono> // Voor std::chrono::seconds
const std::string ADDRESS("mqtt://localhost:1883"); // Aanpassen indien nodig
const std::string CLIENT_ID("raspberry_pi_client");
const std::string TOPIC("home/commands");
class callback : public virtual mqtt::callback {
void message_arrived(mqtt::const_message_ptr msg) override {
std::cout << "Ontvangen bericht: '" << msg->get_topic()
<< "' : " << msg->to_string() << std::endl;
// Doe iets met het bericht, bijvoorbeeld een GP.IO-activering
}
void connection_lost(const std::string& cause) override {
std::cerr << "Verbinding verloren. Oorzaak: " << cause << std::endl;
}
void delivery_complete(mqtt::delivery_token_ptr token) override {
std::cout << "Bericht afgeleverd!" << std::endl;
}
};
int main() {
mqtt::async_client client(ADDRESS, CLIENT_ID);
callback cb;
client.set_callback(cb);
mqtt::connect_options connOpts;
connOpts.set_clean_session(true);
connOpts.set_user_name("ishak");
connOpts.set_password("kobuki");
connOpts.set_mqtt_version(MQTTVERSION_3_1_1); // Voor MQTT 3.1.1
try {
std::cout << "Verbinden met broker..." << std::endl;
client.connect(connOpts)->wait();
std::cout << "Verbonden!" << std::endl;
std::cout << "Abonneren op topic: " << TOPIC << std::endl;
client.subscribe(TOPIC, 1)->wait();
// Houd de client draaiende om berichten te blijven ontvangen
while (true) {
std::this_thread::sleep_for(std::chrono::seconds(1)); // Wacht om CPU-gebruik te verminderen
}
} catch (const mqtt::exception &exc) {
std::cerr << "Fout: " << exc.what() << std::endl;
return 1;
}
return 0;
}

View File

@@ -0,0 +1,44 @@
cmake_minimum_required( VERSION 3.6 )
# Require C++11 (or later)
set( CMAKE_CXX_STANDARD 23 )
set( CMAKE_CXX_STANDARD_REQUIRED ON )
set( CMAKE_CXX_EXTENSIONS OFF )
set(BUILD_MODE Debug)
# Create Project
project( Sample )
add_executable( YOLOv4 util.h main.cpp )
# Set StartUp Project
set_property( DIRECTORY PROPERTY VS_STARTUP_PROJECT "YOLOv4" )
# Find Package
# OpenCV
find_package( OpenCV REQUIRED )
if( OpenCV_FOUND )
# Additional Include Directories
include_directories( ${OpenCV_INCLUDE_DIRS} )
# Additional Dependencies
target_link_libraries( YOLOv4 ${OpenCV_LIBS} )
endif()
# Download Model
set( MODEL https://github.com/AlexeyAB/darknet/releases/download/darknet_yolo_v3_optimal/yolov4.weights )
file( DOWNLOAD
"${MODEL}"
"${CMAKE_CURRENT_LIST_DIR}/yolov4.weights"
EXPECTED_HASH SHA256=e8a4f6c62188738d86dc6898d82724ec0964d0eb9d2ae0f0a9d53d65d108d562
SHOW_PROGRESS
)
# Download Config
set( CONFIG https://raw.githubusercontent.com/AlexeyAB/darknet/master/cfg/yolov4.cfg )
file( DOWNLOAD
"${CONFIG}"
"${CMAKE_CURRENT_LIST_DIR}/yolov4.cfg"
EXPECTED_HASH SHA256=a6d0f8e5c62cc8378384f75a8159b95fa2964d4162e33351b00ac82e0fc46a34
SHOW_PROGRESS
)

BIN
src/C++/OpenCV/YOLOv4 Executable file

Binary file not shown.

80
src/C++/OpenCV/coco.names Normal file
View File

@@ -0,0 +1,80 @@
person
bicycle
car
motorbike
aeroplane
bus
train
truck
boat
traffic light
fire hydrant
stop sign
parking meter
bench
bird
cat
dog
horse
sheep
cow
elephant
bear
zebra
giraffe
backpack
umbrella
handbag
tie
suitcase
frisbee
skis
snowboard
sports ball
kite
baseball bat
baseball glove
skateboard
surfboard
tennis racket
bottle
wine glass
cup
fork
knife
spoon
bowl
banana
apple
sandwich
orange
broccoli
carrot
hot dog
pizza
donut
cake
chair
sofa
pottedplant
bed
diningtable
toilet
tvmonitor
laptop
mouse
remote
keyboard
cell phone
microwave
oven
toaster
sink
refrigerator
book
clock
vase
scissors
teddy bear
hair drier
toothbrush

209
src/C++/OpenCV/main.cpp Normal file
View File

@@ -0,0 +1,209 @@
#include <iostream>
#include <string>
#include <vector>
#include <opencv2/opencv.hpp>
#include <opencv2/dnn.hpp>
#include <filesystem>
#include <fstream>
#include "util.h"
// Helper function to check if a file exists
bool fileExists(const std::string &path)
{
return std::filesystem::exists(path);
}
// Function to read class names from a file
std::vector<std::string> _readClassNameList(const std::string &path)
{
std::vector<std::string> classes;
// Check if file exists
if (!fileExists(path))
{
throw std::runtime_error("Class names file not found: " + path);
}
// Try to open and read file
std::ifstream file(path);
if (!file.is_open())
{
throw std::runtime_error("Unable to open class names file: " + path);
}
std::string line;
while (std::getline(file, line))
{
if (!line.empty())
{
classes.push_back(line);
}
}
if (classes.empty())
{
throw std::runtime_error("No classes found in file: " + path);
}
return classes;
}
int main(int argc, char *argv[])
{
try
{
// Open Video Capture
cv::VideoCapture capture = cv::VideoCapture(0);
if (!capture.isOpened())
{
std::cerr << "Failed to open camera device" << std::endl;
return -1;
}
// Read Class Name List and Color Table
const std::string list = "coco.names";
const std::vector<std::string> classes = _readClassNameList(list);
const std::vector<cv::Scalar> colors = getClassColors(classes.size());
// Debug: Print the size of the colors vector
std::cout << "Number of colors: " << colors.size() << std::endl;
// Read Darknet
const std::string model = "yolov4.weights";
const std::string config = "yolov4.cfg";
cv::dnn::Net net = cv::dnn::readNet(model, config);
if (net.empty())
{
std::cerr << "Failed to load network" << std::endl;
return -1;
}
// Set Preferable Backend
net.setPreferableBackend(cv::dnn::DNN_BACKEND_OPENCV);
// Set Preferable Target
net.setPreferableTarget(cv::dnn::DNN_TARGET_OPENCL);
while (true)
{
// Read Frame
cv::Mat frame;
capture >> frame;
if (frame.empty())
{
cv::waitKey(0);
break;
}
if (frame.channels() == 4)
{
cv::cvtColor(frame, frame, cv::COLOR_BGRA2BGR);
}
// Create Blob from Input Image
cv::Mat blob = cv::dnn::blobFromImage(frame, 1 / 255.f, cv::Size(416, 416), cv::Scalar(), true, false);
// Set Input Blob
net.setInput(blob);
// Run Forward Network
std::vector<cv::Mat> detections;
net.forward(detections, getOutputsNames(net));
// Draw Region
std::vector<int32_t> class_ids;
std::vector<float> confidences;
std::vector<cv::Rect> rectangles;
for (cv::Mat &detection : detections)
{
if (detection.empty())
{
std::cerr << "Detection matrix is empty!" << std::endl;
continue;
}
for (int32_t i = 0; i < detection.rows; i++)
{
cv::Mat region = detection.row(i);
// Retrieve Max Confidence and Class Index
cv::Mat scores = region.colRange(5, detection.cols);
cv::Point class_id;
double confidence;
cv::minMaxLoc(scores, 0, &confidence, 0, &class_id);
// Check Confidence
constexpr float threshold = 0.2;
if (threshold > confidence)
{
continue;
}
// Retrieve Object Position
const int32_t x_center = static_cast<int32_t>(region.at<float>(0) * frame.cols);
const int32_t y_center = static_cast<int32_t>(region.at<float>(1) * frame.rows);
const int32_t width = static_cast<int32_t>(region.at<float>(2) * frame.cols);
const int32_t height = static_cast<int32_t>(region.at<float>(3) * frame.rows);
const cv::Rect rectangle = cv::Rect(x_center - (width / 2), y_center - (height / 2), width, height);
// Add Class ID, Confidence, Rectangle
class_ids.push_back(class_id.x);
confidences.push_back(confidence);
rectangles.push_back(rectangle);
}
}
// Remove Overlap Rectangles using Non-Maximum Suppression
constexpr float confidence_threshold = 0.5; // Confidence
constexpr float nms_threshold = 0.5; // IoU (Intersection over Union)
std::vector<int32_t> indices;
cv::dnn::NMSBoxes(rectangles, confidences, confidence_threshold, nms_threshold, indices);
// Draw Rectangle
for (const int32_t &index : indices)
{
// Bounds checking
if (class_ids[index] >= colors.size())
{
std::cerr << "Color index out of bounds: " << class_ids[index] << " (max: " << colors.size() - 1 << ")" << std::endl;
continue;
}
const cv::Rect rectangle = rectangles[index];
const cv::Scalar color = colors[class_ids[index]];
// Debug: Print the index and color
std::cout << "Drawing rectangle with color index: " << class_ids[index] << std::endl;
constexpr int32_t thickness = 3;
cv::rectangle(frame, rectangle, color, thickness);
std::string label = classes[class_ids[index]] + ": " + std::to_string(static_cast<int>(confidences[index] * 100)) + "%";
int baseLine;
cv::Size labelSize = cv::getTextSize(label, cv::FONT_HERSHEY_SIMPLEX, 0.5, 1, &baseLine);
int top = std::max(rectangle.y, labelSize.height);
cv::rectangle(frame, cv::Point(rectangle.x, top - labelSize.height),
cv::Point(rectangle.x + labelSize.width, top + baseLine), color, cv::FILLED);
cv::putText(frame, label, cv::Point(rectangle.x, top), cv::FONT_HERSHEY_SIMPLEX, 0.5, cv::Scalar(255, 255, 255), 1);
}
// Show Image
cv::imshow("Object Detection", frame);
const int32_t key = cv::waitKey(1);
if (key == 'q')
{
break;
}
}
cv::destroyAllWindows();
return 0;
}
catch (const std::exception &e)
{
std::cerr << "Error: " << e.what() << std::endl;
return -1;
}
}
// cloned and fixed from https://github.com/UnaNancyOwen/OpenCVDNNSample/tree/master

61
src/C++/OpenCV/util.h Normal file
View File

@@ -0,0 +1,61 @@
#ifndef __UTIL__
#define __UTIL__
#include <vector>
#include <string>
#include <fstream>
#include <opencv2/dnn.hpp>
#include <opencv2/core.hpp>
#include <opencv2/highgui.hpp>
// Get Output Layers Name
std::vector<std::string> getOutputsNames( const cv::dnn::Net& net )
{
static std::vector<std::string> names;
if( names.empty() ){
std::vector<int32_t> out_layers = net.getUnconnectedOutLayers();
std::vector<std::string> layers_names = net.getLayerNames();
names.resize( out_layers.size() );
for( size_t i = 0; i < out_layers.size(); ++i ){
names[i] = layers_names[out_layers[i] - 1];
}
}
return names;
}
// Get Output Layer Type
std::string getOutputLayerType( cv::dnn::Net& net )
{
const std::vector<int32_t> out_layers = net.getUnconnectedOutLayers();
const std::string output_layer_type = net.getLayer( out_layers[0] )->type;
return output_layer_type;
}
// Read Class Name List
std::vector<std::string> readClassNameList( const std::string list_path )
{
std::vector<std::string> classes;
std::ifstream ifs( list_path );
if( !ifs.is_open() ){
return classes;
}
std::string class_name = "";
while( std::getline( ifs, class_name ) ){
classes.push_back( class_name );
}
return classes;
}
// Get Class Color Table for Visualize
std::vector<cv::Scalar> getClassColors( const int32_t number_of_colors )
{
cv::RNG random;
std::vector<cv::Scalar> colors;
for( int32_t i = 0; i < number_of_colors; i++ ){
cv::Scalar color( random.uniform( 0, 255 ), random.uniform( 0, 255 ), random.uniform( 0, 255 ) );
colors.push_back( color );
}
return colors;
}
#endif // __UTIL__

1158
src/C++/OpenCV/yolov4.cfg Normal file

File diff suppressed because it is too large Load Diff

Binary file not shown.

View File

View File

@@ -0,0 +1,503 @@
/*
TinyGPS++ - a small GPS library for Arduino providing universal NMEA parsing
Based on work by and "distanceBetween" and "courseTo" courtesy of Maarten Lamers.
Suggestion to add satellites, courseTo(), and cardinal() by Matt Monson.
Location precision improvements suggested by Wayne Holder.
Copyright (C) 2008-2013 Mikal Hart
All rights reserved.
This library is free software; you can redistribute it and/or
modify it under the terms of the GNU Lesser General Public
License as published by the Free Software Foundation; either
version 2.1 of the License, or (at your option) any later version.
This library is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
Lesser General Public License for more details.
You should have received a copy of the GNU Lesser General Public
License along with this library; if not, write to the Free Software
Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA
*/
#include "TinyGPS++.h"
#include <string.h>
#include <ctype.h>
#include <stdlib.h>
#define _GPRMCterm "GPRMC"
#define _GPGGAterm "GPGGA"
#define _GNRMCterm "GNRMC"
#define _GNGGAterm "GNGGA"
TinyGPSPlus::TinyGPSPlus()
: parity(0)
, isChecksumTerm(false)
, curSentenceType(GPS_SENTENCE_OTHER)
, curTermNumber(0)
, curTermOffset(0)
, sentenceHasFix(false)
, customElts(0)
, customCandidates(0)
, encodedCharCount(0)
, sentencesWithFixCount(0)
, failedChecksumCount(0)
, passedChecksumCount(0)
{
term[0] = '\0';
}
//
// public methods
//
bool TinyGPSPlus::encode(char c)
{
++encodedCharCount;
switch(c)
{
case ',': // term terminators
parity ^= (uint8_t)c;
case '\r':
case '\n':
case '*':
{
bool isValidSentence = false;
if (curTermOffset < sizeof(term))
{
term[curTermOffset] = 0;
isValidSentence = endOfTermHandler();
}
++curTermNumber;
curTermOffset = 0;
isChecksumTerm = c == '*';
return isValidSentence;
}
break;
case '$': // sentence begin
curTermNumber = curTermOffset = 0;
parity = 0;
curSentenceType = GPS_SENTENCE_OTHER;
isChecksumTerm = false;
sentenceHasFix = false;
return false;
default: // ordinary characters
if (curTermOffset < sizeof(term) - 1)
term[curTermOffset++] = c;
if (!isChecksumTerm)
parity ^= c;
return false;
}
return false;
}
//
// internal utilities
//
int TinyGPSPlus::fromHex(char a)
{
if (a >= 'A' && a <= 'F')
return a - 'A' + 10;
else if (a >= 'a' && a <= 'f')
return a - 'a' + 10;
else
return a - '0';
}
// static
// Parse a (potentially negative) number with up to 2 decimal digits -xxxx.yy
int32_t TinyGPSPlus::parseDecimal(const char *term)
{
bool negative = *term == '-';
if (negative) ++term;
int32_t ret = 100 * (int32_t)atol(term);
while (isdigit(*term)) ++term;
if (*term == '.' && isdigit(term[1]))
{
ret += 10 * (term[1] - '0');
if (isdigit(term[2]))
ret += term[2] - '0';
}
return negative ? -ret : ret;
}
// static
// Parse degrees in that funny NMEA format DDMM.MMMM
void TinyGPSPlus::parseDegrees(const char *term, RawDegrees &deg)
{
uint32_t leftOfDecimal = (uint32_t)atol(term);
uint16_t minutes = (uint16_t)(leftOfDecimal % 100);
uint32_t multiplier = 10000000UL;
uint32_t tenMillionthsOfMinutes = minutes * multiplier;
deg.deg = (int16_t)(leftOfDecimal / 100);
while (isdigit(*term))
++term;
if (*term == '.')
while (isdigit(*++term))
{
multiplier /= 10;
tenMillionthsOfMinutes += (*term - '0') * multiplier;
}
deg.billionths = (5 * tenMillionthsOfMinutes + 1) / 3;
deg.negative = false;
}
#define COMBINE(sentence_type, term_number) (((unsigned)(sentence_type) << 5) | term_number)
// Processes a just-completed term
// Returns true if new sentence has just passed checksum test and is validated
bool TinyGPSPlus::endOfTermHandler()
{
// If it's the checksum term, and the checksum checks out, commit
if (isChecksumTerm)
{
byte checksum = 16 * fromHex(term[0]) + fromHex(term[1]);
if (checksum == parity)
{
passedChecksumCount++;
if (sentenceHasFix)
++sentencesWithFixCount;
switch(curSentenceType)
{
case GPS_SENTENCE_GPRMC:
date.commit();
time.commit();
if (sentenceHasFix)
{
location.commit();
speed.commit();
course.commit();
}
break;
case GPS_SENTENCE_GPGGA:
time.commit();
if (sentenceHasFix)
{
location.commit();
altitude.commit();
}
satellites.commit();
hdop.commit();
break;
}
// Commit all custom listeners of this sentence type
for (TinyGPSCustom *p = customCandidates; p != NULL && strcmp(p->sentenceName, customCandidates->sentenceName) == 0; p = p->next)
p->commit();
return true;
}
else
{
++failedChecksumCount;
}
return false;
}
// the first term determines the sentence type
if (curTermNumber == 0)
{
if (!strcmp(term, _GPRMCterm) || !strcmp(term, _GNRMCterm))
curSentenceType = GPS_SENTENCE_GPRMC;
else if (!strcmp(term, _GPGGAterm) || !strcmp(term, _GNGGAterm))
curSentenceType = GPS_SENTENCE_GPGGA;
else
curSentenceType = GPS_SENTENCE_OTHER;
// Any custom candidates of this sentence type?
for (customCandidates = customElts; customCandidates != NULL && strcmp(customCandidates->sentenceName, term) < 0; customCandidates = customCandidates->next);
if (customCandidates != NULL && strcmp(customCandidates->sentenceName, term) > 0)
customCandidates = NULL;
return false;
}
if (curSentenceType != GPS_SENTENCE_OTHER && term[0])
switch(COMBINE(curSentenceType, curTermNumber))
{
case COMBINE(GPS_SENTENCE_GPRMC, 1): // Time in both sentences
case COMBINE(GPS_SENTENCE_GPGGA, 1):
time.setTime(term);
break;
case COMBINE(GPS_SENTENCE_GPRMC, 2): // GPRMC validity
sentenceHasFix = term[0] == 'A';
break;
case COMBINE(GPS_SENTENCE_GPRMC, 3): // Latitude
case COMBINE(GPS_SENTENCE_GPGGA, 2):
location.setLatitude(term);
break;
case COMBINE(GPS_SENTENCE_GPRMC, 4): // N/S
case COMBINE(GPS_SENTENCE_GPGGA, 3):
location.rawNewLatData.negative = term[0] == 'S';
break;
case COMBINE(GPS_SENTENCE_GPRMC, 5): // Longitude
case COMBINE(GPS_SENTENCE_GPGGA, 4):
location.setLongitude(term);
break;
case COMBINE(GPS_SENTENCE_GPRMC, 6): // E/W
case COMBINE(GPS_SENTENCE_GPGGA, 5):
location.rawNewLngData.negative = term[0] == 'W';
break;
case COMBINE(GPS_SENTENCE_GPRMC, 7): // Speed (GPRMC)
speed.set(term);
break;
case COMBINE(GPS_SENTENCE_GPRMC, 8): // Course (GPRMC)
course.set(term);
break;
case COMBINE(GPS_SENTENCE_GPRMC, 9): // Date (GPRMC)
date.setDate(term);
break;
case COMBINE(GPS_SENTENCE_GPGGA, 6): // Fix data (GPGGA)
sentenceHasFix = term[0] > '0';
break;
case COMBINE(GPS_SENTENCE_GPGGA, 7): // Satellites used (GPGGA)
satellites.set(term);
break;
case COMBINE(GPS_SENTENCE_GPGGA, 8): // HDOP
hdop.set(term);
break;
case COMBINE(GPS_SENTENCE_GPGGA, 9): // Altitude (GPGGA)
altitude.set(term);
break;
}
// Set custom values as needed
for (TinyGPSCustom *p = customCandidates; p != NULL && strcmp(p->sentenceName, customCandidates->sentenceName) == 0 && p->termNumber <= curTermNumber; p = p->next)
if (p->termNumber == curTermNumber)
p->set(term);
return false;
}
/* static */
double TinyGPSPlus::distanceBetween(double lat1, double long1, double lat2, double long2)
{
// returns distance in meters between two positions, both specified
// as signed decimal-degrees latitude and longitude. Uses great-circle
// distance computation for hypothetical sphere of radius 6372795 meters.
// Because Earth is no exact sphere, rounding errors may be up to 0.5%.
// Courtesy of Maarten Lamers
double delta = radians(long1-long2);
double sdlong = sin(delta);
double cdlong = cos(delta);
lat1 = radians(lat1);
lat2 = radians(lat2);
double slat1 = sin(lat1);
double clat1 = cos(lat1);
double slat2 = sin(lat2);
double clat2 = cos(lat2);
delta = (clat1 * slat2) - (slat1 * clat2 * cdlong);
delta = sq(delta);
delta += sq(clat2 * sdlong);
delta = sqrt(delta);
double denom = (slat1 * slat2) + (clat1 * clat2 * cdlong);
delta = atan2(delta, denom);
return delta * 6372795;
}
double TinyGPSPlus::courseTo(double lat1, double long1, double lat2, double long2)
{
// returns course in degrees (North=0, West=270) from position 1 to position 2,
// both specified as signed decimal-degrees latitude and longitude.
// Because Earth is no exact sphere, calculated course may be off by a tiny fraction.
// Courtesy of Maarten Lamers
double dlon = radians(long2-long1);
lat1 = radians(lat1);
lat2 = radians(lat2);
double a1 = sin(dlon) * cos(lat2);
double a2 = sin(lat1) * cos(lat2) * cos(dlon);
a2 = cos(lat1) * sin(lat2) - a2;
a2 = atan2(a1, a2);
if (a2 < 0.0)
{
a2 += TWO_PI;
}
return degrees(a2);
}
const char *TinyGPSPlus::cardinal(double course)
{
static const char* directions[] = {"N", "NNE", "NE", "ENE", "E", "ESE", "SE", "SSE", "S", "SSW", "SW", "WSW", "W", "WNW", "NW", "NNW"};
int direction = (int)((course + 11.25f) / 22.5f);
return directions[direction % 16];
}
void TinyGPSLocation::commit()
{
rawLatData = rawNewLatData;
rawLngData = rawNewLngData;
lastCommitTime = millis();
valid = updated = true;
}
void TinyGPSLocation::setLatitude(const char *term)
{
TinyGPSPlus::parseDegrees(term, rawNewLatData);
}
void TinyGPSLocation::setLongitude(const char *term)
{
TinyGPSPlus::parseDegrees(term, rawNewLngData);
}
double TinyGPSLocation::lat()
{
updated = false;
double ret = rawLatData.deg + rawLatData.billionths / 1000000000.0;
return rawLatData.negative ? -ret : ret;
}
double TinyGPSLocation::lng()
{
updated = false;
double ret = rawLngData.deg + rawLngData.billionths / 1000000000.0;
return rawLngData.negative ? -ret : ret;
}
void TinyGPSDate::commit()
{
date = newDate;
lastCommitTime = millis();
valid = updated = true;
}
void TinyGPSTime::commit()
{
time = newTime;
lastCommitTime = millis();
valid = updated = true;
}
void TinyGPSTime::setTime(const char *term)
{
newTime = (uint32_t)TinyGPSPlus::parseDecimal(term);
}
void TinyGPSDate::setDate(const char *term)
{
newDate = atol(term);
}
uint16_t TinyGPSDate::year()
{
updated = false;
uint16_t year = date % 100;
return year + 2000;
}
uint8_t TinyGPSDate::month()
{
updated = false;
return (date / 100) % 100;
}
uint8_t TinyGPSDate::day()
{
updated = false;
return date / 10000;
}
uint8_t TinyGPSTime::hour()
{
updated = false;
return time / 1000000;
}
uint8_t TinyGPSTime::minute()
{
updated = false;
return (time / 10000) % 100;
}
uint8_t TinyGPSTime::second()
{
updated = false;
return (time / 100) % 100;
}
uint8_t TinyGPSTime::centisecond()
{
updated = false;
return time % 100;
}
void TinyGPSDecimal::commit()
{
val = newval;
lastCommitTime = millis();
valid = updated = true;
}
void TinyGPSDecimal::set(const char *term)
{
newval = TinyGPSPlus::parseDecimal(term);
}
void TinyGPSInteger::commit()
{
val = newval;
lastCommitTime = millis();
valid = updated = true;
}
void TinyGPSInteger::set(const char *term)
{
newval = atol(term);
}
TinyGPSCustom::TinyGPSCustom(TinyGPSPlus &gps, const char *_sentenceName, int _termNumber)
{
begin(gps, _sentenceName, _termNumber);
}
void TinyGPSCustom::begin(TinyGPSPlus &gps, const char *_sentenceName, int _termNumber)
{
lastCommitTime = 0;
updated = valid = false;
sentenceName = _sentenceName;
termNumber = _termNumber;
memset(stagingBuffer, '\0', sizeof(stagingBuffer));
memset(buffer, '\0', sizeof(buffer));
// Insert this item into the GPS tree
gps.insertCustom(this, _sentenceName, _termNumber);
}
void TinyGPSCustom::commit()
{
strcpy(this->buffer, this->stagingBuffer);
lastCommitTime = millis();
valid = updated = true;
}
void TinyGPSCustom::set(const char *term)
{
strncpy(this->stagingBuffer, term, sizeof(this->stagingBuffer));
}
void TinyGPSPlus::insertCustom(TinyGPSCustom *pElt, const char *sentenceName, int termNumber)
{
TinyGPSCustom **ppelt;
for (ppelt = &this->customElts; *ppelt != NULL; ppelt = &(*ppelt)->next)
{
int cmp = strcmp(sentenceName, (*ppelt)->sentenceName);
if (cmp < 0 || (cmp == 0 && termNumber < (*ppelt)->termNumber))
break;
}
pElt->next = *ppelt;
*ppelt = pElt;
}

View File

@@ -0,0 +1,278 @@
/*
TinyGPS++ - a small GPS library for Arduino providing universal NMEA parsing
Based on work by and "distanceBetween" and "courseTo" courtesy of Maarten Lamers.
Suggestion to add satellites, courseTo(), and cardinal() by Matt Monson.
Location precision improvements suggested by Wayne Holder.
Copyright (C) 2008-2022 Mikal Hart
All rights reserved.
This library is free software; you can redistribute it and/or
modify it under the terms of the GNU Lesser General Public
License as published by the Free Software Foundation; either
version 2.1 of the License, or (at your option) any later version.
This library is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
Lesser General Public License for more details.
You should have received a copy of the GNU Lesser General Public
License along with this library; if not, write to the Free Software
Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA
*/
#ifndef __TinyGPSPlus_h
#define __TinyGPSPlus_h
#if defined(ARDUINO) && ARDUINO >= 100
#include "Arduino.h"
#else
#include "WProgram.h"
#endif
#include <limits.h>
#define _GPS_VERSION "1.0.3" // software version of this library
#define _GPS_MPH_PER_KNOT 1.15077945
#define _GPS_MPS_PER_KNOT 0.51444444
#define _GPS_KMPH_PER_KNOT 1.852
#define _GPS_MILES_PER_METER 0.00062137112
#define _GPS_KM_PER_METER 0.001
#define _GPS_FEET_PER_METER 3.2808399
#define _GPS_MAX_FIELD_SIZE 15
struct RawDegrees
{
uint16_t deg;
uint32_t billionths;
bool negative;
public:
RawDegrees() : deg(0), billionths(0), negative(false)
{}
};
struct TinyGPSLocation
{
friend class TinyGPSPlus;
public:
bool isValid() const { return valid; }
bool isUpdated() const { return updated; }
uint32_t age() const { return valid ? millis() - lastCommitTime : (uint32_t)ULONG_MAX; }
const RawDegrees &rawLat() { updated = false; return rawLatData; }
const RawDegrees &rawLng() { updated = false; return rawLngData; }
double lat();
double lng();
TinyGPSLocation() : valid(false), updated(false)
{}
private:
bool valid, updated;
RawDegrees rawLatData, rawLngData, rawNewLatData, rawNewLngData;
uint32_t lastCommitTime;
void commit();
void setLatitude(const char *term);
void setLongitude(const char *term);
};
struct TinyGPSDate
{
friend class TinyGPSPlus;
public:
bool isValid() const { return valid; }
bool isUpdated() const { return updated; }
uint32_t age() const { return valid ? millis() - lastCommitTime : (uint32_t)ULONG_MAX; }
uint32_t value() { updated = false; return date; }
uint16_t year();
uint8_t month();
uint8_t day();
TinyGPSDate() : valid(false), updated(false), date(0)
{}
private:
bool valid, updated;
uint32_t date, newDate;
uint32_t lastCommitTime;
void commit();
void setDate(const char *term);
};
struct TinyGPSTime
{
friend class TinyGPSPlus;
public:
bool isValid() const { return valid; }
bool isUpdated() const { return updated; }
uint32_t age() const { return valid ? millis() - lastCommitTime : (uint32_t)ULONG_MAX; }
uint32_t value() { updated = false; return time; }
uint8_t hour();
uint8_t minute();
uint8_t second();
uint8_t centisecond();
TinyGPSTime() : valid(false), updated(false), time(0)
{}
private:
bool valid, updated;
uint32_t time, newTime;
uint32_t lastCommitTime;
void commit();
void setTime(const char *term);
};
struct TinyGPSDecimal
{
friend class TinyGPSPlus;
public:
bool isValid() const { return valid; }
bool isUpdated() const { return updated; }
uint32_t age() const { return valid ? millis() - lastCommitTime : (uint32_t)ULONG_MAX; }
int32_t value() { updated = false; return val; }
TinyGPSDecimal() : valid(false), updated(false), val(0)
{}
private:
bool valid, updated;
uint32_t lastCommitTime;
int32_t val, newval;
void commit();
void set(const char *term);
};
struct TinyGPSInteger
{
friend class TinyGPSPlus;
public:
bool isValid() const { return valid; }
bool isUpdated() const { return updated; }
uint32_t age() const { return valid ? millis() - lastCommitTime : (uint32_t)ULONG_MAX; }
uint32_t value() { updated = false; return val; }
TinyGPSInteger() : valid(false), updated(false), val(0)
{}
private:
bool valid, updated;
uint32_t lastCommitTime;
uint32_t val, newval;
void commit();
void set(const char *term);
};
struct TinyGPSSpeed : TinyGPSDecimal
{
double knots() { return value() / 100.0; }
double mph() { return _GPS_MPH_PER_KNOT * value() / 100.0; }
double mps() { return _GPS_MPS_PER_KNOT * value() / 100.0; }
double kmph() { return _GPS_KMPH_PER_KNOT * value() / 100.0; }
};
struct TinyGPSCourse : public TinyGPSDecimal
{
double deg() { return value() / 100.0; }
};
struct TinyGPSAltitude : TinyGPSDecimal
{
double meters() { return value() / 100.0; }
double miles() { return _GPS_MILES_PER_METER * value() / 100.0; }
double kilometers() { return _GPS_KM_PER_METER * value() / 100.0; }
double feet() { return _GPS_FEET_PER_METER * value() / 100.0; }
};
struct TinyGPSHDOP : TinyGPSDecimal
{
double hdop() { return value() / 100.0; }
};
class TinyGPSPlus;
class TinyGPSCustom
{
public:
TinyGPSCustom() {};
TinyGPSCustom(TinyGPSPlus &gps, const char *sentenceName, int termNumber);
void begin(TinyGPSPlus &gps, const char *_sentenceName, int _termNumber);
bool isUpdated() const { return updated; }
bool isValid() const { return valid; }
uint32_t age() const { return valid ? millis() - lastCommitTime : (uint32_t)ULONG_MAX; }
const char *value() { updated = false; return buffer; }
private:
void commit();
void set(const char *term);
char stagingBuffer[_GPS_MAX_FIELD_SIZE + 1];
char buffer[_GPS_MAX_FIELD_SIZE + 1];
unsigned long lastCommitTime;
bool valid, updated;
const char *sentenceName;
int termNumber;
friend class TinyGPSPlus;
TinyGPSCustom *next;
};
class TinyGPSPlus
{
public:
TinyGPSPlus();
bool encode(char c); // process one character received from GPS
TinyGPSPlus &operator << (char c) {encode(c); return *this;}
TinyGPSLocation location;
TinyGPSDate date;
TinyGPSTime time;
TinyGPSSpeed speed;
TinyGPSCourse course;
TinyGPSAltitude altitude;
TinyGPSInteger satellites;
TinyGPSHDOP hdop;
static const char *libraryVersion() { return _GPS_VERSION; }
static double distanceBetween(double lat1, double long1, double lat2, double long2);
static double courseTo(double lat1, double long1, double lat2, double long2);
static const char *cardinal(double course);
static int32_t parseDecimal(const char *term);
static void parseDegrees(const char *term, RawDegrees &deg);
uint32_t charsProcessed() const { return encodedCharCount; }
uint32_t sentencesWithFix() const { return sentencesWithFixCount; }
uint32_t failedChecksum() const { return failedChecksumCount; }
uint32_t passedChecksum() const { return passedChecksumCount; }
private:
enum {GPS_SENTENCE_GPGGA, GPS_SENTENCE_GPRMC, GPS_SENTENCE_OTHER};
// parsing state variables
uint8_t parity;
bool isChecksumTerm;
char term[_GPS_MAX_FIELD_SIZE];
uint8_t curSentenceType;
uint8_t curTermNumber;
uint8_t curTermOffset;
bool sentenceHasFix;
// custom element support
friend class TinyGPSCustom;
TinyGPSCustom *customElts;
TinyGPSCustom *customCandidates;
void insertCustom(TinyGPSCustom *pElt, const char *sentenceName, int index);
// statistics
uint32_t encodedCharCount;
uint32_t sentencesWithFixCount;
uint32_t failedChecksumCount;
uint32_t passedChecksumCount;
// internal utilities
int fromHex(char a);
bool endOfTermHandler();
};
#endif // def(__TinyGPSPlus_h)

View File

@@ -0,0 +1,26 @@
/*
TinyGPSPlus - a small GPS library for Arduino providing universal NMEA parsing
Based on work by and "distanceBetween" and "courseTo" courtesy of Maarten Lamers.
Suggestion to add satellites, courseTo(), and cardinal() by Matt Monson.
Location precision improvements suggested by Wayne Holder.
Copyright (C) 2008-2013 Mikal Hart
All rights reserved.
This library is free software; you can redistribute it and/or
modify it under the terms of the GNU Lesser General Public
License as published by the Free Software Foundation; either
version 2.1 of the License, or (at your option) any later version.
This library is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
Lesser General Public License for more details.
You should have received a copy of the GNU Lesser General Public
License along with this library; if not, write to the Free Software
Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA
*/
#ifndef __TinyGPSPlus_h
#include "TinyGPS++.h"
#endif

41
src/Python/YOLO/app.py Normal file
View File

@@ -0,0 +1,41 @@
from ultralytics import YOLO
import cv2
import numpy as np
import requests
import time
model = YOLO("yolo11n.pt")
#try to fetch the image from the given url
def fetch_image(url):
try:
response = requests.get(url)
response.raise_for_status()
image_array = np.frombuffer(response.content, np.uint8)
image = cv2.imdecode(image_array, cv2.IMREAD_COLOR)
return image
except requests.RequestException as e:
print(f"Error: Could not fetch image - {e}")
return None
# URL of the photostream
url = "http://145.92.224.21/image"
while True:
frame = fetch_image(url)
if frame is None:
print("Error: Could not fetch image, retrying...")
time.sleep(1) # Wait for 1 second before retrying
continue
# Predict on the frame
results = model(frame)
# Display the results
results[0].show()
# Exit if 'q' is pressed
if cv2.waitKey(1) & 0xFF == ord('q'):
break
cv2.destroyAllWindows()

View File

@@ -0,0 +1 @@
__pycache__

View File

@@ -0,0 +1,18 @@
FROM python:3.9
WORKDIR /app
COPY . .
RUN apt-get update && apt-get install -y libgl1
RUN pip install -r requirements.txt
EXPOSE 5000
CMD ["python", "web/app.py"]
#build instruction: sudo docker buildx build -t flaskapp:latest .
#run instruction: sudo docker run --network="host" flaskapp:latest
# need to use network host to connect to the host's mqtt server

View File

@@ -0,0 +1,5 @@
Flask==3.1.0
paho-mqtt==1.6.1
ultralytics==8.3.58
opencv-python-headless==4.6.0.66
numpy==1.23.4

View File

@@ -1,32 +1,109 @@
from flask import Flask, request, render_template, jsonify
from flask import Flask, Response, request, render_template, jsonify
import paho.mqtt.client as mqtt
from ultralytics import YOLO
import cv2
import numpy as np
import threading
app = Flask(__name__)
# Load a model
model = YOLO("yolo11n.pt") # pretrained YOLO11n model
kobuki_message = ""
latest_image = None
processed_image = None
yolo_results = []
# Lock for thread-safe access to shared variables
lock = threading.Lock()
# List of class names (example for COCO dataset)
yolo_classes = list(model.names.values())
def on_message(client, userdata, message):
global kobuki_message, latest_image, processed_image, yolo_results
if message.topic == "kobuki/data":
kobuki_message = str(message.payload.decode("utf-8"))
elif message.topic == "kobuki/cam":
with lock: # Lock the shared variables between threads so they can't be accessed at the same time and you cant have half processed images
latest_image = np.frombuffer(message.payload, np.uint8)
latest_image = cv2.imdecode(latest_image, cv2.IMREAD_COLOR)
# Process the image with YOLO
results = model(latest_image)
yolo_results = []
processed_image = latest_image.copy() # Create a copy for processing
for result in results:
for box in result.boxes:
class_id = int(box.cls.item())
class_name = yolo_classes[class_id]
yolo_results.append({
"class": class_name,
"confidence": box.conf.item(),
"bbox": box.xyxy.tolist()
})
# Draw bounding box on the processed image
x1, y1, x2, y2 = map(int, box.xyxy[0])
cv2.rectangle(processed_image, (x1, y1), (x2, y2), (0, 255, 0), 2)
cv2.putText(processed_image, f"{class_name} {box.conf.item():.2f}", (x1, y1 - 10), cv2.FONT_HERSHEY_SIMPLEX, 0.9, (0, 255, 0), 2)
# Create an MQTT client instance
mqtt_client = mqtt.Client()
mqtt_client.username_pw_set("ishak", "kobuki")
mqtt_client.connect("localhost", 1883, 60)
mqtt_client.username_pw_set("server", "serverwachtwoordofzo")
mqtt_client.connect("localhost", 1884, 60)
mqtt_client.loop_start()
mqtt_client.subscribe("kobuki/data")
mqtt_client.subscribe("kobuki/cam")
@app.route('/', methods=["POST"])
mqtt_client.on_message = on_message # this line needs to be under the function definition otherwise it can't find which function it needs to use
@app.route('/')
def index():
return render_template('index.html')
@app.route('/control', methods=["GET", "POST"])
def control():
if request.authorization and request.authorization.username == 'ishak' and request.authorization.password == 'kobuki':
return render_template('control.html')
else:
return ('Unauthorized', 401, {'WWW-Authenticate': 'Basic realm="Login Required"'})
@app.route('/move', methods=['POST'])
def move():
# Get the direction from the form data
direction = request.form['direction']
# Publish the direction to the MQTT topic "home/commands"
result = mqtt_client.publish("home/commands", direction)
# Check if the publish was successful
if result.rc == mqtt.MQTT_ERR_SUCCESS:
return jsonify({"message": "Bericht succesvol gepubliceerd"}), 200
else:
return jsonify({"message": "Fout bij het publiceren van bericht"}), 500
data = request.get_json()
direction = data.get("direction")
# Verstuur de richting via MQTT
if direction:
mqtt_client.publish("home/commands", direction) # Het topic kan aangepast worden
return jsonify({"status": "success", "direction": direction})
@app.route('/data', methods=['GET'])
def data():
return kobuki_message
@app.route('/image')
def image():
global processed_image
with lock: # Lock the shared variables between threads so they can't be accessed at the same time and you cant have half processed images
if processed_image is not None:
_, buffer = cv2.imencode('.jpg', processed_image)
return Response(buffer.tobytes(), mimetype='image/jpeg')
else:
return "No image available", 404
@app.route('/yolo_results', methods=['GET'])
def yolo_results_endpoint():
global yolo_results
with lock:
return jsonify(yolo_results)
# Run the Flask application in debug mode
if __name__ == '__main__':
app.run(debug=True)
app.run(debug=True, port=5000)

Binary file not shown.

After

Width:  |  Height:  |  Size: 200 KiB

View File

@@ -1,14 +1,61 @@
window.onload = function () {
const form = document.getElementById("form");
form.onclick = function (event) {
event.preventDefault();
fetch("/move", {
method: "GET",
})
.then((response) => {})
.then(() => {
fetch
document.addEventListener("DOMContentLoaded", function() {
document.querySelectorAll(".btn").forEach(button => {
button.addEventListener("click", function(event) {
event.preventDefault(); // prevents page refresh
// Get the value of the button
const direction = event.target.value;
fetch("/move", {
method: "POST",
headers: {
"Content-Type": "application/json"
},
body: JSON.stringify({ direction: direction })
})
.then(response => response.json())
.then(data => {
console.log("Success:", data);
})
.catch(error => {
console.error("Error:", error);
});
};
};
9
});
});
// Fetch data from the server
async function fetchData() {
try {
const response = await fetch("/data");
const data = await response.json();
return data;
} catch (error) {
console.error("Error:", error);
}
}
// Parse the data and show it on the website
async function parseData() {
const data = await fetchData();
const sensorDataContainer = document.getElementById("sensor-data");
sensorDataContainer.innerHTML = ""; // Clear previous data
// For each object in JSON array, create a new paragraph element and append it to the sensorDataContainer
for (const [key, value] of Object.entries(data)) {
const dataElement = document.createElement("p");
dataElement.textContent = `${key}: ${value}`;
sensorDataContainer.appendChild(dataElement);
}
}
// Update the image
function updateImage() {
var img = document.getElementById("robot-image");
img.src = "/image?" + new Date().getTime(); // Add timestamp to avoid caching
}
// Fetch and display sensor data every 1 second
setInterval(parseData, 1000);
// Update the image every 200 milliseconds
setInterval(updateImage, 100);
});

View File

@@ -1,8 +1,8 @@
body {
font-family: 'Poppins', sans-serif;
text-align: -webkit-center;
margin: 0;
padding: 0;
font-family: "Poppins", sans-serif;
text-align: -webkit-center;
margin: 0;
padding: 0;
}
/* This is my code for my navbar */
@@ -23,97 +23,148 @@ body {
right: 0%;
}
.footer{
display: flex;
justify-content: space-between;
max-width: 80%;
background-color: #fff;
border: 1px solid #f0f0f0;
border-radius: 50px;
align-items: center;
margin: 1.5rem auto 0 auto;
padding: 0 30px;
top: 0%;
bottom: auto;
left: 0%;
right: 0%;
}
.imgNav {
height: 50px;
border-radius: 20px;
}
.connectButton {
border-radius: 10px;
height: 100%;
width: 100px;
box-shadow: none;
border: none;
font-size: 1rem;
height: 40px;
background-color: #b3ffb3;
}
border-radius: 10px;
height: 100%;
width: 100px;
box-shadow: none;
border: none;
font-size: 1rem;
height: 40px;
background-color: #b3ffb3;
}
/* end navbar */
.container {
display: flex;
justify-content: space-around;
align-items: center;
margin-top: 50px;
width: 80%;
background-color: white;
border-radius: 20px;
box-shadow: 0px 8px 16px rgba(0, 0, 0, 0.2);
padding: 40px;
display: flex;
justify-content: space-around;
align-items: center;
margin-top: 50px;
width: 80%;
background-color: white;
border-radius: 20px;
box-shadow: 0px 8px 16px rgba(0, 0, 0, 0.2);
padding: 40px;
}
.button-section {
position: relative;
width: 150px;
height: 150px;
position: relative;
width: 150px;
height: 150px;
}
.btn {
position: absolute;
background-color: #007BFF;
color: white;
border: none;
border-radius: 50%;
width: 60px;
height: 60px;
font-size: 1.2em;
text-align: center;
line-height: 60px;
cursor: pointer;
transition: transform 0.2s ease, background-color 0.2s ease;
position: absolute;
background-color: #007bff;
color: white;
border: none;
border-radius: 50%;
width: 60px;
height: 60px;
font-size: 1.2em;
text-align: center;
line-height: 60px;
cursor: pointer;
transition: transform 0.2s ease, background-color 0.2s ease;
}
/* Middenknop */
.text{
width: 50%;
}
.image{
height: 100%;
}
.sectionHeight{
height: 200px;
}
/* Direction buttons */
.btn:nth-child(1) {
top: 50%;
left: 50%;
transform: translate(-125%, -50%);
/* Left */
top: 50%;
left: 50%;
transform: translate(-160%, -50%);
}
/* Knop boven */
.btn:nth-child(2) {
top: 0;
left: 50%;
transform: translateX(-50%);
/* Up */
top: 0;
left: 50%;
transform: translate(-50%, -35%);
}
/* Knop rechts */
.btn:nth-child(3) {
top: 50%;
right: 0;
transform: translateY(-50%);
/* Right */
top: 50%;
right: 0;
transform: translate(35%, -50%);
}
/* Knop onder */
.btn:nth-child(4) {
bottom: 0;
left: 50%;
transform: translateX(-50%);
/* Down */
bottom: 0;
left: 50%;
transform: translate(-50%, 35%);
}
/* Knop links */
.btn:nth-child(5) {
top: 50%;
left: 0;
transform: translateY(-50%);
/* Stop Button */
top: 50%;
left: 50%;
transform: translate(-50%, -50%);
background-color: red; /* Distinct color for the stop button */
width: 60px; /* Slightly larger for emphasis */
height: 60px; /* Slightly larger for emphasis */
line-height: 60px; /* Center text vertically */
}
/* Hover effects */
.btn:hover {
background-color: #0056b3;
/* transform: scale(1.1); */
background-color: #0056b3;
}
.btn:active {
background-color: #004494;
background-color: #004494;
}
.stop-button:hover {
background-color: darkred; /* Different hover color for the stop button */
}
table {
width: 100%;
border-collapse: collapse;
}
th,td {
border: 1px solid #ddd;
padding: 8px;
}
th {
background-color: #f2f2f2;
text-align: left;
}

View File

@@ -0,0 +1,73 @@
body {
font-family: Arial, sans-serif;
margin: 0;
padding: 0;
background-color: #f4f4f4;
}
header {
background-color: #333;
color: #fff;
padding: 1rem 0;
text-align: center;
}
header h1 {
margin: 0;
}
nav ul {
list-style: none;
padding: 0;
}
nav ul li {
display: inline;
margin: 0 1rem;
}
nav ul li a {
color: #fff;
text-decoration: none;
}
section {
padding: 2rem;
margin: 1rem 0;
background-color: #fff;
border-radius: 8px;
box-shadow: 0 0 10px rgba(0, 0, 0, 0.1);
}
section h2 {
margin-top: 0;
}
form {
display: flex;
flex-direction: column;
}
form label {
margin: 0.5rem 0 0.2rem;
}
form input, form textarea {
padding: 0.5rem;
margin-bottom: 1rem;
border: 1px solid #ccc;
border-radius: 4px;
}
form button {
padding: 0.7rem;
border: none;
border-radius: 4px;
background-color: #333;
color: #fff;
cursor: pointer;
}
form button:hover {
background-color: #555;
}

View File

@@ -8,8 +8,9 @@
{% block head %}{% endblock %}
</head>
<body>
{% include 'navbar.html' %}
{% include 'navbar.html' %}
{% block content %}
{% endblock %}
</body>
</html>

View File

@@ -0,0 +1,51 @@
{% extends 'base.html' %}
{% block head %}
<link rel="stylesheet" href="../static/style.css" />
{% endblock %}
{% block content %}
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Kobuki</title>
<link rel="stylesheet" href="../static/style.css" />
</head>
<body>
<div class="container">
<div class="robot-image">
<img src="/image" alt="Kobuki Camera Feed" id="robot-image" />
</div>
<div class="button-section">
<form id="form" action="/move" method="post">
<button class="btn" name="direction" value="left"></button>
<button class="btn" name="direction" value="up"></button>
<button class="btn" name="direction" value="right"></button>
<button class="btn" name="direction" value="down"></button>
<button class="btn stop-button" name="direction" value="stop">
Stop
</button>
</form>
</div>
</div>
<div class="container">
<h1>Sensor Data</h1>
<div class="data">
<table id="sensor-data"> <!-- Do not change -->
<thead>
<tr>
<th>Sensor</th>
<th>Value</th>
</tr>
</thead>
<tbody>
<!-- Sensor data rows will be inserted here -->
</tbody>
</table>
</div>
</div>
<script src="../static/script.js"></script>
</body>
</html>
{% endblock %}

View File

@@ -0,0 +1,21 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Document</title>
<link rel="stylesheet" href="{{ url_for('static', filename='style.css') }}">
</head>
<body>
<footer class="footer">
<img src="{{url_for('static', filename='images/logo_kobuki.png')}}" alt="logo" class="imgNav" />
<h3>© 2024 Kobuki Robot Project. All rights reserved.</h3>
<div class="buttonContainer">
<a href="{{ url_for('control') }}" target="_blank">
<button class="click connectButton">Controller</button>
</a>
</div>
</footer>
</body>
</html>

View File

@@ -1,30 +1,58 @@
{%extends 'base.html'%} {%block head%} {%endblock%} {%block content%}
{% extends 'base.html' %} {% block head %}
<link rel="stylesheet" href="../static/style.css" />
{% endblock %} {% block content %}
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Kobuki</title>
<title>Kobuki Robot Project</title>
<link rel="stylesheet" href="../static/style.css" />
</head>
<body>
<div class="container">
<div class="image-section">
<img src="kobuki.jpg" alt="Kobuki Robot" id="robot-image" />
</div>
<div class="button-section">
<form id = "form" action="/" method="post">
<button class="btn" name="direction" value="left"></button>
<button class="btn" name="direction" value="up"></button>
<button class="btn" name="direction" value="right"></button>
<button class="btn" name="direction" value="down"></button>
</form>
</div>
</div>
<div class="container">
<h1>Sensor Data</h1>
</div>
<script href="../static/script.js"></script>
<section class="container sectionHeight">
<p class="text">
The Kobuki Robot Project is an innovative initiative aimed at developing
a versatile and intelligent robot platform. Our goal is to create a
robot that can navigate autonomously, interact with its environment, and
perform various tasks.
</p>
<img src="{{url_for('static', filename='images/logo.png')}}" alt="logo" class="image" />
</section>
<section class="container sectionHeight" id="about">
<h2>About the Project</h2>
<p>
This project is a collaborative effort involving engineers, researchers,
and enthusiasts. The Kobuki robot is equipped with various sensors,
including bumpers, cliff sensors, and gyroscopes, to help it navigate
and interact with its surroundings.
</p>
<p>Key features of the Kobuki Robot:</p>
<ul>
<li>Autonomous navigation</li>
<li>Obstacle detection and avoidance</li>
<li>Real-time data processing</li>
<li>Remote control and monitoring</li>
</ul>
</section>
<section class="container" id="contact">
<h2>Contact Us</h2>
<form id="contact-form" action="/contact" method="post">
<label for="name">Name:</label>
<input type="text" id="name" name="name" required />
<label for="email">Email:</label>
<input type="email" id="email" name="email" required />
<label for="message">Message:</label>
<textarea id="message" name="message" required></textarea>
<button type="submit">Send</button>
</form>
</section>
{% include 'footer.html' %}
<script src="static/script.js"></script>
</body>
</html>
{%endblock%}
{% endblock %}

View File

@@ -11,13 +11,9 @@
<img src="{{url_for('static', filename='images/logo_kobuki.png')}}" alt="logo" class="imgNav" />
<h3>Kobuki</h3>
<div class="buttonContainer">
<a
href="https://gitlab.fdmci.hva.nl/propedeuse-hbo-ict/onderwijs/2023-2024/out-a-se-ti/blok-3/vuupoofeehoo27"
target="_blank"
>
<button class="click connectButton">Placeholder</button>
<a href="{{ url_for('control') }}" target="_blank">
<button class="click connectButton">Controller</button>
</a>
<!-- <a href="./signup.html">sign in</a> -->
</div>
</nav>

Binary file not shown.

View File

@@ -0,0 +1,13 @@
[Unit]
Description=kobukiDriver
After=network.target
[Service]
User=user1
WorkingDirectory=/home/user1/rooziinuubii79/src/C++/Driver/
ExecStart=/home/user1/rooziinuubii79/src/C++/Driver/kobuki_control
Restart=always
RestartSec=5
[Install]
WantedBy=multi-user.target

View File

@@ -0,0 +1,7 @@
allow_anonymous false
password_file /etc/mosquitto/passwordfile
listener 8080
protocol websockets
listener 1884
protocol mqtt

View File

@@ -0,0 +1,22 @@
server {
listen 80;
server_name 145.92.224.21;
# Proxy WebSocket connections for MQTT
location /ws/ {
proxy_pass http://localhost:9001;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $host;
}
# Proxy HTTP connections for Flask
location / {
proxy_pass http://localhost:5000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}

View File

@@ -0,0 +1,7 @@
stream {
server {
listen 9001;
proxy_pass localhost:8080;
}
}

View File

@@ -0,0 +1,11 @@
# Feedback expert review
probeer Definition of done zelf te formuleren.
beter user stories maken
# Feedback peer review
- eventuele coaching op het gebied van technisch en taken
- meer duidelijkheid wat wij gaan gebruiken, welke technieken en tools. Hiermee kan je mensen koppelen aan taken zodat iedereen iets heeft gedaan.
- meer duidelijkheid van elkaar kwaliteiten en leerdoelen. Hiermee kan je goed inschatten wie wat kan doen en wie wat kan leren.

View File

@@ -0,0 +1,42 @@
# Hoofd en deelvragen
**Wat is de aanleiding?**
De aanleiding is de de behoefte aan veilige communicatie tussen apparaten. Dit is belangrijk omdat onbeveiligde dataoverdracht kan leiden tot datalekken.
**wat is het probleem/behoefte en waaruit blijkt dat?**
Het probleem is dat data die tussen apparaten wordt verstuurd kwetsbaar kan zijn. Dit blijkt nadat wij te horen hebben gekregen dat er niet goed was omgegaan met communicatie tussen apparaten.
**Wie heeft het probleem/behoefte?**
ons groepje, maar ook bijvoorbeeld grote bedrijven waar het heel belangrijk is dat data veilig wordt verstuurd zonder dat het in de verkeerde handen valt.
**Wanneer is het probleem/behoefte ontstaan?**
Het probleem is ontstaan nadat wij te horen hebben gekregen dat er niet goed was omgegaan met communicatie tussen apparaten.
**Waarom is het een probleem?**
Het is een probleem omdat onbeveiligde communicatie kan leiden tot datalekken waaronder privacy. Hiermee kunnen bedrijven in de problemen komen.
**Waar doet het probleem/behoefte zich voor (afbakening)?**
Het probleem komt voor in verschillende sectoren waar data tussen apparaten wordt verstuurd. Dit kan zijn in de zorg, industrie, op kantoor, maar ook met IoT projecten wat je thuis kan hebben.
## Hoofdvraag
Welke communicatieprotocol geeft de mogelijkheid om veilig en betrouwbaar te communiceren tussen IoT apparaten?
## Deelvragen
1. Wat houdt veilige en betrouwbare communicatie tussen apparaten in?
2. Welke protocollen zijn er om veilig en betrouwbaar te communiceren tussen apparaten?
3. Wat zijn de voor- en nadelen van de verschillende protocollen?
## Bronnen
- Singh, S., & Jyoti. (2024, June 7). Secure Communications Protocols for IoT networks: a survey. https://journal.ijprse.com/index.php/ijprse/article/view/1082
- Nguyen, K. T., Laurent, M., Oualha, N., CEA, & Institut Mines-Telecom. (2015). Survey on secure communication protocols for the Internet of Things. In Ad Hoc Networks (Vol. 32, pp. 1731) [Journal-article]. http://dx.doi.org/10.1016/j.adhoc.2015.01.006
- Miorandi, D., Sicari, S., De Pellegrini, F., & Imrich Chlamtac. (2012). Internet of things: Vision, applications and research challenges. In Ad Hoc Networks (Vol. 10, pp. 14971516) [Journal-article]. Elsevier B.V. http://dx.doi.org/10.1016/j.adhoc.2012.02.016
- Christiano, P. (2023, November 5). Top 9 IoT communication protocols & their features in 2024: An In-Depth guide - ExpertBeacon. Expertbeacon. https://expertbeacon.com/iot-communication-protocol/
- Yugha, R., & Chithra, S. (2020). A survey on technologies and security protocols: Reference for future generation IoT. Journal of Network and Computer Applications, 169, 102763. https://doi.org/10.1016/j.jnca.2020.102763
- De Mendizábal, I. (2022, June 16). IoT Communication Protocols—IoT Data Protocols. Technical Articles. https://www.allaboutcircuits.com/technical-articles/internet-of-things-communication-protocols-iot-data-protocols/
- IoT-technologieën en -protocollen | Microsoft Azure. (n.d.). https://azure.microsoft.com/nl-nl/solutions/iot/iot-technology-protocols
- Het IoT verbinden: wat is MQTT en waarin verschilt het van CoAP? (n.d.). https://www.onlogic.com/nl/blog/het-iot-verbinden-wat-is-mqtt-en-waarin-verschilt-het-van-coap/
- Nader, K. (2023, October 30). Wat zijn de voordelen van het gebruik van WebSocket voor IoT-communicatie? AppMaster - Ultimate All-in No-code Platform. https://appmaster.io/nl/blog/websocket-voor-iot-communicatie
- Sidna, J., Amine, B., Abdallah, N., & Alami, H. E. (2020). Analysis and evaluation of communication Protocols for IoT Applications. Karbala International Journal of Modern Science. https://doi.org/10.1145/3419604.3419754

Binary file not shown.

View File

@@ -0,0 +1,11 @@
# kobuki
# last sprint i have been busy with the website
# my job was to make an controller on the website for the kobuki
# so that we can control the kobuki with the website
# i did this using a protocol called MQTT
# mqtt is a protocol that is used to send messages between devices
# this is different from the normal protocol that i used last year
# it was a bit difficult to get the kobuki to work with the website
# next sprint i will be working on a sensor
# i will do some research on the different sensors that we can use
#