70 Commits

Author SHA1 Message Date
c9d3b0f795 Merge branch 'main' into 'OpenCV'
# Conflicts:
#   src/Python/flask/web/app.py
#   src/Python/flask/web/static/script.js
2024-12-12 13:48:21 +01:00
85af15d7a3 change default camera 2024-12-12 13:28:43 +01:00
a1b50a3780 changes to video settings 2024-12-12 13:27:06 +01:00
b86528595e change camera 2024-12-11 16:51:01 +01:00
eef4f9c79c revert video format change 2024-12-11 16:39:45 +01:00
3c23d37be1 change video format 2024-12-11 16:37:01 +01:00
c2886d32c9 use libcamera with picam 2024-12-11 16:30:14 +01:00
8158c85d6e use astra backend 2024-12-11 16:12:16 +01:00
e682969ec8 code revert 2024-12-11 16:07:26 +01:00
0dfc3b5c13 attempt with gstreamer 2024-12-11 15:43:05 +01:00
7f786d5197 change camera logic 2024-12-11 15:37:31 +01:00
60ba177dc2 add pipeline for picam 2024-12-11 15:34:53 +01:00
e9f998b3e7 set V4L2 backend 2024-12-11 15:28:21 +01:00
7eeaba482e removed attempt for camera detection 2024-12-11 14:50:02 +01:00
e8db00120f update video camera logic 2024-12-11 14:47:29 +01:00
c65f310e81 cleanup 2024-12-11 14:46:58 +01:00
ec3e83ef7f changed ip adress and cmakelist 2024-12-11 14:35:42 +01:00
480d36393a update website so it shows image 2024-12-10 13:29:58 +01:00
fea0f19857 update ip adress 2024-12-10 13:29:50 +01:00
e1135dac0f update cmakelist 2024-12-10 13:13:45 +01:00
2f4e5ae096 re enable robot communication 2024-12-09 10:31:01 +01:00
ishak jmilou.ishak
606506e40c update review feedback with additional documentation and planning notes 2024-12-03 13:37:44 +01:00
ishak jmilou.ishak
6efd95fb32 feedback sprint 4 2024-12-03 13:34:43 +01:00
ishak jmilou.ishak
5eff7fccba fix 2024-12-03 12:45:18 +01:00
ishak jmilou.ishak
a03894e52e changed MQTT connection port from 1884 to 80 2024-12-03 12:18:02 +01:00
9e07a243ea receive images from mqtt server and display on endpoint 2024-12-03 12:06:12 +01:00
b93a5f2dca added mosquitto conf 2024-12-02 14:00:29 +01:00
911b870786 remove unused library 2024-12-02 13:59:27 +01:00
dd2a1b56c4 changed port 2024-12-02 12:58:14 +00:00
dd39bd3021 fixed mqtt and sockets and reverse proxy after 5 hours 2024-12-02 13:44:15 +01:00
ishak jmilou.ishak
1563528b67 changed file location 2024-12-02 13:32:45 +01:00
ishak jmilou.ishak
2e5af52ba8 changed mqtt port to 8080 2024-12-02 12:45:26 +01:00
ishak jmilou.ishak
eb04d35d40 changed path 2024-12-02 10:13:42 +01:00
ishak jmilou.ishak
80fcb1ccc3 daily stand up 02/12/2024 2024-12-02 09:13:34 +01:00
ishak jmilou.ishak
62cdf98098 testing phpmyadmin 2024-11-30 15:18:01 +01:00
ishak jmilou.ishak
db6fa156c9 Update WSGI path to point to the correct application directory 2024-11-28 13:21:24 +01:00
ishak jmilou.ishak
048790ec8b Add WSGI entry point for the application 2024-11-28 13:16:03 +01:00
8aa54805ac Grabbed existing progam off github and repaired it 2024-11-27 21:25:48 +01:00
ishak jmilou.ishak
aca6644c02 Merge branch 'main' of ssh://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79 2024-11-26 14:55:47 +01:00
ishak jmilou.ishak
492f506aa2 daily stand up-/ hoofdvraag en deelvragen 2024-11-26 14:55:45 +01:00
d26d277c3c driver cleanup 2024-11-26 13:32:14 +01:00
c76ba93e82 code req 2024-11-26 13:30:38 +01:00
508d2ed4e2 added base OpenCV script and documentation 2024-11-25 11:46:24 +01:00
3e202acc8d gitignore update 2024-11-25 11:46:07 +01:00
0bfba0bffe Added comments 2024-11-24 16:42:34 +01:00
ishak jmilou.ishak
8a5b349040 Merge branch 'main' of ssh://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79 2024-11-20 17:14:55 +01:00
ishak jmilou.ishak
a41ea1b70c styling website 2024-11-20 17:14:54 +01:00
eb804c888c updated credentials to secure mqtt 2024-11-20 16:29:22 +01:00
a028a6f88f change mqtt credentials 2024-11-20 16:07:39 +01:00
47b29a1c55 added both json and idividual topic sending in cpp 2024-11-20 15:49:40 +01:00
528de4f3f4 code conventions 2024-11-20 15:49:40 +01:00
ishak jmilou.ishak
c16ba3cf9d busy with website redesign 2024-11-20 15:38:56 +01:00
ishak jmilou.ishak
82c4381143 daily stand up 19/11/2024 2024-11-20 11:45:54 +01:00
ishak jmilou.ishak
25b1fa8c35 changed locations of different files 2024-11-18 13:07:02 +01:00
ishak jmilou.ishak
cc9aefa424 made admonition work 2024-11-18 09:18:49 +01:00
ishak jmilou.ishak
3d95479840 fixed drop down 2024-11-18 09:15:56 +01:00
ishak jmilou.ishak
b2a24779f5 Merge branch 'main' of ssh://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79 2024-11-18 09:13:36 +01:00
ishak jmilou.ishak
97efd7d6e1 started with daily 18-11 2024-11-18 09:13:35 +01:00
Mees Roelofsz
ef52dbefe4 Merge branch 'main' of ssh://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-projectten/rooziinuubii79 2024-11-13 16:06:57 +01:00
Mees Roelofsz
f24b88bd68 fixed markdown 2024-11-13 16:06:53 +01:00
ishak jmilou.ishak
f9767965a1 wrote comments 2024-11-13 15:01:15 +01:00
ishak jmilou.ishak
a6b1b3bd1e removed space 2024-11-13 14:20:14 +01:00
ishak jmilou.ishak
c9efba62d4 testing vpn 2024-11-13 13:41:40 +01:00
661fdb9d26 Added placeholder webite for main website 2024-11-12 14:30:45 +01:00
0e30854b51 fix html and added password to enter page 2024-11-12 14:13:43 +01:00
ishak jmilou.ishak
194920bdad refert to older version to check if this is the problem 2024-11-06 15:08:47 +01:00
ishak jmilou.ishak
a67f5238b6 Enhance MQTT message handling and add data fetching in Flask app 2024-11-06 14:44:12 +01:00
ishak jmilou.ishak
d534940370 Refactor safety checks and improve message handling in CKobuki class 2024-11-06 14:06:15 +01:00
89b608b759 comments and cleanup 2024-11-06 14:01:39 +01:00
6dba1d0262 added sensordata to website 2024-11-06 13:59:40 +01:00
51 changed files with 2317 additions and 392 deletions

3
.gitignore vendored
View File

@@ -13,7 +13,7 @@ src/Socket/a.out
src/C++/Driver/cmake_install.cmake
src/C++/Socket/a.out
src/C++/Driver/Makefile
src/C++/Driver/vgcore*
vgcore*
src/C++/Driver/cmake_install.cmake
src/C++/Driver/Makefile
src/C++/Driver/log
@@ -31,3 +31,4 @@ CMakeFiles/
Makefile
CMakeCache.txt
cmake_install.cmake
src/C++/OpenCV/main

View File

@@ -19,14 +19,9 @@ Acceptatie criteria zijn specifieke eisen waaraan de User Story moet voldoen. De
- [ ] Acceptatiecriterium 2
- [ ] ...
**Definition of Done**
**Definition of Done: Hardware**
- [ ] Alle acceptatiecriteria van de user story zijn afgevinkt.
- [ ] Je hebt volgens de HBO-ICT werkstandaarden gewerkt (Agile, GitLab, sprint boards, sprint planning, HBO-ICT conventions etc.)
- [ ] Het werk is technisch gedocumenteerd in het Engels en relevant voor collega-ontwikkelaars. Denk o.a. aan ERD, UML, testen en testresultaten.
- [ ] Het leerproces is beschreven in Standaardnederlands.
- [ ] Het werk is gereviewd door een peer.
- [ ] Het UX/UI gedeelte van de applicatie voldoet aan het Think-Make-Check (TMC) principe.
- [ ] De code is functioneel getest op fouten.
- [ ] De code werkt zonder fouten bij normaal gebruik.
- [ ] De webapplicatie dient zowel op mobiele- als desktop-apparaten gebruikt te kunnen worden.
- [ ] Je werk is gedocumenteerd.
- [ ] Je hebt testen uitgevoerd.
- [ ]

View File

@@ -1,2 +0,0 @@
# Feedback expert review

20
docs/code/OpenCV.md Normal file
View File

@@ -0,0 +1,20 @@
# OpenCV
## Requirements
For the camera we want it to detect what is happening on the video feed and identify it so it can identify dangers.
## Issues
* OpenCL not grabbing gpu
* Solution: https://github.com/Smorodov/Multitarget-tracker/issues/93
## Installation
### Dependencies
* glew
* opencv
## Sources
* https://github.com/UnaNancyOwen/OpenCVDNNSample/tree/master

View File

@@ -0,0 +1,10 @@
# Requirements
1. Het compileerd op x86 en ARM architechturen
2. Geen dubbele code
3. commentaar bij lastige code
4. Doxygen comments bij elke functie, behalve als het duidelijk is in de functienaam
5. Hou je code leesbaar
6. Geen dode code
7. Gebruik TODO comments (TODO TREE)
8.

View File

@@ -1,10 +1,9 @@
# **Projectplan - robot voor Gevaarlijke Omgevingen**
### Projectbescrhijving
### Projectbeschrijving
Het project betreft de ontwikkeling van een robot voor gevaarlijke omgevingen. De robot is bedoeld om gevaarlijke situaties te verkennen en in kaart te brengen. De robot is uitgerust met verschillende sensoren en daar gaan we zelf nog een camera en andere sensoren op zetten om de omgeving te verkennen. De data wordt verzameld en geanalyseerd om een beeld te krijgen van de situatie. De robot kan worden ingezet in verschillende situaties, zoals branden, instortingen en andere gevaarlijke situaties.
### 1. Organisatorische Context
Bij de ontwikkeling van de robot zijn verschillende factoren van belang. Maatschappelijke veranderingen zoals duurzaamheid en de toenemende vraag naar technologische oplossingen voor gevaarlijke werkomgevingen spelen een grote rol. De robot kan worden ingezet voor gevaarlijke situaties waar bijvoorbeeld geen mensen naar binnen kunnen.
@@ -27,7 +26,6 @@ Ethische vragen staan centraal bij de ontwikkeling van de robot. Er moet rekenin
Het project wordt uitgevoerd vanuit een duidelijk plan waar elke sprint een deel van het project centraal staat. Driver bouwen, testen en verbindingen leggen tussen elk gedeelte van het project. We gebruiken de Agile methodiek dus alles kan nog veranderen. Hierbij moeten wij dus ook rekening houden met de etische en organisatorische aspecten, zoals duurzaamheid en veiligheid.
### Aanpak
**Werk methode:** Gebruik van Agile projectmanagement voor flexibiliteit.

View File

@@ -1,6 +1,7 @@
# Wat gaan we maken
## Sensoren
* Camera
* GPS module
* Temparatuur sensor
@@ -12,21 +13,27 @@
## Wat gaan we met de sensoren doen?
### Camera
De camera word gebruikt om foto's te maken in de omgeving in het geval van informatie verkrijgen voor als de robot bijvoorbeeld vast zit, geeft ook optie om informatie te krijgen zonder op de plek zelf te zijn.
### GPS module
De GPS module word gebruikt om de locatie van de robot te bepalen en aan te geven waar bijzonderheden bevinden.
### Temparatuur, TVOC en ECO2 sensor
Deze sensoren zijn bedoeld om de omgeving te meten en te kijken of de omgeving veilig is voor mensen om in te gaan.
### LDR sensor
De LDR sensor word gebruikt om de lichtsterkte te meten en te kijken of er een lamp op de robot aan moet gaan voor de camera.
### Time of Flight sensor
De Time of Flight sensor word gebruikt om de afstand te meten tussen de robot en de muur, zodat de robot niet tegen de muur aan botst.
## Het project
Bij brand of op fabrieksterreinen met gevaarlijke stoffen kan het nodig zijn om een verkenning te
doen van een verdachte omgeving. Het is dan niet verstandig om mensen naar binnen te sturen, in
die gevallen vallen de hulpdiensten terug om een verkenningsrobot. Het doel van het project is het

View File

@@ -1 +0,0 @@
# home

View File

@@ -1,5 +0,0 @@
- [x] Kobuki werkt met driver.
- [x] Ik kan de data uitlezen.
- [ ] Data wordt correct weergegeven.
- [ ] Ik kan de data laten zien in op de website.
- [ ] Ik kan de kobuki besturen vanaf de website.

View File

@@ -0,0 +1,20 @@
# Daily stand ups
??? note "Daily Stand-ups Sprint 4"
| Day | Submitted by | What did you do yesterday | What will you do today | Any blockers? |
| ---------- | ------------ | ----------------------------- | -------------------------------------------------- | ------------------------------------ |
| 18/11/2024 | Ishak | --- | Engels, Repo fixen, beginnen met nieuwe user story | --- |
| 18/11/2024 | Sam | --- | Engels, Feedback verwerken medium stake | None |
| 18/11/2024 | Yannick | --- | Engels, Documentatie, Code samenvoegen | None |
| 18/11/2024 | Mees | --- | Engels, Onderzoek | None |
| 19/11/2024 | Ishak | Engels, Repo fixen | workshop | --- |
| 19/11/2024 | Sam | Engels, Feedback verwerken | workshop | None |
| 19/11/2024 | Yannick | code samenvoegen,schema maken | workshop, documentatie | None |
| 19/11/2024 | Mees | niks | workshop, fixen include path | include path werkt niet |
| 26/11/2024 | Ishak | Workshop | database, engels video opnemen | phpmyadmin werkt niet(weet probleem) |
| 26/11/2024 | Sam | opencv | opencv | --- |
| 26/11/2024 | Yannick | ziek | ziek | --- |
| 26/11/2024 | Mees | Engels video | stepper motor | vscode werkt niet |
| 02/12/2024 | Ishak | database | database | --- |
| 02/12/2024 | Sam | opencv | camera beeld op website | --- |
| 02/12/2024 | Yannick | ziek, documentatie | behuizing voor esp | --- |

View File

@@ -0,0 +1,12 @@
# sprint review 4 feedback
- Definition of done SMART maken
- Uitgebreider beschrijven wat er voor de definition of done nodig is
- Testen van de software niet meer dan een halve A4
- Acceptatie criteria beter uitschrijven( vragen aan ed)
- Meer software ontwikkelen
- kijken of we met een punten systeem kunnen werken in user stories. zo kan je zien hoe groot een user story is.
- read.me file aanpassen
- meer aan documentatie doen.
- technisch iets te uitdagend
- planning beter maken

View File

@@ -34,17 +34,22 @@ plugins:
modules: [mkdocs_macros_mdocotion]
markdown_extensions:
- attr_list
- md_in_html
- fenced_code
- pymdownx.highlight:
linenums: true
use_pygments: true
- pymdownx.inlinehilite
- pymdownx.snippets
- pymdownx.tabbed
- pymdownx.superfences:
custom_fences:
- name: mermaid
class: mermaid
format: !!python/name:pymdownx.superfences.fence_code_format
- attr_list
- admonition
- pymdownx.details
- pymdownx.superfences
- md_in_html
- fenced_code
- pymdownx.highlight:
linenums: true
use_pygments: true
- pymdownx.inlinehilite
- pymdownx.snippets
- pymdownx.superfences:
custom_fences:
- name: mermaid
class: mermaid
format: !!python/name:pymdownx.superfences.fence_code_format
- toc:
permalink: true
- pymdownx.details

View File

@@ -6,7 +6,10 @@ set(CMAKE_CXX_STANDARD 23)
find_library(PAHO_MQTTPP_LIBRARY paho-mqttpp3 PATHS /usr/local/lib)
find_library(PAHO_MQTT_LIBRARY paho-mqtt3a PATHS /usr/local/lib)
include_directories(/usr/local/include)
# Find OpenCV package
find_package(OpenCV REQUIRED)
find_package(OpenEXR REQUIRED)
include_directories(${OpenCV_INCLUDE_DIRS})
set(SOURCE_FILES
src/KobukiDriver/KobukiParser.cpp
@@ -20,4 +23,4 @@ set(SOURCE_FILES
add_executable(kobuki_control ${SOURCE_FILES})
# Link the static libraries
target_link_libraries(kobuki_control ${PAHO_MQTTPP_LIBRARY} ${PAHO_MQTT_LIBRARY} pthread)
target_link_libraries(kobuki_control ${PAHO_MQTTPP_LIBRARY} ${PAHO_MQTT_LIBRARY} ${OpenCV_LIBS} pthread OpenEXR::OpenEXR)

View File

@@ -282,7 +282,6 @@ int CKobuki::measure() {
return 0;
}
long double CKobuki::gyroToRad(signed short GyroAngle) {
long double rad;
@@ -510,32 +509,8 @@ void CKobuki::doRotation(long double th) {
usleep(25 * 1000);
}
// combines navigation to a coordinate and rotation by an angle, performs
// movement to the selected coordinate in the robot's coordinate system
void CKobuki::goToXy(long double xx, long double yy) {
long double th;
yy = yy * -1;
th = atan2(yy, xx);
doRotation(th);
long double s = sqrt(pow(xx, 2) + pow(yy, 2));
// resetnem suradnicovu sustavu robota
x = 0;
y = 0;
iterationCount = 0;
theta = 0;
// std::cout << "mam prejst: " << s << "[m]" << std::endl;
goStraight(s);
usleep(25 * 1000);
return;
}
/// @brief Makes the robot move forward for 3 seconds
/// @param speedvalue How fast it will drive forward from 0 - 1024
void CKobuki::forward(int speedvalue) {
// Use the goStraight logic to determine the speed and distance
@@ -569,76 +544,77 @@ void CKobuki::forward(int speedvalue) {
/// @param degrees Rotation in degrees
void CKobuki::Rotate(int degrees) {
// convert raidans to degrees
float radians = degrees * PI / 180.0;
// convert raidans to degrees
float radians = degrees * PI / 180.0;
// Calculate the rotation speed in radians per second
double radpersec = 1;
// Calculate the rotation speed in radians per second
double RADS_PER_SEC = 1;
// calculator rotation time and give absolute value
float rotation_time = std::abs(radians / radpersec);
// calculator rotation time and give absolute value
float rotation_time = std::abs(radians / RADS_PER_SEC);
// Use original function to set the rotation speed in mm/s
setRotationSpeed(radians);
// Use original function to set the rotation speed in mm/s
setRotationSpeed(radians);
// Sleep for the calculated rotation time
std::this_thread::sleep_for(std::chrono::milliseconds(static_cast<int>(rotation_time * 1000)));
// Sleep for the calculated rotation time
std::this_thread::sleep_for(
std::chrono::milliseconds(static_cast<int>(rotation_time * 1000)));
// Stop the robot after the rotation
setRotationSpeed(0);
// Stop the robot after the rotation
setRotationSpeed(0);
}
/// @brief Robot safety function to be ran in another thread. Makes sure the robot does not throw inteself from the table. Only use this when the speed is lower than 350
/// @param pointerToMessage Set this pointer to the control message and then it attempts to reset it when it bumps into something so it doesnt keep trying to do the past commant
// TODO: make this return bool so it can be used in the control part
void CKobuki::robotSafety(std::string *pointerToMessage) {
while (true) {
if (parser.data.BumperCenter || parser.data.BumperLeft || parser.data.BumperRight ||
parser.data.CliffLeft || parser.data.CliffCenter || parser.data.CliffRight) {
std::cout << "Safety condition triggered!" << std::endl; // Debug print
*pointerToMessage = "estop";
forward(-100); // reverse the robot
}
std::this_thread::sleep_for(std::chrono::milliseconds(static_cast<int>(100)));
while (true) {
if (parser.data.BumperCenter || parser.data.BumperLeft ||
parser.data.BumperRight || parser.data.CliffLeft ||
parser.data.CliffCenter || parser.data.CliffRight) {
std::cout << "Safety condition triggered!" << std::endl; // Debug print
*pointerToMessage = "estop";
forward(-100); // reverse the robot
}
std::this_thread::sleep_for(std::chrono::milliseconds(static_cast<int>(100)));
}
}
/// @brief Robot safety function to be ran in another thread. Makes sure the robot does not throw inteself from the table. Only use this when the speed is lower than 350
void CKobuki::robotSafety() {
while (true) {
if (parser.data.BumperCenter || parser.data.BumperLeft || parser.data.BumperRight ||
parser.data.CliffLeft || parser.data.CliffCenter || parser.data.CliffRight) {
std::cout << "Safety condition triggered!" << std::endl; // Debug print
forward(-100); // reverse the robot
}
while (true) {
if (parser.data.BumperCenter || parser.data.BumperLeft ||
parser.data.BumperRight || parser.data.CliffLeft ||
parser.data.CliffCenter || parser.data.CliffRight) {
std::cout << "Safety condition triggered!" << std::endl; // Debug print
forward(-100); // reverse the robot
}
}
}
void CKobuki::sendNullMessage(){
/// @brief When called the robot gets a control message to stop whatever its doing
void CKobuki::sendNullMessage() {
unsigned char message[11] = {
0xaa, // Start byte 1
0x55, // Start byte 2
0x08, // Payload length (the first 2 bytes dont count)
0x01, // payload type (0x01 = control command)
0x04, // Control byte or additional identifier
0x00, // Lower byte of speed value
0x00, // Upper byte of speed value
0x00, // Placeholder for radius
0x00, // Placeholder for radius
0x00 // Placeholder for checksum
};
unsigned char message[11] = {
0xaa, // Start byte 1
0x55, // Start byte 2
0x08, // Payload length (the first 2 bytes dont count)
0x01, // payload type (0x01 = control command)
0x04, // Control byte or additional identifier
0x00, // Lower byte of speed value
0x00, // Upper byte of speed value
0x00, // Placeholder for radius
0x00, // Placeholder for radius
0x00 // Placeholder for checksum
};
message[10] = message[2] ^ message[3] ^ message[4] ^ message[5] ^ message[6] ^
message[7] ^ message[8] ^ message[9];
message[10] = message[2] ^ message[3] ^ message[4] ^ message[5] ^ message[6] ^
message[7] ^ message[8] ^ message[9];
// Send the message
uint32_t pocet;
pocet = write(HCom, &message, 11);
}

View File

@@ -31,7 +31,6 @@
#include <chrono>
#include <sstream>
#include "KobukiParser.h"
#include "graph.h"
using namespace std;
@@ -79,6 +78,7 @@ public:
void robotSafety(std::string *pointerToMessage);
void robotSafety(); //overload
void sendNullMessage();
bool safetyActive = false;
KobukiParser parser;

View File

@@ -1,71 +0,0 @@
#ifndef GRAPH1010
#define GRAPH1010
#include <stdio.h>
#include <stdlib.h>
#include <vector>
using namespace std;
#define GRAPH_ENABLED true
class plot {
public:
FILE *gp;
bool enabled,persist;
plot(bool _persist=false,bool _enabled=GRAPH_ENABLED) {
enabled=_enabled;
persist=_persist;
if (enabled) {
if(persist)
gp=popen("gnuplot -persist","w");
else
gp=popen("gnuplot","w");
}
}
void plot_data(vector<float> x,const char* style="points",const char* title="Data") {
if(!enabled)
return;
fprintf(gp,"set title '%s' \n",title);
fprintf(gp,"plot '-' w %s \n",style);
for(int k=0;k<x.size();k++) {
fprintf(gp,"%f\n",x[k]);
}
fprintf(gp,"e\n");
fflush(gp);
}
void plot_data(vector<float> x,vector<float> y,const char* style="points",const char* title="Data") {
if(!enabled)
return;
fprintf(gp,"set title '%s' \n",title);
fprintf(gp,"plot '-' w %s \n",style);
for(int k=0;k<x.size();k++) {
fprintf(gp,"%f %f \n",x[k],y[k]);
}
fprintf(gp,"e\n");
fflush(gp);
}
~plot() {
if(enabled)
pclose(gp);
}
};
/*
int main(int argc,char **argv) {
plot p;
for(int a=0;a<100;a++) {
vector<float> x,y;
for(int k=a;k<a+200;k++) {
x.push_back(k);
y.push_back(k*k);
}
p.plot_data(x,y);
}
return 0;
}
*/
#endif

View File

@@ -2,8 +2,10 @@
MqttClient::MqttClient(const std::string& address, const std::string& clientId, const std::string& username, const std::string& password)
//client_ is the connection
//here all the @PARAMS are getting set for the connection
: client_(address, clientId), username_(username), password_(password), callback_(*this) {
client_.set_callback(callback_);
options.set_clean_session(true);
options.set_mqtt_version(MQTTVERSION_3_1_1); // For MQTT 3.1.1
if (!username_.empty() && !password_.empty()) {

View File

@@ -1,5 +1,5 @@
#include "MqttClient.h"
//example file for testing
int main(){
MqttClient client("mqtt://localhost:1883", "raspberry_pi_client", "ishak", "kobuki");
client.connect();

View File

@@ -1,15 +1,19 @@
#include <iostream>
#include <cmath>
#include <thread>
#include "KobukiDriver/graph.h"
#include "MQTT/MqttClient.h"
#include "KobukiDriver/CKobuki.h"
#include <opencv4/opencv2/opencv.hpp>
#include <opencv4/opencv2/core.hpp>
using namespace std;
using namespace cv;
CKobuki robot;
std::string readMQTT();
void parseMQTT(std::string message);
MqttClient client("mqtt://145.92.224.21:1883", "KobukiRPI", "ishak", "kobuki"); // create a client object
void CapnSend();
//ip, clientID, username, password
MqttClient client("ws://145.92.224.21/ws/", "KobukiRPI", "rpi", "rpiwachtwoordofzo"); // create a client object
std::string message = "stop";
std::string serializeKobukiData(const TKobukiData &data);
void sendKobukiData(TKobukiData &data);
@@ -18,6 +22,8 @@ void setup()
{
unsigned char *null_ptr(0);
robot.startCommunication("/dev/ttyUSB0", true, null_ptr);
//connect mqtt server and sub to commands
client.connect();
client.subscribe("home/commands");
}
@@ -25,15 +31,17 @@ void setup()
int main()
{
setup();
std::thread image (CapnSend);
std::thread safety([&]() { robot.robotSafety(&message); });
std::thread sendMqtt([&]() { sendKobukiData(robot.parser.data); });
while(true){
parseMQTT(readMQTT());
parseMQTT(readMQTT());
}
sendMqtt.join();
safety.join();
return 0;
image.join();
}
std::string readMQTT()
@@ -53,7 +61,7 @@ void parseMQTT(std::string message)
{
if (message == "up")
{
robot.forward(1024);
robot.forward(350);
}
else if (message == "left")
{
@@ -65,7 +73,7 @@ void parseMQTT(std::string message)
}
else if (message == "down")
{
robot.forward(-800);
robot.forward(-350);
}
else if (message == "stop")
{
@@ -149,6 +157,64 @@ void logToFile()
}
}
void sendIndividualKobukiData(const TKobukiData &data) {
while (true) {
client.publishMessage("kobuki/data/timestamp", std::to_string(data.timestamp));
client.publishMessage("kobuki/data/BumperCenter", std::to_string(data.BumperCenter));
client.publishMessage("kobuki/data/BumperLeft", std::to_string(data.BumperLeft));
client.publishMessage("kobuki/data/BumperRight", std::to_string(data.BumperRight));
client.publishMessage("kobuki/data/WheelDropLeft", std::to_string(data.WheelDropLeft));
client.publishMessage("kobuki/data/WheelDropRight", std::to_string(data.WheelDropRight));
client.publishMessage("kobuki/data/CliffCenter", std::to_string(data.CliffCenter));
client.publishMessage("kobuki/data/CliffLeft", std::to_string(data.CliffLeft));
client.publishMessage("kobuki/data/CliffRight", std::to_string(data.CliffRight));
client.publishMessage("kobuki/data/EncoderLeft", std::to_string(data.EncoderLeft));
client.publishMessage("kobuki/data/EncoderRight", std::to_string(data.EncoderRight));
client.publishMessage("kobuki/data/PWMleft", std::to_string(data.PWMleft));
client.publishMessage("kobuki/data/PWMright", std::to_string(data.PWMright));
client.publishMessage("kobuki/data/ButtonPress1", std::to_string(data.ButtonPress1));
client.publishMessage("kobuki/data/ButtonPress2", std::to_string(data.ButtonPress2));
client.publishMessage("kobuki/data/ButtonPress3", std::to_string(data.ButtonPress3));
client.publishMessage("kobuki/data/Charger", std::to_string(data.Charger));
client.publishMessage("kobuki/data/Battery", std::to_string(data.Battery));
client.publishMessage("kobuki/data/overCurrent", std::to_string(data.overCurrent));
client.publishMessage("kobuki/data/IRSensorRight", std::to_string(data.IRSensorRight));
client.publishMessage("kobuki/data/IRSensorCenter", std::to_string(data.IRSensorCenter));
client.publishMessage("kobuki/data/IRSensorLeft", std::to_string(data.IRSensorLeft));
client.publishMessage("kobuki/data/GyroAngle", std::to_string(data.GyroAngle));
client.publishMessage("kobuki/data/GyroAngleRate", std::to_string(data.GyroAngleRate));
client.publishMessage("kobuki/data/CliffSensorRight", std::to_string(data.CliffSensorRight));
client.publishMessage("kobuki/data/CliffSensorCenter", std::to_string(data.CliffSensorCenter));
client.publishMessage("kobuki/data/CliffSensorLeft", std::to_string(data.CliffSensorLeft));
client.publishMessage("kobuki/data/wheelCurrentLeft", std::to_string(data.wheelCurrentLeft));
client.publishMessage("kobuki/data/wheelCurrentRight", std::to_string(data.wheelCurrentRight));
client.publishMessage("kobuki/data/digitalInput", std::to_string(data.digitalInput));
client.publishMessage("kobuki/data/analogInputCh0", std::to_string(data.analogInputCh0));
client.publishMessage("kobuki/data/analogInputCh1", std::to_string(data.analogInputCh1));
client.publishMessage("kobuki/data/analogInputCh2", std::to_string(data.analogInputCh2));
client.publishMessage("kobuki/data/analogInputCh3", std::to_string(data.analogInputCh3));
client.publishMessage("kobuki/data/frameId", std::to_string(data.frameId));
client.publishMessage("kobuki/data/extraInfo/HardwareVersionPatch", std::to_string(data.extraInfo.HardwareVersionPatch));
client.publishMessage("kobuki/data/extraInfo/HardwareVersionMinor", std::to_string(data.extraInfo.HardwareVersionMinor));
client.publishMessage("kobuki/data/extraInfo/HardwareVersionMajor", std::to_string(data.extraInfo.HardwareVersionMajor));
client.publishMessage("kobuki/data/extraInfo/FirmwareVersionPatch", std::to_string(data.extraInfo.FirmwareVersionPatch));
client.publishMessage("kobuki/data/extraInfo/FirmwareVersionMinor", std::to_string(data.extraInfo.FirmwareVersionMinor));
client.publishMessage("kobuki/data/extraInfo/FirmwareVersionMajor", std::to_string(data.extraInfo.FirmwareVersionMajor));
client.publishMessage("kobuki/data/extraInfo/UDID0", std::to_string(data.extraInfo.UDID0));
client.publishMessage("kobuki/data/extraInfo/UDID1", std::to_string(data.extraInfo.UDID1));
client.publishMessage("kobuki/data/extraInfo/UDID2", std::to_string(data.extraInfo.UDID2));
if (!data.gyroData.empty()) {
const auto& latestGyro = data.gyroData.back();
client.publishMessage("kobuki/data/gyroData/x", std::to_string(latestGyro.x));
client.publishMessage("kobuki/data/gyroData/y", std::to_string(latestGyro.y));
client.publishMessage("kobuki/data/gyroData/z", std::to_string(latestGyro.z));
}
std::this_thread::sleep_for(std::chrono::milliseconds(1000));
}
}
std::string serializeKobukiData(const TKobukiData &data) {
std::string json = "{\"timestamp\":" + std::to_string(data.timestamp) +
",\"BumperCenter\":" + std::to_string(data.BumperCenter) +
@@ -210,6 +276,35 @@ std::string serializeKobukiData(const TKobukiData &data) {
void sendKobukiData(TKobukiData &data) {
while (true) {
client.publishMessage("kobuki/data", serializeKobukiData(data));
std::cout << "Sent data" << std::endl;
std::this_thread::sleep_for(std::chrono::milliseconds(1000));
}
}
void CapnSend() {
VideoCapture cap(0);
if (!cap.isOpened()) {
cerr << "Error: Could not open camera" << endl;
return;
}
Mat frame;
while (true) {
cap >> frame; // Capture a new image frame
if (frame.empty()) {
cerr << "Error: Could not capture image" << endl;
continue;
}
// Convert the image to a byte array
vector<uchar> buf;
imencode(".jpg", frame, buf);
auto* enc_msg = reinterpret_cast<unsigned char*>(buf.data());
// Publish the image data
client.publishMessage("kobuki/cam", string(enc_msg, enc_msg + buf.size()));
cout << "Sent image" << endl;
std::this_thread::sleep_for(std::chrono::milliseconds(300)); // Send image every 1000ms
}
}

View File

@@ -1,15 +0,0 @@
cmake_minimum_required(VERSION 3.10)
set(CMAKE_CXX_STANDARD 23)
# Find the Paho MQTT C++ library
find_library(PAHO_MQTTPP_LIBRARY paho-mqttpp3 PATHS /usr/local/lib)
find_library(PAHO_MQTT_LIBRARY paho-mqtt3a PATHS /usr/local/lib)
# Include the headers
include_directories(/usr/local/include)
# Add the executable
add_executable(my_program main.cpp)
# Link the libraries
target_link_libraries(my_program ${PAHO_MQTTPP_LIBRARY} ${PAHO_MQTT_LIBRARY})

View File

@@ -1,56 +0,0 @@
#include <iostream>
#include <mqtt/async_client.h>
#include <thread> // Voor std::this_thread::sleep_for
#include <chrono> // Voor std::chrono::seconds
const std::string ADDRESS("mqtt://localhost:1883"); // Aanpassen indien nodig
const std::string CLIENT_ID("raspberry_pi_client");
const std::string TOPIC("home/commands");
class callback : public virtual mqtt::callback {
void message_arrived(mqtt::const_message_ptr msg) override {
std::cout << "Ontvangen bericht: '" << msg->get_topic()
<< "' : " << msg->to_string() << std::endl;
// Doe iets met het bericht, bijvoorbeeld een GP.IO-activering
}
void connection_lost(const std::string& cause) override {
std::cerr << "Verbinding verloren. Oorzaak: " << cause << std::endl;
}
void delivery_complete(mqtt::delivery_token_ptr token) override {
std::cout << "Bericht afgeleverd!" << std::endl;
}
};
int main() {
mqtt::async_client client(ADDRESS, CLIENT_ID);
callback cb;
client.set_callback(cb);
mqtt::connect_options connOpts;
connOpts.set_clean_session(true);
connOpts.set_user_name("ishak");
connOpts.set_password("kobuki");
connOpts.set_mqtt_version(MQTTVERSION_3_1_1); // Voor MQTT 3.1.1
try {
std::cout << "Verbinden met broker..." << std::endl;
client.connect(connOpts)->wait();
std::cout << "Verbonden!" << std::endl;
std::cout << "Abonneren op topic: " << TOPIC << std::endl;
client.subscribe(TOPIC, 1)->wait();
// Houd de client draaiende om berichten te blijven ontvangen
while (true) {
std::this_thread::sleep_for(std::chrono::seconds(1)); // Wacht om CPU-gebruik te verminderen
}
} catch (const mqtt::exception &exc) {
std::cerr << "Fout: " << exc.what() << std::endl;
return 1;
}
return 0;
}

View File

@@ -0,0 +1,44 @@
cmake_minimum_required( VERSION 3.6 )
# Require C++11 (or later)
set( CMAKE_CXX_STANDARD 23 )
set( CMAKE_CXX_STANDARD_REQUIRED ON )
set( CMAKE_CXX_EXTENSIONS OFF )
set(BUILD_MODE Debug)
# Create Project
project( Sample )
add_executable( YOLOv4 util.h main.cpp )
# Set StartUp Project
set_property( DIRECTORY PROPERTY VS_STARTUP_PROJECT "YOLOv4" )
# Find Package
# OpenCV
find_package( OpenCV REQUIRED )
if( OpenCV_FOUND )
# Additional Include Directories
include_directories( ${OpenCV_INCLUDE_DIRS} )
# Additional Dependencies
target_link_libraries( YOLOv4 ${OpenCV_LIBS} )
endif()
# Download Model
set( MODEL https://github.com/AlexeyAB/darknet/releases/download/darknet_yolo_v3_optimal/yolov4.weights )
file( DOWNLOAD
"${MODEL}"
"${CMAKE_CURRENT_LIST_DIR}/yolov4.weights"
EXPECTED_HASH SHA256=e8a4f6c62188738d86dc6898d82724ec0964d0eb9d2ae0f0a9d53d65d108d562
SHOW_PROGRESS
)
# Download Config
set( CONFIG https://raw.githubusercontent.com/AlexeyAB/darknet/master/cfg/yolov4.cfg )
file( DOWNLOAD
"${CONFIG}"
"${CMAKE_CURRENT_LIST_DIR}/yolov4.cfg"
EXPECTED_HASH SHA256=a6d0f8e5c62cc8378384f75a8159b95fa2964d4162e33351b00ac82e0fc46a34
SHOW_PROGRESS
)

BIN
src/C++/OpenCV/YOLOv4 Executable file

Binary file not shown.

80
src/C++/OpenCV/coco.names Normal file
View File

@@ -0,0 +1,80 @@
person
bicycle
car
motorbike
aeroplane
bus
train
truck
boat
traffic light
fire hydrant
stop sign
parking meter
bench
bird
cat
dog
horse
sheep
cow
elephant
bear
zebra
giraffe
backpack
umbrella
handbag
tie
suitcase
frisbee
skis
snowboard
sports ball
kite
baseball bat
baseball glove
skateboard
surfboard
tennis racket
bottle
wine glass
cup
fork
knife
spoon
bowl
banana
apple
sandwich
orange
broccoli
carrot
hot dog
pizza
donut
cake
chair
sofa
pottedplant
bed
diningtable
toilet
tvmonitor
laptop
mouse
remote
keyboard
cell phone
microwave
oven
toaster
sink
refrigerator
book
clock
vase
scissors
teddy bear
hair drier
toothbrush

209
src/C++/OpenCV/main.cpp Normal file
View File

@@ -0,0 +1,209 @@
#include <iostream>
#include <string>
#include <vector>
#include <opencv2/opencv.hpp>
#include <opencv2/dnn.hpp>
#include <filesystem>
#include <fstream>
#include "util.h"
// Helper function to check if a file exists
bool fileExists(const std::string &path)
{
return std::filesystem::exists(path);
}
// Function to read class names from a file
std::vector<std::string> _readClassNameList(const std::string &path)
{
std::vector<std::string> classes;
// Check if file exists
if (!fileExists(path))
{
throw std::runtime_error("Class names file not found: " + path);
}
// Try to open and read file
std::ifstream file(path);
if (!file.is_open())
{
throw std::runtime_error("Unable to open class names file: " + path);
}
std::string line;
while (std::getline(file, line))
{
if (!line.empty())
{
classes.push_back(line);
}
}
if (classes.empty())
{
throw std::runtime_error("No classes found in file: " + path);
}
return classes;
}
int main(int argc, char *argv[])
{
try
{
// Open Video Capture
cv::VideoCapture capture = cv::VideoCapture(0);
if (!capture.isOpened())
{
std::cerr << "Failed to open camera device" << std::endl;
return -1;
}
// Read Class Name List and Color Table
const std::string list = "coco.names";
const std::vector<std::string> classes = _readClassNameList(list);
const std::vector<cv::Scalar> colors = getClassColors(classes.size());
// Debug: Print the size of the colors vector
std::cout << "Number of colors: " << colors.size() << std::endl;
// Read Darknet
const std::string model = "yolov4.weights";
const std::string config = "yolov4.cfg";
cv::dnn::Net net = cv::dnn::readNet(model, config);
if (net.empty())
{
std::cerr << "Failed to load network" << std::endl;
return -1;
}
// Set Preferable Backend
net.setPreferableBackend(cv::dnn::DNN_BACKEND_OPENCV);
// Set Preferable Target
net.setPreferableTarget(cv::dnn::DNN_TARGET_OPENCL);
while (true)
{
// Read Frame
cv::Mat frame;
capture >> frame;
if (frame.empty())
{
cv::waitKey(0);
break;
}
if (frame.channels() == 4)
{
cv::cvtColor(frame, frame, cv::COLOR_BGRA2BGR);
}
// Create Blob from Input Image
cv::Mat blob = cv::dnn::blobFromImage(frame, 1 / 255.f, cv::Size(416, 416), cv::Scalar(), true, false);
// Set Input Blob
net.setInput(blob);
// Run Forward Network
std::vector<cv::Mat> detections;
net.forward(detections, getOutputsNames(net));
// Draw Region
std::vector<int32_t> class_ids;
std::vector<float> confidences;
std::vector<cv::Rect> rectangles;
for (cv::Mat &detection : detections)
{
if (detection.empty())
{
std::cerr << "Detection matrix is empty!" << std::endl;
continue;
}
for (int32_t i = 0; i < detection.rows; i++)
{
cv::Mat region = detection.row(i);
// Retrieve Max Confidence and Class Index
cv::Mat scores = region.colRange(5, detection.cols);
cv::Point class_id;
double confidence;
cv::minMaxLoc(scores, 0, &confidence, 0, &class_id);
// Check Confidence
constexpr float threshold = 0.2;
if (threshold > confidence)
{
continue;
}
// Retrieve Object Position
const int32_t x_center = static_cast<int32_t>(region.at<float>(0) * frame.cols);
const int32_t y_center = static_cast<int32_t>(region.at<float>(1) * frame.rows);
const int32_t width = static_cast<int32_t>(region.at<float>(2) * frame.cols);
const int32_t height = static_cast<int32_t>(region.at<float>(3) * frame.rows);
const cv::Rect rectangle = cv::Rect(x_center - (width / 2), y_center - (height / 2), width, height);
// Add Class ID, Confidence, Rectangle
class_ids.push_back(class_id.x);
confidences.push_back(confidence);
rectangles.push_back(rectangle);
}
}
// Remove Overlap Rectangles using Non-Maximum Suppression
constexpr float confidence_threshold = 0.5; // Confidence
constexpr float nms_threshold = 0.5; // IoU (Intersection over Union)
std::vector<int32_t> indices;
cv::dnn::NMSBoxes(rectangles, confidences, confidence_threshold, nms_threshold, indices);
// Draw Rectangle
for (const int32_t &index : indices)
{
// Bounds checking
if (class_ids[index] >= colors.size())
{
std::cerr << "Color index out of bounds: " << class_ids[index] << " (max: " << colors.size() - 1 << ")" << std::endl;
continue;
}
const cv::Rect rectangle = rectangles[index];
const cv::Scalar color = colors[class_ids[index]];
// Debug: Print the index and color
std::cout << "Drawing rectangle with color index: " << class_ids[index] << std::endl;
constexpr int32_t thickness = 3;
cv::rectangle(frame, rectangle, color, thickness);
std::string label = classes[class_ids[index]] + ": " + std::to_string(static_cast<int>(confidences[index] * 100)) + "%";
int baseLine;
cv::Size labelSize = cv::getTextSize(label, cv::FONT_HERSHEY_SIMPLEX, 0.5, 1, &baseLine);
int top = std::max(rectangle.y, labelSize.height);
cv::rectangle(frame, cv::Point(rectangle.x, top - labelSize.height),
cv::Point(rectangle.x + labelSize.width, top + baseLine), color, cv::FILLED);
cv::putText(frame, label, cv::Point(rectangle.x, top), cv::FONT_HERSHEY_SIMPLEX, 0.5, cv::Scalar(255, 255, 255), 1);
}
// Show Image
cv::imshow("Object Detection", frame);
const int32_t key = cv::waitKey(1);
if (key == 'q')
{
break;
}
}
cv::destroyAllWindows();
return 0;
}
catch (const std::exception &e)
{
std::cerr << "Error: " << e.what() << std::endl;
return -1;
}
}
// cloned and fixed from https://github.com/UnaNancyOwen/OpenCVDNNSample/tree/master

61
src/C++/OpenCV/util.h Normal file
View File

@@ -0,0 +1,61 @@
#ifndef __UTIL__
#define __UTIL__
#include <vector>
#include <string>
#include <fstream>
#include <opencv2/dnn.hpp>
#include <opencv2/core.hpp>
#include <opencv2/highgui.hpp>
// Get Output Layers Name
std::vector<std::string> getOutputsNames( const cv::dnn::Net& net )
{
static std::vector<std::string> names;
if( names.empty() ){
std::vector<int32_t> out_layers = net.getUnconnectedOutLayers();
std::vector<std::string> layers_names = net.getLayerNames();
names.resize( out_layers.size() );
for( size_t i = 0; i < out_layers.size(); ++i ){
names[i] = layers_names[out_layers[i] - 1];
}
}
return names;
}
// Get Output Layer Type
std::string getOutputLayerType( cv::dnn::Net& net )
{
const std::vector<int32_t> out_layers = net.getUnconnectedOutLayers();
const std::string output_layer_type = net.getLayer( out_layers[0] )->type;
return output_layer_type;
}
// Read Class Name List
std::vector<std::string> readClassNameList( const std::string list_path )
{
std::vector<std::string> classes;
std::ifstream ifs( list_path );
if( !ifs.is_open() ){
return classes;
}
std::string class_name = "";
while( std::getline( ifs, class_name ) ){
classes.push_back( class_name );
}
return classes;
}
// Get Class Color Table for Visualize
std::vector<cv::Scalar> getClassColors( const int32_t number_of_colors )
{
cv::RNG random;
std::vector<cv::Scalar> colors;
for( int32_t i = 0; i < number_of_colors; i++ ){
cv::Scalar color( random.uniform( 0, 255 ), random.uniform( 0, 255 ), random.uniform( 0, 255 ) );
colors.push_back( color );
}
return colors;
}
#endif // __UTIL__

1158
src/C++/OpenCV/yolov4.cfg Normal file

File diff suppressed because it is too large Load Diff

Binary file not shown.

View File

@@ -1,25 +1,38 @@
from flask import Flask, request, render_template, jsonify
from flask import Flask, Response, request, render_template, jsonify
import paho.mqtt.client as mqtt
app = Flask(__name__)
# This function gets triggered when it receives a mqtt message
kobuki_message = "empty"
def on_message(client, userdata, message):
global kobuki_message #set scope for this variable
kobuki_message = str(message.payload.decode("utf-8"))
global kobuki_message, latest_image
if message.topic == "kobuki/data":
kobuki_message = str(message.payload.decode("utf-8"))
elif message.topic == "kobuki/cam":
latest_image = message.payload
# Create an MQTT client instance
mqtt_client = mqtt.Client()
mqtt_client.username_pw_set("ishak", "kobuki")
mqtt_client.connect("145.92.224.21", 1883, 60)
mqtt_client.username_pw_set("server", "serverwachtwoordofzo")
mqtt_client.connect("localhost", 80, 60)
mqtt_client.loop_start()
mqtt_client.subscribe("kobuki/data")
mqtt_client.subscribe("kobuki/cam")
mqtt_client.on_message = on_message # this lines needs to be under the function definition otherwise it cant find which function it needs to use
@app.route('/', methods=["GET","POST"])
@app.route('/')
def index():
return render_template('index.html')
@app.route('/control', methods=["GET","POST"])
def control():
if request.authorization and request.authorization.username == 'ishak' and request.authorization.password == 'kobuki':
return render_template('control.html')
else:
return ('Unauthorized', 401, {'WWW-Authenticate': 'Basic realm="Login Required"'})
@app.route('/move', methods=['POST'])
def move():
data = request.get_json()
@@ -34,10 +47,25 @@ def move():
@app.route('/data', methods=['GET'])
def data():
return jsonify({"kobuki_message": kobuki_message})
return kobuki_message
@app.route('/image')
def image():
global latest_image
if latest_image is not None:
return Response(latest_image, mimetype='image/jpeg')
else:
return "No image available", 404
@app.route('/phpmyadmin/<path:path>')
def phpmyadmin_passthrough(path):
# Laat Apache deze route direct afhandelen
return "", 404
if __name__ == '__main__':
app.run(debug=True)
app.run(debug=True, port=5000)

Binary file not shown.

After

Width:  |  Height:  |  Size: 200 KiB

View File

@@ -1,25 +1,57 @@
// Selecteer alle knoppen en voeg een event listener toe aan elke knop
document.querySelectorAll(".btn").forEach(button => {
button.addEventListener("click", function(event) {
event.preventDefault(); // voorkomt pagina-verversing
document.addEventListener("DOMContentLoaded", function() {
document.querySelectorAll(".btn").forEach(button => {
button.addEventListener("click", function(event) {
event.preventDefault(); // prevents page refresh
// Haal de waarde van de knop op
const direction = event.target.value;
// Get the value of the button
const direction = event.target.value;
// Verstuur de richting naar de server met fetch
fetch("/move", {
method: "POST",
headers: {
"Content-Type": "application/json"
},
body: JSON.stringify({ direction: direction })
})
.then(response => response.json())
.then(data => {
console.log("Success:", data);
})
.catch(error => {
console.error("Error:", error);
fetch("/move", {
method: "POST",
headers: {
"Content-Type": "application/json"
},
body: JSON.stringify({ direction: direction })
})
.then(response => response.json())
.then(data => {script
console.log("Success:", data);
})
.catch(error => {
console.error("Error:", error);
});
});
});
// Fetch data from the server
async function fetchData() {
const response = await fetch("/data");
const data = await response.json();
return data;
}
// Parse the data and show it on the website
async function parseData() {
const data = await fetchData();
const sensorDataContainer = document.getElementById("sensor-data");
sensorDataContainer.innerHTML = ""; // Clear previous data
// For each object in JSON array, create a new paragraph element and append it to the sensorDataContainer
for (const [key, value] of Object.entries(data)) {
const dataElement = document.createElement("p");
dataElement.textContent = `${key}: ${value}`;
sensorDataContainer.appendChild(dataElement);
}
}
// Update the image
function updateImage() {
var img = document.getElementById("robot-image");
img.src = "/image?" + new Date().getTime(); // Add timestamp to avoid caching
}
// Fetch and display sensor data every 5 seconds
setInterval(parseData, 1000);
// Update the image every 5 seconds
setInterval(updateImage, 200);
});

View File

@@ -1,8 +1,8 @@
body {
font-family: 'Poppins', sans-serif;
text-align: -webkit-center;
margin: 0;
padding: 0;
font-family: "Poppins", sans-serif;
text-align: -webkit-center;
margin: 0;
padding: 0;
}
/* This is my code for my navbar */
@@ -23,101 +23,148 @@ body {
right: 0%;
}
.footer{
display: flex;
justify-content: space-between;
max-width: 80%;
background-color: #fff;
border: 1px solid #f0f0f0;
border-radius: 50px;
align-items: center;
margin: 1.5rem auto 0 auto;
padding: 0 30px;
top: 0%;
bottom: auto;
left: 0%;
right: 0%;
}
.imgNav {
height: 50px;
border-radius: 20px;
}
.connectButton {
border-radius: 10px;
height: 100%;
width: 100px;
box-shadow: none;
border: none;
font-size: 1rem;
height: 40px;
background-color: #b3ffb3;
}
border-radius: 10px;
height: 100%;
width: 100px;
box-shadow: none;
border: none;
font-size: 1rem;
height: 40px;
background-color: #b3ffb3;
}
/* end navbar */
.container {
display: flex;
justify-content: space-around;
align-items: center;
margin-top: 50px;
width: 80%;
background-color: white;
border-radius: 20px;
box-shadow: 0px 8px 16px rgba(0, 0, 0, 0.2);
padding: 40px;
display: flex;
justify-content: space-around;
align-items: center;
margin-top: 50px;
width: 80%;
background-color: white;
border-radius: 20px;
box-shadow: 0px 8px 16px rgba(0, 0, 0, 0.2);
padding: 40px;
}
.button-section {
position: relative;
width: 150px;
height: 150px;
position: relative;
width: 150px;
height: 150px;
}
.btn {
position: absolute;
background-color: #007BFF;
color: white;
border: none;
border-radius: 50%;
width: 60px;
height: 60px;
font-size: 1.2em;
text-align: center;
line-height: 60px;
cursor: pointer;
transition: transform 0.2s ease, background-color 0.2s ease;
position: absolute;
background-color: #007bff;
color: white;
border: none;
border-radius: 50%;
width: 60px;
height: 60px;
font-size: 1.2em;
text-align: center;
line-height: 60px;
cursor: pointer;
transition: transform 0.2s ease, background-color 0.2s ease;
}
.text{
width: 50%;
}
.image{
height: 100%;
}
.sectionHeight{
height: 200px;
}
/* Direction buttons */
.btn:nth-child(1) { /* Left */
top: 50%;
left: 50%;
transform: translate(-160%, -50%);
.btn:nth-child(1) {
/* Left */
top: 50%;
left: 50%;
transform: translate(-160%, -50%);
}
.btn:nth-child(2) { /* Up */
top: 0;
left: 50%;
transform: translate(-50%, -35%);
.btn:nth-child(2) {
/* Up */
top: 0;
left: 50%;
transform: translate(-50%, -35%);
}
.btn:nth-child(3) { /* Right */
top: 50%;
right: 0;
transform: translate(35%,-50%);
.btn:nth-child(3) {
/* Right */
top: 50%;
right: 0;
transform: translate(35%, -50%);
}
.btn:nth-child(4) { /* Down */
bottom: 0;
left: 50%;
transform: translate(-50%, 35%);
.btn:nth-child(4) {
/* Down */
bottom: 0;
left: 50%;
transform: translate(-50%, 35%);
}
.btn:nth-child(5) { /* Stop Button */
top: 50%;
left: 50%;
transform: translate(-50%, -50%);
background-color: red; /* Distinct color for the stop button */
width: 60px; /* Slightly larger for emphasis */
height: 60px; /* Slightly larger for emphasis */
line-height: 60px; /* Center text vertically */
.btn:nth-child(5) {
/* Stop Button */
top: 50%;
left: 50%;
transform: translate(-50%, -50%);
background-color: red; /* Distinct color for the stop button */
width: 60px; /* Slightly larger for emphasis */
height: 60px; /* Slightly larger for emphasis */
line-height: 60px; /* Center text vertically */
}
/* Hover effects */
.btn:hover {
background-color: #0056b3;
background-color: #0056b3;
}
.btn:active {
background-color: #004494;
background-color: #004494;
}
.stop-button:hover {
background-color: darkred; /* Different hover color for the stop button */
background-color: darkred; /* Different hover color for the stop button */
}
table {
width: 100%;
border-collapse: collapse;
}
th,td {
border: 1px solid #ddd;
padding: 8px;
}
th {
background-color: #f2f2f2;
text-align: left;
}

View File

@@ -0,0 +1,73 @@
body {
font-family: Arial, sans-serif;
margin: 0;
padding: 0;
background-color: #f4f4f4;
}
header {
background-color: #333;
color: #fff;
padding: 1rem 0;
text-align: center;
}
header h1 {
margin: 0;
}
nav ul {
list-style: none;
padding: 0;
}
nav ul li {
display: inline;
margin: 0 1rem;
}
nav ul li a {
color: #fff;
text-decoration: none;
}
section {
padding: 2rem;
margin: 1rem 0;
background-color: #fff;
border-radius: 8px;
box-shadow: 0 0 10px rgba(0, 0, 0, 0.1);
}
section h2 {
margin-top: 0;
}
form {
display: flex;
flex-direction: column;
}
form label {
margin: 0.5rem 0 0.2rem;
}
form input, form textarea {
padding: 0.5rem;
margin-bottom: 1rem;
border: 1px solid #ccc;
border-radius: 4px;
}
form button {
padding: 0.7rem;
border: none;
border-radius: 4px;
background-color: #333;
color: #fff;
cursor: pointer;
}
form button:hover {
background-color: #555;
}

View File

@@ -11,5 +11,6 @@
{% include 'navbar.html' %}
{% block content %}
{% endblock %}
</body>
</html>

View File

@@ -0,0 +1,51 @@
{% extends 'base.html' %}
{% block head %}
<link rel="stylesheet" href="../static/style.css" />
{% endblock %}
{% block content %}
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Kobuki</title>
<link rel="stylesheet" href="../static/style.css" />
</head>
<body>
<div class="container">
<div class="robot-image">
<img src="/image" alt="Kobuki Camera Feed" id="robot-image" />
</div>
<div class="button-section">
<form id="form" action="/move" method="post">
<button class="btn" name="direction" value="left"></button>
<button class="btn" name="direction" value="up"></button>
<button class="btn" name="direction" value="right"></button>
<button class="btn" name="direction" value="down"></button>
<button class="btn stop-button" name="direction" value="stop">
Stop
</button>
</form>
</div>
</div>
<div class="container">
<h1>Sensor Data</h1>
<div class="data">
<table id="sensor-data"> <!-- Do not change -->
<thead>
<tr>
<th>Sensor</th>
<th>Value</th>
</tr>
</thead>
<tbody>
<!-- Sensor data rows will be inserted here -->
</tbody>
</table>
</div>
</div>
<script src="../static/script.js"></script>
</body>
</html>
{% endblock %}

View File

@@ -0,0 +1,21 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Document</title>
<link rel="stylesheet" href="{{ url_for('static', filename='style.css') }}">
</head>
<body>
<footer class="footer">
<img src="{{url_for('static', filename='images/logo_kobuki.png')}}" alt="logo" class="imgNav" />
<h3>© 2024 Kobuki Robot Project. All rights reserved.</h3>
<div class="buttonContainer">
<a href="{{ url_for('control') }}" target="_blank">
<button class="click connectButton">Controller</button>
</a>
</div>
</footer>
</body>
</html>

View File

@@ -1,35 +1,58 @@
{% extends 'base.html' %}
{% block head %}
{% extends 'base.html' %} {% block head %}
<link rel="stylesheet" href="../static/style.css" />
{% endblock %}
{% block content %}
{% endblock %} {% block content %}
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Kobuki</title>
<title>Kobuki Robot Project</title>
<link rel="stylesheet" href="../static/style.css" />
</head>
<body>
<div class="container">
<div class="image-section">
<img src="kobuki.jpg" alt="Kobuki Robot" id="robot-image" />
</div>
<div class="button-section">
<form id="form" action="/move" method="post">
<button class="btn" name="direction" value="left"></button>
<button class="btn" name="direction" value="up"></button>
<button class="btn" name="direction" value="right"></button>
<button class="btn" name="direction" value="down"></button>
<button class="btn stop-button" name="direction" value="stop">Stop</button>
</form>
</div>
</div>
<div class="container">
<h1>Sensor Data</h1>
</div>
<script src="../static/script.js"></script>
<section class="container sectionHeight">
<p class="text">
The Kobuki Robot Project is an innovative initiative aimed at developing
a versatile and intelligent robot platform. Our goal is to create a
robot that can navigate autonomously, interact with its environment, and
perform various tasks.
</p>
<img src="{{url_for('static', filename='images/logo.png')}}" alt="logo" class="image" />
</section>
<section class="container sectionHeight" id="about">
<h2>About the Project</h2>
<p>
This project is a collaborative effort involving engineers, researchers,
and enthusiasts. The Kobuki robot is equipped with various sensors,
including bumpers, cliff sensors, and gyroscopes, to help it navigate
and interact with its surroundings.
</p>
<p>Key features of the Kobuki Robot:</p>
<ul>
<li>Autonomous navigation</li>
<li>Obstacle detection and avoidance</li>
<li>Real-time data processing</li>
<li>Remote control and monitoring</li>
</ul>
</section>
<section class="container" id="contact">
<h2>Contact Us</h2>
<form id="contact-form" action="/contact" method="post">
<label for="name">Name:</label>
<input type="text" id="name" name="name" required />
<label for="email">Email:</label>
<input type="email" id="email" name="email" required />
<label for="message">Message:</label>
<textarea id="message" name="message" required></textarea>
<button type="submit">Send</button>
</form>
</section>
{% include 'footer.html' %}
<script src="static/script.js"></script>
</body>
</html>
{% endblock %}

View File

@@ -11,13 +11,9 @@
<img src="{{url_for('static', filename='images/logo_kobuki.png')}}" alt="logo" class="imgNav" />
<h3>Kobuki</h3>
<div class="buttonContainer">
<a
href="https://gitlab.fdmci.hva.nl/propedeuse-hbo-ict/onderwijs/2023-2024/out-a-se-ti/blok-3/vuupoofeehoo27"
target="_blank"
>
<button class="click connectButton">Placeholder</button>
<a href="{{ url_for('control') }}" target="_blank">
<button class="click connectButton">Controller</button>
</a>
<!-- <a href="./signup.html">sign in</a> -->
</div>
</nav>

7
src/Python/wsgi.py Normal file
View File

@@ -0,0 +1,7 @@
import sys
import logging
logging.basicConfig(stream=sys.stderr)
sys.path.insert(0, "/home/ishak/rooziinuubii79/src/Python/flask/web")
from app import app as application

View File

@@ -0,0 +1,7 @@
allow_anonymous false
password_file /etc/mosquitto/passwordfile
listener 8080
protocol websockets
listener 1884
protocol mqtt

View File

@@ -0,0 +1,22 @@
server {
listen 80;
server_name 145.92.224.21;
# Proxy WebSocket connections for MQTT
location /ws/ {
proxy_pass http://localhost:9001;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_set_header Host $host;
}
# Proxy HTTP connections for Flask
location / {
proxy_pass http://localhost:5000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}

7
src/config/nginx.conf Normal file
View File

@@ -0,0 +1,7 @@
stream {
server {
listen 9001;
proxy_pass localhost:8080;
}
}

View File

@@ -0,0 +1,11 @@
# Feedback expert review
probeer Definition of done zelf te formuleren.
beter user stories maken
# Feedback peer review
- eventuele coaching op het gebied van technisch en taken
- meer duidelijkheid wat wij gaan gebruiken, welke technieken en tools. Hiermee kan je mensen koppelen aan taken zodat iedereen iets heeft gedaan.
- meer duidelijkheid van elkaar kwaliteiten en leerdoelen. Hiermee kan je goed inschatten wie wat kan doen en wie wat kan leren.

View File

@@ -0,0 +1,42 @@
# Hoofd en deelvragen
**Wat is de aanleiding?**
De aanleiding is de de behoefte aan veilige communicatie tussen apparaten. Dit is belangrijk omdat onbeveiligde dataoverdracht kan leiden tot datalekken.
**wat is het probleem/behoefte en waaruit blijkt dat?**
Het probleem is dat data die tussen apparaten wordt verstuurd kwetsbaar kan zijn. Dit blijkt nadat wij te horen hebben gekregen dat er niet goed was omgegaan met communicatie tussen apparaten.
**Wie heeft het probleem/behoefte?**
ons groepje, maar ook bijvoorbeeld grote bedrijven waar het heel belangrijk is dat data veilig wordt verstuurd zonder dat het in de verkeerde handen valt.
**Wanneer is het probleem/behoefte ontstaan?**
Het probleem is ontstaan nadat wij te horen hebben gekregen dat er niet goed was omgegaan met communicatie tussen apparaten.
**Waarom is het een probleem?**
Het is een probleem omdat onbeveiligde communicatie kan leiden tot datalekken waaronder privacy. Hiermee kunnen bedrijven in de problemen komen.
**Waar doet het probleem/behoefte zich voor (afbakening)?**
Het probleem komt voor in verschillende sectoren waar data tussen apparaten wordt verstuurd. Dit kan zijn in de zorg, industrie, op kantoor, maar ook met IoT projecten wat je thuis kan hebben.
## Hoofdvraag
Welke communicatieprotocol geeft de mogelijkheid om veilig en betrouwbaar te communiceren tussen IoT apparaten?
## Deelvragen
1. Wat houdt veilige en betrouwbare communicatie tussen apparaten in?
2. Welke protocollen zijn er om veilig en betrouwbaar te communiceren tussen apparaten?
3. Wat zijn de voor- en nadelen van de verschillende protocollen?
## Bronnen
- Singh, S., & Jyoti. (2024, June 7). Secure Communications Protocols for IoT networks: a survey. https://journal.ijprse.com/index.php/ijprse/article/view/1082
- Nguyen, K. T., Laurent, M., Oualha, N., CEA, & Institut Mines-Telecom. (2015). Survey on secure communication protocols for the Internet of Things. In Ad Hoc Networks (Vol. 32, pp. 1731) [Journal-article]. http://dx.doi.org/10.1016/j.adhoc.2015.01.006
- Miorandi, D., Sicari, S., De Pellegrini, F., & Imrich Chlamtac. (2012). Internet of things: Vision, applications and research challenges. In Ad Hoc Networks (Vol. 10, pp. 14971516) [Journal-article]. Elsevier B.V. http://dx.doi.org/10.1016/j.adhoc.2012.02.016
- Christiano, P. (2023, November 5). Top 9 IoT communication protocols & their features in 2024: An In-Depth guide - ExpertBeacon. Expertbeacon. https://expertbeacon.com/iot-communication-protocol/
- Yugha, R., & Chithra, S. (2020). A survey on technologies and security protocols: Reference for future generation IoT. Journal of Network and Computer Applications, 169, 102763. https://doi.org/10.1016/j.jnca.2020.102763
- De Mendizábal, I. (2022, June 16). IoT Communication Protocols—IoT Data Protocols. Technical Articles. https://www.allaboutcircuits.com/technical-articles/internet-of-things-communication-protocols-iot-data-protocols/
- IoT-technologieën en -protocollen | Microsoft Azure. (n.d.). https://azure.microsoft.com/nl-nl/solutions/iot/iot-technology-protocols
- Het IoT verbinden: wat is MQTT en waarin verschilt het van CoAP? (n.d.). https://www.onlogic.com/nl/blog/het-iot-verbinden-wat-is-mqtt-en-waarin-verschilt-het-van-coap/
- Nader, K. (2023, October 30). Wat zijn de voordelen van het gebruik van WebSocket voor IoT-communicatie? AppMaster - Ultimate All-in No-code Platform. https://appmaster.io/nl/blog/websocket-voor-iot-communicatie
- Sidna, J., Amine, B., Abdallah, N., & Alami, H. E. (2020). Analysis and evaluation of communication Protocols for IoT Applications. Karbala International Journal of Modern Science. https://doi.org/10.1145/3419604.3419754

Binary file not shown.

View File

@@ -0,0 +1,11 @@
# kobuki
# last sprint i have been busy with the website
# my job was to make an controller on the website for the kobuki
# so that we can control the kobuki with the website
# i did this using a protocol called MQTT
# mqtt is a protocol that is used to send messages between devices
# this is different from the normal protocol that i used last year
# it was a bit difficult to get the kobuki to work with the website
# next sprint i will be working on a sensor
# i will do some research on the different sensors that we can use
#