35 Commits

Author SHA1 Message Date
ff89b8a742 revert and change port number 2025-01-14 15:07:44 +01:00
eaee4ebd60 change port 2025-01-14 15:06:54 +01:00
a4ba54161a change streaming protocol to udp 2025-01-14 15:04:06 +01:00
2448691e63 rtsp stream opencv using gstreamen 2025-01-14 14:55:59 +01:00
ff7b148556 start readme 2025-01-08 14:20:38 +01:00
6fe28f997a edited kobuki speedvalue for safety 2025-01-07 12:51:24 +01:00
1bf9ebddab added mutex in python 2025-01-07 11:30:04 +01:00
585a0e9a52 fix thread crash 2025-01-07 11:19:47 +01:00
cb988a5260 store new image in processed_image 2025-01-07 11:09:33 +01:00
5e01e25d9c comment update 2025-01-06 16:40:40 +01:00
0d184261fd updated systemd kobukidriver service file 2025-01-06 16:40:00 +01:00
ccaa722973 edited startup file for kobukidriver
(everything works now)
2025-01-06 16:31:52 +01:00
7d1b878c30 fix yolo image boxes 2025-01-06 16:11:03 +01:00
228c508012 attempt to fix broken code 2025-01-06 16:02:17 +01:00
7845feb9f8 update yolo naming in image 2025-01-06 15:55:24 +01:00
20d6d8799d attempt to show name next to image box 2025-01-06 15:45:29 +01:00
9c7c774030 change boxing and text error of YOLO 2025-01-06 15:26:00 +01:00
0832da0d3b change mqtt port in python 2025-01-06 15:16:13 +01:00
a59b9c8714 requirements update 2025-01-06 15:11:27 +01:00
4a05ec5efc added dockerfile 2025-01-06 15:11:21 +01:00
c3d575ccf1 added yolo image object detection to /image 2025-01-06 13:00:00 +01:00
1b0b1e87ce change directories and added driver service config 2025-01-06 11:44:08 +01:00
9c41d64c69 remove mqtt debug print 2025-01-06 11:23:27 +01:00
b48eda9735 change image refresh time 2025-01-06 10:35:54 +01:00
629f9cba92 attempt to fix javascript 2025-01-06 10:34:00 +01:00
51aad34c78 added bracket 2025-01-06 10:08:05 +01:00
1b3fead2b3 documentation 2025-01-06 09:49:56 +01:00
ishak jmilou.ishak
61651a9a02 try to see why sensor value doesnt show on website 2024-12-18 12:22:22 +01:00
ishak jmilou.ishak
e5881f1b37 making message print only once 2024-12-18 11:56:43 +01:00
50b6b83299 fix headerfile 2024-12-17 11:19:15 -04:00
10597ba37d avoid busy waiting 2024-12-17 11:17:25 -04:00
6ab5716797 Documentation kobuki, mqtt, opencv 2024-12-13 18:33:24 +01:00
50b90e461f gitignore update 2024-12-13 10:52:53 +01:00
2811036595 change port of mqtt flask 2024-12-12 14:01:39 +01:00
58f1a931a6 Merge branch 'OpenCV' into 'main'
Open cv

See merge request technische-informatica-sm3/ti-projectten/rooziinuubii79!3
2024-12-12 13:48:59 +01:00
24 changed files with 363 additions and 141 deletions

1
.gitignore vendored
View File

@@ -32,3 +32,4 @@ Makefile
CMakeCache.txt
cmake_install.cmake
src/C++/OpenCV/main
.vs

View File

@@ -1,93 +1,8 @@
# TI-project
## Getting started
To make it easy for you to get started with GitLab, here's a list of recommended next steps.
Already a pro? Just edit this README.md and make it your own. Want to make it easy? [Use the template at the bottom](#editing-this-readme)!
## Add your files
- [ ] [Create](https://docs.gitlab.com/ee/user/project/repository/web_editor.html#create-a-file) or [upload](https://docs.gitlab.com/ee/user/project/repository/web_editor.html#upload-a-file) files
- [ ] [Add files using the command line](https://docs.gitlab.com/ee/gitlab-basics/add-file.html#add-a-file-using-the-command-line) or push an existing Git repository with the following command:
```
cd existing_repo
git remote add origin https://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-project.git
git branch -M main
git push -uf origin main
```
## Integrate with your tools
- [ ] [Set up project integrations](https://gitlab.fdmci.hva.nl/technische-informatica-sm3/ti-project/-/settings/integrations)
## Collaborate with your team
- [ ] [Invite team members and collaborators](https://docs.gitlab.com/ee/user/project/members/)
- [ ] [Create a new merge request](https://docs.gitlab.com/ee/user/project/merge_requests/creating_merge_requests.html)
- [ ] [Automatically close issues from merge requests](https://docs.gitlab.com/ee/user/project/issues/managing_issues.html#closing-issues-automatically)
- [ ] [Enable merge request approvals](https://docs.gitlab.com/ee/user/project/merge_requests/approvals/)
- [ ] [Set auto-merge](https://docs.gitlab.com/ee/user/project/merge_requests/merge_when_pipeline_succeeds.html)
## Test and Deploy
Use the built-in continuous integration in GitLab.
- [ ] [Get started with GitLab CI/CD](https://docs.gitlab.com/ee/ci/quick_start/index.html)
- [ ] [Analyze your code for known vulnerabilities with Static Application Security Testing (SAST)](https://docs.gitlab.com/ee/user/application_security/sast/)
- [ ] [Deploy to Kubernetes, Amazon EC2, or Amazon ECS using Auto Deploy](https://docs.gitlab.com/ee/topics/autodevops/requirements.html)
- [ ] [Use pull-based deployments for improved Kubernetes management](https://docs.gitlab.com/ee/user/clusters/agent/)
- [ ] [Set up protected environments](https://docs.gitlab.com/ee/ci/environments/protected_environments.html)
***
# Editing this README
When you're ready to make this README your own, just edit this file and use the handy template below (or feel free to structure it however you want - this is just a starting point!). Thanks to [makeareadme.com](https://www.makeareadme.com/) for this template.
## Suggestions for a good README
Every project is different, so consider which of these sections apply to yours. The sections used in the template are suggestions for most open source projects. Also keep in mind that while a README can be too long and detailed, too long is better than too short. If you think your README is too long, consider utilizing another form of documentation rather than cutting out information.
## Name
Choose a self-explaining name for your project.
# TI-project - Kobuki
## Description
Let people know what your project can do specifically. Provide context and add a link to any reference visitors might be unfamiliar with. A list of Features or a Background subsection can also be added here. If there are alternatives to your project, this is a good place to list differentiating factors.
This project is a kobuki that drives around in dangerous areas and detects objects in its path. It uses a camera to detect objects. The kobuki is able to drive around in a room and detect objects.
## Badges
On some READMEs, you may see small images that convey metadata, such as whether or not all the tests are passing for the project. You can use Shields to add some to your README. Many services also have instructions for adding a badge.
## Photos
![Kobuki](/docs/assets/KobukiPhoto.jpg)
## Visuals
Depending on what you are making, it can be a good idea to include screenshots or even a video (you'll frequently see GIFs rather than actual videos). Tools like ttygif can help, but check out Asciinema for a more sophisticated method.
## Installation
Within a particular ecosystem, there may be a common way of installing things, such as using Yarn, NuGet, or Homebrew. However, consider the possibility that whoever is reading your README is a novice and would like more guidance. Listing specific steps helps remove ambiguity and gets people to using your project as quickly as possible. If it only runs in a specific context like a particular programming language version or operating system or has dependencies that have to be installed manually, also add a Requirements subsection.
## Usage
Use examples liberally, and show the expected output if you can. It's helpful to have inline the smallest example of usage that you can demonstrate, while providing links to more sophisticated examples if they are too long to reasonably include in the README.
## Support
Tell people where they can go to for help. It can be any combination of an issue tracker, a chat room, an email address, etc.
## Roadmap
If you have ideas for releases in the future, it is a good idea to list them in the README.
## Contributing
State if you are open to contributions and what your requirements are for accepting them.
For people who want to make changes to your project, it's helpful to have some documentation on how to get started. Perhaps there is a script that they should run or some environment variables that they need to set. Make these steps explicit. These instructions could also be useful to your future self.
You can also document commands to lint the code or run tests. These steps help to ensure high code quality and reduce the likelihood that the changes inadvertently break something. Having instructions for running tests is especially helpful if it requires external setup, such as starting a Selenium server for testing in a browser.
## Authors and acknowledgment
Show your appreciation to those who have contributed to the project.
## License
For open source projects, say how it is licensed.
## Project status
If you have run out of energy or time for your project, put a note at the top of the README saying that development has slowed down or stopped completely. Someone may choose to fork your project or volunteer to step in as a maintainer or owner, allowing your project to keep going. You can also make an explicit request for maintainers.

View File

@@ -0,0 +1,51 @@
# Systemd Services
# What is a service
A service is a program or script that runs in the background and is managed by the system. Services are started at boot time and run until the system is shut down. Services can be started, stopped, and restarted by the system administrator.
# How to manage services on systemD
## Starting a service
To start a service, use the `systemctl start` command followed by the service name. For example, to start the `apache2` service, use the following command:
```bash
sudo systemctl start apache2
```
## Stopping a service
To stop a service, use the `systemctl stop` command followed by the service name. For example, to stop the `apache2` service, use the following command:
```bash
sudo systemctl stop apache2
```
## Restarting a service
To restart a service, use the `systemctl restart` command followed by the service name. For example, to restart the `apache2` service, use the following command:
```bash
sudo systemctl restart apache2
```
## Enabling a service
To enable a service to start at boot time, use the `systemctl enable` command followed by the service name. For example, to enable the `apache2` service, use the following command:
```bash
sudo systemctl enable apache2
```
## Creating a new service
To create a new service, you need to create a new service file in the `/etc/systemd/system/` directory. The service file should have a `.service` extension and contain the following sections:
### Example service file:
```bash
[Unit]
Description=FlaskApp #description of the service
After=network.target #start the service after the network is up
[Service]
User=ishak #start the service as a specific user
WorkingDirectory=/home/ishak/rooziinuubii79/src/Python/flask/web/ #working directory of the service
ExecStart=/usr/bin/gunicorn -w 3 -b 127.0.0.1:5000 app:app #command to start the service
```

BIN
docs/assets/KobukiPhoto.jpg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 491 KiB

50
docs/code/Mqtt.md Normal file
View File

@@ -0,0 +1,50 @@
# MQTT
## What is MQTT?
MQTT is a lightweight messaging protocol made for IOT devices. It allows efficient communication between IoT devices, servers, and applications by allowing them to
publish and subscribe to messages.
## How to connect
To connect to a MQTT server you need to create a instance of the class.
Example:
```cpp
// server adress, Client ID, Client Username, Client Password
MqttClient client("ws://145.92.224.21/ws/", "KobukiRPI", "rpi", "rpiwachtwoordofzo"); // create a client object
```
Later in the setup function you need to call ```client.connect();``` to connect to the mqtt server.
```cpp
client.connect();
```
When you've connected and the instance is initiated you can subscribe to topics or send messages to topics.
## Subscribing and receiving messages
Example subscribing to a topic:
```cpp
void setup(){
client.subscribe("home/commands");
}
```
Example receiving latest message from a topic:
```cpp
std::string foo(){
std::string latestMqttMessage = "";
latestMqttMessage = client.getLastMessage();
return latestMqttMessage;
}
```
If you want to subscribe to mulitple topics you need to initiate multiple instances of the mqtt class.
## Publishing messages
Example publishing a message:
```cpp
void foo(std::string Message){
//channel, payload
client.publishMessage("kobuki/example", Message);
}
```

View File

@@ -1,20 +1,69 @@
# OpenCV
## Requirements
For the camera we want it to detect what is happening on the video feed and identify it so it can identify dangers.
We want that the camera we want it to detect what is happening on the video feed and identify it so it can identify dangers.
## Issues
* OpenCL not grabbing gpu
* Solution: https://github.com/Smorodov/Multitarget-tracker/issues/93
## Installation
### Dependencies
* glew
* opencv
* glew (for openGL)
* opencv C++ lib
How to install OpenCV
```bash
sudo apt-get install libopencv-dev
```
## Code explanation
### Opening the camera with OpenCV
```cpp
VideoCapture cap(0); //Open the default camera (0), points to /dev/video0. You could also change the number to the preferred camera
if (!cap.isOpened()) { //if camera is not opened throw a error message
cerr << "Error: Could not open camera" << endl;
return;
}
```
## Taking a picture and storing it in a variable
```cpp
Mat frame; //create a new Matrix variable called frame
while (true) {
cap >> frame; // Capture a new image frame.
if (frame.empty()) { //if the variable frame is not filled return a error
cerr << "Error: Could not capture image" << endl;
continue;
}
```
## Encoding the image for sending it over MQTT
```cpp
vector<uchar> buf; //create a dyanmic buffer for the image
imencode(".jpg", frame, buf); //encode the image to the buffer
auto* enc_msg = reinterpret_cast<unsigned char*>(buf.data());
```
```cpp
void CapnSend() {
// Convert the image to a byte array
// Publish the image data
client.publishMessage("kobuki/cam", string(enc_msg, enc_msg + buf.size()));
cout << "Sent image" << endl;
std::this_thread::sleep_for(std::chrono::milliseconds(300)); // Send image every 1000ms
}
}
```
## Sources
* https://github.com/UnaNancyOwen/OpenCVDNNSample/tree/master

View File

@@ -0,0 +1,25 @@
# Kobuki driver
## How do i communicate with the kobuki
You can communicate with the kobuki by usb serial or the big serial port on the front. We chose the usb port paired with a raspberry Pi.
The Kobuki sends a message every 200ms with a baudrate of 115200. It sends all the sensordata and the message always starts with the same 2 bytes 0xAA and 0x55.
## Kobuki payloads
To communicate with the kobuki we need to send payloads to the kobuki. These are structured the same as the payloads that the kobuki sends.
```cpp
unsigned char KobukiPayload[11] = {
0xaa, // Start byte 1
0x55, // Start byte 2
0x08, // Payload length (the first 2 bytes dont count)
0x01, // payload type (0x01 = control command)
0x04, // Control byte or additional identifier
actual_speed % 256, // Lower byte of speed value (max actual_speed 1024)
actual_speed >> 8, // Upper byte of speed value
0x00, // Placeholder for radius
0x00, // Placeholder for radius
0x00 // Placeholder for checksum (will be applied later)
};
```
You can also find the documentation about the payloads on the kobuki website

View File

@@ -576,7 +576,7 @@ void CKobuki::robotSafety(std::string *pointerToMessage) {
parser.data.CliffCenter || parser.data.CliffRight) {
std::cout << "Safety condition triggered!" << std::endl; // Debug print
*pointerToMessage = "estop";
forward(-100); // reverse the robot
forward(-300); // reverse the robot
}
std::this_thread::sleep_for(std::chrono::milliseconds(static_cast<int>(100)));
}
@@ -590,8 +590,10 @@ void CKobuki::robotSafety() {
parser.data.BumperRight || parser.data.CliffLeft ||
parser.data.CliffCenter || parser.data.CliffRight) {
std::cout << "Safety condition triggered!" << std::endl; // Debug print
forward(-100); // reverse the robot
forward(-300); // reverse the robot
}
std::this_thread::sleep_for(std::chrono::milliseconds(static_cast<int>(100)));
}
}

View File

@@ -2,6 +2,8 @@
#include <iostream>
//moet checkenvalue gebruiken of moet kijken naar de payloadlength welke dingen er extra zijn
int KobukiParser::parseKobukiMessage(TKobukiData &output, unsigned char *data) {
std::this_thread::sleep_for(std::chrono::milliseconds(static_cast<int>(20))); //avoid busy waiting. The kobuki sends a message every 20ms
int rtrnvalue = checkChecksum(data);
if (rtrnvalue != 0) {
// std::cerr << "Invalid checksum" << std::endl;

View File

@@ -2,6 +2,8 @@
#define KOBUKIPARSER_H
#include <vector>
#include <thread>
struct TRawGyroData {
int x, y, z;

View File

@@ -37,7 +37,6 @@ void MqttClient::subscribe(const std::string& topic, int qos) {
void MqttClient::publishMessage(const std::string& topic, const std::string& payload) {
try {
std::cout << "Publishing message: " << payload << std::endl;
client_.publish(topic, payload)->wait();
} catch (const mqtt::exception& exc) {
std::cerr << "Error: " << exc.what() << std::endl;

View File

@@ -36,7 +36,11 @@ int main()
std::thread sendMqtt([&]() { sendKobukiData(robot.parser.data); });
while(true){
parseMQTT(readMQTT());
std::string message = readMQTT();
if (!message.empty()){
parseMQTT(message);
}
}
sendMqtt.join();
@@ -46,10 +50,13 @@ int main()
std::string readMQTT()
{
message = client.getLastMessage();
if (!message.empty())
static std::string lastMessage;
std::string message = client.getLastMessage();
if (!message.empty() && message != lastMessage)
{
std::cout << "MQTT Message: " << message << std::endl;
lastMessage = message;
}
// Add a small delay to avoid busy-waiting
@@ -282,12 +289,23 @@ void sendKobukiData(TKobukiData &data) {
}
void CapnSend() {
VideoCapture cap(0);
int fps = 15;
int width = 800;
int height = 600;
VideoCapture cap("/dev/video0");
if (!cap.isOpened()) {
cerr << "Error: Could not open camera" << endl;
return;
}
VideoWriter out("appsrc ! videoconvert ! video/x-raw,format=I420 ! x264enc speed-preset=ultrafast bitrate=600 key-int-max=" + to_string(fps * 2) + " ! video/x-h264,profile=baseline ! mpegtsmux ! udpsink host=127.0.0.1 port=5001",
CAP_GSTREAMER, 0, fps, Size(width, height), true);
if (!out.isOpened()) {
cerr << "Error: Can't open video writer" << endl;
return;
}
Mat frame;
while (true) {
cap >> frame; // Capture a new image frame
@@ -296,15 +314,10 @@ void CapnSend() {
continue;
}
// Convert the image to a byte array
vector<uchar> buf;
imencode(".jpg", frame, buf);
auto* enc_msg = reinterpret_cast<unsigned char*>(buf.data());
// Publish the image data
client.publishMessage("kobuki/cam", string(enc_msg, enc_msg + buf.size()));
// Write the frame to the UDP stream
out.write(frame);
cout << "Sent image" << endl;
std::this_thread::sleep_for(std::chrono::milliseconds(300)); // Send image every 1000ms
std::this_thread::sleep_for(std::chrono::milliseconds(1000 / fps)); // Control the frame rate
}
}

41
src/Python/YOLO/app.py Normal file
View File

@@ -0,0 +1,41 @@
from ultralytics import YOLO
import cv2
import numpy as np
import requests
import time
model = YOLO("yolo11n.pt")
#try to fetch the image from the given url
def fetch_image(url):
try:
response = requests.get(url)
response.raise_for_status()
image_array = np.frombuffer(response.content, np.uint8)
image = cv2.imdecode(image_array, cv2.IMREAD_COLOR)
return image
except requests.RequestException as e:
print(f"Error: Could not fetch image - {e}")
return None
# URL of the photostream
url = "http://145.92.224.21/image"
while True:
frame = fetch_image(url)
if frame is None:
print("Error: Could not fetch image, retrying...")
time.sleep(1) # Wait for 1 second before retrying
continue
# Predict on the frame
results = model(frame)
# Display the results
results[0].show()
# Exit if 'q' is pressed
if cv2.waitKey(1) & 0xFF == ord('q'):
break
cv2.destroyAllWindows()

View File

@@ -0,0 +1 @@
__pycache__

View File

@@ -0,0 +1,18 @@
FROM python:3.9
WORKDIR /app
COPY . .
RUN apt-get update && apt-get install -y libgl1
RUN pip install -r requirements.txt
EXPOSE 5000
CMD ["python", "web/app.py"]
#build instruction: sudo docker buildx build -t flaskapp:latest .
#run instruction: sudo docker run --network="host" flaskapp:latest
# need to use network host to connect to the host's mqtt server

View File

@@ -0,0 +1,5 @@
Flask==3.1.0
paho-mqtt==1.6.1
ultralytics==8.3.58
opencv-python-headless==4.6.0.66
numpy==1.23.4

View File

@@ -1,38 +1,75 @@
from flask import Flask, Response, request, render_template, jsonify
import paho.mqtt.client as mqtt
from ultralytics import YOLO
import cv2
import numpy as np
import threading
app = Flask(__name__)
kobuki_message = "empty"
# Load a model
model = YOLO("yolo11n.pt") # pretrained YOLO11n model
kobuki_message = ""
latest_image = None
processed_image = None
yolo_results = []
# Lock for thread-safe access to shared variables
lock = threading.Lock()
# List of class names (example for COCO dataset)
yolo_classes = list(model.names.values())
def on_message(client, userdata, message):
global kobuki_message, latest_image
global kobuki_message, latest_image, processed_image, yolo_results
if message.topic == "kobuki/data":
kobuki_message = str(message.payload.decode("utf-8"))
elif message.topic == "kobuki/cam":
latest_image = message.payload
with lock: # Lock the shared variables between threads so they can't be accessed at the same time and you cant have half processed images
latest_image = np.frombuffer(message.payload, np.uint8)
latest_image = cv2.imdecode(latest_image, cv2.IMREAD_COLOR)
# Process the image with YOLO
results = model(latest_image)
yolo_results = []
processed_image = latest_image.copy() # Create a copy for processing
for result in results:
for box in result.boxes:
class_id = int(box.cls.item())
class_name = yolo_classes[class_id]
yolo_results.append({
"class": class_name,
"confidence": box.conf.item(),
"bbox": box.xyxy.tolist()
})
# Draw bounding box on the processed image
x1, y1, x2, y2 = map(int, box.xyxy[0])
cv2.rectangle(processed_image, (x1, y1), (x2, y2), (0, 255, 0), 2)
cv2.putText(processed_image, f"{class_name} {box.conf.item():.2f}", (x1, y1 - 10), cv2.FONT_HERSHEY_SIMPLEX, 0.9, (0, 255, 0), 2)
# Create an MQTT client instance
mqtt_client = mqtt.Client()
mqtt_client.username_pw_set("server", "serverwachtwoordofzo")
mqtt_client.connect("localhost", 80, 60)
mqtt_client.connect("localhost", 1884, 60)
mqtt_client.loop_start()
mqtt_client.subscribe("kobuki/data")
mqtt_client.subscribe("kobuki/cam")
mqtt_client.on_message = on_message # this lines needs to be under the function definition otherwise it cant find which function it needs to use
mqtt_client.on_message = on_message # this line needs to be under the function definition otherwise it can't find which function it needs to use
@app.route('/')
def index():
return render_template('index.html')
@app.route('/control', methods=["GET","POST"])
@app.route('/control', methods=["GET", "POST"])
def control():
if request.authorization and request.authorization.username == 'ishak' and request.authorization.password == 'kobuki':
return render_template('control.html')
else:
return ('Unauthorized', 401, {'WWW-Authenticate': 'Basic realm="Login Required"'})
@app.route('/move', methods=['POST'])
def move():
data = request.get_json()
@@ -52,20 +89,21 @@ def data():
@app.route('/image')
def image():
global latest_image
if latest_image is not None:
return Response(latest_image, mimetype='image/jpeg')
else:
return "No image available", 404
@app.route('/phpmyadmin/<path:path>')
def phpmyadmin_passthrough(path):
# Laat Apache deze route direct afhandelen
return "", 404
global processed_image
with lock: # Lock the shared variables between threads so they can't be accessed at the same time and you cant have half processed images
if processed_image is not None:
_, buffer = cv2.imencode('.jpg', processed_image)
return Response(buffer.tobytes(), mimetype='image/jpeg')
else:
return "No image available", 404
@app.route('/yolo_results', methods=['GET'])
def yolo_results_endpoint():
global yolo_results
with lock:
return jsonify(yolo_results)
if __name__ == '__main__':
app.run(debug=True, port=5000)
app.run(debug=True, port=5000)

View File

@@ -14,7 +14,7 @@ document.addEventListener("DOMContentLoaded", function() {
body: JSON.stringify({ direction: direction })
})
.then(response => response.json())
.then(data => {script
.then(data => {
console.log("Success:", data);
})
.catch(error => {
@@ -25,9 +25,13 @@ document.addEventListener("DOMContentLoaded", function() {
// Fetch data from the server
async function fetchData() {
const response = await fetch("/data");
const data = await response.json();
return data;
try {
const response = await fetch("/data");
const data = await response.json();
return data;
} catch (error) {
console.error("Error:", error);
}
}
// Parse the data and show it on the website
@@ -49,9 +53,9 @@ document.addEventListener("DOMContentLoaded", function() {
img.src = "/image?" + new Date().getTime(); // Add timestamp to avoid caching
}
// Fetch and display sensor data every 5 seconds
// Fetch and display sensor data every 1 second
setInterval(parseData, 1000);
// Update the image every 5 seconds
setInterval(updateImage, 200);
// Update the image every 200 milliseconds
setInterval(updateImage, 100);
});

Binary file not shown.

View File

@@ -1,7 +0,0 @@
import sys
import logging
logging.basicConfig(stream=sys.stderr)
sys.path.insert(0, "/home/ishak/rooziinuubii79/src/Python/flask/web")
from app import app as application

View File

@@ -0,0 +1,13 @@
[Unit]
Description=kobukiDriver
After=network.target
[Service]
User=user1
WorkingDirectory=/home/user1/rooziinuubii79/src/C++/Driver/
ExecStart=/home/user1/rooziinuubii79/src/C++/Driver/kobuki_control
Restart=always
RestartSec=5
[Install]
WantedBy=multi-user.target