The goal of this project is to build a robot capable of mapping its environment in a 3D simulation view. It uses a neural network for depth estimation deployed on a Jetson Nano. The Jetson is also connected to an Arduino Nano to get the gyro data from its IMU to project the depth values in a 3D world based on the orientation of the robot.
Implementation of 3D mapping using Kinect and RGB-D Camera sensors in an indoor environment. Real-time appearance-based mapping (RTAB-Map) was used to make the 3D map simulated in the Gazebo environment.
This is a project where in an autonomous robot is built which makes use of the 3D mapping technique to map the entire environment and finally is able to navigate autonomously toward the detected switch board. This project was developed with the basic idea of making the robot to be able to navigate toward the switch board whenever it needs to be charged.
We utilized Hadoop Cluster via Oracle Big Data Cloud Service - Compute Edition with Hive Query Language to perform geo-spacial and geo-temporal analyses on ~10GB dataset from Yelp.com. We analyzed Yelp features' performance over time, consumers written review sentiment, and consumer rating patterns in different regions.
This is a online social robot navigation framework that implements several techniques for that matter, like the social relevance validity checking and an extended social comfort cost function.