Posts

Tuesday, 26 February 2019

How to install Anaconda on Ubuntu 16.04 and some commands for conda

How to install Anaconda on Ubuntu 16.04


curl -O https://repo.continuum.io/archive/Anaconda3-4.3.1-Linux-x86_64.sh
bash Anaconda3-4.3.1-Linux-x86_64.sh
source ~/.bashrc

Simple Commands

conda list
See the list of all the packages available through the Anaconda.
conda create --name env1 python=3
Create a new environment named env1 with python3.
source activate env1
Activate the environment named env1.
source deactivate

Tuesday, 19 February 2019

RC car 일단 저장

일단 저장

Pixhawk2 Calibration

  1. Mission Planner 설치 (윈도우) / Qgroundcontrol (우분투)
  2. Firmware 설치
    • 인터넷에서 다른 firmware 다운로드 옵션을 통해 FMUv3 용의 ardurover firmware을 다운로드 후 설치 (Pixhawk1 버전은 안됨)
  3. Mission Planner의 지시에 따라 가속도계, 나침반, radio, esc (필수 x) 를 calibration
    • radio calibration에서 mission planner는 최소 4개의 채널(poll, pitch, yaw, throttle)을 요구하지만, 사용하고 있는 컨트롤러는 기본적으로 2개의 채널만 사용.
    • AUX1~3 채널을 임의로 할당하여 사용(기본 2개의 채널을 rover 제어에 필요한 poll, throttle에 할당)
    • 기본적으로 mission planner에서는 channel 3의 신호를 throttle 제어에 사용하도록 설정되어 있으므로, parameter 변경을 통해 channel 2가 throttle을 제어할 수 있도록 한다.

TK1에 MAVROS 설치

  1. TK1 에 ros-indigo 설치
    1. roswiki 참고
    2. TK2 에 MAVROS 설치
sudo apt-get install python-catkin-tools
sudo apt-get install ros-indigo-mavros ros-indigo-mavros

Serial, USB

그림1

  1. TK1 보드에서 Pixhwak를 제어하기 위해 Pixhwak의 TELEM2 포트를 USBtoTTL 장치를 통해 그림 1과 같이 연결
  2. Qgroundcontrol 에서 SYS_COMPANION 파라메터를 “Companion Link(921600 baudrate 8N1)으로 수정(Mission Planner에서는 기본적으로 57600)
ls /dev/ttyUSB*
sudo chmod 666 /dev/ttyUSB* #for permission
roscore
rosrun mavros mavros_node _fcu_url:="/dev/ttyUSB*:921600"
    • 이때 TK1의 경우 serial to usb 드라이버가 설치되어 있지 않음(보통은 우분투 운영체제 기본 내장이다)
    • USBtoTTL 장치의 제조 회사에 따라 드라이버가 다름(이 경우는 CP210X 드라이버를 설치해야 한다)
제조사에서 드라이버 다운로드 후
make
Cp cp210x.ko /lib/modules/<kernel-version>/kernel/drivers/usb/serial
Inmode /lib/modules/<kernel-version>/kernel/drivers/usb/serial/usbserial.ko #이 경우에는 없었음
Insmod cp210.ko # 드라이버 실행 코드
make 할 때 tk1 에서는 genmask 매크로가 정의되어 있지 않아 에러가 발생(genmask는 커널 버전 3.19 부터 정의되어 지는데 tk1은 3.13이다)
(따라서 에러가 발생하는 파일에 genmask를 정의 해주어야 한다)
(genmask의 코드는 인터넷에 나온다)

뒤                                                                       앞

Arming
Pixhawk를 작동시키기 위해 pixhawk 를 arming 하는 과정이 필요하다. 이때 오류가 날 경우 parameter 중 arming단계에서 preflight (시동전 점검)단계를 생략하는 parameter를 설정한다.

roscore
rosrun mavros mavros_node _fcu_url:="/dev/ttyUSB0:57600"
rosservice call /mavros/set_mode “base_mode: 0 cumtom_mode:”GUIDED””  #GUIDED(외부 보드에서 pixhawk를 제어할 수 있는 모드)로 변환
rosservice call /mavros/cmd/arming “value: Ture” #arming
# Arming 스위치를 누른다(깜박임 -> 유지)
# 이후 여러 topic을 통해 제어 할 수 있다.



Monday, 18 February 2019

[ORB SLAM] What is in the code

[ORB SLAM] What is in the code


ORB_SLAM2/Exampels/Monocular


I read 'mono_tum.cc' but 1~7 are all same in every cc files.

  1. Retrieve paths to images.
  2. Create a SLAM system. It initializes all system and gets ready to process frames.
  3. Create a vector for tracking time statistics.
  4. Main loop
    1. Read an image from file
    2. Pass the image to the SLAM system
    3. Wait to load the next frame
    4. repeat!
  5. Stop all threads
  6. Tracking time statistics
  7. Save keyframe trajectory. After running, the keyframe trajectory will be saved as 'KeyFrameTrajectory.txt' so save it as a different name or move it to another directory if you want to keep it.

# There is a part to skip three lines in the first of 'rgb.txt' file. There is no part like that in 'mono_kitti.cc'.

In '.yaml' file, there are camera parameters includes camera calibration and distortion parameters, camera frames per second, color order of the images. Also, there are ORB parameters for SLAM like the number of features per image, the scale factor between levels in the scale pyramid and the number of levels in the scale pyramid.


Wednesday, 13 February 2019

Ubuntu에서 pip intsall을 한 후에 ImportError 발생

Ubuntu에서 pip intsall을 한 후에 ImportError 발생

Traceback (most recent call last):   File "/usr/bin/pip", line 9, in <module>
from pip import main
ImportError: cannot import name main

terminal에서
hash -d pip
하니 해결되었다.

땡큐

Tuesday, 12 February 2019

How to run ORB SLAM with KITTI Dataset

How to run ORB SLAM with KITTI Dataset



Monocular Examples

1. Download the dataset (grayscale images) from http://www.cvlibs.net/datasets/kitti/eval_odometry.php

2. Execute the following command. Change KITTIX.yamlby KITTI00-02.yaml, KITTI03.yaml or KITTI04-12.yaml for sequence 0 to 2, 3, and 4 to 12 respectively. Change PATH_TO_DATASET_FOLDER to the uncompressed dataset folder. Change SEQUENCE_NUMBER to 00, 01, 02,.., 11.


./Examples/Monocular/mono_kitti Vocabulary/ORBvoc.txt Examples/Monocular/KITTIX.yaml PATH_TO_DATASET_FOLDER/dataset/sequences/SEQUENCE_NUMBER



Map I got

In KeyFrameTrajectory.txt file, every row has 8 entries containing time stamp (in seconds), position and orientation: 'timestamp x y z q_x q_y q_z q_w'
So to achieve what you want to do, you could for example load the file as a table (similar to a .csv file) and then the columns 2 to 4 are your x, y, z values (or 1 to 3 if you count from 0)

Stetro Examples


1. Download the dataset above.

2. Execute the following command. Change KITTIX.yamlto KITTI00-02.yaml, KITTI03.yaml or KITTI04-12.yaml for sequence 0 to 2, 3, and 4 to 12 respectively. Change PATH_TO_DATASET_FOLDER to the uncompressed dataset folder. Change SEQUENCE_NUMBER to 00, 01, 02,.., 11.

./Examples/Stereo/stereo_kitti Vocabulary/ORBvoc.txt Examples/Stereo/KITTIX.yaml PATH_TO_DATASET_FOLDER/dataset/sequences/SEQUENCE_NUMBER

How to convert ogv to mp4 in Ubuntu16.04

How to convert ogv file to mp4 file

I am using Ubuntu16.04 and RecordMyDesktop to record my screen. This program makes .ogv file as its output so I wanted to convert it to mp4 file.

wget http://ffmpeg.gusari.org/static/32bit/ffmpeg.static.32bit.latest.tar.gz
tar xzvf ffmpeg.static.32bit.latest.tar.gz

Get ffmpeg like above, or use sudo like below.

sudo apt get install ffmpeg

Then run it. 'out-1.ogv' is the input file. The value 5 after -crf controls the quality. The larger number, lower quality. If you want to resize the video file, control this number.

ffmpeg -i input.ogv -aq 80 -vcodec libx264 -preset slow -crf 5 -threads 0 output.mp4



Friday, 8 February 2019

How to run ORB SLAM with your own data in Ubuntu 16.04

How to run ORB SLAM with your own data in Ubuntu 16.04



1. Take your video at a resolution of 640*480(VGA).

2. Save each frame of your video as a separate frame in the PNG format.

sudo apt install ffmpeg
ffmpeg -i testvideo.mp4 frame%d.png
My video name is testvideo.mp4 and I wanted to convert every frame of it to frame%d.png files. Those png files will be made in the same folder as the video.

3. Generate a text file like the picture below. The file contains the timestamps and filename of each image(frame). You can generate your own timestamps. There is no ideal way to set this but small time gap makes the slam run fast. 
on of the text file of the TUM datasets 
frame_num = 235
f = open('rgb.txt', 'w+')

for i in range(frame_num):
    f.write('%f rgb/frame%d.png\n' %(0.4*(i+1), i+1))

f.close()

I made the 'rgb.txt' file like above.

4. The images should be saved in a folder named 'rgb' in the main folder(let us name it 'test') and the text file should be named 'rgb.txt' and be saved in the 'test' folder.

file location
I set files like above.

5. Go to your SLAM folder and run.

./Examples/Monocular/mono_tum Vocabulary/ORBvoc.txt Examples/Monocular/TUM1.yaml ./test
Result

















Thursday, 7 February 2019

How to install ORB_SLAM2 and test it on ubuntu16.04

< What I did >

How to install ORB_SLAM2 and test it on Ubuntu 16.04

1. Create a new folder

mkdir ORB_SLAM
cd ORB_SLAM

2. Install the prerequisite software

(1) Update apt library
sudo apt-get update
(2) Install git
sudo apt-get install git
(3) Install cmake
sudo apt-get install cmake
(4) Install Pangolin
installation dependencies:
Opengl / GLEW :
sudo apt-get install libglew-dev
Boost :
sudo apt-get install libboost-dev libboost-thread-dev libboost-filesystem-dev
Python2/Python3 :
sudo apt-get install libpython2.7-dev
compile the base library :
sudo apt-get install build-essential
How to build Pangolin :
git clone https://github.com/stevenlovegrove/Pangolin.git
cd Pangolin
mkdir build
cd build
cmake ..
cmake --build .
(5) Install Eigen
cd ~/ORB_SLAM
sudo apt install mercurial
hg clone https://bitbucket.org/eigen/eigen/
cd eigen
mkdir build
cd build
cmake ..
make
sudo make install
(6) Install the BLAS and LAPACK library
sudo apt-get install libblas-dev
sudo apt-get install liblapack-dev
(7) Install OpenCV
# Install Dependencies

sudo apt-get update
sudo apt-get install -y build-essential
sudo apt-get install -y cmake
sudo apt-get install -y libgtk2.0-dev
sudo apt-get install -y pkg-config
sudo apt-get install -y python-numpy python-dev
sudo apt-get install -y libavcodec-dev libavformat-dev libswscale-dev
sudo apt-get install -y libjpeg-dev libpng-dev libtiff-dev libjasper-dev
sudo apt-get -qq install libopencv-dev build-essential checkinstall cmake pkg-config yasm libjpeg-dev libjasper-dev libavcodec-dev libavformat-dev libswscale-dev libdc1394-22-dev libxine2 libgstreamer0.10-dev libgstreamer-plugins-base0.10-dev libv4l-dev python-dev python-numpy libtbb-dev libqt4-dev libgtk2.0-dev libmp3lame-dev libopencore-amrnb-dev libopencore-amrwb-dev libtheora-dev libvorbis-dev libxvidcore-dev x264 v4l-utils

# Download opencv-2.4.11

wget http://downloads.sourceforge.net/project/opencvlibrary/opencv-unix/2.4.11/opencv-2.4.11.zip
unzip opencv-2.4.11.zip
cd opencv-2.4.11
mkdir release
cd release
cd ~/ORB_SLAM
git clone https://github.com/opencv/opencv_extra.git
cd opencv-2.4.11/release
Please change -DOPENCV_TEST_DATA_PATH line below as yours.
  • Set OPENCV_TEST_DATA_PATH environment variable to <path to opencv_extra/testdata>.

cmake \
      -DBUILD_EXAMPLES=ON                                                     \
      -DBUILD_OPENCV_JAVA=OFF                                                 \
      -DBUILD_OPENCV_JS=ON                                                    \
      -DBUILD_OPENCV_NONFREE=ON                                               \
      -DBUILD_OPENCV_PYTHON=ON                                                \
      -DCMAKE_BUILD_TYPE=RELEASE                                              \
      -DCMAKE_INSTALL_PREFIX=$INSTALL_PATH                                    \
      -DCMAKE_LIBRARY_PATH=$CUDA_PATH/lib64/stubs/                            \
      -DCUDA_CUDA_LIBRARY=$CUDA_PATH/lib64/stubs/libcuda.so                   \
      -DCUDA_FAST_MATH=ON                                                     \
      -DCUDA_TOOLKIT_ROOT_DIR=$CUDA_PATH                                      \
      -DENABLE_CCACHE=ON                                                      \
      -DENABLE_FAST_MATH=ON                                                   \
      -DENABLE_PRECOMPILED_HEADERS=OFF                                        \
      -DINSTALL_C_EXAMPLES=ON                                                 \
      -DINSTALL_PYTHON_EXAMPLES=ON                                            \
      -DINSTALL_TESTS=ON                                                      \
      -DOPENCV_EXTRA_MODULES_PATH=$DOWNLOAD_PATH/opencv_contrib/modules/      \
      -DOPENCV_ENABLE_NONFREE=ON                                              \
      -DOPENCV_TEST_DATA_PATH= ~/ORB_SLAM/opencv_extra/testdata/              \
      -DWITH_CUBLAS=ON                                                        \
      -DWITH_CUDA=ON                                                          \
      -DWITH_FFMPEG=ON                                                        \
      -DWITH_GDAL=ON                                                          \
      -DWITH_GSTREAMER=ON                                                     \
      -DWITH_LIBV4L=ON                                                        \
      -DWITH_NVCUVID=ON                                                       \
      -DWITH_OPENCL=ON                                                        \
      -DWITH_OPENGL=ON                                                        \
      -DWITH_OPENMP=ON                                                        \
      -DWITH_QT=ON                                                            \
      -DWITH_TBB=ON                                                           \
      -DWITH_V4L=ON                                                           \
      -DWITH_VTK=ON                                                           \
      -DWITH_XINE=ON                                                          \
..

make all -j12 #12 cores
sudo make install
sudo apt-get install python-opencv
I got the cmake error like down below and the solution is on that link.

Error :
CMake Error: The following variables are used in this project, but they are set to NOTFOUND.
Please set them or make sure they are set and tested correctly in the CMake files:
CUDA_nppi_LIBRARY (ADVANCED)

Solution :
https://stackoverflow.com/questions/46584000/cmake-error-variables-are-set-to-notfound

And then, I did cmake again and it worked!


3. Install ORB_SLAM

(1) Clone the repository:
git clone https://github.com/raulmur/ORB_SLAM2.git ORB_SLAM2
(2) Compile:
cd ORB_SLAM2
chmod +x build.sh
./build.sh
I got the error after this, so I added '#include<unistd.h>' in 'system.h' file.

Error:
‘usleep’ was not declared in this scope usleep(3000);

Solution:
Modify ORB_SLAM2-master/include/system.h, add "#include <unistd.h>" in header.

And then, I deleted every build file in ORB_SLAM2, DBoW, g2o and run 'build.sh' again.
It worked!

4. Test ORB_SLAM2

A test data set for a monocular camera was downloaded for testing.

(1) Download the test data set
Download a sequence from http://vision.in.tum.de/data/datasets/rgbd-dataset/download and uncompress it.
wget https://vision.in.tum.de/rgbd/dataset/freiburg1/rgbd_dataset_freiburg1_xyz.tgz

(2) Execution command:
Execute the following command. Change TUMX.yaml to TUM1.yaml, TUM2.yaml or TUM3.yaml for freiburg1, freiburg2 and freiburg3 sequences respectively. Change PATH_TO_SEQUENCE_FOLDER to the uncompressed sequence folder.

./Examples/Monocular/mono_tum Vocabulary/ORBvoc.txt Examples/Monocular/TUMX.yaml PATH_TO_SEQUENCE_FOLDER


The result is OK, and some research records will be made on the ORB-SLAM algorithm in the future. Stay tuned.



[ new blog ]

new blog https://jihyo-jeon.github.io/