From e0502757672fe0f20321dd1ca8a9216fcdda09e0 Mon Sep 17 00:00:00 2001 From: Lorenzo Date: Tue, 23 Mar 2021 17:51:35 +0100 Subject: [PATCH 1/3] fix figure --- README.md | 4 +--- 1 file changed, 1 insertion(+), 3 deletions(-) diff --git a/README.md b/README.md index f939a29..47313c0 100644 --- a/README.md +++ b/README.md @@ -1,4 +1,4 @@ -# Monoloco library    [![Downloads](https://pepy.tech/badge/monoloco)](https://pepy.tech/project/monoloco) +# Monoloco library      [![Downloads](https://pepy.tech/badge/monoloco)](https://pepy.tech/project/monoloco) gif @@ -321,8 +321,6 @@ python -m monoloco.run eval \ --save \ ```` - -By changing the net and the model, the same command evaluates MonStereo model. From 83fcb0f3bc507d096740b607ad1df12cfa42b1c8 Mon Sep 17 00:00:00 2001 From: Lorenzo Date: Wed, 24 Mar 2021 12:25:35 +0100 Subject: [PATCH 2/3] add video links --- README.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/README.md b/README.md index 47313c0..041e3cf 100644 --- a/README.md +++ b/README.md @@ -3,12 +3,12 @@ gif -This library is based on three research projects for monocular/stereo 3D human localization (detection), body orientation, and social distancing. +This library is based on three research projects for monocular/stereo 3D human localization (detection), body orientation, and social distancing. Check the [demo video](https://www.youtube.com/watch?v=O5zhzi8mwJ4)! > __MonStereo: When Monocular and Stereo Meet at the Tail of 3D Human Localization__
> _[L. Bertoni](https://scholar.google.com/citations?user=f-4YHeMAAAAJ&hl=en), [S. Kreiss](https://www.svenkreiss.com), [T. Mordan](https://people.epfl.ch/taylor.mordan/?lang=en), [A. Alahi](https://scholar.google.com/citations?user=UIhXQ64AAAAJ&hl=en)_, ICRA 2021
-__[Article](https://arxiv.org/abs/2008.10913)__                 __[Citation](#Citation)__                 __[Video](#Todo)__ +__[Article](https://arxiv.org/abs/2008.10913)__                 __[Citation](#Citation)__                 __[Video](https://www.youtube.com/watch?v=pGssROjckHU)__ @@ -353,7 +353,7 @@ When using this library in your research, we will be happy if you cite us! @InProceedings{bertoni_2021_icra, author = {Bertoni, Lorenzo and Kreiss, Sven and Mordan, Taylor and Alahi, Alexandre}, title = {MonStereo: When Monocular and Stereo Meet at the Tail of 3D Human Localization}, - booktitle = {International Conference on Robotics and Automation (ICRA)}, + booktitle = {the International Conference on Robotics and Automation (ICRA)}, year = {2021} } ``` @@ -370,7 +370,7 @@ When using this library in your research, we will be happy if you cite us! @InProceedings{bertoni_2019_iccv, author = {Bertoni, Lorenzo and Kreiss, Sven and Alahi, Alexandre}, title = {MonoLoco: Monocular 3D Pedestrian Localization and Uncertainty Estimation}, - booktitle = {The IEEE International Conference on Computer Vision (ICCV)}, + booktitle = {the IEEE International Conference on Computer Vision (ICCV)}, month = {October}, year = {2019} } From b51c16d7dfb4cf4756c4e7476d1d4c078ecdc89b Mon Sep 17 00:00:00 2001 From: Lorenzo Date: Fri, 26 Mar 2021 11:09:11 +0100 Subject: [PATCH 3/3] change link for joints_kitti --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 041e3cf..1ce7351 100644 --- a/README.md +++ b/README.md @@ -188,7 +188,7 @@ The network estimates orientation and box dimensions as well. Results are saved
## Training -We train on the KITTI dataset (MonoLoco/Monoloco++/MonStereo) or the nuScenes dataset (MonoLoco) specifying the path of the json file containing the input joints. Please download them [heere](https://drive.google.com/file/d/1e-wXTO460ip_Je2NdXojxrOrJ-Oirlgh/view?usp=sharing) or follow [preprocessing instructions](#Preprocessing). +We train on the KITTI dataset (MonoLoco/Monoloco++/MonStereo) or the nuScenes dataset (MonoLoco) specifying the path of the json file containing the input joints. Please download them [here](https://drive.google.com/file/d/1bJPyA1HuX9uyJYf1IhiDqzhkvSokd4l0/view?usp=sharing) or follow [preprocessing instructions](#Preprocessing). Results for MonoLoco++ are obtained with: