figshare
Browse

Deep-LfD: Deep robot learning from demonstrations

Download (7.06 MB)
Version 5 2024-10-21, 15:24
Version 4 2024-03-12, 19:30
Version 3 2023-10-29, 16:46
journal contribution
posted on 2024-10-21, 15:24 authored by Amir Ghalamzan Esfahani, Kiyanoush NazariKiyanoush Nazari, Hamidreza Hashempour, Fangxun Zhong

Like other robot learning from demonstration (LfD) approaches, deep-LfD builds a task model from sample demonstrations. However, unlike conventional LfD, the deep-LfD model learns the relation between high dimensional visual sensory information and robot trajectory/path. This paper presents a dataset of successful needle insertion by da Vinci Research Kit into deformable objects based on which several deep-LfD models are built as a benchmark of models learning robot controller for the needle insertion task.

History

School affiliated with

  • Lincoln Institute for Agri-Food Technology (Research Outputs)

Publication Title

Software Impacts

Volume

9

Pages/Article Number

100087

Publisher

Elsevier

ISSN

2665-9638

Date Submitted

2021-06-14

Date Accepted

2021-05-18

Date of First Publication

2021-05-28

Date of Final Publication

2021-08-31

Date Document First Uploaded

2021-06-08

ePrints ID

45212

Usage metrics

    University of Lincoln (Research Outputs)

    Categories

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC