figshare
Browse

Machine Learning and Deep Learning Techniques for Colocated MIMO Radars: A Tutorial Overview

Download (11.33 GB)
dataset
posted on 2025-03-11, 14:25 authored by Alessandro Davoli, Giorgio Guerzoni, Giorgio Matteo Vitetta
<p dir="ltr">Last update: February 2021.</p><p dir="ltr">The dataset folder includes both raw and post-processed radar data used for training and testing the networks proposed in Sect. VIII of the article “Machine Learning and Deep Learning Techniques for Colocated MIMO Radars: A Tutorial Overview”.</p><p dir="ltr">The folder <b>Human Activity Classification</b> contains</p><ol><li>“Raw” folder where 150 files acquired with our FMCW radar sensor are given inside the “doppler_dataset” zip folder; they are divided in 50 for walking, 50 for jumping and 50 for running;</li><li>“Post_process” divided in<br>- “Machine Learning” folder including “dataset_ML_doppler_real_activities.mat”; this dataset has been used for training and testing the SVM, K-NN and Adaboost described in Sect. VIII-A).<br>- The 150x4 matrix “X_meas” including the features described by eqs. (227)-(234) and the 150x1 vector of char “labels_py” containing the associated labels.<br>- “Deep Learning” folder containing “dataset_DL_doppler_real_activities.mat”; this dataset is composed by 150 structs of data, where each of them, associated to a specific activity, includes:<br>- The label associated to the considered activity,<br>- The overall range variations from the beginning to the end of the motion “delta_R”;<br>- The Range-Doppler map “RD_map”;<br>- The normalized spectrogram “SP_Norm”;<br>- The Cadence Velocity Diagram “CVD”;<br>- The period of the spectrogram “per”;<br>- The peaks associated to the greatest three cadence frequencies in “peaks_cad”;<br>- The three strongest cadence frequencies and their normalized version in “cad_freqs” and “cad_freqs_norm”;<br>- The strongest cadence frequency “c1”;<br>- The three velocity profiles associated to the three strongest cadence frequencies “matr_vex”.</li></ol><p dir="ltr">The spectrogram images (SP_Norm) contained in this dataset were used for training and testing the CNN in Sect. VIII-A).</p><p dir="ltr">The folder <b>Obstacle Detection</b> contains</p><ol><li>“Raw” folder where raw data acquired with our radar system and TOF camera in the presence of a multi target or single target scenario are given inside the “obst_detect_Raw_mat” zip folder. It’s important to note that each radar frame and each TOF camera image have their own time stamp, but since they come from different sensor have to be synchronized.</li><li>“Post_process” divided in<br>- “Neural Net” folder containing “inputs_bis_1.mat” and “t_d_1.mat”, where<br>- “inputs_bis_1.mat” contains the vectors of features of size 32x1, used for training and testing the feed-forward neural network described in Sect. VIII-B) (see eqs. (243)-(251)),<br>- “t_d_1.mat” contains the associated 2x1 vectors of labels (see eq. (235)).<br>- “Yolo v2” folder containing the folder “Dataset_YOLO” and the table “obj_dataset_tab”, where<br>- “Dataset_YOLO_v2” contains (inside the sub-folder “obj_dataset”) the Range-Azimuth maps used for training the YOLO v2 network (see eqs. (257)-(258) and Fig. 30);<br>- “obj_dataset_tab” contains the path, the bounding box and the label associated to the Range-Azimuth maps (see eq. (256)-(266)).</li></ol><p dir="ltr"><b>Cite as</b>: A. Davoli, G. Guerzoni and G. M. Vitetta, "Machine Learning and Deep Learning Techniques for Colocated MIMO Radars: A Tutorial Overview," in IEEE Access, vol. 9, pp. 33704-33755, 2021, doi: 10.1109/ACCESS.2021.3061424.</p>

Funding

The authors would like to thank CNH Industrial Italia S.p.A. and CNH Industrial Belgium NV for funding this research work.

History

Usage metrics

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC