scenery_builder

The scenery_builder is a tool that generates execution artefacts from the composable representation of the FloorPlan DSL models.

scenery builder pipeline

Installation

Install all the requirements:

sudo apt-get install blender python3-pip python3-venv -y

First, create a virtual environment and activate it:

python -m venv .venv
source .venv/bin/activate

For Blender to regonize the virtual environment, add it to your PYTHONPATH:

export PYTHONPATH=<Path to .venv directory>/lib/python3.11/site-packages   

From the root directory of the repo, install the python packages by running:

pip install -e .

Usage

This module adds floorplan as a command line interface. You can use the generate command as shown below:

floorplan generate -i <path to input folder>

Where the input folder(s) must contain:

For more information on the parameters that can be used to customize the generation simply call

floorplan generate --help

The command above currently generates the following artefacts:

Docker

To use the scenery_builder via Docker, simply mount the input and output paths as volumes and run the container. This will run the floorplan generate command with the default values defined in the entrypoint.

docker run -v <local input path>:/usr/src/app/models -v <local output path>:/usr/src/app/output scenery_builder:latest

If you want to change the path of the volumes inside the Docker container (i.e., /usr/src/app/models or /usr/src/app/output) or want to customize the arguments, use the following:

docker run -v <local input path>:/usr/src/app/models -v <local output path>:/usr/src/app/output scenery_builder:latest -i /usr/src/app/models -o /usr/src/app/output <optional arguments>

Example

3D asset generated from the environment description

An example model for a building is available here. After transforming the floorplan model into its composable representation, generate the artefacts by passing the folder with the JSON-LD models as inputs:

floorplan generate -i hospital/json-ld -o hospital/gen

That should generate the following files:

.
├── 3d-mesh
│   └── hospital.stl
├── gazebo
│   ├── models
│   │   └── hospital
│   │       ├── model.config
│   │       └── model.sdf
│   └── worlds
│       └── hospital.sdf
├── maps
│   ├── hospital.pgm
│   └── hospital.yaml
├── polyline
│   └── hospital.poly
├── ros
│   └── launch
│       └── hospital.ros2.launch
└── tasks
    ├── hallway_task.yaml
    ├── reception_task.yaml
    ├── room_A_task.yaml
    └── room_B_task.yaml

Task generator

It uses the FloorPlan corners to generate a task specification to visit all corners in a space. The option --dist-to-corner is a float value representing the distance between the corner of a space and its center.

Object placing

This tool places objects (e.g. doors) in indoor environments. By using the composable modelling approach, a scenery can compose the static FloorPlan models with objects such as doors.

Models that can be composed into a scenery

Gazebo world generation

The tool generates SDF format world files for Gazebo. The initial state plugin sets up the scene as determined by the initial state for each object included in the world file.

Tutorials

Tutorials on how to model objects with movement constraints, and how to place them in floor plan models is available here.

Publications

  1. A. Ortega Sainz, S. Parra, S. Schneider, and N. Hochgeschwender, ‘Composable and executable scenarios for simulation-based testing of mobile robots’, Frontiers in Robotics and AI, vol. 11, 2024, doi: 10.3389/frobt.2024.1363281.
    bib ```bib @ARTICLE{ortega2024frontiers, AUTHOR={Ortega, Argentina and Parra, Samuel and Schneider, Sven and Hochgeschwender, Nico }, TITLE={Composable and executable scenarios for simulation-based testing of mobile robots}, JOURNAL={Frontiers in Robotics and AI}, VOLUME={Volume 11 - 2024}, YEAR={2024}, URL={https://www.frontiersin.org/journals/robotics-and-ai/articles/10.3389/frobt.2024.1363281}, DOI={10.3389/frobt.2024.1363281}, ISSN={2296-9144}, } ```

Acknowledgments

This work has been partly supported by the European Union’s Horizon 2020 projects SESAME (Grant No. 101017258) and SOPRANO (Grant No. 101120990).

drawing
drawing
drawing