Show simple item record

dc.contributor.authorLarrauri, J. Murgoitio
dc.contributor.authorMuñoz, E. D. Martí
dc.contributor.authorRecalde, M. E. Vaca
dc.contributor.authorHillbrand, B.
dc.contributor.authorTengg, A.
dc.contributor.authorPilz, Ch.
dc.contributor.authorDruml, N.
dc.date.accessioned2019-12-16T15:52:14Z
dc.date.available2019-12-16T15:52:14Z
dc.date.issued2019
dc.identifier.citationLarrauri, J. Murgoitio, E. D. Martí Muñoz, M. E. Vaca Recalde, B. Hillbrand, A. Tengg, Ch. Pilz, and N. Druml. “Sensor Testing for Smart Mobility Scenarios: From Parking Assistance to Automated Parking.” Sensor Systems Simulations (June 19, 2019): 331–365. doi:10.1007/978-3-030-16577-2_12.en
dc.identifier.isbn978-3-030-16576-5en
dc.identifier.urihttp://hdl.handle.net/11556/829
dc.description.abstractVehicle automation is one of the major challenges of nowadays’ transport system and its goals are to achieve the ideal energy efficiency, the minimum environment impact and the highest safety rate. In this context, IoSense is the project which will deploy new capabilities (Sensors, Components and Systems) through several demonstrators, one of them called “SmaBility” (Smart Mobility scenarios). So, the intelligent perception and decision making for safer and autonomous driving are the main objectives of the SmaBility demonstrator focus on the “Automated parking”. Then this chapter firstly lists the capabilities in the design, modelling and simulation area of each partner (TECNALIA, IFAT and VIF) involved on the title “From Parking Assistance to Automated Parking” within the Smability. In a second stage, several simulations considering a Time-of-Flight (ToF) camera, as the main perception technology, are explained at both levels: Sensor (ToF) and System (Automated parking). In parking assistance scenario (system level), a ToF camera, similar to the previous one analysed at sensor level, is considered as substitute for ultrasonic range sensors. The expected advantages of using such camera include faster answer, better resolution and object recognition capabilities. Combining depth information with a vehicle geometry model and ego-information (position, speed, steering angle), it is possible to estimate distance to collision point and time to collision (TTC) with great accuracy. Finally, summary and conclusions are reported.en
dc.description.sponsorshipH2020 | ECSEL-IA, 692480, IoSenseen
dc.language.isoengen
dc.publisherSpringer, Chamen
dc.titleSensor Testing for Smart Mobility Scenarios: From Parking Assistance to Automated Parkingen
dc.typebookParten
dc.identifier.doi10.1007/978-3-030-16577-2_12en
dc.relation.projectIDinfo:eu-repo/grantAgreement/EC/H2020/692480/EU/Flexible FE/BE Sensor Pilot Line for the Internet of Everything/IoSenseen
dc.rights.accessRightsembargoedAccessen
dc.subject.keywordsTime-of-Flight 3D sensoren
dc.subject.keywordsRoad automationen
dc.subject.keywordsSmart mobilityen
dc.subject.keywordsAutomated Parkingen
dc.subject.keywordsSimulationen
dc.subject.keywordsSensorsen
dc.journal.titleSensor Systems Simulationsen
dc.page.final365en
dc.page.initial331en
dc.identifier.esbn978-3-030-16577-2en


Files in this item

FilesSizeFormatView

There are no files associated with this item.

    Show simple item record