omicar02 - OMiLab Robotic Car PoC 2

Keywords:

Use Case

Sensor fusion integates information from different sensors. In the fusion process, information is produced that is not inherent to the sum of information determined by the individual sensors. As devices increasingly rely on rich sensory input, sensor fusion has to be done by computational paradigms. For this purpose, the goal of this experiment is to validate the formal knowledge representation schemes

  • Semantic Networks and 
  • Bayesian Networks.

Semantic Networks can provide a symbolic relation between sensor information. Bayesian natworks can provide posterior probabilities about hypotheses interpreting sensory input.

Experiment

This validation environment uses an autonomous vahicle controlled by a Raspberry Pi 2. Motors run on 7.2v power. Sensors and actuators are controlled on 3.3v and 5v logic. 

Actuators

  • 4 Independently Controlled Wheels
  • 16 Servos max
  • Laser
  • Speaker
  • Buzzer
  • LCD Display
  • RGB Matrix

Sensors

  • Microphone
  • Camera
  • 4 Analog Line Sensors
  • 4 Digital Line Sensors
  • 2 Color Sensors
  • 2 Distance Sensors
  • Noise Sensor
  • Temperature Sensor
  • Accerometer
  • Gyroscope
  • Compass
  • Lux Sensor
  • ...

Integration in the OMiLab Portal is archieved by consolidating REST interfaces in a microservice. REST endpoints are automatically generated for each device of the autonomous vehicle.

The OMiCarPoc2 can be called via a REST service. The basic address of the web services is:

http://austria.omilab.org/omirob/omicarpoc2/rest

 

Results