Lecture 3 - Sensor Based Interaction Flashcards
(35 cards)
Some examples of sensors
- accelerometer
- gyroscope
- magnetometer
- temperature
- humidity
- moisture
- ambient light
- proximity
- barometric pressure
- GNSS location (GPS is part of this)
- heart-rate
- fingerprint
- iris scanner
- rader
- LIDAR
-depth
What is an accelerometer?
measure a moving object’s acceleration and can detect frequency and intensity of human movement
What is a gyroscope?
A gyroscope is a device used for measuring or maintaining orientation and angular velocity
What is a magnetometer?
used to measure the strength and direction of the magnetic field in the vicinity of the instrument.
What is the IMU
Inertial Measurement Unit
-> these are a group of sensors that sense intertial motion
-> includes accelerometer (speed), gyroscope (rotation),magnetometer(direction)
What sensors can be used to sense surroundings?
- ambient light sensor (how light or dark it is)
- proximity sensor (how close the nearest thing is)
-> we typically use this sensor to tell when the user is trying to interact or about to interact
-> typically an optical sensor of some sort
Sensors for getting surroundings are usually on the
front of the device
What sensors are used to sense position?
- barometric pressure (level above sea level)
- GNSS location (satellite systems that can be used to pinpoint location)
What kind of sensors can be used to sense 3D?
- radar (using waves to scan for objects nearby)
- LIDAR (Image result for LIDAR
Lidar — Light Detection and Ranging — is a remote sensing method used to examine the surface of the Earth.) - depth
Examples of 3D projects?
- Google tango (map spaces onto phone via camera)
- spatial for iphone (modelling objects via camera)
What are the fundamental components of our devices that we forget are sensors?
- camera
- microphone
- touchscreens
What can the camera be used for?
tracking objects / hands , object recognition
What can the microphone be used for?
- ambient audio
- speech recognition
What can the touchscreen be used for?
(main way to interact with devices in the modern day)
- touch gestures
- pre-touch sensing
- pressure sensing
Do touchscreens have potential to do more than what we are using them for now?
YES
- we can get the exact x,y coordinate of touch
- sensors can detect the full area of touch
- sensors can detect when fingers are above and not actually touching the screen
These abilities mean we can enrich the interaction currently available
What is sensor fusion?
Fusing different sensor data together?
Why fuse sensors?
Data becomes less uncertain for what we are trying to infer (potentially, however some things only require one sensor in isolation)
Example of sensor fusion?
- activity tracking from accelerometer and gyroscope (where and how fast)
- gaze detection from front camera and depth sensor
- indoor mapping from gyroscope and depth camera (depth at different orientations)
Can we use sensors to build new interactions?
Yes, this is called sensor-based interaction.
Examples of some recent sensor-based interactions?
- personal assistants e.g. alexa
- mid air gesture input
- 3D depth sensor input e.g. Google Tango
- grasp pressure input e.g. Google Active Edge
- touchscreen pressure input e.g. Apple 3D touch
What can sensors be used for in computing?
- context aware systems
- sensor-based interaction techniques
Different between inferring context and providing interaction from sensors.
Context helps us infer what the situation is of the user and adapt UI and funcionality, possibly adding special interaction.
Sensor based interaction is where we use a sensor to interact with a device in a certain way and this work regardless of context (unless combined with context awareness)
Another name for sensor fusion?
Mutlimodal sensing
Sensor fusion..
increases accuracy of action estimation and gives ability to have more expressive techniques