Looking through the Eye of the Mouse: A Simple Method for Measuring End-to-end Latency using an Optical Mouse
http://dl.acm.org/citation.cfm?id=2807454
-lightweight method to measure end-to-end latency in real-time
-perform multiple measures per second
-mouse has LED, camera
-connect with Arduino, capture what the mouse sees
-put on white screen- physical gap between pixels (screen door effect)
-features detection
-the more adjacent white pixels, the more screen door effect
-right texture
-at least 1 pixel pointer displacement for 1 count
-evaluating the texture
-measuring latency at different positions on a display
-insert probes in the pipeline
-comparison of toolkits (C++/GLUT, Java/Swing, C++/Qt)
-influence of CPU load
-Conclusion:
-repeated measure of latency in real time
-influence of API
-70ms on average on a high-end computer doing nothing
-main limitation: does not work with laser mice
-website with interactive demo > ns.inria.fr/mjolnir/lagmeter
-no more excuse for not measuring/reporting latency
Joint 5D Pen Input for Light Field Displays
http://dl.acm.org/citation.cfm?id=2807477
-light field 3D - 2 eyes (for binocular stereo variable focus) and a neck (motion parallax)
-light fields via display
-auto-stereoscopic
-auto-multiscopic
-interaction > how to interact in light field space?
-reverse the ray direction to get interactive control
-automatic calibration of camera and 4 projectors
-capture > find contours > sense position
-Joint optical path for light field input and output
-IR pen sensing @150Hz
SensorTape: Modular and Programmable 3D-Aware Dense Sensor Network on a Tape
http://dl.acm.org/citation.cfm?id=2807507
-previous work > cuttable sensors, sensate materials, shape-sensing, modular electronics
-sensing capabilities: IMU sensors, proximity sensors
-prototype1: roll-to-roll printing
-prototype2: polymide-based
-microcontroller, IMU, proximity sensor,
-cuttable design
-peer-to-peer address assignment
-Global I2C bus
-Bending and twisting - quaternions -> roll, yaw, pitch -> (X,Y,Z) each node
-Angle accuracy
-Maximum length
-3D ruler, interactive table, wearable motion tracking
-Democratize hardware and software
github.com/ResEnv/SensorTape
-Scalable hardware 100-200
-manufacture in Shenzhen, China
-Self-sensing fabrics
-Conclusion: form-factor, dynamic self-sensing (shape, proximity), scalable
Blog post on manufacturer
http://shenzhen.media.mit.edu/january-2015-visit/flex-pcb/
Longer version video:
FlexiBend: Enabling Interactivity of Multi-Part, Deformable Fabrications Using Single Shape-Sensing Strip
http://dl.acm.org/citation.cfm?id=2807456
-thin and flexible shape sensors
-Used strain gauge for the hardware
-Hardware design
-Signal processing
-Used Ninjaflex to make the shape
-Challenge: Isolating the widgets > locking mechanism
-Experiments: knob, slider, etc
-Flexibend longer and denser, thinner and more durable
-Conclusion: novel shape-aware strip that brings interactivity to multi-part, deformable objects with ease. tool for iterative
Disclaimer: The opinions expressed here are my own, and do not reflect those of my employer. -Fumi Yamazaki
http://dl.acm.org/citation.cfm?id=2807454
We present a simple method for measuring end-to-end latency in graphical user interfaces. The method works with most optical mice and allows accurate and real time latency measures up to 5 times per second. In addition, the technique allows easy insertion of probes at different places in the system - i.e. mouse events listeners - to investigate the sources of latency. After presenting the measurement method and our methodology, we detail the measures we performed on different systems, toolkits and applications. Results show that latency is affected by the operating system and system load. Substantial differences are found between C++/GLUT and C++/Qt or Java/Swing implementations, as well as between web browsers.
-lightweight method to measure end-to-end latency in real-time
-perform multiple measures per second
-mouse has LED, camera
-connect with Arduino, capture what the mouse sees
-put on white screen- physical gap between pixels (screen door effect)
-features detection
-the more adjacent white pixels, the more screen door effect
-right texture
-at least 1 pixel pointer displacement for 1 count
-evaluating the texture
-measuring latency at different positions on a display
-insert probes in the pipeline
-comparison of toolkits (C++/GLUT, Java/Swing, C++/Qt)
-influence of CPU load
-Conclusion:
-repeated measure of latency in real time
-influence of API
-70ms on average on a high-end computer doing nothing
-main limitation: does not work with laser mice
-website with interactive demo > ns.inria.fr/mjolnir/lagmeter
-no more excuse for not measuring/reporting latency
Joint 5D Pen Input for Light Field Displays
http://dl.acm.org/citation.cfm?id=2807477
Light field displays allow viewers to see view-dependent 3D content as if looking through a window; however, existing work on light field display interaction is limited. Yet, they have the potential to parallel 2D pen and touch screen systems, which present a joint input and display surface for natural interaction. We propose a 4D display and interaction space using a dual-purpose lenslet array, which combines light field display and light field pen sensing, and allows us to estimate the 3D position and 2D orientation of the pen. This method is simple, fast (150Hz), with position accuracy of 2-3mm and precision of 0.2-0.6mm from 0-350mm away from the lenslet array, and orientation accuracy of 2 degrees and precision of 0.2-0.3 degrees within a 45 degree field of view. Further, we 3D print the lenslet array with embedded baffles to reduce out-of-bounds cross-talk, and use an optical relay to allow interaction behind the focal plane. We demonstrate our joint display/sensing system with interactive light field painting.
-light field 3D - 2 eyes (for binocular stereo variable focus) and a neck (motion parallax)
-light fields via display
-auto-stereoscopic
-auto-multiscopic
-interaction > how to interact in light field space?
-reverse the ray direction to get interactive control
-automatic calibration of camera and 4 projectors
-capture > find contours > sense position
-scene to render > miltiview rendering
-range: cheating depth of field (display +-50mm) sensing range 0-350mm
-tricks:
-3D print Lenslet array
-negative Z value to draw behind
-optical relay + baffles
-Conclusions:-Joint optical path for light field input and output
-IR pen sensing @150Hz
SensorTape: Modular and Programmable 3D-Aware Dense Sensor Network on a Tape
http://dl.acm.org/citation.cfm?id=2807507
SensorTape is a modular and dense sensor network in a form factor of a tape. SensorTape is composed of interconnected and programmable sensor nodes on a flexible electronics substrate. Each node can sense its orientation with an inertial measurement unit, allowing deformation self-sensing of the whole tape. Also, nodes sense proximity using time-of-flight infrared. We developed network architecture to automatically determine the location of each sensor node, as SensorTape is cut and rejoined. Also, we made an intuitive graphical interface to program the tape. Our user study suggested that SensorTape enables users with different skill sets to intuitively create and program large sensor network arrays. We developed diverse applications ranging from wearables to home sensing, to show low deployment effort required by the user. We showed how SensorTape could be produced at scale using current technologies and we made a 2.3-meter long prototype.-prototyping sensor arrays become complex, time consuming > make it accessible to anyone and easy to prototype
-previous work > cuttable sensors, sensate materials, shape-sensing, modular electronics
-sensing capabilities: IMU sensors, proximity sensors
-prototype1: roll-to-roll printing
-prototype2: polymide-based
-microcontroller, IMU, proximity sensor,
-cuttable design
-peer-to-peer address assignment
-Global I2C bus
-Bending and twisting - quaternions -> roll, yaw, pitch -> (X,Y,Z) each node
-Angle accuracy
-Maximum length
-3D ruler, interactive table, wearable motion tracking
-Democratize hardware and software
github.com/ResEnv/SensorTape
-Scalable hardware 100-200
-manufacture in Shenzhen, China
-Self-sensing fabrics
-Conclusion: form-factor, dynamic self-sensing (shape, proximity), scalable
Blog post on manufacturer
http://shenzhen.media.mit.edu/january-2015-visit/flex-pcb/
Longer version video:
FlexiBend: Enabling Interactivity of Multi-Part, Deformable Fabrications Using Single Shape-Sensing Strip
http://dl.acm.org/citation.cfm?id=2807456
-thin and flexible shape sensors
-Used strain gauge for the hardware
-Hardware design
-Signal processing
-Used Ninjaflex to make the shape
-Challenge: Isolating the widgets > locking mechanism
-Experiments: knob, slider, etc
-Flexibend longer and denser, thinner and more durable
-Conclusion: novel shape-aware strip that brings interactivity to multi-part, deformable objects with ease. tool for iterative
Disclaimer: The opinions expressed here are my own, and do not reflect those of my employer. -Fumi Yamazaki
0 件のコメント:
コメントを投稿