your browser can see and hear and
DESCRIPTION
New sensor based Web Standards developments have punched a hole in the web that is letting the real world leak into the browser. The getUserMedia API now lets us access cameras and microphones and JSARToolkit and javascript based Natural Feature Tracking like the examples from ICG Graz University have shown that browsers can now be taught to perceive the world around them. Combine this with the and WebGL and you have a real working model for a Web Standards based Augmented Reality. On top of this we also have OGCs Sensor Web Enablement and new developments like the Sensor API and the rapid spread of networked sensors and wireless Arduino-ised devices. Massively distributed dynamic immersive visualisation is now the new structural form for the modern web.TRANSCRIPT
your browser can see and hear and ...
by @nambor from
photo credit
your brain is a sponge!
photo credit
it soaks up data through your senses
photo credit
but now we have digital sensors too
photo credit
they let our devices see, hear & feel
photo credit
global positioning system
photo credit
digital compasses
photo credit
near field communication
photo credit
new combinations are possible
photo credit
the internet of things is here!
photo credit
demonstration
photo credit
html5 <video> & <canvas>
demonstration
photo credit
JSARToolkit fiducial tracking
demonstration
photo credit
Audio Visualiser in js & css
demonstration
photo credit
Javascript is getting fast
demonstration
photo credit
Javascript is getting threaded
demonstration
photo credit
getUserMedia = local camera access
demonstration
photo credit
Natural Feature Tracking in js
your browser can see and hear and ...
by @nambor from
photo credit