Friday, April 22, 2016

Module 4 – Disruptive Technology The MIT SixthSense Project




In defining what a Disruptive Technology is, Thornburg connects the phrase to Clayton Christensen, a Harvard Business School professor, who is credited with coining the phrase ‘Disruptive Innovation’ in his 1997 book, The Innovator’s Dilemma (Christensen, 1997; Laureate Education, 2014a). According to Thornburg, a disruptive technology is a newly emerging technology that unexpectedly displaces an established one (Laureate Education, 2014a). Thornburg further explains that Disruptive Technologies are a ‘Wildcard’ in that a certain technology may be on an evolutionary path and all of a sudden a Disruptive Technology or ‘Wildcard’ comes into the market that meets the same criterion that the Evolutionary Technology does, but does it more efficiently, has a lower cost, and has other benefits to it which leads it to displace the former technology (Laureate Education, 2014a). A current example of a Disruptive Technology is the SixthSense project.

The SixthSense prototype is a wearable gestural interface that augments its user’s physical world, with digital information, that the user can interact with by using natural hand gestures (Mistry, 2009). By using a web camera and a tiny projector mounted in a pendant like wearable device, the SixthSense prototype perceives what the user perceives and visually enhances any surfaces or objects the user interacts with (Mistry, 2009). The SixthSense prototype displays digital data onto any surface, such as a wall or physical objects like the palm of the user’s hand, which allows the user to interact with the displayed digital information with natural hand or arm gestures, or user interaction with the object itself (Mistry, 2009). Through the use of the SixthSense prototype, the user attempts to free information from the confines of a computer or mobile device by seamlessly integrating it with physical reality, and therefore making the entire world the user’s computer (Mistry, 2009).



While pursuing his Ph.D. at the Massachusetts Institute of Technology (MIT) Media Lab in 2009, Pranav Mistry took up further development of this disruptive technology, from innovator Steve Mann, who started the project in 1994. Mistry invented the current SixthSense/Wear UR World (WUW) prototype with open-source software, as a wearable technology to make it simpler for people to connect computing with their daily undertakings without having to carry a laptop or be stuck sitting behind a computer. The SixthSense prototype build costs approximately $350 and consists of standard technologies such as a pocket projector, a small portable mirror and a web cam with built in microphone, four different types of color markers and a mobile computing device, usually a smartphone. The SixthSense technology connected hardware and open-sourcing software endeavors to displace or obsolete smartwatches, smartphones, tablets, laptops, computers, televisions and many more current technologies. Mistry states, ‘SixthSense’ will help us in not being machines sitting in front of another machine” (Dhingra, 2009). 




Both the pocket projector and the web cam connect to the user’s mobile computing device, which is usually a smartphone. SixthSense works by projecting visual information via enabled surfaces, such as a wall or the user’s palm or through physical objects as interfaces. Meanwhile the web cam comprehends and tracks the user’s hand gestures or physical objects using Mistry’s open-sourced software. The software application processes the video stream data captured by the camera and tracks the locations of the colored markers or visual tracking fiducials, which is a point or line assumed as a fixed basis of comparison, at the tip of the user’s fingers. The movements of the fiducials are deciphered into gestures that act as interactive commands for the projected applications interface (Motwani, Motwani, & Kasatwar, 2016). The maximum number of tracked fingers is only constrained by the number of unique fiducials, which allows the SixthSense’s open-source software to supports multi-touch and multi-user interactions.

One of the major social benefits of the SixthSense technology is that it can enabling its users to sense and share stories and other personal attributes and information about people they are interacting with in real time. Personal thought clouds, containing informational traits, hobbies, etc., about the person the user is interacting with can be displayed for the user to use in conversation, while retaining privacy and anonymity. Whereas current social networking sites keeps the user separated from the true reality of their physical world and the importance of remaining human, SixthSense will make it easier to go out into the physical world and interact on a higher societal level of living.

The social implications of using this sort of technology in the classroom is very exciting. As the following video details beautifully, students and educators can use SixthSense as a translation tool to collaborate with other students and educators in different areas of the world seamlessly. There are two additional ways the SixthSense technology can enhance the educational experience, by encouraging student engagement and by stimulating self-directed learning. As an experienced teacher, I have noticed that my students learn more when they are actively engaged. Media can be a powerful tool, sparking curiosity, promoting scientific enquiry and enlisting critical thinking, by helping my students make engaging connections between their real world experiences and the content they are learning (Quinones, 2010). Within self-directed learning, the student takes the initiative and the responsibility for what occurs. By combining the ability to share information with other learners and educators, as well as tap into the world of data and information available on the Internet, through use of the SixthSense technology, students are able to follow new paths of self-directed enquiry that will expand their understanding of the topics they are studying (Looi et al., 2016).



An example of using the SixthSense technology to enhance student learning would be using it on a field trip to the museum or zoo to collect information on a certain subject for a group collaborative project. Learners would be able to take pictures of the subject using their fingers and have them added to their SixthSense device. Afterward, the students could use a flat surface in the museum or zoo to post their pictures and work collaboratively on their project, thereby enhancing student engagement and self-directed learning.  


Since SixSense is still considered a project and has not come to fruition as a self-contained, all in one device. Therefore, it is difficult to predict how many years it has before another emerging technology or disruptive technology will replace it. In the following video Maes, head of MIT’s Media Lab's Fluid Interfaces Group, speaks to the SixSense technology arriving in the form of a human brain implant that may emerge in 10 years of so (Maes & Mistry, 2009). SixthSense is a concept and not an actual viable consumer product yet. For this technology concept to become available in the consumer market numerous factors will have to come together, chief among them will be enhanced battery technology, more efficient versions of the web cam hardware, improvements in wireless data networks and most importantly, the untethering of the smartphone needed to connect to the internet. The technology will most likely germinate as a more fashionable all-in-one wearable pendant device or set of glasses or goggles. As the hardware concept of tethering several devices involved to implement this technology’s conceptual use continues to develop, such devices as Google Glasses, Hololens or slimmed down versions of the Oculus Rift, employing Mistry’s open-sourced software, will most likely replace the current daisy-chained version of the SixthSense technology.





How to build your personal SixSense device


In conclusion, Mistry has invented an amazing technological concept with the SixthSense Project. The permutation of devices and the open-source software combined create a user-friendly reality in which the user’s digital world is joined with their physical world (Balamurugan, 2010). This technology can be used in a plethora of positive ways, to enhance daily tasks. Being able to easily learn information about products while shopping, find instructions while building or constructing, and being able to easily identify areas while travelling are all extremely positive results of the Sixth Sense technology project. Mistry (2009) tells us that the SixthSense technology is essentially a wearable computer that can surf the web, make phone calls, and even connect to other computing devices (Mistry, 2009). It is more portable and more interactive than any smartphone, laptop, or tablet available today, while costing about the same as a smartphone or tablet. It does, however, pose a threat to society in the form of privacy issues. It allows for the user to easily record and photograph anything without being noticed, and it can also allow the user to gain available online information about anything just by glancing at it and performing an online search. This may lead to social, legal, ethical, and security concerns. Sixth Sense technology is significant in its intended purpose for daily task efficiency and interactive computing, but it may prove to not be practical due to the societal issues it may raise and the portability of hardware involved.




Resources

To learn more about the SixthSense Project please follow this link:
http://www.pranavmistry.com/projects/sixthsense/

For more history on the SixthSense Project please use this link:
https://en.wikipedia.org/wiki/SixthSense

References

Balamurugan, S, (2010). Sixth sense technology, proceedings of 2010 internationalconference on communication and computational intelligence (pp. 336-339). Chennai: Scitech publications (India).

Christensen, C. M. (1997). The Innovator's Dilemma: The Revolutionary Book that Will Change the Way You Do Business (Collins Business Essentials).

Dhingra, S, (2009, October). Sixth sense technology will revolutionize the world. TheViewspaper. Retrieved February 28, 2012, from http://www.theviewspaper.net/ sixth-sense-technology-will-revolutionize-the-world/

Laureate Education (Producer). (2014a). David Thornburg: Disruptive technologies [Video file]. Baltimore, MD: Author.

Looi, C. K., Lim, K. F., Pang, J., Koh, A. L. H., Seow, P., Sun, D., ... & Soloway, E. (2016). Bridging Formal and Informal Learning with the Use of Mobile Technology. In Future Learning in Primary Schools (pp. 79-96). Springer Singapore.

Maes, P. and Mistry, P. (2009, February). Pattie Maes and Pranav Mistry: Meet the SixthSense interaction [Video file]. Retrieved from http://www.ted.com/talks/pattie_maes_demos_the_sixth_sense

Mistry, P. (2009, November). Pranav Mistry: The thrilling potential of SixthSense
technology. [Video file]. Retrieved from http://www.ted.com/talks/pranav_mistr

Motwani, C., Motwani, D., & Kasatwar, A. (2016). Six Sense Technology Using Hand Gesture. International Journal of Research, 3(5), 93-99.

Quinones, D. (2010). Digital media (including video!) Resources for the stem classroom and collection. Knowledge Quest, 39(2), 28-32.

Wilson-Richter, L, (2009, March 13). Pattie Maes’ sixth sense technology: what’sstopping this?. Lucas Wilson-Richter. Retrieved February 28, 2012, from http://www.lucasrichter.id.au/2009/03/13/pattie-maes-sixth-sense-technology-whats-stopping-this/



Thursday, April 7, 2016

Module 3 – Rhymes of History Virtual Reality Technologies



     

Rhymes of History - Virtual Reality Technologies


In explaining Rhymes of History, Thornburg (2014h) presents a quote attributed to Mark Twain, “History may not repeat itself, but it sure rhymes a lot”. Thornburg goes on to tell us that the prospective of Rhymes of History differs from Evolutionary Technology, where as, Evolution, within the Rhymes of History, may be taking place, the affect or impact of a new development rekindles something from the distant past (Laureate Education, 2014h). McLuhan’s Tetrad Model also speaks to this rekindling or retrieving of distant technologies of the past.

The affect, that the following four technologies rekindles is that of Virtual Reality or the simulation of a three-dimensional appearance or setting that can be interactive in an outwardly real or objective way by the user operating the specific technology.

This blog post will illustrate the major Rhymes of Historic technologies that correspond with Virtual Reality from the current 21st Century Virtual Reality Goggles through two Rhymes of Historic Wearable Technology and one observable technology dating back to the 19th Century. Of course, there are other Rhymes of Historical significance that could have been included in this blog post, however I choose these four technologies because they really stood out in popular use.

Virtual Goggles

An example of a current technology that represents the Rhymes of History are Virtual Reality (VR) Goggles and Headsets. A virtual reality headset is a wearable device that you place over your eyes like a pair of goggles (Petry & Huber, 2015). This type of device blocks out all external light and shows the user an image on high-definition screens in front of their eyes. The objective of VR goggles is to engross the user in a virtual simulation. In VR simulations, the user’s point of view is the character’s point of view; most VR headsets track the user’s head movement, so that wherever one looks, their simulated personality looks, too. If done well, the user will feel like they are interactive inside the simulation. VR goggles use two video feeds sent to one headset display, one video feed per eye. There are also lenses which are placed between the user’s eyes and the headset’s pixels which is why the devices are often called goggles (Petry & Huber, 2015). In addition, the individual video feeds are adjustable to correct the distance between the user’s eyes which differs for each person.

For our purpose of Rhymes of Historical perspective, the most important feature of this technology is that the goggle’s lenses focus and reshapes the picture for each eye and creates a stereoscopic 3D image by angling the two 2D images to mimic how each of our two eyes views the world ever-so-slightly differently (Nurkkala & Kalermo, 2015). For quick reference, close one of your eyes, to see individual objects dance about from side to side, then do the same with your other eye and you will understand the idea behind this technology’s concept. The first prototype of this current type of technology was developed in 2012 by Palmer Luckey and is called the Oculus Rift. Facebook acquired Oculus Rift for two billion dollars in 2014. The education and medical fields are seeing more inclusive use of VR Headsets to simulate practical experiences, experiments and operations that students cannot perform in real life situations (Cassard & Sloboda, 2016; Estes, Dailey-Hebert, & Choi, 2016; Yelshyna et al., 2016).


Oculus Rift - Step in to The Rift - Reveal Trailer
Double-Click on Video to Play
https://youtu.be/fHpxcCeK1vs



GAF 3D View-Master

VR Goggles, like the Oculus Rift, rekindles the memory of one of the most popular American educational toys from the 1970’s, the General Aniline & Film (GAF) 3D View-Master. The View-Master was acquired by GAF in 1966, from the its original owners, the Sawyer Corporation, who introduced this technology to the world in 1939, at the New York World’s Fair. Like VR Goggles, the View Master employs the same premise of focusing each lens to reshape the picture for each eye to create a stereoscopic 3D image by angling the two 2D images to mimic how each of our two eyes views the world ever-so-slightly differently. You will see this same theme running through all of the iteration of devices along this technological trail of Rhymes of History. More recently, the Mattel Toy Company has envisioned and reinvented the original View-Master using its newly patented hardware coupled with the users’ Android or iOS Smartphone to create educational and gaming VR simulations much like VR Goggles, in a more cost efficient way. Below are three videos looking at the 1970’s version of the GAF View Master and the newly minted Mattel View-Master 360°, followed by a comparison video of both technologies.


1971 GAF View Master Commercial with Henry Fonda & Jodie Foster
Please use this link in your browser
https://youtu.be/O63-XGBwn-k




Mattel’s View-Master 360° Experience – How it works!
Double-Click on Video to Play
https://youtu.be/ADfThVM1TsE




 ADULTS REACT TO VIEW-MASTER (VR VS. 3D)
Double-Click on Video to Play
https://youtu.be/yyca0sHKkJ8





                          


    





The Stereoscope

The 1970’s GAF View-Master, in turn, rekindles the memory of a Victorian optical device identified as the Stereoscope. This 19th century technology created an illusion of 3-D using two side-by-side photographs of the same object taken at slightly different angles and viewed together, creating an impression of depth and solidity. Like GAF’s View-Master and today’s Oculus Rift, in the Stereoscope we see the same common theme of Rhymes of History in the premise of focusing each lens to reshape the picture for each eye to create a stereoscopic 3D image by angling the two 2D images to mimic how each of our two eyes views the world ever-so-slightly differently. The 1861 revised and cost efficient Holmes Stereoscope, reimagined by Oliver Wendell Holmes, was said to be the most popular version of this type of technology.





Panoramic Paintings

The Stereoscope was preceded by late 18th and early 19th century Panoramic paintings. This technological Rhyme of History does not use wearable hardware, but nevertheless can be envisioned as starting the journey of virtual reality simulation. If one focuses more precisely on the scope of virtual reality as a means of crafting the illusion that they are present somewhere that they are not, then the earliest effort toward virtual reality would be the 360-degree murals or panoramic paintings (Melman, 2013). These paintings were envisioned to fill the viewer’s complete field of vision, making them feel present in the midst of some historical event or scene. Here is a very interesting video of a contemporary artist’s take on creating panoramic paintings.

Yadegar Asisis panorama pictures | euromaxx
Double-Click on Video to Play
https://youtu.be/7oynuh5HnDE



In conclusion, what will be the next iteration in the succession of these VR Wearable Rhymes of History? Could the next Rhyme be that of some sort of implanted VR chip on our body or linked to our DNA, creating our ‘Virtual Self’? If so, we should be ready for an entirely new branch of psychology to work with those of us who start believing we are actually our “virtual’ self.

Resources

To find more in-depth information on the Oculus Rift VR Goggles please follow this link
https://www.oculus.com/en-us/rift/

To rekindle the history of the original 3D View-Master, please use this link:
https://en.wikipedia.org/wiki/View-Master

If you are interested in learning more about one of the least expensive, but still noteworthy and fun set of VR Goggles, please navigate over to Mattel’s View-Master 360° Experience by using this link: http://www.view-master.com/en-us

To explore the history of Panoramic Paintings, please refer to this link:
http://www.baruch.cuny.edu/library/alumni/online_exhibits/digital/2003/panorama/new_001.htm


References

Cassard, A., & Sloboda, B. W. (2016). Faculty perception of virtual 3-D learning environment to assess student learning. Emerging Tools and Applications of Virtual Reality in Education, 48.

Estes, J. S., Dailey-Hebert, A., & Choi, D. H. (2016). Integrating technological innovations to enhance the teaching-learning process. Emerging Tools and Applications of Virtual Reality in Education, 277.

Laureate Education (Producer). (2014h). David Thornburg: Rhymes of history [Video file]. Baltimore, MD: Author.

Melman, B. (2013). The power of the past history and modernity in the. Victorian World, 466.

Nurkkala, V. M., & Kalermo, J. (2015, January). Development of a tool for semi-automatic creation of virtual environments. In Proceedings of the International Conference on Modeling, Simulation and Visualization Methods (MSV) (p. 35). The Steering Committee of The World Congress in Computer Science, Computer Engineering and Applied Computing (WorldComp).

Petry, B., & Huber, J. (2015, March). Towards effective interaction with omnidirectional videos using immersive virtual reality headsets. In Proceedings of the 6th Augmented Human International Conference (pp. 217-218). ACM.

Yelshyna, D., Gago, M. F., Bicho, E., Fernandes, V., Gago, N. F., Costa, L., ... & Sousa, N. (2016). Compensatory postural adjustments in Parkinson’s disease assessed via a virtual reality environment. Behavioural brain research, 296, 384-392.