- SIAT News
Research & Exhibitions
- Learning as avatars transforms virtual classrooms
- No Child Alone: SIAT researcher develops online social network for children
- International team develops advanced COVID-19 virtual reality training simulations
- Slow Interaction Design: Advances in Research and Practice
- How posthumanism plays a role in designing for the unknown ↗
- An AI painter that creates portraits based on the traits of human subjects ↗
- SIAT alumna runs study with rural Chinese students and augmented reality for learning English
- Touching your loved ones over distance ↗
- Visualizing science: how colour determines what we see
- SIAT instructor Chantal Gibson wins poetry award
- Chantal Gibson's new art show features work by SFU Pub and SIAT students
- SIAT professor contributes to a sold out MOMA show
- Artificial Intelligence - Research Keeps it More Human
- Information Visualization Dashboards
- An 85 inch "tablet" for data visualization
- SIAT Success at ACM CHI 2019
- Women Made Visible
- Connecting People Through Technology
- Designing Mind-full apps
- Virtual Meditative Walk
- Exploring Creative Artificial Intelligence
- Could VR make us more human?
- Project & Story Submission
- Staff & faculty resources
- Fall 2020 Showcase Submissions
An 85 inch "tablet" for data visualization
At SIAT, VVISE is the Virtual and Augmented Reality, Visual Analytics, Interaction, Systems & Experiments Lab. The research in the VVISE group spans the fields of Human-Computer Interaction, Virtual and Augmented Reality, Visual and Immersive Analytics, large displays, 3D User Interfaces, and both software and hardware systems.
VizInteract (pictured above) is an interactive data visualization tool being built by SIAT Masters student Supratim Chakraborty and Professor Wolfgang Stuerzlinger, director of the VVISE Lab. It allows for rapid exploration of complex data sets through multi-touch interactions with graphical data plots. VizInteract addresses the need for data exploration by allowing the easy creation and manipulation of visualizations, such as scatter plots and parallel coordinate plots, through simple touch gestures. It also affords touch-based data filtering to support digging “down” into the details.
With the motivation to make a single tool that can be used everywhere from a mobile device to large interactive surfaces, the team was able to successfully deploy VizInteract as an Android application running on an 85-inch 4K display with a touch screen overlay on it, part of the impressive V4-Space (pictured above). Supratim initially created VizInteract to run on Android-based devices, including a 10” tablet. Driven by the need to be able to see larger amounts of data simultaneously, which makes it easier to make sense of complex data, Professor Stuerzlinger challenged him to think about running the system on a larger interactive display. Creating an application that runs on smartphones and large interactive surfaces is hard and traditionally requires re-development of the software for multiple platforms.
After some research, Supratim decided to build on Android-x86, a project that has ported the mobile device operating system Android to run on PC hardware. As mobile operating systems such as the Android-x86 have much more sophisticated and mature support for touch interaction, this enables VizInteract to offer a rich palette of interaction methods, commensurate with what current mobile devices afford. The most fascinating insight gained was a rethinking of the multi-touch experiences for data visualization tools, especially on display surfaces that are much larger than usually considered and the challenges they posed. Most notably, the team is still in the process of identifying key changes in the user experience in terms of dragging, rotating, and pinching gestures on an 85-inch “tablet” in comparison to a regular mobile tablet.