a live-ish live stream































EPISODE 2:
AR-TV (ARRGH!-TV)




































AR-TV is a streaming publication of design research that explores how the stories we consume today are authored by not just humans but algorithmically tuned cameras.

As a part of SNAP INC Research’s 2020 CREATIVE CHALLENGE, in collaboration with BBC Research and MICROSOFT Research, ARTCENTER’S Immersion Lab examined the future of storytelling with augmented reality through the lens of the camera, computer vision, and machine learning.

AR-TV proposes ways to identify, author, and share new stories in collaboration with autonomous machines. If a story is a series of events that are suddenly given priority, importance, and structure, who or what decides this? How can computer vision reveal events and details that once went unnoticed? How can machine vision and learning models detect features, infer patterns, and predict scenes, now be co-authors? How might these new machine envisioned stories change how we understand and relate to one another?

ARTCENTER COLLEGE OF DESIGN Transdisciplinary Studio HOST DEPARTMENT Interaction Design FACULTY AND DESIGN RESEARCHER Jenny Rodenhouse THESIS RESEARCHER Miranda Jin ARTIST & DESIGNERS Davis Brown, Dillon Chi, Brandon Comer, Wenyu Du, Qihang Fan, Anna Kang, Casey Knapp, John Ma, Susie Moon, Jeanne Park, Ian Sterling, Nicole Wang TEACHING ASSISTANT Leo Yang WEB & AR BUMPER DESIGN Jenny Rodenhouse 



THANK YOU to our mentors RAJAN VAISH, ANDRES MONROY-HERNANDEZ, SNAP INC RESEARCH, BBC RESEARCH, and MICROSOFT RESEARCH
































































































How can we use AR to idenitify individual animals and craft their own stories?

REBUILT HIERARCHY is a future mobile AR application that uses computer vision to identify individual animals from their physical attributes, creating a social network for humans and animals. The AR app aims to have people become more personally attached and respectful of animals by becoming more aware of their individualities, habitats, and shared stories. 
BY JEANNE PARK

︎























































How can we use AR and object recognition to author more interactive live streaming content? How can the machine lead the human towards an more unexpected story?

CO-COOKING is an augmented live stream that is collaboratively authored by the chef, the audience, and the machine. BY QIFAN HANG & WENYU DU

︎

















































How can machine learning help us explore our  past, present, and future versions of ourselves with AR? 

MULTIVERSE AGENT is a custom 3D avatar and autonomous agent that helps people determine which path to take by exploring multiple experiences simultaneously. The avatar learns from personalized datasets and acts as a mirror, revealing ways to better understand ourselves. Watch your avatar experiences through real-time AR and snap stories. 
BY SUSIE MOON

︎