More descriptions forthcoming later this week, but here are the screens of the current version of Ambient MET which I submitted for my final project.
David Pierce from WIRED runs down some interesting histories of unreleased and short-lived products that I'd never heard of...Interesting to imagine different ways the connected device landscape could have played out.
Pierce looks at the same future/now that Mark Weiser and John Seely Brown at Xerox PARC envisioned in "The Coming Age of Calm Technology," i.e. that one day we'd be surrounded by many computers.
The difference here being that instead of designing for elegant peripheral information/experiences that users should be able to seamlessly bring into and out of focus at any time a la "Calm Technology," Pierce argues that now the focus should be on designing an AI service layer that elegantly connects all your connected devices. "So all you need to do is log in." Groovy..
I will continue on the same path that interested me during the midterm - an iOS app that bundles a suite of interactive, ambient music soundscapes that respond to some of the different galleries at the Metropolitan Museum of Art.
After meeting with ITP alum and MET employee Spencer Kiser, my sense that this might also be interesting to others within the museum-going community was invigorated. We connected over a shared love of sound design and Janet Cardiff's Her Long Black Hair, and Spencer offered to help in ways that might benefit the development of the project. Particularly, that I might be able to enter the MET before visitors have arrived to capture Impulse Response (IR) recordings of the empty galleries. By using the natural decay of an IR (like a clap or sharp attack, discrete event), I can model room-specfic Convolution Reverbs that make the audio samples of that soundscape realistically sound as though they exist within that room.
I've also made some preliminary visits of my own to the MET, looking for ideal rooms and galleries, and making some iPhone field recordings of the spaces with visitors in them. This is a key element to each soundscape's ability to convince your brain of the mixed reality element, which I learned firsthand from Janet Cardiff's incredibly simple yet effective piece.
Here are some images from my MET visits and their associated field recordings.
There's also a paper which recently affected me very much, "The Coming Age of Calm Technology" by Mark Weiser and John Seeley Brown. They authored this in 1996, built on their experiences at Xerox PARC. They correctly predict the coming age of IoT, and that in order to retain our personhood/sanity/calmness, we must design these types of experiences and smart objects to exist seamlessly within our cognitive periphery. The more effortlessly we can pull something from the periphery to the center of our focus, the more control we'll feel over that experience and thus the more calm we will feel. It is not enough for my project to simply aesthetically align with the idea of calmness, but it must also be capable of gracefully transitioning from the periphery of your museum visit experience to the center of your focus.
In visiting the MET, what you see -- especially in the larger rooms with abundant seating options -- is that the majority of people pull out their phones when taking a break from the art. This is an opportunity! Both in the interest of my project, and hopefully in the institution's interest as well, keeping users creatively engaged with their visit vs. logging on to social media, for example, is an effective way to exploit our inclination to reach for our phones in moments of downtime.
Implement vector analysis of user touch, so as to implement physics engine that allows realistic "flinging" of balls.
Consider 3.js to turn the balls into spheres or low poly 3D objects that bare greater resemblance to the objects in each gallery.
Complete IR room recordings.
Complete sound designing the audio source material of each response piece.
Redraw a MET map for user navigation that's relevant and aesthetically linked to Ambient MET.
After having fallen in love with Janet Cardiff's audio walk in Central Park Her Long Black Hair, I'd been searching for an idea/place/event to match with my own interactive audio experiment.
Ambient Machine was a final project for ICM, and I've been interested to find a new context for it in the mobile realm. Using the Google Maps API to determine location is something I've been interested to pursue.
Here are some wireframes of the different Ambient Machine instruments that serve as inspiration and framework for crafting the response pieces for the Met.
Who is this for?
This is an interactive creative experience designed to augment your visit to The Met. It is perfect for someone making either their first or repeat visit to the museum, who's scale and breadth can be overwhelming. Little moments of ancillary experience can be responded to within the day trip to the museum. The soundscapes will be made to both visually and sonically respond to different vantage points and pieces of art within the Met. Some amount of Impulse Responses and field recordings could be captured in the spaces themselves, as well as photography to be incorporated into the look and feel of the app.
This week I pursued developing the collage/video app inspired by photogrammetry - working title MOOODI, and started to build out the different panels into a one page app integrating jQuery.
Pictured here to the left would be a rough idea of how the home screen would work. By using the camera plugin and embedding it into the top large square, you can easily in one screen begin to shape the input content to MOOODI's cryptic output...
In screens 2-6, you populate the different panels of MOOODI with more snaps from the camera, building out your collage and increasing the amount of content that will be used to create your impressionistic image.
The end result of your home screen after you've populated all the tile panels with imagery.
Lastly you can add a song into this experiential stew, which will make MOOODI's final output into a video of your still image accompanied by the a section of the selected song.
How can the practice of photogrammetric 3D modeling connect with visually oriented social media?
I love the look of these files. An in between moment, an artifact in the middle of a pipeline between PhotoScan's compositing and a 3D graphics implementation. But in this moment they leave so much to the imagination. What object is represented? Ironic that a texture map exploded like this appears to so closely resemble countries plucked from their maps, decontextualized and arranged into new categorizations. What if there were an app like Instagram that gave it's users the ability to capture a moment from a variety of perspectives? Arranged like a moodboard, each post could include a variety of photos that capture different angles, icons, and characters in a scene. You could attach a field recording, or embed the song you were listening to. Give your viewers a deeper impression of the moment you're sharing, but also arranged cryptically, like these photogrammetry texture exports, in a way that encourages imagination.
Here's a pair of screenshots of my take on a simple, inspirational app: a Zen Koan master's guide to daily routines, paired with a lovely photo of Betty White. One is the original web based, and one is the version I'm adapting to my mobile build.
Lynda tutorials have been helpful in filling in my understanding of Cordova, Xcode and PhoneGap and getting a test build successfully completed.
This week I've been sleeping without my phone next to me. Now I leave it charging in the kitchen. It has fundamentally affected how I become restful, sleep, and wake up. Life has become even more plugged in since grad school began, and in the first semester I didn't always sleep very well. Often I'd be on my phone a bit as I got ready to close my eyes, and in moments of sleeplessness I'd reach for it to fill these frustrating little nocturnal moments. Or was it because I knew I could reach for the phone that I had moments of amnesia? I'd also reach for the phone to wake me up even though I knew it felt wrong to screen-zap my corneas into action. It was a circular predicament, and I've definitively noticed that over the last week that even as the busyness of second semester sets in I'm sleeping better and more waking up more humanely. I'd bet over the course of the week I've reduced my screen time by a few hours.
The anxiety and dependency that characterize our pervasive human-phone relationships is very much within our abilities to mediate. I don't need to read emails before I've lifted my head from the pillow, and it's healthier to unwind in bed with a book or a conversation with my partner. Of course we know these things, but this week's homework prompted me to institute a simple change that has left me feeling a refreshing piece of my humanness restored.
80 pages in and it would appear that Eggers has his readers staring down the pike of a dystopic near-future characterized by a culture of pervasive techno surveillance. The Circle seems keen on forecasting our world’s evolving ideas of watching and viewing, and our growing collective comfort with the corporations responsible for this seismic shift in perspective. The Circle corporation is painted in the image of today’s Facebook or Google; a progressive work environment filled with lavish amounts of tech and glass walls, built around products and ideals that sound as forward-looking and moralistic as they do troubling and surreptitious.
The act of seeing is something Eggers’ is beginning to explore via the SeeChange product, a tiny streaming camera that can be easily hidden anywhere, and the charismatic CEO Eamon Bailey’s mantra “ALL THAT HAPPENS MUST BE KNOWN.” Bailey foresees the company’s new product as moving society towards an “all knowing, all-seeing” reality. The Circle corp names sections of their campus after periods of progress (e.g., the Renaissance, the Enlightenment), and views themselves as surfing the front of history’s epistemological wave.
The concept of visual surrogates, those who would live stream events to others unable to attend, strikes me as both unsettling and sad. Alternately, the idea that police, or anyone, should behave in accordance with a reality that they are in an environment of perpetual, potential surveillance by a world of online eyeballs, is both fascinating and problematic. It’s not so different from our current reality; certainly cell phone footage of people behaving illegally in the last two years has been responsible for great forward leaps in holding cops and others accountable for their bad actions.
The Circle appears to be locked on a crash course with a zero-privacy world in which we’ve allowed a FOMO-amphetamine paradigm to increasingly dehumanize us. The sweet character of Mae’s doting father who suffers from MS embodies this point of no return. As he rests in the car at a restaurant's parking lot, staring up at the ‘interlocking boughs of an unremarkable tree,’ he rolls down the lens of his front seat window and remarks “Well, this has been wonderful.” It would seem this is not merely a comment about family brunch, but of the essence of vision, of the shawdoy and tangled structures of systemic power, and of the challenges in decoding a rapidly changing reality. A vestige of techno-innocence sits squarely in our rear view mirror as we plunge over the falls.