• 1

    Amelia Winger-Bearskin

REEL of Interactive Performance Projects

Interactive Performance Art and Video

network visualization

Currently Amelia is working with her research group The MultiAgency Collective and Caroline Woolard on

THE EXCHANGE ARCHIVE

a research database for projects about exchange. From unconventional dialogue to reciprocity systems, the Exchange Archive supports research about contemporary exchange practices.We aim to support artistic research by making legible the people, ideas, and materials that surround exchange-based work today.The Exchange Archive highlights connections between works.

 

WHO WE ARE:

The Exchange Archive was conceived, designed, and developed as a collaboration between the MultiAgency Collective and Caroline Woolard, as part of MoMA Studio: Exchange Café.

MultiAgency Collective: (in alphabetical order)

Amelia Winger-Bearskin is an Interactive Performance Artist living in Brooklyn. She has been a solo performer at numerous international performance festivals and art biennials since 2008 in cities not limited to: Beijing, China; Manila, Philippines; Seoul and Gwangju, South Korea; Sao Paulo, Brazil; Tasmania, Australia; New York, NY; and Washington, DC. Her newest works are audience experiences merging the worlds of computational media, physical computing and agent based software. 

Corey Brady is a designer of technologies for collaborative knowledge building. He is president of Inquire Learning, LLC, and co-Principal Investigator on two grants from the National Science Foundation. He holds a PhD in mathematics education, as well as master’s degrees in both mathematics (algebraic topology) and English (Victorian literature).

Pratim Sengupta is an assistant professor of Learning Sciences at Vanderbilt University, and recipient of the National Science Foundation’s Early CAREER award. He designs multi-agent based and tangible computational systems for a wide age group of learners, and studies cognitive development of both children and adults. He directs the Mind, Matter & Media Lab.

Developer: Mason Wright

Mason Wright is a software developer at Vanderbilt University. He holds a master’s degree in science education from Vanderbilt and has taught high school physics. As a programmer, he builds visual programming environments for students in grade school through high school, with funding from the NSF at the Mind, Matter & Media Lab. He has also used JavaScript and HTML5 to implement educational web games that can be played on mobile devices.

Exchange Café Organizer: Caroline Woolard

Caroline Woolard is an artist and organizer based in Brooklyn, New York. Making sculptures, furniture, events, and workshops, Woolard co-creates spaces for critical exchange, forgotten histories, and plausible futures. In 2009, Woolard cofounded three organizations to support collaborative cultural production: a studio space, a barter network, and Trade School. Woolard teaches at the New School and is a Fellow at Eyebeam.

I am currently a Master’s Candidate at at NYU – TISCH – ITP (Interactive Telecommunications Program) in New York City.

Final Projects from my first semester at ITP Fall 2013:

Touching the Voice

a vocal instrument using the LeapMotion

 

Created with Processing (LeapMotion Library for Processing, Minim and OSC5 to Max/MSP )

 

Touching the Voice is an instrument for the voice. It makes visible expressions of the human instrument with intimate hand gestures. I use the LEAP as a voice control tool for live performance rather than the Kinect because of the ability for quiet movements to articulate the intimate voice (the microphone voice). I would like the gestures that a vocal performance has naturally those small articulations of the hands that are used in vocal performance to help the singer convey emotion and for the audience to visualize what is happening inside the singer’s body control the voice.

You may have seen this project on QuiteBeyond.de a private tech blog about the LEAP Motion by Maximilian Werkhausen

 

See Processing and MAX Code

Processing Code:

/*

Leap Motion library for Processing.

Copyright (c) 2012-2013 held jointly by the individual authors.

*/

import ddf.minim.*;

import ddf.minim.AudioSample;

import ddf.minim.AudioPlayer;

AudioInput in;

Minim minim;

AudioSample circuit;

AudioSample la1;

AudioSample la2;

AudioSample metro;

AudioSample tone;

float val;

int val2;

int bufferSize = 4;

int fftSize = floor(bufferSize*.9)+1;

import com.leapmotion.leap.Controller;

import com.leapmotion.leap.Finger;

import com.leapmotion.leap.Frame;

import com.leapmotion.leap.Hand;

import com.leapmotion.leap.processing.LeapMotion;

import com.leapmotion.leap.Gesture;

import oscP5.*;

import netP5.*;

int fingers = 0;

LeapMotion leapMotion;

OscP5 oscP5;

NetAddress addressForMax;

float[] leapValues = new float[3];

void setup()

{

size(1000,650);

background(20);

frameRate(60);

//textAlign(CENTER);

minim = new Minim(this);

//minim sounds

circuit = minim.loadSample(“circuit.mp3″, 512);

la1 = minim.loadSample(“la1.mp3″, 512);

la2 = minim.loadSample(“la2.mp3″, 512);

metro = minim.loadSample(“metro.mp3″, 512);

tone = minim.loadSample(“tone.mp3″, 512);

//end of sounds

leapMotion = new LeapMotion(this);

oscP5 = new OscP5(this,12346);

addressForMax = new NetAddress(“localhost”,12345);

// use the getLineIn method of the Minim object to get an AudioInput

in = minim.getLineIn();

// uncomment this line to *hear* what is being monitored, in addition to seeing it

in.enableMonitoring();}

void draw()

{background(0);

//fill(20);

//rect(0, 0, width, height);

fill(255);

textSize(2*height/5.0);

text(String.valueOf(fingers), width/6, 11*height/11.0);

for(int i = 0; i < in.bufferSize() – 1; i++)

{ stroke(0);

strokeWeight(1);

line( i*8, 900 + in.left.get(i)*500, i+height/2, 1 + in.left.get(i+1)*7);

stroke(255,43);

strokeWeight(7);

line( i, 1 + in.right.get(i)*900, height, 550 + in.right.get(i+1)*10 );

//ellipse(5 + in.left.get(i)*150, height-in.right.get(i)*150, 500, 500);

}

OscMessage myMessage = new OscMessage(“/finger”);

myMessage.add(fingers);

oscP5.send(myMessage,addressForMax);

}

//more minimminimminim

void onInit(final Controller controller)

{

controller.enableGesture(Gesture.Type.TYPE_CIRCLE);

controller.enableGesture(Gesture.Type.TYPE_KEY_TAP);

controller.enableGesture(Gesture.Type.TYPE_SCREEN_TAP);

controller.enableGesture(Gesture.Type.TYPE_SWIPE);

// enable background policy

controller.setPolicyFlags(Controller.PolicyFlag.POLICY_BACKGROUND_FRAMES);}

//miniminiminiminimin

void onFrame(final Controller controller)

{

Frame frame = controller.frame();

if (frame.hands().isEmpty())

{

fingers = 0;

}

else

{

int c = 0;

for (Hand hand : frame.hands())

{

c = Math.max(c, hand.fingers().count());

}

fingers = c;

}

for (Gesture gesture : frame.gestures())

{

if (“TYPE_CIRCLE”.equals(gesture.type().toString()) && “STATE_START”.equals(gesture.state().toString())) {

circuit.trigger();

}

else if (“TYPE_KEY_TAP”.equals(gesture.type().toString()) && “STATE_STOP”.equals(gesture.state().toString())) {

la1.trigger();

}

else if (“TYPE_SWIPE”.equals(gesture.type().toString()) && “STATE_START”.equals(gesture.state().toString())) {

la2.trigger();

}

else if (“TYPE_SCREEN_TAP”.equals(gesture.type().toString()) && “STATE_STOP”.equals(gesture.state().toString())) {

metro.trigger();

}

else if (“TYPE_SCREEN_TAP”.equals(gesture.type().toString()) && “STATE_STOP”.equals(gesture.state().toString())) {

tone.trigger();}

println(“gesture ” + gesture + ” id ” + gesture.id() + ” type ” + gesture.type() + ” state ” + gesture.state() + ” duration ” + gesture.duration() + ” durationSeconds ” + gesture.durationSeconds());

}

}

void stop()

{

circuit.close();

la1.close();

la2.close();

metro.close();

tone.stop();

super.stop();

}

//lessminiminiminim

//thank you Justin Lang for helping me with the OSC for Processing Library

Max/MSP Code:

I’ve never used max before and this was a very fun and challenging way to learn a new program in 2 weeks but I am very happy with the way it turned out.

leapvocals2_amelia

 Solar Stitching

Managing E-Waste in Africa and creating a maker resource online at SolarStitching.com

a project for DESIGN FOR UNICEF

class at NYU – ITP -  Presented to UNICEF Innovation Lab December 13th 2013 at the UNICEF Headquarters in New York City.

by

Rocío Almanza-Guillén, Shilpan Bhagat, Amelia Winger-Bearskin

Screen-Shot-2013-12-04-at-9.09.01-PM

What are we doing?

Generating energy opportunities around proper e-waste management, specifically from solar panels.

What is the goal of this website?

Hosting simple how to videos and a visual guide anyone can use to:

*  repair solar panels
*  build new solar panels from broken cells
*  use solar powered soldering techniques
*  use a variety of diverse supplies to fix and build panels
*  Learn about managing EWaste in Africa and all over the world.

H . A . U . T . E

Hidden and UnTagable Eyewear

HAUTE

Celebs and adulterers:  for a night out or a clandestine encounter, these shades will keep you shaded from unwanted photographs and Facebook walls.

Using infrared and ultra bright LEDs these glasses make your face unrecognizable (and un-taged) in photographs while keeping you looking stylish and uninhibited.

H.A.U.T.E. was part of Wearable Tech Event Fall 2013 at Huge in DUMBO.
From the event press:

“Google Glass has revolutionized the way people make fun of wearable tech. But being-ripe-for-parody aside, there is no doubt that wearable tech is here to stay. Many of us have thought about purchasing a Pebble watch or a Nike Fuel Band, and these well known products only scratch the surface of what’s possible with current tech.”

Technical.ly did a write up on the event and posted some photos as well

NEWS:

Amelia is a graduate student at NYU-TISCH-ITP (Interactive Telecommunications Program) in New York City. At ITP she works with physical computing and creative coding to create interactive responsive systems that facilitate in human and machine  learning, live performance art and wearable technologies.  Her ITP blog can be found here at http://www.brooklyninvention.com .

Her video artwork has been included in the 2014 Storytelling : La biennale d’art contemporain autochtone, 2e édition (Art Biennale of Contemporary Native Art) at Art Mur (Montreal, Canada).

BIO:

Amelia Winger-Bearskin works with modeling (as defined by agent based computer programming) as a conceptual prompt in her performance and interactive work, she has developed a concept of Open Source Performance Art (OSPA), she has spoken about OSPA at various academic conferences and performance festivals since 2010. She has been a solo performer at numerous international performance festivals since 2008 in cities not limited to: Beijing, China, Manila, Philippines, Seoul, South Korea, Sao Paulo, Brazil, New York NY and Washington, DC. She performed as part of the 2012 Gwangju Biennial.  Currently she is a Graduate Student at NYU-TISCH-ITP (Interactive Telecommunications Program) 2013-2015 and a Program Designer for NEU (New Engineering University).

 


Please enter the security code:
security code
Security Code (lowercase letters):