„The future of the body could be thought about without the hype of technophilic utopia, simply because flesh is always wet, intimate, juicy and a body of bleeding and suffering ... the point is to remember that we might have been otherwise, and might yet be ...“
Focus of Future Bodies
|
Contact Dr. Jill Scott, jscott@smile.ch A. Schiffler, aschiffler@sympatico.ca
|
What are Future Bodies?
Future Bodies are nomadic text- and visual-based entities, floating in cyberspace, waiting for interaction through the net. These characters, who live in cyberspace, create an interactive, non-linear, net-based hyper drama which the viewer can influence.
Future Bodies experiments with the notion of shared interactivity and unpredictability through the potentials of genetic programming.
Future Bodies researches the potential of script-writing over the internet and the resultant interactive drama.
Future Bodies extends the public interest in virtual characters and interactive fiction in relationship to bio-technology and ethics.
Future Bodies the interests and background of the audience as inputs which allow the characters to be directly influenced by the psychological state of the viewing public. The characters can cross over by gender and class, as well as be modified by each other.
Future Bodies progresses through 3 stages: The experimental phase 1, which was completed during 1999, produced a new paradigm for mixed realities which is applied in the following two stages. This project aims at implementing phase 2 which is designed to bring this paradigm to a wider audience and phase 3 to present some results as a „smart-sculpture“.
Can we design new interfaces which enhance both the symbolic and the metaphorical relationship between the organic human body and the content or agent which „sits“ on a computer? The term interface may be extended to include the cognitive and emotional aspects of a face to face dialogue as well as the potential to making both objects and architecture smart and more sociologically present.
The applicant’s work are at the forefront of media art. With her creative explorations in performance art, video art and new media over the last three decades, her works have lead to a much broader concept of the human body, the represented body and the body of the audience. For Future Bodies, she has assembled a group of people from varying fields and backgrounds, bringing together the artistic design of her vision, a sociological experiment on the internet and state-of-the-art technology to create a unique project that has the potential to touch a wide-ranging and diversified audience.
We intent to design the interface of phase 2 to include a „send-to-a-friend“ functionality so it can spread from user to user. Since it is planned to involve a select and instructed initial user group upon start of this part of the project, this functionality will ensure quick visibility and widespread use for the projects software. Since phase 3 is intended as a one time public event or a museum exhibit, poster announcements in the area of the event seem most appropriate and will be undertaken.
The publicly accessible website will also serve as a starting point and archive for the project. It will be promoted through news releases to the appropriate institutions when it is launched.
History: Future Bodies Phase 1
The first stage of Future Bodies was constructed in conjunction with a French exchange student for „Fusion 99“ at the Media Faculty of the Bauhaus University, Weimar, Germany. A Hotline (http://www.bigredh.com) chat interface with Future Bodies was created and explored in real-time over the net with participants from Weimar and the Australian Network for Art and Technology. Future Bodies involved net-based characters, which only existed in a text format and appeared real-time within the common protocol of a Hotline chat session. They were genetically deficient characters, programmed to behave according to a pre-described behaviour called „Brains“. The Brains are written and modified by users with a keyword based program and randomly generated responses (see appendix „Future Bodies Chatterbot Brains“). The characters were called Ms. Poor, Ms. Rich, and Ms. Perfect.
The virtual characters automatically participated in an ongoing chat by either responding to keywords that appeared or by conjugating input from the chat and respond randomly at different times (see appendix „Log of Chat during FUSION 99“). Participants were encouraged to find out more about the Future Bodies as well as write and modify the „Brains“ (the keyword-response mapping agents) in order to change their moods, be modified or be evolved. The program also allowed for the characters to react to the virtual characters responses.
The project explored the real-time potential of virtual characters mixing with real characters over the net, creating a hive of interaction and surprised reaction. The result was recorded and is available for inspection and analysis. It was also a new paradigm of mixed realities, as traditionally, augmented or mixed reality (AR) research aims to develop technologies that mix or overlap computer generated 2D or 3D virtual objects on the real world, while Future Bodies chose the method of textual agents embedded in a live chat session.
Fig. 1: Immortal Duality: The interactive Videowall where the audience can play with the history of genetic engineering which they see reflected inside their own shadows, ZKM, Karlsruhe, Germany
The next stage aims to extend the concepts of Future Bodies to a more general public over an extended period of time. The programming component would include the potential of visual interfaces to the bodies which have as yet not been included. It is also desired to create a user-group around the project. This phase would involve: 1) a more stable server-base for the operation of the Future Bodies; 2) design and implementation of a custom interface application that appeals to a wide audience; and 3) creation of a project website, including links to software already being developed by collaborating partners, which will serve as a distribution point for the application. It is planned to create a more active collaboration between the partner of the project - an exchange which would go both ways.
The implementation of the Future Bodies in this phase draw on the collaborator’s expertise, existing software engineering, developed design ideas, ongoing projects and commercial products of the projects core group. The intent is to create a compelling, organic user interface using the AppWares Integrated Media Portal (IMP) technology created by AppWares Development group (http://www.appwares.com). Embedded into this interface is a java-based multi-agent system that brings the Future Bodies to life. These software agents and the web-based interface are to be loosely based on the „Proxy“ system (http://proxy.arts.uci.edu), a research and development project dealing with issues of knowledge discovery, file-sharing, and information mis/management in relation to networked identity construction and collective behavior. An online server creates the internet presence for the Future Bodies in form of backend services required to manipulate the Bodies, to serve a public website as well as to collect the information that is created by the interacting Future Bodies.
The AppWares IMP is a shaped window that can embedd a mini-browser. It can look like anything and can be scriped to animate like a Macromedia Director application. It can play music and download files in the background. It can also interact with any website that is beeing displayed on the users computer. It is distributed as a compact Windows executable that the user installs on the computer. The application allows also for an easy distribution function using a „send-to-a-friend“ feature that can be used to forward the application via email and thereby helps distribution. This application is the key to a more visual representation of the Future Bodies required to intensify the interaction with the otherwise abstract textual characters. The initial design is determined by a questionaire which is asked of the user when a users Future Body is first created, but it will gradually shift in response to the Future Bodies psychological state. A Java applet, that can run on any webbrowser, is embedded into the IMP and provides the necessary textual and graphical interface by which the character is controlled.
PROXY utilizes user-defined psychological attributes for generating agents, which then influences agent behavior during ongoing interaction within a community of users. A Future Body using the agent system based on PROXY represents a leap forward in terms of detail and depth of characters when compared to the Hotline based „Brains“ of Phase 1.
In PROXY, the emphasis has been on agent personification, and the ability to autonomously and asynchronously fulfill specific tasks while exploring the notion that ongoing interaction does not necessarily require or even benefit from ongoing communication. Agents catalyze community formation by coupling utilitarian function with playfully emergent and unpredictable behavior. For Future Bodies, a distributed audience-base would establish a set of initial psychological parameters though web-based polling, and then begin interacting with a Future Body.
The body’s behavior would, to a degree, be dictated by those parameters as control scripts from the particpant’s remote locations were dynamically updating the Future Bodies’ server. However, in theory, Future Body behavior could also be the sum of the collective psychological state of currently connected users, as well as past users no longer connected, providing a much richer set of behavioral possibilities.
The intent is to create a bi-directional system, where the main interface to a Future Body would be it’s own custom designed IMP, which may also integrate components of PROXY. At the same time, a Future Body could be accessed through PROXY as a plugin (see Fig. 2). This would leverage and extend existing infrastructure, while at the same time extending AppWares IMP and PROXY in new and unexpected directions.
Fig. 2: The IMP/Applet interface with morphing design on WinXP desktop.
The goal is to design an integrated system with 4 components:
A website that provides a home-base for the project, downloads, interfaces for unsupported platforms as well as archives of past states for research into Future Bodies.
An integrated media portal (IMP) for the Windows platform as a visual representation of the Future Body of the virtual character with a complex and dynamic design that depends on the characters state.
A Java based applet to control and maintain the virtual character, provide a custom chat interface, maintain and display state, and control agents.
A centralized archiving and management server system to create characters, track interactions and socialize agents into their own Future Bodies.
A user receives a Future Body through the website as a custom Windows executable IMP. Development steps for the IMP are based on the 2D designs which can be rendered from the Future Body visualizations used in phase 3. The images and masks are combined with program logic and internet connectivity to the character server for state control. An embedded mini-browser is used to run the applet that provides the interface to the Future Bodies state and all agent functionality. Active areas on the IMP allow users to control functionality, to connect back to the website, send the IMP to a friend and other functions that are yet to be defined. The central server is designed to track, record and filter interesting interactions that arise between Future Bodies. Users might participate by actively using the IMP application, or observe passively through interfaces presented on the website. The user can also witness the changing characters of the agents as they talk about ethics and bio-technology to each other.
Since it is expected that active participation can grow into the thousands of users, the server software and connection protocol has to be designed to handle such loads.
As public access increases, the project becomes more visible and the interactions more interesting. To create a connection to public space and a public exhibit or event, the software is integrated into three portable constructions or „smart scupltures“ (See Fig 3, see appendix „Future Bodies Interface“, 4 pages). They create a public access microscope into the world of the Future Bodies by means of a 3D visualization generated with Derivative Inc.‘s „Touch 007“ System (http://www.derivativeinc.com). The images are emotional reactions to the words present in the databases maintained for Phase 2 and the viewer can explore Future Bodies by creating a matamorphic show between the body models received from the net (see also cover page).
Fig. 3: Prototype designs for three „smart sculptures“
The „smart sculptures“ (see Fig. 3 and additional material) are designed to represent a combination of the artificial, virtual and organic. A humanoid shaped terminal resembling an old radio houses a computer and screen with an internet connection, and has several simple and robust button interfaces. The computer is used to create live 3D visualizations using „TouchPlayer“ software. The 3D visuals are based on a set of pre-defined animations but are rendered in real-time. The selection of the scenes is based on Future Bodies data retrieved over the net. The user is abled to control transitions between bodies – for example between the artifial and the organic.
Derivate Inc.‘s TouchDesigner system allows for the creation and visualization of 3D scenes on a standard PC platform in a way which enables one to create expressive, spontaneous digital imagery that can be shared over the internet. Since Derivative Inc.‘s product is geared toward live performance of 3D visuals by a VJ during public events such as „raves“ or media-performances, the possibility exists to place Future Body interface sculptures in the context of a live media performance in a night-club or a museum event. Preproduced 3D visuals are combined in scenes and connected and controlled by the emotional states of the virtual characters, as well as the viewers interaction. A computer running TouchPlayer and equiped with a custom interface for scene control is placed inside the sculpture. An internet connection can be used, if available, to retrieve new states from the server and let the sculpture interact with other Future Bodies in real-time.
We are quite confident that the Beall Center for Art and Technology at the University of California, Irvine, could serve as an appropriate venue for hosting an event that would be coordinated between several remote locations, including Canada, Germany, and the United States. Currently Robert Nideffer sits on the exhibitions committe for the Beall. Event coordination with other locations throughout the University of California system is possible as well.
Conceptual Design: Jill Scott (AUS/SWI) , Media Artist (see appendix „CV – Jill Scott“, 6 pages)
Technical Concept and Design: Andreas Schiffler (CAN), Scientist and Programmer (see appendix „CV – Andreas Schiffler“, 6 pages)
Technical Concept and Design: Robert Nideffer (USA), Sociologist and Computer Artist (see appendix „CV – Robert Nideffer“, 4 pages)
Website: Carter Hodgkin (USA), Artist and Webdesigner
3D Design: Andrew Quinn (AUS), Computer Animator and Musician
The following people have shown interest in beeing in the initial user group of Future Bodies and get the project started by testing and providing input through usage:
Susanne Ackers (GER), Andrea Zapp (UK), Natalie Madnan (FRA), Nina Czegledy, Eric Fong (UK)
Users of http://www.digibodies.org/ will be invited to join Future Bodies.
Jill Scott has exhibited many video artworks, conceptual performances and interactive environments in USA, Australia, Europe and Japan. She has been the Director of Site, Cite, Sight, San Francisco, a lecturer in Media at the University of New South Wales, College of Fine Arts, Sydney and the director of the Australian Video Festival (1984-88). She has worked with computers leading to 3D animation and interactive art. In 1992 she was invited to be a Guest Professor for Computer Animation, in the Hochschule für Kunst, Saarbrücken, Germany, and in 1994 won a price at Ars Electronica for Interactive Art. From 1994-97 she was an Artist in Residence and project co-ordinator for the Medienmuseum at the Zentrum fur Kunst und Medien Technologie (ZKM) in Karlsruhe as well as a Research Fellow at The Center for Advanced Inquiry into the Interactive Arts, University of Wales, Great Britain, where she was awarded a Doctorate in Media Philosophy. Currently she is Professor for Real and Virtual Environments at the Media Faculty of the Bauhaus University in Weimar, Germany.
Andreas Schiffler is an astro-physicist (M. Sc.) and self-educated programmer. He worked in the areas of media-technology, internet, networking, soft- and hardware development, and media-art at the Center for Art and Media Technology (ZKM), Karlsruhe, Germany for E.U. projects, the Expo2000, Hannover, Germany and several major Institutions. He has worked as Chief Software engineer for Tek21 Inc. in a more commercial setting, in charge of design, setup and maintenance of the DeskPlayer product and services. He recently co-founded AppWares Development Group, a Canadian startup company specializing in the development and marketing of integrated media portal (IMP) applications.
Robert F. Nideffer researches, teaches, and publishes in the areas of virtual environments and behavior, interface theory and design, technology and culture, and contemporary social theory. He holds an MFA in Computer Arts, and a Ph.D. in Sociology, and is an Assistant Professor in Studio Art and Information and Computer Science at UC Irvine, where he also serves as an Associate Director of the Virtual Reality Center, and as an Affiliated Faculty in the Visual Studies Program. Currently he is hard at play initiating an Interdisciplinary Gaming Studies Program (IGaSP).
Andrew Quinn is a computer animator and musician, who lives in Milan, Italy. He is an official beta tester for the products of Derivative Inc. and has had for many years an ongoing working relationship with Jill Scott and with Greg Hermanovic from Derivative Inc. His specialities and experience includes major animation segments using Softimage, Houdini and Alias wavefront and he has been a computer animator since 1985. Besides working on advertising he has specialized in special effects and compositing for feature film productions which include „The Matrix“ and others.
Carter Hodgkins is a painter and webdesigner who lives in New York, USA. She has exhibited widely in the USA, Europe and Asia. Her focus in her work is on molecular technology and the philosophical effects of scientific visualization. She has worked as a web-designer for America Online and at present is the senior web-designer for Leggo in New York.
Category
|
Phase |
Description |
Cost |
Wages |
2 & 3 |
Design and Production – Jill Scott |
|
|
2 & 3 |
Programming and Implementation – Andreas Schiffler |
|
|
2 & 3 |
Programming and PROXY re-configuration - Robert Nideffer |
|
|
2 |
3D model design and testing – Andrew Quinn |
|
|
2 |
Homepage and webdesign – Carter Hodgkins |
|
|
3 |
3D scene design and testing – Andrew Quinn |
|
Software |
2 |
IMP development system |
|
|
2 & 3 |
TouchDesigner ( |
|
|
3 |
3x TouchMixer ( |
|
Hardware |
2 & 3 |
Server computer w/o monitor, Linux |
|
|
3 |
3x Smart-sculpture computer, Monitor, Interface, Windows |
|
|
3 |
3x Smart-sculpture construction and material, Germany |
|
Services |
2 |
IMP programming and services – AppWares Dev. Group |
|
|
2 & 3 |
Server internet hosting and bandwidth |
|
|
3 |
Smart-sculpture implementation, transport and presentation |
|
|
2 & 3 |
Publicity |
|
|
|
|
|
Sponsor
|
Product or Service |
Amount |
AppWares Devel. Group |
IMP development environment (valued at ) |
|
Bauhaus University, Weimar |
Smart-sculpture construction and material |
|
Univ. of California at Irvine |
Server hosting, internet connectivity and bandwidth |
|
|
|
|
Source
|
Product or Service |
Amount |
Grant |
(see Table 1 above) |
|
Sponsor |
(see Table 2 above) |
|
TOTAL |
|
|
Project duration is split into four parts commencing tentatively Jun/01, 2002 (pending repection of funding) and completing Jan/31, 2003:
project pre-production
phase 2 production
concurrent phase 2 operation and phase 3 production
phase 3 presentation and project conclusion.
Overall duration is set to be over an 8 months period, with 6 months of core implementation and operation, 1 month of pre-production and 1 month for exhibits and public events.
Time
|
Phase |
Task |
01. Jun 2002 |
2 & 3 |
|
01. Jul 2002 |
2 |
|
Aug 2002 |
2 |
|
Sep 2002 |
2 |
|
01. Oct 2002 |
2 & 3 |
|
Nov 2002 |
2 & 3 |
|
Dec 2002 |
2 & 3 |
|
31. Jan 2003 |
3 |
|
AppWares Development Group4566
Victoria Avenue Web: http://www.appwares.com Email: info@appwares.com
|
|
Derivative Incorporated 401
Richmond Street West Web: http://www.derivativeinc.com Email: sales@derivativeinc.com
|
|
|
University of California at IrvineIrvine, CA
92697 |
Bauhaus UniversityMedia Faculty Bauhaus Str. 11 99423 Weimar Germany
|
CV – Jill Scott, 6 pages.
CV – Robert Nideffer, 4 pages.
CV – Andreas Schiffler, 6 pages.
Future Bodies Phase 1 Chatterbot Brains and Chat Log, 5 pages.
Future Bodies „Smart Sculpture“ Interface Designs, 4 pages.
Official letter from AppWares Development Group - Sponsorship, 1 page.
Official letter from Bauhaus University Weimar – Sponsorship, 1 page.
Official letter from UCI – Sponsorship, 1 page.
Documentation: A Figurative History, Frontiers of Utopia, Beyond Hierarchy – 2 pages, 1 Betacam SP tape (PAL).
The tape give samples of three relevant „Virtual Community“ interactive artworks from the history of Jill Scott‘s work:
A Figurative History (1997), In the collection of the Medienmuseum Karlsruhe, ZKM, Germany. Software programming for Travelling version - Andreas Schiffler.
Frontiers Of Utopia (1995), Australian Film Commission funded work. Also in the ZKM collection
Beyond Hierarchy (2000), Ruhr Vision commissioned work from the City of Dortmund, Germany. Software programming Andreas Schiffler.