[iDC] notes on media remix

Marc Lafia marc at marclafia.net
Wed Apr 19 20:55:46 EDT 2006


Lev  it was very good to see you speak at Pratt and to have read the very
interesting discussion on the list that has followed in the last week.

I think what you point out is very evident.

Œthe appearance of multiple media simultaneously in the same frame. Whether
these media are openly juxtaposed or almost seamlessly blended together is
less important than the fact of this co-presence itself.¹

Varied media types in a single envelop of time as one duration of time.
That¹s how  I read it. I just don¹t see it as that new, more pervasive,
sure.  What I do thing has changed considerably is the larger notion of the
frame. Especially this idea of the same frame.

I read the moment and media production as one of increasing poly-temporality
­ this  notion  points to a different kind of distribution of events that
expands the very idea of the frame. A multiplicity of events in a
simultaneity or all-at-once-ness of not  a single time but a plurality of
times and a multiplicity of frames. I see media objects now as event
objects, iterative, always becoming, trans-actionable.  Many of these new
media objects point to the becoming event , an event or transactional space
that is communicative and authorial in real time ­ things such as mapping
pictures to 2D maps, adding friends to one¹s myspace account, flicker,
algorithmic works,  newsmap, playback,  what it is to take an image and
circulate it, on and on - these are not fixed objects but event spaces.

I think what you point to is the use of non-linear tools to make linear
objects. They are in some sense remnants of the world of prerecorded media
or in some sense that is the way your descriptions cast them, as fixed, as
bound. I think that is because our playback systems are not yet
computational. But this is certain to change.

As media become instructions rather than fix objects , objects that are
instructions that are computed and networked the characteristics you point
to will take on something more than this remixability , something that is
poly-temporal and iterative , a frame that is malleable, polyglot. How media
as object  events  change,  at what duration, what  rhythm, how they become
in time ­  inhabit  time, occupy  space and what space, how we interact with
them, reflect ourselves through them will be the thing that will be of note.

No doubt  this notion of Œremixability¹,  a notion that many on the list
serve spoke wonderfully  to regarding sound and music and its cultural
transport and mutation, which has for ever been happening  as with cuisine,
language, architecture , festival, religion ­ on and on ­ remixability in
software has a certain ease and fluidity now ­ but the idea and practice is
age old ­ beneath the surface of this commingling of media types is an
understanding and instantiation of a new sense of time, (and the mesh
(frame) of time) its being , consumption and use ­ new media, computation,
composition and the network speak of new strategies and understandings of
time, communication, being, presence, authorship ­ that I think many of us
look forward to hearing you speak to.
 
Marc Abu Lafia
 


> 
> Greetings to everybody
> 
> My Pratt Manhattan gallery lecture earlier this month was my first public
> presentation of some ideas on media remix I have been developing lately; a
> long text Understanding Metamedia which goes into details will be posted on
> my web site next month.
> 
> I am not sure if I was successful in presenting the ideas correctly at this
> time - but for now, I wanted to add to the discussion two text statements
> which summarize what I wanted to convey in the lecture and what I am trying
> to develop in more detail in the forthcoming text.
> 
> I am giving another talk in NYC this coming Saturday April 22 where I will
> try to approach these ideas again from a somewhat diffirent POV than in the
> previous talk:
> 
> www.mediaconference2006.com
> 
> 
> 
> 
> The first segment is from my article "Abstraction and Complexity" (2003):
> 
> --------
> One result of the shift from separate representational and inscription media
> to computer metamedium is proliferation of hybrid images - images that
> combine traces and effects of a variety of media. Think of an typical
> magazine spread, a TV advertisement or a home page of a commercial web site:
> maybe a figure or a face of person against a white background, some computer
> elements floating behind or in front, some Photoshop blur, funky Illustrator
> typography, and so on. (Of course looking at the Bauhaus graphic design we
> can already find some hybridity as well similar treatment of space combining
> 2D and 3D elements ­ yet because a designer had to deal with a number of
> physically distinct media, the boundaries between elements in different
> media were sharply defined.)
> 
> This leads us to another effect - the liberation of the techniques of a
> particular media from its material and tool specificity. Simulated in
> software, these techniques can now be freely applied to visual, spatial or
> audio data that has nothing to do with the original media. In addition to
> populating the tool pallets of various software applications, these
> virtualized techniques came to form a separate type of software ­ filters.
> You can apply reverb (a property of sound when it propagates in particular
> spaces) to any sound wave; apply depth of field effect to a 3D virtual
> space; apply blur to type, and so on.
> 
> The last example is quite significant in itself: simulation of media
> properties and interfaces in software has not only made possible the
> development of numerous separate filters but also whole new areas of media
> culture such as motion graphics (animated type which exist on its own or
> combined with abstract elements, video, etc). By allowing the designers to
> move type in 2D and 3D space, and filter it in arbitrary ways, After Effects
> has affected the Guttenberg universe of text at least as much if not more
> than Photoshop affected photography.
> 
> --------
> 
> 
> The second segment comes from this new long text Understanding Metamedia
> which will be available shortly. In this segment the idea of media
> remixability is developed in relation to visual langauge of moving images.
> However just as I tried to do this in the lecture, I am working to apply the
> idea of medix remixability to other areas of digital media.
> 
> --------
> 
> The use of After Effects is closely identified with a particular type of
> moving images which became commonplace to a large part because of this
> software ­ ³motion graphics.² Concisely defined by Matt Frantz in his Master
> Thesis as ³designed non-narrative, non-figurative based visuals that change
> over time,²  motion graphics today include film and television titles, TV
> graphics, dynamic menus, the graphics for mobile media content, and other
> animated sequences. Typically motion graphics appear as parts of longer
> pieces: commercials, music videos, training videos, narrative and
> documentary films, interactive projects.
> 
> While motion graphics definitely exemplify the changes that took place
> during software revolution of the 1990s, these changes are more broad.
> Simply put, the result of this revolution is a new hybrid visual language of
> moving images in general. This language is not confined to particular media
> forms. And while today it manifests itself most clearly in non-narrative
> forms, it is also often present in narrative and figurative sequences and
> films. 
> 
> For example, a music video may use life action while also employing
> typography and a variety of transitions done with computer graphics
> (example: video for Go by Common, directed by Convert / MK12 / Kanye West,
> 2005). Or it may imbed the singer within the animated painterly space (video
> for Sheryl Crow¹ Good Is Good, directed by Psyop, 2005.) A short film may
> mix typography, stylized 3D graphics, moving design elements, and video
> (Itsu for Plaid, directed by Pleix collective, 2002 ).
> 
> In some cases, the juxtaposition of different media is clearly visible
> (examples: music video for Don¹t Panic by Coldplay; main title for The
> Inside by Imaginary Forces, 2005). In other cases, a sequence may move
> between different media so quickly that the shifts are barely noticeable
> (GMC Denali ³Holes² commercial by Imaginary Forces, 2005). Yet in other
> cases, a commercial or a movie title may feature continuous action shot on
> video or film, with the image being periodically changing from a more
> natural to a highly stylized look.
> 
> While the particular aesthetic solutions vary from one piece to the next and
> from one designer to another, they all share the same logic: the appearance
> of multiple media simultaneously in the same frame. Whether these media are
> openly juxtaposed or almost seamlessly blended together is less important
> than the fact of this co-presence itself.
> 
> Today such hybrid visual language is also common to a large proportion of
> short ³experimental² (i.e. non-commercial) films being produced for media
> festivals, the web, mobile media devices, and other distribution platforms
> The large percentage of the visuals created by VJs and Live Cinema artists
> are also hybrid, combining video, layers of 2D imagery, animation, and
> abstract imagery generated in real time. (For examples, consult The VJ book,
> VJ: Live Cinema Unraveled, or web sites such as www.vjcentral.com and
> www.live-cinema.org. )  In the case of feature narrative films and TV
> programs, while they are still rarely mix different graphical styles within
> the same frame, many now feature highly stylized aesthetics which would
> previously be identified with illustration rather than filmmaking ­ for
> instance, TV series CSI, George Lucas¹s latest Star Wars films, or Robert
> Rodriguez¹s Sin City.
> 
> 
> 
> What is the logic of this new hybrid visual language? This logic is one of
> remixability: not only of the content of different media or simply their
> aesthetics, but their fundamental techniques, working methods, and
> assumptions. United within the common software environment, cinematography,
> animation, computer animation, special effects, graphic design, and
> typography have come to form a new metamedium. A work produced in this new
> metamedium can use all techniques which were previously unique to these
> different media, or any subset of these techniques.
> 
> If we use the concept of ³remediation² to describe this new situation, we
> will misrepresent this logic ­ or the logic of media computing in general.
> The computer does not ³remediate² particular media. Instead, it simulates
> all media. And what it simulates are not surface appearances of different
> media but all the techniques used for their production and all the methods
> of viewing and interaction with the works in these media.
> 
> Once all types of media met within the same digital environment ­ and this
> was accomplished by the middle of the 1990s - they started interacting in
> the ways that could never be predicted nor even imagined previously. For
> instance, while particular media techniques continue to be used in relation
> to their original media, they can also be applied to other media. (This is
> possible because the techniques are turned into algorithms, all media is
> turned into digital data stored in compatible file formats, and software is
> designed to read and write files produced by other programs.) Here are a few
> examples: motion blur is applied to 3D computer graphics, computer generated
> fields of particles are blended with live action footage to give it enhanced
> look, a virtual camera is made to move around the virtual space filled with
> 2D drawings, flat typography is animated as though it is made from a liquid
> like material (the liquid simulation coming from computer graphics field),
> and so on. And while this ³cross-over² use by itself constitutes a
> fundamental shift in media history, today a typical short film or a sequence
> may combine many such pairings within the same frame. The result is a
> hybrid, intricate, complex, and rich visual language ­ or rather, numerous
> languages that share the basic logic of remixabilty.
> 
> I believe that ³media remixability² which begins around middle of the 1990s
> constitutes a new fundamental stage in the history of media. It manifests
> itself in different areas of culture and not only moving images ­ although
> the later does offer a particularly striking example of this new logic at
> work. Here software such as After Effects became a Petri dish where computer
> animation, live cinematography, graphic design, 2D animation and typography
> started to interact together, creating new hybrids. And as the examples
> mentioned above demonstrate, the result of this process of remixability are
> new aesthetics and new media species which cannot be reduced to the sum of
> media that went into them. Put differently, the interactions of different
> media in the same software environment are cultural species.
> 
> 
> 
> 
> 
> _______________________________________________
> iDC -- mailing list of the Institute for Distributed Creativity
> (distributedcreativity.org)
> iDC at bbs.thing.net
> http://mailman.thing.net/cgi-bin/mailman/listinfo/idc
> 
> List Archive:
> http://mailman.thing.net/pipermail/idc/
> 
> 





More information about the iDC mailing list