Forums > Jitter

Scissors and Glue Performance

August 7, 2006 | 8:04 pm

That has to be the biggest text patch I’ve seen yet. I think I
understand what you’re trying to do, but a few details are unclear to
me. If you’re trying to rearrange the scanlines of individual frame
on a frame by frame basis (that is not mixing the data temporally),
you can use jit.repos to do this quite easily. If you are trying to
mix and match scanlines of various frames, you can use
xray.jit.timecube @mode 2 to do temporal remapping in conjunction with
a 3d matrix. This is the most efficient way to do such things in
jitter right now. jit.glue/scissors isn’t meant for such micro
operations and is best used for doing things on the order of 8 or less
segments as the patching can get hairy quickly and CPU usage goes up
fast. You can download xray.jit.timecube from
http://www.mat.ucsb.edu/~whsmith/xray.html .

best,
wes


August 7, 2006 | 9:52 pm

You might also want to have a look at jit.plume for scanline offset
based on luminance.

Cheers,
Andrew B.


August 7, 2006 | 10:02 pm

On Aug 7, 2006, at 2:52 PM, Andrew Benson wrote:

> You might also want to have a look at jit.plume for scanline offset
> based on luminance.

There’s also the jitter-examples/video/matrix/jit.matrix-drift-
scanlines.pat, fwiw. Could perhaps be managed in Js/Java as well.

-Joshua


August 8, 2006 | 4:43 pm

No, I am not rearranging the location of the scanlines – you are correct in thinking that I just want to fool around with temporality, so it looks like I will give xray.jit.timecube a look-see (as essentially what I’m doing is a timecube with visualization as per individual scanlines).

BTW – Inspiration for this project came last Friday when I saw a talk by Toshio Iwai at the MIT Media Lab. His project (from 1985 – form which he received many awards) Time Stratum was shown in a video. Iwai-San mentioned that many of his installations no longer function as they rely on old unavailable hardware – and it’s too much of a bitch to remake them on new gear. Part of his talk was calling for methods to preserve media-art | he got this idea when he made Electroplankton for Nintendo DS, which borrowed many ideas of his from previous installations. He was happy that in this game form – many of his ideas will be preserved.

What I’m trying to do is recreate Time Stratum so that others can enjoy it (using their computers and webcams) – I initially tried to do this in QuartzComposer as it seemed like a natural fit, but sadly QC does not have an available buffer… next version maybe.

Anyways… I’m trying to imagine how Iwai-San implemented this idea in 1985… Anyone want to posit a guess?

Thanx for all your help!


August 8, 2006 | 6:16 pm

I looked up Time Stratum – how does this relate to the project? The
project sounds intriguing, but I’m having trouble picturing what
you’re trying to accomplish? Why use scanlines?

Time Stratum (there are 3 listed, scroll down):

http://museum.doorsofperception.com/doors1/transcripts/iwai/iwai.html

best
evan

On Aug 8, 2006, at 5:43 PM, Connor Dickie wrote:

>
> No, I am not rearranging the location of the scanlines – you are
> correct in thinking that I just want to fool around with
> temporality, so it looks like I will give xray.jit.timecube a look-
> see (as essentially what I’m doing is a timecube with visualization
> as per individual scanlines).
>
> BTW – Inspiration for this project came last Friday when I saw a
> talk by Toshio Iwai at the MIT Media Lab. His project (from 1985 –
> form which he received many awards) Time Stratum was shown in a
> video. Iwai-San mentioned that many of his installations no longer
> function as they rely on old unavailable hardware – and it’s too
> much of a bitch to remake them on new gear. Part of his talk was
> calling for methods to preserve media-art | he got this idea when
> he made
> flash.html">Electroplankton for Nintendo DS, which borrowed
> many ideas of his from previous installations. He was happy that
> in this game form – many of his ideas will be preserved.
>
> What I’m trying to do is recreate Time Stratum so that others can
> enjoy it (using their computers and webcams) – I initially tried to
> do this in QuartzComposer as it seemed like a natural fit, but
> sadly QC does not have an available buffer… next version maybe.
>
> Anyways… I’m trying to imagine how Iwai-San implemented this idea
> in 1985… Anyone want to posit a guess?
>
> Thanx for all your help!


August 8, 2006 | 6:53 pm

My bad – I referenced the wrong project. I’m trying to re-create parts of his project called: "Another Time, Another Space" from 1993. now that we hae this corrected I can see how he acheived the effect in 1993 – It was a little hard for me to understand how this could have been done in the 80′s. I was incorrect, sorry for the confusion.

As an aside – I tried out the xray.jit.timecube object and it seems to be too slow for my purposes. Perhaps I’m using it wrong, or perhaps it’s because I’m using it on Intel OS X (in Rosetta).

I’m still working on this to see how I can make it work… I’ll keep you all posted. Although I’m starting to think that this might be outside of what can be accomplished with Max/MSP with my current hardware.


August 8, 2006 | 7:29 pm

On Aug 8, 2006, at 11:53 AM, Connor Dickie wrote:

> As an aside – I tried out the xray.jit.timecube object and it seems
> to be too slow for my purposes. Perhaps I’m using it wrong, or
> perhaps it’s because I’m using it on Intel OS X (in Rosetta).
>
> I’m still working on this to see how I can make it work… I’ll
> keep you all posted. Although I’m starting to think that this
> might be outside of what can be accomplished with Max/MSP with my
> current hardware.

I would definitely check out the drift scanlines example. The same
concept can be applied to adding elements to a 3D "video/time cube"
inserting 2D frames into the 3D cube, and pulling scanlines from the
3D cube with standard jit.matrix srdim/dstdim messages for per pixel
offsetting this technique will become too slow, which is why I
believe Wesley ended up writing this object.

Lastly, for maximum performance, writing such an object in C, isn’t
that tough if you feel comfortable programming in C syntax languages.
The biggest performance penalty will be due to memory access if the
cube is large (make sure you have *lots* of RAM for this sort of
thing, you might want to use uyvy or grgb data types to cut the ram
usage in half). Perhaps in the future we’ll extend jit.repos to work
for >2 dimensions.

-Joshua


August 8, 2006 | 7:33 pm

Well, first off you’re using rosetta and have a definite performance
hit. I would suggest downloading the new UB versions of jitter and
max and emailing me offlist for a UB version of xray.jit.timecube as I
haven’t had time to put together a UB download of the xray objects
yet. Also, I haven’t been able to find any visual media showing the
installation, but I did find this snippet:

This sense of play , curiosity and inventiveness is reminicent too of the
fairground attraction and in many ways his approach mirrored that of
Toshio Iwai whose entire oeuvre including his public art is based on play .
In "Another Time, Another Space"created in Antwerp central station in
1993. Toshio Iwai made an electronic hall of mirrors using a tree structure
of video screens. The installation featured 15 video cameras, 30 computers,
30 video monitors, and a videodisk recorder. The comings and goings of
people through the station were filmed by the cameras, and manipulated in
real-time by the computer to deform shape, time reference, and showing a
different time-space environment in each movement. Video processing
software reflected back crowds like fields of wheat where algorithms
interpreted successive layers of crowd as wave- like motions. Sober-suited
business men leapt and cavorted in front of these magic mirrors.
"I used the "Another Time, Another Space" system to create an
experimental event as part of an NHK television program. People passing
in front of Shinjuku Sta tion were photographed by a video camera and the
images were altered and projected onto the giant Alta Vision screen across
the street. It caused a much larger commotion than we expected. The
moment the image appeared on the screen, hundreds of people started
gathering in front of the station and waving their hands and moving their
bodies as they watched their images on the screen. In that moment the big
screen that everyone had been taking for granted suddenly became a giant
interactive event."7

I still have no idea what you’re trying to do, maybe you could be more
specific than "recreate Another Time, Another Space". FWIW, I had
xray.jit.timecube running with a 720x480x100 buffer on a PC at 30FPS
with room for more speed and it runs quite fast at 320x240x100 on my
1.67G PPC powerbook. You could go to 620x480x100 if you did it with
yuyv colorspace.

Also, check out this:
http://www.vasulka.org/Steina/Steina_BentScans/BentScans.html . Is
this along the lines of what you’re trying to do?

best,
wes

On 8/8/06, Connor Dickie wrote:
>
> My bad – I referenced the wrong project. I’m trying to re-create parts of his project called: "Another Time, Another Space" from 1993. now that we hae this corrected I can see how he acheived the effect in 1993 – It was a little hard for me to understand how this could have been done in the 80′s. I was incorrect, sorry for the confusion.
>
> As an aside – I tried out the xray.jit.timecube object and it seems to be too slow for my purposes. Perhaps I’m using it wrong, or perhaps it’s because I’m using it on Intel OS X (in Rosetta).
>
> I’m still working on this to see how I can make it work… I’ll keep you all posted. Although I’m starting to think that this might be outside of what can be accomplished with Max/MSP with my current hardware.
>


August 8, 2006 | 7:39 pm

http://www.vasulka.org/Steina/Steina_BentScans/BentScans.html . Is
this along the lines of what you’re trying to do?

This is exactly what i am trying to do ;-)

Thanx for the link.

Do you have any more information on this?


August 8, 2006 | 7:45 pm

Check out jitter-examples/video/matrix/jit.matrix-scanlines.pat


August 8, 2006 | 7:49 pm

sorry, should read:
Check out jitter-examples/video/matrix/jit.matrix-slitscan.pat

Andrew B.


August 8, 2006 | 7:53 pm

On Aug 8, 2006, at 12:39 PM, Connor Dickie wrote:

>
> Do you have any more information on this?

This doesn’t keep a volume of history, but is rather a slit scanning
technique, and you can check out the jit.matrix-slitscan.pat (turn
down the metro in the patch) for one of several ways you can do slit
scanning in Jitter. For more information on slitscanning I think
Golan Levin put together a catalogue of slit scan works
(rustle…rustle…here it is: http://www.flong.com/writings/lists/
list_slit_scan.html).

If you do need to keep a voume of history for taking dynamic slices,
you’ll want to check out wes’ object or the other suggested
solutions. One of several different pieces which does this with a
volume is the Kronos Projector which uses a haptic interface to push
through the time. I’m pretty sure they used custom software with 3D
textures. We don’t expose the level of access to 3D textures in
Jitter to let you do this without programming your own object in C,
but perhaps in the future.

http://www.k2.t.u-tokyo.ac.jp/members/alvaro/Khronos/

-Joshua


August 9, 2006 | 3:28 am

Thanx everyone for thier input and help. Greatly appreciated. I’ve learned a lot.

I think that I’ll probably give this idea a shot using some conventional language. I guess some of you guys suggested c.


August 9, 2006 | 9:56 am

You should also check out Camille Utterback’s work, very similar,
especially "Liquid Time":

http://www.camilleutterback.com/liquidtime.html

best,
evan

On Aug 9, 2006, at 4:28 AM, Connor Dickie wrote:

>
> Thanx everyone for thier input and help. Greatly appreciated.
> I’ve learned a lot.
>
> I think that I’ll probably give this idea a shot using some
> conventional language. I guess some of you guys suggested c.


August 9, 2006 | 12:27 pm

FWIW, xray.jit.timecube is does exactly what the Vasulka video shows
and a bit more. In mode 0, it will do a vertical slit-scan in mode 1
it will do a horizontal slit-scan and in mode 2 it will take a map as
input and give the appropriate pixels. Also, I don’t think you’re
going to be able to achieve any significant speedup over
xray.jit.timecube (at least in CPU processing). It’s implemented
using a circular buffer and a sync signal, so it’s quite efficient.

best,
wes

On 8/9/06, evan.raskob [lists]

wrote:
> You should also check out Camille Utterback’s work, very similar,
> especially "Liquid Time":
>
> http://www.camilleutterback.com/liquidtime.html
>
> best,
> evan
>
> On Aug 9, 2006, at 4:28 AM, Connor Dickie wrote:
>
> >
> > Thanx everyone for thier input and help. Greatly appreciated.
> > I’ve learned a lot.
> >
> > I think that I’ll probably give this idea a shot using some
> > conventional language. I guess some of you guys suggested c.
>
>


August 9, 2006 | 7:20 pm

Well, that’s good to know. I’ll keep trying with the xray objects then – I’ll have to ask you offline if I can have access to your Intel build of those objects so I can get the performance I need.



dan
August 10, 2006 | 8:42 pm

you might also check out the work of Marlon Barrios-Solano, a
dancer/visualist who takes the slitscan aesthetic in interesting
directions.

http://unstablelandscape.blogspot.com/

best,
dan

p.s. (woohoo! first jitter post after months of no-time-for-maxing! :D)

On 8/9/06, Connor Dickie wrote:
>
> Well, that’s good to know. I’ll keep trying with the xray objects then – I’ll have to ask you offline if I can have access to your Intel build of those objects so I can get the performance I need.
>


***
http://danwinckler.com
http://share.dj

http://idmi.poly.edu


Viewing 17 posts - 1 through 17 (of 17 total)