tracking the activity on the screen pixel by pixel

mmeza's icon

Dear all,

I wish I could know which Jitter object could be useful to track the activity of the computer screen, or if any of you has any method to acquire data from the activity on the screen (like drawing for example). I mean any activity, in and out of the Max context. For example, if I use any drawing software, and that I want to acquire the speed, or the duration of the stoke, the x, y, coordinates... Any clue, any help will bee very appreciated

the best

B.K.

TFL's icon

To capture the screen, there is jit.desktop. The external 11globalForegroundWindow might be useful to know which app is in the foreground, so you can adapt what to do. Not sure you can capture mouse activity in Max if it is not the active window though.

But one could also make their own drawing software in Max!

mmeza's icon

Many Thanks TFL,

I've done some testing with jit.desktop, thanks to your suggestion. So far it seems to be useful. Of course it is possible to draw with Max, but this project will need more than one platform since it is intend to be an online live performance in which some of the performers interact at distance, some of them not having Max at hand.

TFL's icon

Mmh instead of jit.desktop, you might get better results by capturing your screen with an external software and sending it to Max through Syphon (Mac) or Spout (Windows).

Are other performers screen captured and sent to you? If yes, how?

mmeza's icon

Many thanks for your interest and your help, I appreciate it very much. The first essay of the project was done with the Zoom drawing tools to draw over a video image that was shared by the host (me). The "drawing team" was composed of more or less 12 persons. Each person took it in turns to draw on the Zoom shared screen, in a cadavre esquis kind of collective creative process. Once the screen became saturated, it was erased and the process restarted anew. For the upcoming version of the performance I would like to record the data of the activity on the screen (my screen), that is, the drawing as data, as it is created. We used Zoom because every one could have it and use it. I could use the video recording of the drawing process to extract the data after, but it closes the possibility of using the data in real time during the performance.