using iphone as video input
Does anyone know if it’s possible to use the iPhone’s camera as a video input to jit.qt.grab?
This app lets you use your iphone camera as a webcam. Although I haven’t tried it with Jitter, can’t think of any reason why it wouldn’t work.
thanks, i’ll look into this. some bad reviews, have you used this app, David?
I just tried it out with Jitter and the video coming through was 1-dimensional, ie the same x-value was being seen across the y-axis. Not sure why this is, but perhaps not the best solution in this case. Haven’t seen anything else that can do this though.
I found an app that works for this:
yes, strange. I used the pocketcam a year ago and it was fine and now it is not??? a change in Jitter, a change in pocketcam, i don’t know
has there been any new apps since a year ago?
Just thought I’d top this thread, since it’s been about 6 years now.. This seems like something a lot of people would potentially want to do (or already be doing) with Jitter… Has anyone solved an open-source solution for the time being?
Last time I worked with this, I found Airbeam to be the most reliable/flexible option. http://appologics.com/airbeam
Also, if you have a newer iphone and computer you get much better frame-rates and lower latency by connecting the idevices directly over wifi to the computer instead of running thru a router.
Oh yes – all video is transmitted as syphon streams.
airbeam looks amazing, thanks for sharing!
Yes, airbeam is "quite" cool and easy. And it allows for turning on the light on the iphone while sending video too…
If you do a Adhoc network the latency is not too bad at all.
This is how I located the giant dead possum up under/in my house – I used a iphone with Airbeam like a periscope, made a horrible job a little less horrible.
Forums > Jitter