[feature request: jit.gl.vertex] see example below-jit.gl.pix vs jit.gen in mesh

    Dec 13 2013 | 2:47 pm
    hello team cyclig74, do you intend to create a jit.gl.vertex object?
    In the following example, I did a little test performance between jit.gl.pix and jit.gen with the same code. And a 300x300 matrix with the CPU (in this example) does not follow. I know that is a pixel shader jit.gl.pix but it would be great to have a vertex shader to generate mesh with the GPU.
    There are probably other ways to do the same with the most efficient CPU so but jit.gl.vertex would open new perspectives.

    • Dec 16 2013 | 4:24 pm
      + 1
    • Dec 17 2013 | 8:59 pm
    • Dec 30 2013 | 11:22 am
      we are only 3 to dream of a jit.gl.vertex?
    • Dec 30 2013 | 4:14 pm
      thanks for the request. noted.
    • Dec 30 2013 | 4:48 pm
      thank you for your attention rob, I'm nervous to see more.
      jit.gl.vertex will do a lot of things.
      and more jit.phys in the GPU too ;)
    • Jan 04 2014 | 6:40 am
      +1 Getting pixel and vertex shaders done in gen + export features would open up new worlds
    • Jan 04 2014 | 2:58 pm
    • Jul 04 2014 | 8:53 am
    • Jul 04 2014 | 11:13 am
      +1000 ;-)
      LLT, in your patch, if you're benchmarking the two approaches, you should have turned off vsync in jit.window. Nice patch, by the way!
    • Jul 15 2014 | 3:34 pm
      On my sample code,
      jit.gl.pix's cpu usage is more less than jit.gen's. but, jit.gen's fps is larger than jit.gl.pix's.
      CPU usage jit.gl.pix : 60% / jit.gen : 140%
      fps(@sync 0) jit.gl.pix : 70fps / jit.gen : 100fps
    • Feb 07 2015 | 5:36 pm
    • Dec 27 2015 | 1:12 am
      Would love this to happen
      + 1
    • Jun 22 2016 | 5:32 am
      + 1
    • Jun 22 2016 | 6:28 am
      [jit.gl.gen] ?
    • Jun 23 2016 | 2:09 am
      I also appreciate the symmetry of providing a jit.gl something for the vertex shader, to complement how jit.gl.pix can do some of what a fragment shader can do. But apart from saying +1, I would want to know what would a jit.gl.vertex actually look like. If the goal is to replace GLSL-coding with visual patching, then the main features found in vertex shaders need to be covered. The jit.gl.vertex program would run per vertex. Uniforms are taken care of by params. The matrix inputs (in 1, in 2, etc.) I presume would map to position, normal, color, etc. other arbitrary GL-vertex attributes coming in. Fine. But how would you get textures into the jit.gl.vertex, distinguishing them from attribute matrices? What are the outputs? There's no mechanism for varyings, and more importantly there's no corresponding mechanism for varyings in jit.gl.pix. In fact, I think we'd really need a separate jit.gl.fragment to make this work. And, how do we get these into the scene? Does jit.gl.vertex/jit.gl.fragment itself behave a bit like a jit.gl.mesh (being an ob3d object in its own right), or does it work more like a jit.gl.shader that other ob3d's can use?
    • Jun 23 2016 | 2:25 am
      (being an ob3d object in its own right) replace the jit.matrix all together. basically be a jit.gl.matrix functions the same but runs on the gpu.
    • Jun 23 2016 | 6:11 am
      But what does that mean? The nearest thing to a jit.matrix on the GPU is a texture, and we already have jit.gl.texture/jit.gl.slab/jit.gl.pix et al. The next nearest thing (and perhaps closer to the OP) is a vertex buffers, but OpenGL isn't really designed for arbitrary computation on vertex buffers (it's mostly assumed that happens on the CPU, or in vertex shaders, which as I noted have their own set of weird constraints). Beyond that there's GPGPU, but that still seems messed up enough to be a support nightmare.
      And anyway, jit.matrix isn't an ob3d. Did you mean replace jit.gl.mesh? Are you all just looking for a way to turn texture data into geometry?
    • Jun 23 2016 | 6:29 am
      Maybe that was a bit negative. Probably the most interesting path is OpenCL, which *can* modify GL-based vertex buffers in-place with arbitrary code, and thus would be a suitable target language; and maybe it is less flaky and finicky than it was when I tried it a few years ago. But it's still a bit of a vague request. What would be some examples of what you'd like to do with a hypothetical jit.gl.vertex?
    • Jun 23 2016 | 3:44 pm
      i'm honestly not sure i'm on topic, but i'd like to see something akin to [history] and [delay] operator of the audio domain, or ability to reference external textures (matrices ?) for time-dependant operations....doing this kind of feedback operations with chained jit.gl.pix is not always easy. keep in mind it's a naive feature request
    • Jun 23 2016 | 9:09 pm
      Basically I am also naive in the true details of how things work, I don't know glsl. Just use some of it features with jit.gl.pix.
      Basically I know , there are pixel shaders and vertex shaders.
      and basically when one does particle type work , or depth map mesh It is better to do all the calculation on the GPU. Right now we are stuck doing quite most of this using matrices on CPU.
      Even with jit.bfg and such, Could there be a way get this happening all on a the GPU.
      voilà Sorry , holes in my knowledge keeps me from being more precise.
    • Jun 27 2016 | 10:44 am
      @graham in my case I just need a jit.gl.mesh with texture inputs instead of matrixes (with the same data inside). It could be an update of jit.gl.mesh with texture and matrix inputs.
      I've developed with Mathieu Chamagne some particules tools with jit.gen that we could be easily port to jit.gl.pix (as it could have several outputs now), but I use jit.gl.mesh to display those particules, and it takes only matrixes in input. A few month ago, I have used a fluid model that I've created with several jit.gl.pix (see here) to control particules motion (with the fluid velocity map). The problem is that I had to copy the velocity map to the CPU to get it work. It will be better for me to keep all in GPU.
    • Jul 01 2016 | 2:48 am
      As far as I know there's no standard OpenGL way to generate geometry from textures directly on the GPU.
      However, I did spend a few days last week looking into OpenCL, and it seems likely that one could generate vertex buffer data (i.e. geometry) from an OpenCL compute program, entirely on the GPU; even potentially from texture-based inputs. I'm going to keep investigating this as & when I have time over the coming weeks.
    • Jul 01 2016 | 5:47 am
      I've used vertex shaders primarily for displacement maps applied to vertices of jit.gl.mesh objects, sampling input textures but interpreting them as modifiers for vertex position along arbitrary vectors. The more interesting experiments made use of dynamic normal maps generated with jit.bfg, and calling to gl_normal in the displacement routine.
      I'd vote for jit.gl.vertex to be a replacement for jit.gl.shader, so that they could be applied to any OB3D, not just jit.gl.mesh. I suppose that a jit.gl.fragment object might also be needed, but I'm not sure of how the linkages would work.
      I'd be excited to see vertex buffers exposed in any fashion within Max.
    • Jul 02 2016 | 7:24 am
      Hi Jessie,
      Would you mind sharing a basic example of the setup you describe? I've always struggled with this and some insight would be of great assistance :)
      And yes +1 for jit.gl.vertex (for max6 too please :p )
    • Sep 14 2016 | 3:52 pm
      OpenCL+ vertex buffers+ GPU Bullet Physics :)
    • Sep 28 2016 | 6:55 pm
      hey guys, our friend over in the jitter facebook group just posted this tutorial demonstrating vertex texture fetching using jit.gl.shader: https://www.youtube.com/watch?v=CsEVJNbKMms&feature=share
      i've reworked the original patch posted at the top of this thread to show how to use vertex texture fetches with jit.gl.pix to generate geometry directly from a texture without a matrix readback. the trick is to set rectangle 0 on the jit.gl.pix (or jit.gl.slab) input and output textures. this forces the texture dims to be a power of two, so probably best to start with POT dims on the input matrices (although not necessary).
      this also means you sample using texture2D rather than texture2DRect in the vertex program.
      hope this helps those that want to use textures to generate geometry.
    • Sep 29 2016 | 1:53 pm
      Rob, thank you. That is incredibly helpful.
    • Sep 29 2016 | 2:42 pm
      endless thx rob
    • Sep 29 2016 | 8:22 pm
      So jit.gl.pix automatically is the texture number 0! Cool insight Rob
    • Sep 30 2016 | 3:09 am
      Thanks for sharing this Rob, its amazing! So, I manage to add an amount param to control xyz displacement individually to achieve a Rutt-Etra like fx more efficiently. Although it works, I feel like I’m doing some things wrong. Can anyone confirm? Attached goes a gen version of xyz.displacement, vade's Rutt-Etra patch and my mod of Rob's vtf.jxs shader so you can compare.
      Thx in advance
    • Sep 30 2016 | 4:29 pm
      I just realised that there is a typo in the vtf.jxs I shared yesterday. Not big deal, still works but should be 1. instead of 100. in line 25.
    • Sep 30 2016 | 4:32 pm
      hey kevin, this looks great to me. what are you thinking is wrong? how's the performance compare to the matrix version?
    • Sep 30 2016 | 6:25 pm
      Hi Rob, with 1280x720 and my Macbook i5 late 2013:
      - gen version (all cpu): 19fps. - gl.pix to geometry without matrix readback: 35fps. - gl.pix to geometry with matrix readback: 17fps.
    • Feb 16 2017 | 1:09 am
      hey guys, the vtf feature is now a part of jit.gl.material. check my post here for more info: https://cycling74.com/forums/new-7-3-2-jitter-features/
    • Feb 16 2017 | 5:43 pm
      Hi Rob,
      Thanks for the work :)