Be able to assign texture coordinates from a camera’s perspective. Touch designer has this function which is great for separating the real projector’s position from the audience’s view position.
Thanks for the request. We’re interested in looking into this further and want to better understand what you’re suggesting. Can you please give us some more detail? For example: a video demonstrating the feature, images of what you’re describing with notes on them, explain different scenarios where this would be useful, detailed instructions on how this might work in Vuo. Thanks!
Look at the touch designer mutek workshop videos.
It’s the ability to assign perpspectified UV’s to a mesh. It applies the inverse to a cam’s perspective to straighten out a texture.
Say you have a 3d model of the room you are going to project into. In the virtual scene you have an object, say a sphere. You also have a camera where the intended audience will be in real life, viewing the sphere. You also have a mesh that imitates the projection surface, for example a 20 foot screen against the wall of the room’s model. If you assign the image from the audience camera to this mesh it will still need stretching out manually unless projecting from the audience position itself(impossible).Touchdesigner also lets you render elements up the chain in a higher resolution so the stretching out can be from say a 4096 image to avoid the pixellation.
Steve will be in touch with you to discuss this further.
@bLackburst, this is an area we are interested in exploring further, but it looks like that will take a significant time investment. We are tabling this for now, but we hope we can revisit sometime after release 1.0.
Now that we’ve released Vuo 1.0, we’re opening this feature request for voting.