I use Blender for animation, particularly for fisheye scenegraphs. For a complete movie I use vuo’s warp to do the full video.
I have a 6m dome with a modified version of Paul Bourke’s warped fisheye offset projection.
Getting the models correct, lighting correct, and camera optimal is always a problem of rendering. To actually view it I have to warp the fisheye. Dalai Felinto did a warped fisheye in the gaming engine 15y ago, but it did not work in practical animation rendering.
I thought I could tighten the QC loop using vuo to do the warping thru syphon. But the syphon code is barely there, so getting it into vuo to do the warp is problematic.
I think the solution is to write a warp node in blender and call it a day.
Unfortunately, it seems so. The only alternative would be to run Blender on a PC, and create a spout output from a camera which should be stable-ish. Then you can use Spout to NDI (https://leadedge.github.io/) and get the NDI stream in Vuo over a local network.
Or use a 4K HDMI → USB-C capture dongle, connect to the HDMI out on the Blender machine, appears as a webcam input in Vuo and warp that. I have one somewhere in my box of tricks from another project. I could dig it out and test if you like. You will be limited to 2K fisheye but I suspect that’s about the upper limit of any spherical mirror projection anyway.
Need 4k, want 8k when those projectors become affordable.
Got rid of the mirror and use the upper half of a video camera fisheye lens; big aperture, cheaper, better image quality, more compact, and easier to maintain/calibrate.
I plan to dig into Dalai Felinto’s code in the blender game engine, to create an image node in the rendering. Just need something for a simple dome quick look to set scenegraph/render parameters. For production, render in fisheye, and move that to vuo for further enhancements (eg, sound, midi & dmx).
I decided my production setup using Vuo and Blender is to do movie sequence warping with Vuo, and to do single image scenegraph tests using Blender. I did not use Dalai Felinto’s code to create a warp node in Blender, but rather devised a method to take the warp mesh file used in Vuo, and convert it into a 16-bit (float would be better) UV image that can be interpreted by the Blender composite node MapUV. This avoids using Syphon to send images to Vuo. I am under the impression Syphon has problems in Blender, additionally the Syphon integration is not part of the distributed Blender code package (i.e., voluntary code occasionally maintained elsewhere).