Engineering Space 360

Can I learn the node Settings of Korean 720 degree sphere screen?

Yes =)
You can start with node basics in our manual:

If you have a specific question, please ask

I already know the basic node operation steps. I want to learn the 720 degree global node setting in South Korea. Can you give me an example

Right, this is about sphere projection we did in 2017 in South Korea with Metaspace:

I can’t share the project or explaing how we did it in details.
It was quite experimental at that time with Screenberry version 2.7 on the edge of it’s capabilities.
We did a full sphere screen calibration with help of 4 cameras with fisheye lenses located on the bridge.

First stage is to calibrate 4 cameras to produce “one image”, and then use our calibartion algorithm to calibrate the projectors with black and white maps.
The node graph was complicated and had too much manual work to align images from cameras beforehand.

We now moved on to new idea: each camera calibrates a part of the screen. Then calibrated parts can be aligned together with a single camera calibration, if possible, or with manual alignment via 3D Scene remapping or simply with 2D patches.
We did this with Hive on Frameless project in 2022.

Old nodes used for that Sphere 360 calibration was removed in version 3.1.
Multi camera calibration is still in experimental stage in Screenberry 3.2, and we usually do all the configuration on site or remotely. If you have a project, please contact us with the details and we help you with the node graph configuration.
We plan to improve multi camera calibration for next Screenberry version 3.3 this spring.

Looking forward to the success of automatic fusion of multiple cameras so that we can easily debug,

1 Like

Can you provide some ideas on how to achieve this with Screenberry 3.2?

For example, if you want to calibrate 3 projectors panorama with 2 cameras.
First camera calibrates 2 left projectors. Second camera calibrates middle and right projector. So both cameras calibrate middle projector. Use Blend Calibrators node for projectors that you want to calibrate from both cameras.

You start by calibrating both calibrators. Now you need to calibrate two calibrated areas.
One way is to have temporal camera that can see whole screen and calibrate once using 2 calibrated areas as projectors:

After calibration has finished - we connect uv map and blend map from 3rd calibrator patches to first 2 calibrators as source UV map and source Blend map, and connect one content source (Media Player) to those two calibrators:

Now you can remove 3rd temporal camera and use only first 2 fixed cameras for recalibration in the future. The transformation between first 2 cameras will remain unchanged.

Another way of calibrating 2 calibrators is to use 3D Scene remapping. You use UV maps from 3D Scene node and blending maps from Soft Edge Blender node: