New User Questions

Hello, I am a completely new user and I have combed through the user guide but I am still having some difficulty understanding how to source an image using the MediaPlayer or ImageFile nodes. With the ImageFile approach I keep getting the error:

[16:12:49] [Error] ImageFile: (41) image not loaded: FileSystem input/output error:
File load failed: ccat_effect.mov

I tried moving the file to a Media folder within the Screenberry project folder but I got the same error

I was also a bit confused because I used the sample dome project provided by Screenberry and in that project there is no ImageFile node that is sourcing the content for the project. In that project it is using .rva folders that have converted the original media file.

Is there a significant different between using .rva and using the ImageFile node to source the media content? How do I avoid the filesystem error when sourcing content like this?

Any information on this would be super helpful.

Thank you,

-Azmat

Hi. ImageFile node is used only to load images (like jpg, png) and use in node graph. To play a video you need to upload it. You can drag-n-drop it into Media Library widget (you can find it in Window menu, select folder you want to use and drag-n-drop the file from explorer into it). It will show upload dialog where you chose quality preset and the video will be converted to RVA. RVA is our internal containter for any kind of media (video, audio, images).
Then you want to add Media Player node, configure it’s output image size and connect to Canvas directly or through some patches/displays for warping.
Double click on Media Player to open it’s widget, select first playlist and drag-n-drop uploaded item from Media Library to playlist. Then you can play it with play selected button or with Alt + Click.
To output the audio add Audio Out node and connect Media Player to it (audio parameter).

That’s the way to play some video. There will be video tutorials for beginners on our youtube channel later.

Have a good day

Thank you so much for the reply. That really clarified some of the confusion, I am now able to output the media content.

I was also wondering, does Screenberry allow for exporting files to a 1:1 fish eye format?

For example, I understand that we can do effects like this:

[Fish Eye Effect](LatLongToDome - Controls2.gif (GIF Image, 600 × 1012 pixels) — Scaled (44%) https://help.screenberry.com/wp-content/uploads/2020/11/LatLongToDome%20-%20Controls2.gif)

Is there any way to export a rendered version of that effect?

Thanks,

-Azmat

Wow, that’s quite old version of Screenberry there. You can go and get 3.0 demo from screenberry.com.

While Screenberry has powerfull warping nodes, that can be used to warp video content, especially for a fulldome, there is no export feature, it is designed to be used as a player.

We are planning to add such features sometime in the future, though.

There is a workaround to export sequence using Script and Image Saver nodes.
Here is a screencast of me doing this (note “play every frame” setting in Media Player):

Thanks for the video, I will give it a try.

Also when you say that it has powerful warping tools does that mean that it can simulate a fisheye effect for a dome projection without needing a 1:1 fisheye aspect ratio video as the source? It can warp through real time rendering as needed?

Thank you again very much for the information.

There is a way to warp a standard 16:9 video or similar to fulldome as a virtual plane or curved plane, so it can be viewed on calibrated dome screen. It is warped at the front of the deme. Size, position and tilt angle can be adjusted. The node name is Planar to Dome

For the Planar to Dome(Or other dome setups) is it also possible to stream media for a project from a live feed or through touchdesign using touchengine? Or does the media have to be a saved as a local file that is added to the Media Player library?

Sure, it can be streamed over Spout, NDI, via capture card (Vision, DeckLink, Deltacast, etc). It can also be played in Touch Designer TOX via Touch Engine node, or it can be a Notch Block source. Any media source can be warped with any patch.

Oh that’s great! Is there a user guide on spout integration? (Lemme know if I am asking too many questions haha)

We don’t have a detailed manual of how to use it. But it’s quite simple, you stream something from another application using Spout and chose that stream by name in Spout In node in Screenberry.
Hope it helps.

Awesome, thanks!