How-To add computers to your render cloud

I have had many email conversation with people that try to relate LightNet: Cloud to how other render farm controllers work. This is very problematic because LightNet: Cloud works differently.

I really hate to front-load a how-to with a lot of background information. I'm sure you're looking for a nice recipe style step by step experience. Unfortunately, there are a few things I need to lay out so that I'm sure everyone is on the same page.

First and foremost, it is very, very important to understand that LightNet: Cloud is different. I have had many email conversation with people that try to relate LightNet: Cloud to how other render farm controllers work. This is very problematic because LightNet: Cloud works differently. LightNet: Cloud is not a command and control system, it does not tell each computer what to do. Instead each computer operates entirely on its own. Each individual computer in your cloud looks at the server, assesses the information that is sitting there, makes a decision on what to do and then updates the information on the server to reflect what it is doing.

There is no master. No centralized controller.

My point is, there are lots of routs you can go here. LightNet: Cloud offers a lot of flexibility! There is no one way nor, necessarily, a right way.

The server it uses is an FTP server used solely (as far as LightNet: Cloud in concerned) for storing data. The FTP server can be a local Windows computer running FileZilla. Or it can be a FTP server on the Internet that is provided by a commercial hosting service. Or it can be your own Linux server connected to the Internet (directly or through a router). Or it can be some other kind of FTP server. Some NAS units - these are hard drive devices designed to connect to a network and operate as network storage - have built-in FTP servers and they can work just fine as well.

My point is, there are lots of routs you can go here. LightNet: Cloud offers a lot of flexibility! There is no one way nor, necessarily, a right way. This makes a recipe style How-To difficult only because there are so many options that I don't want to leave the impression that this is the way to do it and the only way to do it. Try to understand the concepts here and you'll go a lot farther than just following along step by step.

Okay, so with that said and my point made. Here is a simple, step by step tutorial. I'm going to present you with an scenario and walk you through building your render cloud with LightNet: Cloud.

The the scenario:

You have a network of PCs at the office and a network of PCs at home. At the office you have 10 PCs, each named Drone01, Drone02, Drone03 and so on. At home you have 3 PCs named MainPC, GamingPC and OldPC.

You have a FTP server with the Internet address of ftp.example.com.

We're going to be defining the office network as the onsite network. Your home network will be offsite.

At the office you do all of your animation on Drone01 and use the C:\ Drive to store your content.

All of the PCs, the ten at work and the three at home, have LightWave3D installed on them with identical plugins and configuration. The rendering PCs will not require a dongle to run ScreamerNet (that's the rendering engine provided by LightWave3D that LightNet: Cloud interfaces with). LightWave3D allows for unlimited render nodes.

Setting up the onsite network:

We're going to be defining the office network as the onsite network. Your home network - and any other networks such as your friend who's helping you with your renders or your cousin Mel who runs a computer repair business - will be offsite.

After setting up the first PC, you see beauty of LightNet: Cloud; you're pretty much done no matter how many other PCs you have.

Share the C:\ Drive of Drone01 on the network (setting a file share in Windows is beyond the scope of this tutorial, it is very easy to find tutorials on doing that. Google is your friend). On each office PC (including Drone01), set the shared drive to Z:\ and have it reconnect at login. This is important; all of the onsite PCs should see the Z:\ Drive exactly the same.

Now on the Z:\ Drive create a folder called “LnC_Watch” and “LnC_Cache”. At this point, if you haven't already, you can setup LightNet: Cloud on Drone01 as explained in the How-To Setup LightNet: Cloud for the first time using ftp.example.com as the FTP server, Z:\LnC_Watch as the Watch Folder, Z:\LnC_Cache as the Local Network Cache set it to render both onsite and offsite.

After setting up the first PC, you see beauty of LightNet: Cloud; you're pretty much done no matter how many other PCs you have. All you have to do is copy Drone01's lightnet_cloud_1_options.conf file to Drone02 through Drone10's LightNet: Cloud folder. The only settings you have to change on each is the Screamer Name and uncheck Mange The Server.

At this point, Drone01 is the only PC set the manage the server. This is what you want, only one PC should be managing the server on any given local network. Now, any of the Drones can copy scene and content files to the Z:\LnC_Watch folder and submit commands, and all the Drones will be able to run Watcher to see the status of the server. But only Drone01 will upload the files, add scenes to the queue and process commands. With that said, you will want the managing Watcher running at all times, but since Drone01 is your main machine it's probably not ideal. So let's say Drone10 is only used for rendering and nothing else, you might want to set it to be the managing Watcher and run Watcher all the time. This is a matter of preference, Watcher does not demand a lot of system resources.

If you wanted you could just have each PC run offsite, but that's not the most efficient configuration.

The Local Network Cache is a holding place for content archives. You do not have to have it configured, but it will greatly improve performance if you have large offsite content. What happens is the first PC on that local network to start downloading a content archive will place a text file in the Local Network Cache saying when it started the download. When the others check the server and see they need to download the content archive, they will first check the Local Network Cache to see if another PC on the local network has started the download yet, if one has, they will check periodically to see if it is there yet. When the first PC finishes downloading, it will copy the file to the Local Network Cache for the others to copy.

In our scenario here, if the Local Network Cache is not configured then when an offsite scene was ready to render all 10 PCs would suddenly try to download the content archive, putting a great deal of pressure on the FTP server.

Finally, in LightWave when you set the settings in the scene using the LightNet: Cloud lscript, be sure you set the content directory for the Z:\ Drive and not the C:\ Drive. It is vital that all of the Drones see the content directory in exactly the same way and their C:\ drive will not be the same as Drone01's C:\ Drive. This is why we shared the drive and mapped to Z:\ in the first place.

It is also possible to use absolute network path names for your onsite content directories. So, if you don't want to map Z:\, then \\Drone01\CDrive will work as well. I personally prefer mapping network drives and think it's easier to use.

Now, what about offsite scenes? Well, you don't have to so anything to make them work. As a matter of face, if you wanted you could just have each PC run offsite only and you wouldn't even deal with mapping drives, all they would need is Internet access to get to the FTP server. Obviously, that's not the most efficient configuration and you wouldn't want to do that. But that does get us to our next part...

Setting up the offsite network:

Next up you gotta head home and setup LightNet: Cloud on your three home PCs. It's pretty easy at this point, you just have to make a few decisions.

Remember it is only the office network that is the onsite network, anything you do at home you will need to zip up and setup as an offsite scene.

If you want each offsite PC to simply run independently, all you have to do is copy Drone01's lightnet_cloud_1_options.conf to each home PC's LightNet: Cloud directory and change the Scermer Name, change the Watch Folder to a local folder (probably just C:\LightNet-Cloud\Watcher) and leave the Local Network Cache empty.

If you intent to only render with the home PCs, uncheck the Manage Server and never bother running Watcher on them. However, if you're like me and do work at home and need to be able to manage the server from the home network, you can enable that so that your home PC uploads scenes and content files. But remember it is only the office network that is the onsite network, anything you do at home you will need to zip up and setup as an offsite scene.

In our scenario, however, we have three PCs on the home network, so it would be more efficient to setup a Local Network Cache. You would do this in the exact same way as the onsite network; share a drive, map it the same way on all the local PCs, make a LnC_Cache folder and set all the LightNet: Cloud PCs to use the folder for Local Network Cache.

If you have colleagues that want to be part of the render cloud, then you would add them in the same way you added your home PCs.

Me, personally?

Setting an alarm to go off 2:00AM and shuffling into the computer room beats sleeping at the office.

I have an office network with several Drones setup exactly as I have described above. All of the office PCs have access to the Watch Folder, but only one PC manages the server and it runs Watcher 24/7. I do run Watcher on my main animation PC, but only to check the status to the render queue, it does not manage the server. All of the other PCs run Cloud 24/7.

I also have several older PCs that I run at home. My main home PC has LightWave installed on it and I use it to work from home. The home network does have a Local Network Cache but only my main PC runs Watcher and that Watch folder is on my C:\ drive. The other home PCs only run Cloud.

Being able to work with queue at home is nice. Setting an alarm to go off 2:00AM and shuffling into the computer room beats sleeping at the office.