The Compositor Rebuild
This will be interesting. I need to rebuild the compositor system for the slime sprites, but I need to do it in Godot. There's at least a couple ways I can handle it, one way relieves the server of some of the work and does most of the heavy lifting on the client side. Another flips the workload around.
If I want the server to do the heavy lifting I can have the server run the compositor and then send the slime part data over after, caching whatever it does and just sending the cached data every time such a part is requested by clients again.
I could have better highlights and shadows this way with less API hits on update requests, however I think it would be firstly a pain to implement and secondly needlessly complicated. If the client can easily handle a few dozen slimes on screen with animations and shaders then there would be no need to have the server handle all the compositing.
The web preview would have to be rewritten in Godot and then exported for web but that's nothing I haven't done before...
Either way I will build the compositor into the client, then if I have to I can just take the code out and plop it into a new project with some endpoints and then we have a server-based API.
So, starting out on the client...
The first thing I realize is that the parts being smashed down into one png each and switched to grayscale has some... interesting problems with highlights and shadows as well as colors.
I can correct for this by modifying the Values of each sprite part to lighten them up.
To the Godot Docs, time to RTFD!
Well, looking at the two I would say the old compositor had too bright of a highlight anyway. I think this will be fine. If not I will be able to fix it later.
Anyway, I will need to modulate and composite all the parts and see how they look together.
Due to some silly buggers in plotting X and Y, the slime decided to have a little dance.
It would also appear that my grid space is calculated differently here in comparison to the original compositor.
Well, I will have to adjust this in the DB, that's all. Strange though... I think the calculation starts at slime center mass in this case, whereas the original compositor calculated from top left.
Ah yes;
That would be why. We are of course calculating distance from the 0 value on both vertices which is the crosshair in this case, not the top left. Yep, just going to take some modification in the DB to fix this up.
The hand positioning will cause me some troubles down the line when I start animating, I will have to figure out how to make the Godot animator to animate based on starting position. I am sure I can, if not I will have to do it programmatically.
Now that I have it compositing the slime together I should probably have multiple slimes composited from different data. Luckily I have plenty of slimes to work with as I prepared some earlier.
Only have the one set of parts to work with at the moment as I have not written the code to load all the slime parts, that was the code that sparked the compositor rewrite in the first place after all. So it will just be recoloring the one set here for now so I can make sure my recoloring is working correctly before proceeding.
Hmm, not sure I like what's going on with the colors. I think I know why they are so muddy... maybe I can fix it easier in the compositor... but I think I should probably do it in the sprites themselves.
I guess adding some Value and Saturation to the bodies made their colors a little more... powerful. Maybe a little too much saturation, though?
I don't think my alpha channel is being correctly applied here. I am using the a8 property of the Sprite as it seemed to be exactly what I needed. Perhaps I was wrong... back to the docs.
Huh. Maybe I am applying it wrong.
Uhhh... nope?
I've lowered the saturation modifier slightly and changed the background clear color to near black for a better idea what is going on.
I really prefer my original compositor, honestly. Hmm... Maybe I can rewrite the original compositor to save the slime parts themselves in folders to be fetched?
Yeah, I think I like that idea best. I can add some fields to the slime model and have each slime keep track of it's own parts, I can keep my compositor, my web tester and everything else I did. I will still have to assemble the slimes in the client but the image data wont need modulation or alpha compositing to happen per frame. It will be a lot of hits to the API for each slime... but significantly less overall.
I will also have to write a version system on the Slime Model.
Anyway, I need to take my dog out and get some lunch. I'll resume afterward with the modifications to the slime model and make my way over to the compositor API.
Alright, the model now supports the generated parts.
I have tested it with the Body, now I just need to do it for all the other parts and I can get back to writing the slime loader.
With that I have all the body parts being built separately. Now I have to actually generate a bunch of new slimes with it. Plus side is I found a bug that was slowing down my compositor significantly. I guess it was all worth it in the end.
Back to the slime loader.
Well done, Generator.
The loader is now working.
The loader caches each slime's specifically generated parts and loads them from disk if they have been encountered before.
Just in time too. It's past 5pm.