Yarrdachi: SpriteGen API

If I don't manage to somehow retrieve the data from the old DB, this will be a little confusing. Suffice it to say Yarrdachi is the code name of a web based pet sim I am working on. I've chosen my technologies and set about building the pet generation system with Django and Flask. So far I have managed to build the pet model, generator functions in the model for randomizing a new pet and an API with Flask and Numpy to handle generating the images for the pets.

There was a bug in the way I was pasting the images together that would cause my shading layer's transparency to apply to other layers so I have fixed that with alpha compositing and wrote a ghosting system into the model generator so there is a 5% chance any trait will have some level of transparency. I will change it later most likely but this is for testing and turning a bug into a feature.

The next step is to write a hook into the trait processing on the image composition API that will check if a trait's ghosted property is greater than 0 and if so, instead of calling the normal generator call the ghosted generator. There will also be another step in the image generation that can make the whole image ghosted with an overlay of the original body base with some level of transparency applied causing the whole pet to lose some opacity. I will use a section of the traits called "special" that will have the traits that require heavier compute on the API or affect the whole body or apply some kind of aura.

For now though, I need to implement the basic ghosting function for traits, then I can do the special generator hook for the compositor.

And before that, lunch.

So instead of writing a new function and having different levels of ghosting I decided to use the same function as it already has access to alpha channels and use the .paste() instead of the .alpha_composite() in cases of ghosting.

That leads to something along these lines

Ghosty stripe.

I also modified the normal markings so that the opacity level of the marking causes a blend in normal, non-ghosted situations. What used to look like this

Looks like this now.

Of course this varies with the marking's opacity level which is generated on creation of a new pet. Generated by...

{
 "name": "stripe",
 "color": "#D1031",
 "marking": 1,
 "opacity": 141,
 "showing": 1,
 "ghosted": 0
}

Next step is full-body ghosting. This is a "special" trait. So I need to write the generator for special traits in the model first.

        # SPECIAL TRAITS, FULL BODY
        if len(special_traits) > 0:
            #The only trait we have currently is [spirit]
            for sp_trait in special_traits:
                if sp_trait['trait_type'] == 'body':
                    if sp_trait['name'] == 'spirit':
                        # We need to select the body base part.
                        spirit_layer = recolor_bodypart(body_path, pet_traits['body_color'], sp_trait['intensity'])
                        body_base.paste(spirit_layer, (0,0), spirit_layer)

She ain't pretty, but she gets it done. This code happens between the body construction and the generation of eyes so the eyes are not given any transparancy modification.

That creates this.

Black eyes to show the difference.

This trait is a 1 in 10000 chance, it is special after all.

"special": [
    {
        "name": "spirit", 
        "intensity": "100", 
        "hereditary": 0, 
        "trait_type": "body"
    }
]

With a bit more work I got a better compositing system for the ghost special effect.

at 100
at 200
At 20

I ended up getting a little side tracked today with adding special stuff to the image API so tomorrow I will add more parts to test the generator and make sure all the parts generate properly and randomization doesn't explode all over me.

Cheers