2. Emotion deconstruction by building in many ways the reality layers from the local to global processing

 

I made a rendering in Blender of this game action scene to different frames per second to perceive at different speeds what is happening in such a space. The images in the frames of the wall were animated in 60fps (I used GIMP, Pencil2D, Blender, Audacity, and Pitivi to create the images, sound, and animation). The animation in Blender was made to 24fps. Each second took me more than 4 hours for rendering (and a lot of computer crashes) according to what my system can handle. My initial idea was to create a 3 minutes animation but that rendering will take me more than one month or cost me a $5000 desktop with a cutting edge graphic card, 😅.







These are some rendered images taken from this game scene and its objects to create the game's dynamics, aesthetics, and mechanics. The idea is that the gamers experience a high engagement by playing in this Virtual Reality space figuring out how to overcome this game challenge (i.e., puzzle) to open the door toward the next room. People with autism will achieve such a goal by focusing on their preferred local processing (i.e., a private space in where they focus on the object details, in particular, by following repetition and fractal patterns). They would make a smooth transition from the local to global processing (i.e., public/social space based on emotional recognition through colour, sounds, and facial expressions associated with 5 basic emotions). Gamers should find and put objects arranged into particular fractal cubes by following patterns of colour, size, and shape: 5 warlords (i.e, happiness-green), 4 monkey heads (i.e., surprise-yellow), 3 dice (i.e., sadness-blue), 2 Iron-man (anger-red), 1 PS4 (fear-orange). Each object in each one of the five groups of objects triggers a 10-second animation on the frames+video animation to show up the face action units activated when people express each one of these five basic emotions.





 


 

This is the complete 10-seconds animation by object to explore how different images and sound mashups based on fractal and loop structures might trigger happiness, surprise, sadness, anger, and sadness.


 

Emotion perceived globally from the local processing anger (red fractals) by focusing on eyes cues:

Emotion perceived globally from the local processing of fear (orange fractals) by focusing on eyes cues:


Emotion perceived globally from the local processing of fear (orange fractals) by focusing on mouth cues:

Emotion perceived globally from the local processing of happiness (green loops) by focusing on mouth cues:


 

Emotion perceived globally from the local processing of anger (red fractals and loops) by focusing on eyes cues:

 


Emotion perceived globally from the local processing of anger (red fractals and loops) by focusing on mouth cues:

 



Emotion perceived globally from the local processing of anger (red loops) by focusing on game puzzles that put the "head" cues into the cube fractals:


 

Local perception of surprise through yellowish fractal and loop aesthetic:




 


Comments