top of page

Use your own smartphone
as a wireless controller

​Run and fly as an embryo/baby as you explore ​and gain the energetic and defensive power of your fairy companions to help you in your journey to birth in a stylized fantasy world.

​The Newborn is an art adventure game with the third-person perspective about the birth of life, created as an undergraduate thesis by Sheng Raymond Liao, Lei Shi (石磊) and Ruixin Wu (吴瑞鑫) at School of Animation and Digital Arts of Communication University of China. The experience goal for this interactive piece is feeling the greatness of the inception of life, concretized by we setting the player goal of acting as a newborn wandering on peaceful lake surfaces, heading to the top of a great waterfall and flying toward the “sun” hung in a fantasy world in which everything has its symbolic meaning.

The game mechanics are holding a smartphone connecting to certain personal computer by WiFi wireless technology, utilizing the portable device accelerometer to steer the protagonist's movement, and swiping or pressing the touch screen as thumb gestures to command the character's jump or flight locomotion.

Genre

Art

Adventure

Network

Tech

PC with Android

Made with Unity

TCP/IP

Accelerometer

Role

Engineer

Technical Artist

Co-designer

CREDITS

  • Original character, creature and scene concept arts by Ruixin Wu

  • Three-D character and environment modeling with hand-painted texturing by Ruixin Wu

  • Stylized sky-box, environment layout, art direction and the trailer composition by Ruixin Wu

  • The two-screens interaction design and user interface design by Lei Shi

  • Game design document with illustrative diagrams and terrain contour map by Lei Shi

  • Character rigging, animations and camera movement breakdown animation clips by Lei Shi

PERSONAL CONTRIBUTIONS

  • Communicated actively with the two artists, comprehended the ideas and implemented the pillars for the technical demonstrations.

  • Researched, implemented and tested/debugged independently the network module, the user input module, the game AI module as well as the cinematic(camera control) module as the only programmer.

  • Coded the whole game system as well as the game logic.

  • Wrote Unity3D Editor plug-ins and game manager scripts to organize the game assets.

  • Created or modified shaders to assist the visual artist with his ideal visuals as well as reduce his workload.

  • Project management and took the responsibility of the final quality of the product.

Newborn - Trailer (2015 Indie Game)

Newborn - Trailer (2015 Indie Game)

Play Video
Features
SMARTPHONES AS THE SECOND SCREEN AND WIRELESS CONTROLLER

Our goal is to provide intuitive, enhanced interactivity at no extra cost of devices. Connecting your smartphone to PC/laptop via LAN, you use the accelerometer for navigating the avatar and various gestures to interact with visuals on both screens.

CLASSIC NARRATIVE STRUCTURE AND REAL-TIME CINEMATICS

​The game design is based on the three-act structure and the Flow theory to provide an emotional experience of ups and downs. You will navigate the infant character to wander on a calm lake, go through a horrible valley and finally fly onto a grand waterfall.

CUSTOM SHADERS TO PRESENT STUNNING GRAPHICS

​Thanks to the perfect teamwork of the engineer and the artists, the creative use of the programmable rendering pipeline gives us opportunity to present the featuring fantastic visuals.

Dual-Screen Interaction

​Game consoles such as Nintendo DS or some experimental hardware interactive devices utilize the dual screens to expand view-able areas of the virtual contents; Others like Sony Dualshock or Nintendo Wii Remote offer motion sensor available for gesture controls. Our design and implementation, nevertheless, combine these two in the equipment most people have.  The dual screens module codes establish the client-side application on an android portable device and the server side on a networked computer, analyzing and transferring the user input data for smooth character navigation or remotely calling functions(by Remote Procedure Call protocol) to synchronize game events. This special input module was iterated thoroughly for its usability to avoid it descending to a gaudy technical feature.

A GIF demonstrating the wifi connection implemented through Unity 4.6 built-in network component.
A video clip to demonstrate the Wifi connection procedure between a Samsung Galaxy J7 smartphone and a gaming laptop

Besides the navigation functionality of the mobile device, two other game events require the player taking action on the touch screen to make progress. Whenever the infant character enters a mushroom-shaped construction and stands in front of the obelisk inside, stars fade in on the smartphone waiting to be connected through dragging and forming lines among these dots. Moreover, the failure of defense from the fairy companions results in enemies' incursion into the infant's internal body and being transferred onto the mobile phone screen. The infant will stay paralysed till the player drags them out.

An illustration to clarify the interactions on the mobile phone/controller; Image from the game design document written by Lei Shi

​The diagram on the left side illustrates the mapping of the user input to the character locomotion or the triggers of game events. Listed below are the avatar's responds:

  • ① Tilt the smartphone from facing the player more than 45° angle forward to move the character forward.​ The pitch angle controls the walking/running switch.

  • ② and ③ Tilt the smartphone left or right while the phone is more than 45° angle forward to make the character veer. The roll angles controls the turning velocity.

  • ④ Tilt the smartphone from facing the player less than 45° angle to smoothly rotate the camera till looking upward.

  • ⑤ Swipe using the two thumbs upward to render the character jump, then if the player ⑥ presses the two fingers on the screen before the character falls down, it will fly.

  • ⑩ Swipe using the two thumbs downward while it is falling to force the character to slump.

An illustration to clarify the interactions on the mobile phone/controller; Image from the game design document written by Lei Shi
Game AI Behaviors

Flocking around the infant character are game Artificial Intelligent (not real AI but several Computer Graphics algorithms) agents(boids) with the notable "Steering Behaviors" guiding their manner of movement. Combined with simple visual sensor modules, as visualized as the yellow antennae scanning unceasingly, and finite state machines, those creatures will protect the game hero whenever they detected the enemies trying to attack and intrude into its body. 

The video clips on the right side display the behaviors of a single agent in the flock. The algorithms I adapted for agents in three-D space are similar to the traditional two-D ones while being more complex. The agent uses three visual antennae constantly altering directions for perceiving its surroundings and its target position is visualized as the yellow jittering sphere gizmos.

Newborn - Technical Illustration - AI Steering 3D

Newborn - Technical Illustration - AI Steering 3D

Play Video
A draft of the imagined combat scene for Newborn, drawn by Ruixin (River) Wu
Newborn - Technical Illustration - AI Steering 2D

Newborn - Technical Illustration - AI Steering 2D

Play Video
A draft of the imagined combat scene for Newborn drawn by Ruixin (River) Wu
Camera Control and Real-time Cinematics

One of the common issues every interactive three-D graphics programmer has to deal with is the camera movement. It can be a rather frustrating task especially for the third person perspective. It is not at all that simple to lazily put the camera as a child of the character transform in the hierarchy, cause it renders the camera movement rigid and mechanical. My method to handle this is to pre-establish the target transforms of the camera as the "key points" for shooting the character turning, flying, falling, or looking upward and let the camera smoothly truck and pan into position. Moreover, ideally the virtual camera should never go through solid substances and breaks the immersion of the game. Thus a ray shot out from the pivot of the virtual camera detects every frame the position to replace the camera, once it collides with objects in the scene. Unlike the Unreal engine which offers the self-contained Spring Arm component satisfying the need, I had to write camera control scripts that cooperate with all other codes to implement the functionality in Unity 4.6.1.

A GIF demonstrating the smooth transition from a real-time third-person camera to cut-scene camera with predefined track.
A GIF demonstrating the smooth transition from the real-time third-person camera to a cut-scene camera with predefined track
A GIF demonstrating the smooth transition from a cut-scene camera with predefined track back to a real-time third-person camera

On the other side, the narrative ​part of the Newborn requires there to be real-time cut-scenes as lots of other games do. The switch among the main camera and the cut-scene cameras of various movement tracks takes amount of tricky codes to work. Nonetheless, the real challenge back then was how to maintain the consistency of the protagonist' position and orientation. We wanted the transitions to be seamless, though in some shots such as the one the hero flying onto the grand waterfall as the lower GIF shows, the character's body moved into a total different space. The camera, nevertheless, does not abruptly cut after the cut-scene starts or ends. My method to handle this is to use an procedural-animated "sand-in" as the code snippets illustrate, which takes place of the controllable protagonist during the cut-scenes so the camera seemly shoots at the same actor. These seamless transitions compensate for the breaking-up feeling of the gameplay.

A code snippet showing the functions that replace the controllable character with an non-player-character/actor with the same model.
A GIF demonstrating the smooth transition from a cut-scene camera with predefined track back to the real-time third-person camera
bottom of page