Basil and the Isles of Spice


Procedural Tools, Asset Implementation

2023

City Builder Houdini Project


Procedural Tools

2023

Wire Tree Project


Procedural Modeling, Simulation

2024

Weaver's Workshop


Environment Art

2022

Humble Kitchen


Environment Art

2022

One in a Krillion


Environment Art, Shader Creation

2024

Traditional Paintings


Traditional Art

2020 - 2023

Archive


Wire Tree

Procedural Simulation

Houdini

Solo Project

January - March 2024


My Wire Tree project is inspired by wire sculptures of trees, a common decoration piece and craft project. (My roommate has one on her bookshelf, which was my primary source of inspiration!) I had always wanted an excuse to learn the basics of using L-systems for plant shape generation. I was inspired by this tutorial by Entagma to create a wire sculpture by using particle simulation trails.


The base tree is generated using a simple branching L-system set of rules. At the end of every branch, the algorithm generates two new branches along a horizontal plane with each other. The angle of branches is controlled by a parameter. After a minimum number of generations, there is a random chance that branches will cease to grow.The wire tree is generated by simulating particle trails following the L-system curves. Initially, I set up a particle simulation using the POP Curve Force node; however, I quickly made the decision to create a custom script with similar functionality instead. This gave me better control over the particle motion and also allowed me to easily add additional forces and parameters. I started this script by following a tutorial by Tim van Helsdingen, but I’ve added many changes for my own use.

Overall, the particle is driven by four elements. Two are quite simple, and Tim creates these forces in his tutorial:

- A “follow” velocity: The particle’s overall velocity direction is set to follow the tree curve. Before the curve is input to the particle solver, we can use the “PolyFrame” node to generate tangent vectors that point along the curve, then use this attribute to set the direction of our velocity.- A “suction” force: Force is added to the particle pointing towards the center of the curve, drawing it in. This keeps the particle from straying off-course and can create a slight taper in the the tree over time. The magnitude of the suction force is based on its distance to the curve.

Finally, I added a small amount of three-dimensional perlin noise to the forces pushing on each particle, with a small amount of roughness and low frequency. This caused the particles to waver off the curve in a more natural-looking way.

I added my own “advection” force to the particle system to make the particles swirl around the base of the tree. To do this, I created directional vectors that pointed clockwise around the branch, by getting the cross product between the tree normal vector (pointing outwards) and tangent vector (following the curve direction). Adding force along this directional vector caused particles to circle the curve clockwise.

Using Houdini for this particle simulation allowed me to repurpose complex particle nodes for unique needs. I added a POP Flock node to my particle system to take advantage of its avoidance algorithm, so any particles that get too close to each other will be pushed away, helping me resolve awkward wire intersections and particle convergence.


/*
----------------------
setup
----------------------
*/
//get nearest prim (curve), uv (location on prim) on 1st geo
//get nearest point on 2nd geo (swept geometry)
int curveprim;
vector curveuv;
xyzdist(1, @P, curveprim, curveuv);
int npt = nearpoint(2, @P);
//get "normal" of curve (direction the curve is pointing)
vector direction = primuv(1, "N", curveprim, curveuv);
//get position at curveuv on curveprim
vector pos = primuv(1, "P", curveprim, curveuv);
//get gen
float gen = point(2, "gen", npt);
gen += 1; //offset gen so no 0 value
f@gen = gen;/*
----------------------
motion
----------------------
*/
//check if point is within bounds of branch
//if so, bring point towards branch
vector suction = 0;
float maxOffset = chf("max_offset_from_branch");
float minOffset = chf("min_offset_from_branch");
float offsetDecr = chf("offset_age_decrease");
maxOffset = clamp(maxOffset - (gen * offsetDecr), 0, maxOffset);
minOffset = clamp(minOffset - (gen * offsetDecr), 0, minOffset);
vector currOffset = @P - pos;
float currOffLen = length(currOffset);
float suctionScale = chf("suction_scale");if (currOffLen > maxOffset) {
    // sucks in towards point on original curve
    suction = currOffset * -1 * suctionScale;
} else if (currOffLen < minOffset) {
    // repels away
    suction = normalize(currOffset) * (minOffset - currOffLen);
}
//create directional follow velocity
vector follow = direction * chf("follow_scale");
follow *= 1/(chf("follow_age_multiplier") * gen);
// add advection (swirling around)
vector advdir = point(2, "dir", npt);
advdir = normalize(advdir);
float advstrength = chf("advcetion_strength");
v@v = -follow;
v@force = suction;
@force += advdir * advstrength;

One in a Krillion

Environment Art, Shaders

Maya, Substance, Unity

Team Size: 10

January - March 2024


One in a Krillion is a hack n' slash action game. As King Krilliam, control a massive swarm of krill to rise up against fish that seek to kill you! Your krill swarm can be used to form "constructs", giving you different abilities in battle, such as a stunning hammer, a protective shield, or a comboing scissor slash.This game is developed by a team of 10 people. I joined the team after one semester of development as a solo volunteer artist, and was responsible for the character models, environment art, and some technical art needs. This project was an opportunity for me to explore a simple, stylized, and cartoonishly goofy art style. Additionally, I contributed a handful of simple shaders to the project. Finally, working as a solo artist on a small project is quite an exercise in scope and efficiency!


Fish

Fish were modeled in Maya. I wanted to give each fish an extremely distinctive silhouette and use simplified shapes, sharp angles, and smooth curves to give the fish a stylized look.


Environment

In the environment concept art for this game, I explored an idea I’ve had for textures for a while: a stylized, hand-drawn look with detailing added by small shapes. I drew a texture alpha map with loose shapes, then used it to create a tiled terrain texture broken up by a grunge mask. These shapes are lined up in all of my terrain textures but tinted different colors, allowing for smooth in-game transitions.To create the sand normal map, I followed this Substance Designer tutorial by Martin Schmitter. The tutorial uses Anisotropic Noise to generate directional noise patterns, then several passes of warping and distortion to create large islands of sand dunes, mostly driven by non-uniform and multi-directional warping.


Since this project is so free-form, I’ve been exploring several pipelines for asset creation, for the purpose of learning a new process. I created a simple rock generator in Houdini to help author individual rock props. This generator applies Worley (Voronoi / cellular) noise to a subdivided cube, with a large amplitude and element size, and low roughness. This distorts the cube into large cellular portions. The rock is remeshed and exported into low-poly and high-poly versions. I used Substance Painter to bake the mesh normals, and then create a simple texture that used my hand-drawn shape textures plus highlights on edges from a curvature mask.


As additional props in the level, I modeled several sand castle molds which act as walls and props along the edges of the level.

Because I wanted to have multiple colors of sand castle molds, I decided to create a grayscale texture in Substance, then create a shader material in Unity that allowed me to adjust colors in-engine. Since Unity does not have built-in support for gradient parameters in shaders, I created parameters for three colors, then interpolated between them based upon the texture sample.

In addition to the sandcastle molds, I created worn-down versions of the models using Houdini’s “Labs Edge Damage” node to procedurally smooth and soften edges. These new models could be used to create the appearance of softer sand. Since these models are remeshed into triangles, instead of trying to create clean UV maps, I decided to make a triplanar shader projection for sand textures.


Shaders

Seaweed is distorted through a vertex shader. X & Z position coordinates are offset by a sine wave of Time with controls for wave frequency and amplitude. The effect is multiplied by a gradient based upon the vertical UV, which means the bottom of the seaweed, where it anchors into the floor, will not distort.

Another simple shader I made for this game is the Overkrill shader. This shader is applied to swarming krill, and has a boolean toggle parameter which switches it from a basic color into “overkrill” mode. In Overkrill, the color of the krill cycles through a rainbow. This effect is done by oscillating the red, green, and blue channels through a sine wave.


Basil and the Isles of Spice

Procedural Tools

Houdini, Unity

Team Size: 28

Aug. - Dec. 2024


Basil and the Isles of Spice is a 3D plaformer game. Play as Basil, a basilisk, who must pass a series of challenges to become Shaman of her village!I joined Team Leviathan to work on this project in August of 2023. The project was in its final stage of production, with the goal of wrapping everything by the end of the semester in December. However, at the time that I joined, the game's level design had undergone a complete remake, which challenged our art team to rapidly create a brand-new environment.I created two tools in Houdini to help our environment artists with rapid asset creation and set dressing. These tools were integrated into Unity using the Houdini Engine plugin.Basil and the Isles of Spice will be on Steam soon!You can find my Houdini .hip and .hda files, as well as some basic documentation for tool usage, here.


Platform Tool

One of my tools was a procedural platform tool, which creates simple platform geometry based upon a curve.This tool was initially tended for usage by our level designers, but it eventually evolved into an artist tool. Our game’s heavy stylized art style leaned into large and easily readable shapes, so the tool created geometry that exactly fit into our desired outcomes. By allowing artists to create platforms in-engine instead of having to use a separate 3D modeling package, we were able to save a lot of time, and iteration and changes could happen much more rapidly. While our key level structures were modeled by hand, most surrounding platforms were made using the tool.My topmost priority was to make this tool easy to use. The main controls are extremely simple: users could adjust the height of the platform and the bevel amount for both the top and bottom edges. A platform with a slight bevel could be used for rocky or grassy structures, while a very soft bevel along the top edge could create a sand bank, and a soft bottom edge could be used to round out a floating island.

Platforms in the Mangrove, Hub, and Arch areas, each utilizing beveling in different ways to achieve a unique look.

Initially, our team planned to add a trim texture to the sides of platforms to blend the seams of our triplanar textures. I added the ability to output separate trim geometry which consisted of just the top beveled polygons, and implemented procedural UV unwrapping and layout for this trim.

Later, one of our artists requested for me to implement a second type of trim: an extruded overhang. I added parameters for the amount of extrusion and the height of the trim. In addition, I added a noise offset to the bottom of the overhang with adjustable frequency and amplitude.

In order to make the tool both accessible and robust, I decided to automatically handle settings for clean mesh topology generation, while adding in some basic controls that allowed for situational changes.By default, the tool flattens the curve input to guarantee generation of flat surfaces. However, this can be toggled off if desired, to create sloped or sideways platforms.

Flattening the curve helps fix issues resulting from curves with y-level changes. However, toggling off curve flattening allows sideways meshes to be generated.

The mesh is automatically retopologized into evenly-sized triangles. This allows the mesh to be used with our grass tool in-game, which requires even subdivisions. By default, the entire geometry is remeshed; however, there is an option to only remesh the top and bottom faces, which can provide cleaner edges for less beveled platforms. The amount of detail preserved in remeshing can also be adjusted, so platforms which require less fidelity can be generated using fewer polygons.

Remeshing options can control the area targeted and the density of resulting polygons.


The Houdini node network for the platform generator.

The UI and parameters for the platform generator.


Prop Scattering Tool

The Caldera, the ending area of our game. The bushes and kelp have been scattered using this tool.

The second tool I created for this project was a basic prop scattering tool. It accepts a ground mesh and a bounding object, and instances props across the mesh surface.This tool was initially extremely simple — it was more or less just a Scatter and a Copy and Align node, with Unity instancing of props and controls for prop size and density. I bashed this together during one lab period for artists to use in a few quick locations. However, as we found more potential uses for this tool, I began adding additional control parameters.One of the key changes was adding in noise to the density attribute. Simply scattering props evenly created too uniform of a look. Noise patterns, and controls for frequency, allowed us to create clumps of props, which provided a more natural look.

Coral and kelp scattered, broken up into clumps by noise. Both screenshots are the same scene; the right has water disabled for demonstration.

I also added a control for directional placement of props — only placing props based upon the normal direction of the surface mesh. This allowed us to, for example, only scatter foliage on the top surface of a prop.

The directional controls were particularly useful on the geometry for the Leviathan skull.

Iteration on this tool mostly focused on resolving performance issues. I quickly realized that, while a singular density attribute allowed artists to easily modify the look prop scattering, it also could easily lead to far too many props being added at once into the level. I added a parameter that capped the total number of objects scattered, and also double-checked the poly count of our prop meshes, and retopologized them as necessary. Finally, I added an attribute to automatically tag all instanced objects as Static, allowing them to be batch rendered and occluded.


The node network and parameters for the prop scattering tool.


City Builder Houdini Project

Procedural Tools

Houdini, Unreal

Solo Project

Aug. - Dec. 2024


As a deep dive into creating procedural tools for Unreal Engine, I designed a series of HDA’s to generate a city built upon Open Street Map data. This project was largely inspired by GDC talks: Marvel’s Spider Man, meet Houdini by David Santiago and Tools to Build a World in One Day by Thomas Tobin.You can download my final HDAs for this project and a demo final build here.The project builds a city based upon input of real-world map data from Open Street Map. The generated geometry adapts to terrain. Buildings are constructed out of modular wall pieces, which are instanced and can be swapped out for any asset in-engine. Similarly, instanced props can be scattered around the city, which can be used to place environment assets such as trees or rocks. A second type of prop generation creates props that follow roads, an ideal setup for creating lampposts, fences, or other similar props.


Basic OSM City Builder

The backbone of this project is the “Basic OSM City Builder” HDA. This network creates a basic city framework using OSM data, imported by the “Labs OSM Import” node. The data is split between roads and buildings, which are each handled separately by the network.Road generation was extremely simple:
- Road data is imported into the project as curves.
- Curves are subdivided for higher resolution.
- Curves are projected onto the terrain using the “Ray” node.
- A “Sweep” node generates geometry from the curve.
Open Street Map provides an attribute for each road that labels it by type — such as “motorway”, “residential”, or “pedestrian”. A simple script allowed me to create parameters that give the user control over the width of each of these road types.

Buildings, on the other hand, are imported as polygons that outline the base of the building.Initially, I wanted to generate buildings based upon true-to-life silhouettes. However, this presented me with several issues; namely, building module generation was inconsistent with oddly-shaped buildings. I eventually pivoted to processing each building as a rectangular bounding box. This also created a more versatile tool: buildings could be replaced with instanced assets of any model, so long as the model fit into the bounding box. If an artist had a set of pre-made houses, for example, they could be easily copied into the city. Or, this would allow this generation system to be used for other contexts — say, a kid’s building block playset, or a curated garden?In order to support this vision, I created attributes that define the scale and orientation of the bounding box. In Houdini, this was done using the “Extract Transform” node to compare the building’s bounding box to a reference square, finding the orientation and scale.I used Houdini Labs’ “OSM Buildings” node in order to generate the building geometry, saving me some work by using pre-built logic. This node itself is fairly simple, extruding a building shape from the starting geometry based upon a parameter for number of levels (floors). Some buildings were imported with a pre-set attribute for the number of levels; for those without this data, I wrote a simple script to randomize their heights.

As a final step, I wanted to eliminate any buildings that intersected. To achieve this, as each building is generated, the network will check for intersections with all previously generated buildings. The new building will only be included in the final output if it does not intersect with any others.Given more time, I would’ve loved to add so many things to this network. A short list of examples include more complex roads, building foundations, optimization for intersection analysis (my current solution is very slow)! That said, the basic generation here is only the root of the overall project, and at this point I chose to move on in order to explore more tools for the system.


Building Generation

The “Building Generation” HDA is, to some degree, just a wrapper for the “Labs Building Generator” node. For its creation, I heavily relied upon Simon Verstraete’s tutorial on the Houdini Youtube channel — an invaluable resource. However, I had a fun time assembling additional logic to procedurally change the number of module & building variations.

In the tutorial, Simon imports modular wall pieces, and uses the “Building Generator Utility” node to assign a “module_name”, “module_dimension”, and “module_priority” attribute to each piece of geometry, which are used in the “Building Generator” node. However, for my system, I wanted to generate basic placeholder walls instead of importing geometry into my Houdini file. These placeholder walls could be swapped out later, in-engine, for models of the user’s choosing. I also wanted to be able to procedurally alter the number of wall module variations in each building. Because of this, I decided to set up a manual system for assigning attributes to geometry.

One quirk of the “Building Generator” node, which I wasn’t able to find in any documentation: the naming convention for the “Module Pattern” field will match any text until the first asterisk it encounters, then split the string. It doesn’t matter what each module is named afterwards so long as they are all different. In general, I find Houdini’s documentation to be fairly comprehensive, but occasionally small details like this are left out. I’m very glad that Houdini lets us dive into complex nodes to investigate how they work inside!

The network iterates through every building and generates wall modules:

At this point, for our uses, we want to replace the placeholder wall geometry with instanced modules for each building. Deleting the wall module geometry leaves us with a point cloud. New wall geometry can now be packed and instanced when copied onto these points.During this process, we can assign variations to buildings by separating them into groups before copying wall geometry. In my demo, each building variation is given a different color. However, this same feature could be used to create buildings that use different sets of modular assets.


Prop Scatter

The “Asset Scatter” HDA utilizes heightfield masks to control prop scattering. While I initially considered a manual setup for scattering points, Houdini’s heightfield nodes were more efficient and simpler to implement. They allowed me to utilize pre-existing features, such as the “Mask by Feature” node’s controls for masking by height and slope.The major feature of this network is the ability to avoid prop placement that overlaps previously generated buildings and props. However, the instanced nature of props in other HDAs present unique challenges when trying to avoid overlaps. I decided to use the “Mask by Object” node in order to delineate areas where props should not be placed, but this node can’t be used for packed / instanced geometry. (This is the opposite problem of the instancing issue in the building generator!) Like before, I fixed this issue by using the instance point clouds from input nodes. Cube geometry is generated on top of these points, then used to create a mask. Because of this method, the generated mask is not extremely accurate, but it’s acceptably reliable for preventing overlapping props.

Then, it’s a simple manner of using the “Heightfield Scatter” node to scatter points on the landscape. Props are instanced onto points.


Road Props

By contrast, the “Road Props” HDA used a completely different approach for generating props. Like before, points are created, and then instanced props are copied to their locations. We also want to delete points that overlap other geometry. In this case, however, points are generated based upon the original road input.

The “Sweep” node, which we previously used to generate geometry for our roads, can alternatively be configured to make curves. Therefore, this can be used to produce two columns that follow the road, which then are resampled to set the distance between prop placement points. To clean up curves, I used yet another "Sweep" node to generate a bounding region that contains the roads. Any points that fall within this bounding region will intersect with the original roads, so they are deleted.

Like in the “Prop Scatter” node, other building and prop nodes in the network can be accepted as an input. Overlaps with these inputs will be avoided. I found myself running into the same issues I previously faced with instanced geometry. In this case, a similar solution — creating placeholder cube geometry in the location of input points — works again. This time, however, the geometry is converted into a volume, then used in a “Group” node as a bounding volume to select and delete intersecting points.

Finally, a simple script finds and deletes points that have been generated too close to each other, such as two points that nearly touch on a road intersection corner. (A minimum separation is set by a user parameter.) This script makes sure to check for points that are on separate streets — removing points on the same street could just delete all props on narrow roads.


Beyond experimenting with various Houdini and Unreal features to create this project, I also learned a lot about managing my pipeline and workflow. Specifically, towards the end of the project, I struggled a lot with major issues with Unreal integration, many of which resulted from a lack of direction on my part.When I was developing the project, I often avoided bringing my tools into Unreal until the last minute. This led to many headaches where I couldn’t figure out why my network didn’t work the same in Unreal as it did in Houdini. In one particular example, I discovered that using a Houdini HDA as an input for another would only import the first output — I could not create HDAs with multiple outputs. Instead, I had to create groups for output geometries. These groups would be used to identify the desired data when working in another node.At the last minute, I experienced a plethora of build errors when trying to create a final playable build. At this point, it was nearly impossible to identify the source of the errors, and I ended up disabling several features of my project as a brute-force workaround. In reflection, I should have been making regular test builds throughout the creation of this project, to identify problems early and incorporate debugging into a part of my production pipeline.However, as a whole, I am proud of the work that I did on this project. Beyond the final product I produced, the development process of this project taught me a lot about problem-solving and procedural logic. After many weeks of work on these systems, I’m excited to bring the skills I developed to my next project!


Humble Kitchen


My inspiration for this project was found in photos of small Japanese apartment kitchens. Many Asian apartments have tiny kitchens, and people must live hyper-organized lives to efficiently use the space. The person who lives here has invited a guest over for an evening cup of tea; the dishes have been washed and put aside to dry, and the counters have been completely cleaned off, to give the tiny space a bit of breathing room.The process of making this project was a major learning experience for me in both asset production, and project pipeline and management. As a project involving a lot of small parts and aiming for a high degree of realism, I spent a lot of time researching. Working from reference was crucial for this project, and I learned a lot of modeling and texturing techniques in the process of creating these assets.Software Used:
Models - Autodesk Maya
Textures - Substance Painter
Lighting & Rendering - Marmoset Toolbag 3
Additional textures from Adobe Substance 3D Assets

Enchanted Ruins


In the soft sunlight of late afternoon, the ruins look almost normal; the magic is only given away by the floating rocks and rubble that surround the scene. Under the pale night sky, however, the ruins light up under the glow of the moon.This pair of images was my first foray into stylized environment graphics. Especially when approaching lighting, I wanted to give each scene an eye-catching mood that strayed away from realistic lights and colors.
The base for this scene was sculpted in Maya, while the organic crumbling rocks were brought into ZBrush for sculpting, then reduced into low-poly meshes and brought back into Maya for rendering. The daytime scene was rendered in Pixar's RenderMan, while the nighttime scene used Arnold.
Software Used:
Models - Autodesk Maya, ZBrush
Rendering - RenderMan, Arnold

Weaver's Workshop


Fiber crafts have a near and dear spot in my heart. Both of my grandmas do fiber crafts, and I learned how to knit and crochet as a kid. Although I do not weave, I greatly admire the amount of dedication that goes into creating a woven project. For this project, I was inspired to capture the warmth of a weaver's studio, and the nostalgia and familiarity I feel for these kinds of crafts. This project was a lot of fun to make - I had the opportunity to gather reference images of an antique Swedish loom at the Swedish Club in Seattle!On a technical level, this project challenged me to come up with solutions to very complex models and textures. This was my first try using Substance Designer as well, in order to create a yarn material!Software used:
Models - Autodesk Maya
Textures - Substance Painter & Substance Designer
Lighting & Rendering - Unreal Engine 4
Additional textures from Adobe Substance 3D Assets

Staircase on the Hill


I grew up in the San Francisco Bay Area. I'm still in love with San Francisco's peculiar charms, antithetical to the usual perception of California: hilly neighborhoods, foggy summers, bay windows and succulent gardens.For this project, I chose to focus on one of the staircase-sidewalks that are scattered around the city. Specifically, I pulled inspiration from the Vallejo Stairs on Telegraph Hill. When making this project, I tried my best to capture many of the subtle features that set apart San Francisco's neighborhoods. This project was a massive learning experience for making stylized textures in Substance Painter, as well as setting up a realistic exterior lighting scene!Software used:
Models - Autodesk Maya
Textures - Substance Painter
Rendering - Arnold for Maya

Procedural Floating Islands


My first Houdini project: a tool that procedurally generates floating islands and bridges along a user-defined curve! In addition, the project is built so the user has a high degree of control over parameters for both the islands and the bridges.Making this project was a huge learning experience for me. The project involved a lot more Houdini VEX than I initially expected, and forced me to grapple scripting to a depth that I've never tackled before. However, I'm proud of the end result!For this project, I also produced a tutorial / explanation document on my creation process. The document works in tandem with my .hip file, which is thoroughly commented to explain what each node and script does. Download either of these below:

Mirage


Jump through mystical desert ruins and shoot the bird-like guards to steal their treasure!Mirage is a short 2D platformer-shooter game, created as a student project by a team of 12 students over the course of a year using a custom engine.On Mirage, as with any small project, I held many hats. Primarily, I served as a tech artist: I created game VFX using a custom particle system, rigged our snake character and enemy characters using Spine, and implemented almost all of our game's art assets. In addition, I illustrated most of the background props and UI elements, and heavily contributed to style guide documentation and overseeing task management.The creation of this project was an invaluable experience in working with a team. Through both the ups and downs of this project, communication skills were a necessity for our success, and I often found myself serving as the bridge between artists, programmers, and designers, which was an incredible learning process for me!

Ray's Coast 2 Coast


Created in 48 hours for West Coast Game Jam 2022
Ranked #2 overall and #2 in Best Art
Ray's Coast 2 Coast is a 2D zen sidescroller game, where you play as a manta ray who dreams of flying through space. This game was made with a 6-person team, including 2 artists, in the Unity game engine!For this game, I created two backgrounds as well as several obstacle props for the levels. As a quick game jam project, this project was all about how quickly we could create a distinctive visual style: we used bold colors and silhouetted imagery to produce visuals quickly, created extra backgrounds and VFX using simple palette swaps, and the team did a great job implementing simple VFX to add to the experience!

Wild Wild Wetlands


My junior year student project, Wild Wild Wetlands, is a 3D parkour game about a frog cowboy!The game is being created by a team of 18 people, using Unity. We started development at the start of September 2022. The content shown here is from our April 2022 build. While the game is complete, the team is planning to enter a post-production phase for the Fall 2023 semester.On this team, I served as an art producer, environment artist, and tech artist. Over the course of the project, I worked on a number of different aspects to the game:
- Illustrated concept art
- Modeled, UV'd, and textured environment props
- Implemented art assets and conducted set dressing across 3 levels
- Created VFX using particle systems and shaders
- Set up game lighting and postprocessing effects
One of the biggest hurdles of this project was learning how to communicate with a large team. We faced many challenges trying to coordinate across our art team; additionally, I had to learn a lot about balancing tech and design needs! This project required me to be versatile and flexible in order to cover all the odd tasks we had to get done, and challenged me to be adaptable at every turn.

Aphrodite's Salon


Aphrodite, the Greek goddess of love, brings to mind images of classical beauty and heart-melting romance. However, like almost anything else in the hands of American culture, the concept of love all-too-often finds herself pushed into kitschy, gaudy pinks and hearts - just think of Valentine's Day!
I made this illustration for a class in environment design. Given the prompt of "Aphrodite's Salon", I imagined a Googie-architecture style salon, with bold shapes and colors trying to draw the eye of potential customers. I had lots of fun with all the details in this illustration, from the roses in the planters, to the heart-shaped roof, to the heart shape of the giant sign's scissor handles!
As a secondary part of this assignment, I was challenged to put this building into an environment and apply various color palettes to it. I thought it could be interesting to place the building in a Greek pastoral landscape, discordantly bringing this commercialized, modern building back to its classic roots. These color palette variations, a combination of analysis and application, were a wonderful learning process for me to do: side-by-side, it's amazing to how much color can transform a scene!

Alistair: Psychonauts x Alice in Wonderland


This character is Alistair, a re-imagining of the character Alice from Alice in Wonderland. Despite being an old man, he preserves the original Alice's youth at heart. I imagined Alistair as an old man in the modern day, who still wears clothing that would've been fashionable in his youth - for this reason, I drew inspiration from a variety of time periods to give the clothing a vintage yet timeless feel.
At its core, this project was a study in stylization. Double Fine's Psychonauts (and other games, such as Broken Age) feature heavily stylized graphics, with characters built upon heavily exaggerated shapes. For this character design, I aimed to replicate their style, to gain a better understanding of the distinctive design decisions Double Fine employs in their media. This was much more challenging than I initially thought! I spent a lot of time iterating upon sketches before I started to find shapes and proportions that worked well for my design.

Traditional Paintings


When I was in elementary school, my parents signed me up for a Chinese painting class during summer break. As far as I can remember, that was my first introduction to making art beyond crayon and marker scribbles; as I grew up, paint became my comfort medium to work in.These days, I enjoy painting as both a hobby and a way to train my eye and traditional art skills. Over time, I find it more and more refreshing to return to paint. It's a much-needed reminder of the foundations behind great art!All my paintings on this page are done using gouache.

About & Contact


Hello! I'm Annabel Sun, an artist currently based in the Seattle, Washington area. I'm a digital artist, specializing in 3D environment art & tech art for game development -- but I also have a passion for working with traditional media. This is my portfolio website: a showcase of my best work!I am currently a 4rd-year student at DigiPen Institute of Technology, studying for a Bachelor of Fine Arts degree in Digital Art and Animation.I am currently seeking jobs and internships!If you're interested in my work, please don't hesitate to reach out. I'd love to chat!
You can reach me at sunannabel@gmail.com.
You can find all my social media & other links in the footer below.