At EWU, I took an Emergent Design class focused on exploring evolving AI tools. I was already experimenting with AI on my own, but this class pushed me deeper into prompt writing, tool testing, and creative problem-solving using emerging tech. AI is powerful, but only as smart as the person guiding it—so I focused on learning how to communicate clearly and creatively with these tools.
My GPT
I built this tool to give designers actionable feedback. It’s trained with core design principles to support creative thinking, critique, and growth.
Adobe Firefly
This collection showcases early explorations using Adobe Firefly, created for school and AAF Spokane concept development. These pieces helped me test prompt engineering, composition, and lighting within generative tools. While not polished final work, they reflect how I use AI to brainstorm quickly, push visual ideas, and explore creative direction.




Promo Video: AAF Spokane Halloween Bingo
I created this motion piece using Adobe Firefly and Luma AI tools to promote AAF Spokane’s Halloween Bingo event. The goal was to capture the playful, spooky vibe while showcasing how AI can accelerate visual storytelling for quick-turn promos.
I created this motion piece using Adobe Firefly and Luma AI tools to promote AAF Spokane’s Halloween Bingo event. The goal was to capture the playful, spooky vibe while showcasing how AI can accelerate visual storytelling for quick-turn promos.
Claude
Hammy’s Care Tracker
This was a quick concept built with Claude AI while experimenting with prompt-based UI design. I wanted to test how well AI could follow visual tone and layout instructions for a playful pet care interface. Claude handled structure surprisingly well, even for something this whimsical.
This was a quick concept built with Claude AI while experimenting with prompt-based UI design. I wanted to test how well AI could follow visual tone and layout instructions for a playful pet care interface. Claude handled structure surprisingly well, even for something this whimsical.

ChatGPT (DALL-E)
Succulent Sheep (2D to 3D Workflow)
This project started with a simple prompt in ChatGPT + DALL·E and evolved into a full pipeline test: from 2D concept to 3D model to 3D print. I used Meshy to convert AI-generated images into usable 3D assets, refining along the way to prep for print. This little succulent sheep even landed my work a feature in The Inlander, (pg 28). It was an exploration on how AI can assist creatives across various formats.
This project started with a simple prompt in ChatGPT + DALL·E and evolved into a full pipeline test: from 2D concept to 3D model to 3D print. I used Meshy to convert AI-generated images into usable 3D assets, refining along the way to prep for print. This little succulent sheep even landed my work a feature in The Inlander, (pg 28). It was an exploration on how AI can assist creatives across various formats.


v1

v8

v19

v35

v40
3D (GPT to Meshy)


