Experiential Design / Bachelor of Design (Hons) in Creative Media
Experiential Design · Final Task
| Instruction
Doc 1.1 MIB
| Progression
In Unity, I start our project from the main features, which is the
scanning flow. I have design a namecard of "Florist" to be the
sample card. Below is the card design.
Fig 2.1 Target Image
To help us better understand this technology, I also found some
tutorial videos related to our topic:
Vid 2.2 Business Card Project Showcase
Vid 2.3 AR Tutorial Video
Following the tutorial, I started with the Vuforia setup and got
the target image.
Fig 2.4 Target Image
After setting the target image, start making the information in the
target image: avatar, UI button, and text message. The Avatar
website I used was Ready Player Me.
Fig 2.5 Avatar Creating in Ready Player Me
Fig 2.6 Putting Avatar into Frame
I started placing the target message. In the news section, I
gave all components an intro animation so that they appear from
the center of the card. In order to make the button link to the
contact site, I have a script that allows the button to link to.
Fig 2.7 Scanned Information
Fig 2.8 Link Manager Script to Control Button
After setting up (not set up yet but have a draft) the scan
information, I enter the menu page and the scan frame UI. The
button on the menu page will take you to Page 2, which has a
blank page with a camera frame, instructing you to perform a
"scan" operation.
Fig 2.9 Menu Page and Scanning UI
When you scan an object, that is a target business card, the user
interface will display the word "Scanned" and an information panel
where you can set the collection as an index.
Fig 2.10 Scanned Information UI
Fig 2.11 The hierarchy assets and script assets currently
owned
During the previous task, I noticed quite a few bugs, so I decided to
redo the project from scratch to make sure everything is clearer and
more functional. Here are the bugs I faced:
| Bugs Reviews
Known Bugs |
1. Scene assets disappear in Editor mode, but appear during Play mode.
This was super confusing at first. I thought I forgot to save something
or that I deleted a prefab by mistake. But when I hit Play, everything
showed up again. Later, I found out it was related to layer settings or
prefab connections not being updated properly.
fig 3.1 Assets not appear
2. Start scene appears too large; unsure if it's a scale or camera setting
issue.
When I opened the start scene, the whole layout looked way too zoomed in.
I couldn’t tell if it was my object scaling problem or if the camera
settings were off. I tried adjusting both for a long time before things
started to make sense.
Fig 3.2 Scene Scale Error
3. Pause and Play buttons are currently non-functional.
The buttons didn’t respond at all when I clicked them. I double-checked
the OnClick() functions and even changed the UI Event system, but nothing
worked. It was really frustrating.
Fig 3.3 Play and Pause Button Not Working
4. Video Player not functioning.
My onboarding video just wouldn’t play in Unity. I wasn’t sure if it was
the video format or Unity’s settings. I debugged it for hours before
realizing that the video compatibility depends on the platform you’re
building for.
Fig 3.4 Video Player not Appearing
5. Scanning status does not update correctly – UI still shows “Scanning”
even when the target has already been detected. Panel remains on screen
even when the card is no longer visible.
This one really got on my nerves. I wanted the text to update to
"Scanned" when the image target was detected. But no matter how I tried,
the status always stayed at "Scanning" and the panel never disappeared
even when the card wasn’t there anymore.
Fig 3.5 Not Detected but still showing "Scanned"
Not Yet Finished |
1. Menu button, menu panel, and full index system not completed yet —
done by teammate Tze Wei
| Rebuild
Rebuilding the Scene to Fix Bug #1 and #2
To fix the first two bugs, I created a completely new scene and rebuilt
everything from scratch. I carefully reorganized the hierarchy, objects,
and layers to make sure nothing would go missing again. It took extra
time, but it helped me keep everything clean and functional.
I also finished the transition from the onboarding page to the scanning
page. To make the onboarding more user-friendly, I made it into a short
animated video so that users would understand what’s going on before
jumping into the AR part.
After confirming that the scanning process worked smoothly, I handed
over the Menu section to my teammate Tze Wei to continue the rest of the
development.
Fig 3.6 On Boarding Image and Video
Fig 3.7 Unity bugging and debugging moments
1. Scanning Frame & Onboarding Page
Since I planned to use an Android phone for testing, I resized the
project screen to match the phone resolution. That’s also one reason why
I restarted everything—to make sure the screen layout would fit
perfectly on mobile.
I started building the scanning frame first. It was quite fun actually,
because it felt like I was framing the whole AR experience, giving a
proper boundary for all the things to appear inside.
Fig 4.1 Scanning Frame Building
Fig 4.2 OnBoarding Page Video created and imported into Unity.
2. 3D Avatar Model
The original avatar model I used from Ready Player Me didn’t work
well—it had import errors and wouldn’t animate properly in Unity. So
I looked online and found a new model that looked like a “florist
girl,” which fit the theme better anyway.
I added a basic Animator Controller so the avatar could perform a
simple greeting pose like “Say Hello.” Even though it’s a small
animation, it made the AR experience feel a lot more alive and
interactive.
Fig 4.3 Avatar import Setting - Profile
Fig 4.4 Avatar import Setting - 3D Model
Fig 4.5 Animator Controller setup
3. Contact and Others
After scanning the image target, the user can see the full
content: the avatar, some intro text, a short video, and some
clickable buttons. I also wanted the user to rotate the 3D
avatar, so I followed a YouTube tutorial to add a
drag-and-rotate function. It worked pretty well and made the
character feel more real.
I also made a contact button that lets users open actual
links—like a website, WhatsApp, or Instagram. All of these are
real links to a florist shop in Johor Bahru called
Haru Nachii.
I thought this would make the AR experience feel more practical
and meaningful.
Fig 4.6 Full AR Scanned Context and avatar rotation setup
Fig 4.7 Contact button which can lead to link site
At the beginning of the project, everything seemed to be progressing
quite smoothly—especially during the AR scanning setup. I was able to
successfully place all the AR markers and get them to scan and display
correctly without major issues. This gave me a lot of confidence at
the start, and I thought the rest of the implementation would go just
as smoothly. However, as I continued working, I encountered unexpected
technical challenges—especially when trying to add a “Pause” function
and dynamically update the status text between “Scanning” and
“Paused.” Despite multiple attempts and testing, I couldn’t get this
function to work as intended.
Findings |
While troubleshooting, I realized that there were limitations to what
could be achieved using the current knowledge and logic I have. I
initially assumed that pausing AR scanning would be as simple as
toggling a UI element, but it turned out to be much more complex and,
in some cases, even unsupported depending on how the AR session is
managed. After several failed tries and hours spent searching for
tutorials, I reached out to a senior who had worked on a similar AR
assignment before. She patiently reviewed my code, pointed out
mistakes, and explained the logic behind certain bugs I encountered.
Her support was really valuable—not just technically, but emotionally
too, as I was starting to feel overwhelmed.
Another challenge was building the UI menu in Unity. At first, I felt
lost navigating the Unity interface and understanding how to link UI
buttons to functions. Fortunately, my teammate Tze Wei and I teamed up
to watch tutorials on YouTube and work through the process together.
With collaboration and persistence, we were able to get the UI working
as planned.
What I Learned |
One of the biggest lessons I’ve taken from this project is learning to
accept imperfection. I tend to put a lot of pressure on myself to make
everything perfect, and when things don’t go according to plan, I feel
anxious or frustrated. But through this project, I learned that it’s
okay to change direction when something is too difficult to
implement—especially if the effort outweighs the result or the feature
isn’t essential. My senior told me something that really stuck with
me: “Don’t expect everything to be perfect. If it’s not working and
you’ve tried your best, move on.” That piece of advice helped me
reduce unnecessary stress and approach my work more realistically.
Technically, I also learned a lot about Unity’s UI system and how it
connects to AR interactions. Through trial, error, and teamwork, I now
better understand how to design interfaces that work smoothly within
an AR experience. I also realized the importance of asking for help,
collaborating with teammates, and learning from those who have more
experience.
Overall, this project not only improved my technical skills but also
helped me grow mentally—by teaching me how to manage expectations and
problem-solve under pressure.
21 April 2025 - / Week 1 - Week 7 YANG ZHI CHING / 0365154 Application Design II / Bachelor of Design (Hons) in Creative Media Task 2 · Interaction Design Planning & Prototyping INSTRUCTION Doc 1.1 MIB Requirements: 1. Interaction Design Documents: Creating detailed flowcharts and wireframes to map out user journeys, creen layouts, and navigation paths. 2. Animation Prototyping Micro Animations: Small effects like button clicks, loading icons, and feedback states, prototyped in Figma. Macro Animations: Larger UI transitions or intro animations. I can use After Effects or reference similar examples if needed. 3. Explanation & Prese...
21 March 2025 - 3 June 2025 / Week 6 - Week 11 YANG ZHI CHING / 0365154 Experiential Design / Bachelor of Design (Hons) in Creative Media Experiential Design · Task 3 | Instruction Doc 1.1 MIB | Visual Design Prototype Concept and Theme | Together with my teammate Tze Wei, we decided on a theme centered around a "technological future" . To match this concept, we chose a light green color scheme and a minimalist visual style. The color light green symbolizes innovation, freshness, and comfort—qualities that align with our futuristic and user-friendly AR application. Fig 1.1 Reference Moodboard Design Guide | ...
3 Feb 2025 - 10 Feb 2025 / Week 1 - Week 2 YANG ZHI CHING / 0365154 Infoemation Design / Bachelor of Design (Hons) in Creative Media Exercises · Quantifiable Information INSTRUCTION Doc 1.1 MIB Instruction | Gather a set of objects and separate it into category such as color, shape, pattern, and other quantifiable factor. Example: 1. Box of Lego 2. Jar of button 3. Jar of marble ball 4. Set of colourful rubber strap In this exercise you're required to quantify our chosen objects and arrange them into a presentable layout or chart . The information must be presented as is, and you need to arrange the objects with relevant indicators written out with pens to help you to visualize the quantity and data. The examples of objects that can be use are buttons, coins, lego pieces, M&Ms, and more. Submis...
Comments
Post a Comment