Experiential Design - Task 1 / Trending Experience
Task 1 / Trending Experience
TABLE OF CONTENT
1.
Lectures
2.
Instructions
3. Task 1
4.
Feedback
5.
Reflection
LECTURES
Week 1:
This week, Mr. Razif briefed us on the module and showed us some
projects from previous students as examples. He also introduced AR, MR,
and VR. I seem to have developed some interest in AR, but I’m quite
afraid of coding.
AR/VR/MR Differences:
# (Less Immersive) AR > MR > VR (More Immersive)
- AR (Augmented Reality): Adds virtual things to the real world. You can see these things on your phone or AR glasses while still seeing the real world.
- MR (Mixed Reality): Combines the virtual and real world. You can touch and move virtual objects, and they act like they are in the real world (like a virtual object on a real table).
- VR (Virtual Reality): You are fully inside a virtual world. You wear a headset, and you only see the virtual world, not the real one.
Types of AR Experiences:
- Projection: Virtual images shown on real places, like a picture on a store window.
- HMD (Head-Mounted Displays): Glasses like Google Glass or HoloLens that show virtual things in front of your eyes.
- Mobile Devices: You use your phone or tablet to see virtual objects in the real world through the camera.
Designing for AR:
- Marker Based: Uses a picture or code (like a QR code) to make virtual objects appear. When you point your camera at the code, the virtual object shows up.
- Marker-less: Uses your phone’s sensors, like GPS or camera, to show virtual objects without needing a code.
Week 2:
User Mapping
User Mapping
1. Empathy Map:
Purpose: Understand the user’s mindset (thoughts, feelings, actions).
Usage: Build empathy and alignment within a team.
When to Use: Early in the design process or after user interviews.
2. Customer Journey Map:
Purpose: Visualize the steps a customer takes when interacting with a product or service.
Usage: Identify pain points and key touchpoints.
When to Use: Throughout the design process as a reference.
3. Experience Map:
Purpose: Understand a broader, general user experience that’s not tied to a specific product or service.
Usage: To understand general human behavior.
When to Use: Before creating customer journey maps.
4. Service Blueprint:
Purpose: Map out the internal processes and roles (frontstage/backstage actions) involved in delivering a service.
Usage: Identify internal weaknesses and opportunities for optimization.
When to Use: After creating customer journey maps and when improving internal processes.
Purpose: Understand the user’s mindset (thoughts, feelings, actions).
Usage: Build empathy and alignment within a team.
When to Use: Early in the design process or after user interviews.
2. Customer Journey Map:
Purpose: Visualize the steps a customer takes when interacting with a product or service.
Usage: Identify pain points and key touchpoints.
When to Use: Throughout the design process as a reference.
3. Experience Map:
Purpose: Understand a broader, general user experience that’s not tied to a specific product or service.
Usage: To understand general human behavior.
When to Use: Before creating customer journey maps.
4. Service Blueprint:
Purpose: Map out the internal processes and roles (frontstage/backstage actions) involved in delivering a service.
Usage: Identify internal weaknesses and opportunities for optimization.
When to Use: After creating customer journey maps and when improving internal processes.
Group Activity
This week, we have a group activity where we need to choose a location
and create a customer journey map. Our group has chosen H&M in Mid
Valley.
Miro Board Link:
https://miro.com/app/board/uXjVLY-OD2I=/?share_link_id=527642068154
Mid Valley H&M Customer Journey Map
INSTRUCTIONS
<iframe
src="https://drive.google.com/file/d/1Rkgt26muLwSgtGRQtY1N5m0c6bKAgyV8/preview"
width="640" height="480" allow="autoplay"></iframe>
Task 1: Trending Experience
Requirement
"Students are given series of exercises that explore the
current, popular trend in the market to give them better
understanding of the technologies and the knowledge in creating
content for those technologies. Students will conduct research
and experiment to find out features and limitation which will
later allows them to make decision on which technologies they
should proceed with in their final project."
Week 1: Exercise 1
"Imagine the scenario in either of the two places. what would
the AR experience be and what extended visualization can be
useful? What do you want the user to feel?"
Scenario: Shopping Mall
When I go to the shopping mall, my main goal is to buy
things. I need to know the prices, ingredients, expiration
dates, and other details about products to decide if I want
to buy them. Sometimes, it’s hard to find the price tags
because I have to turn the product around 360°, which is
really annoying. The expiration dates can also be hard to
read because the text is often very small.
If I could wear AR glasses, important information about
products could be shown as I look at them. This would help
me quickly and easily understand the details without
spending time reading the tiny text on the packaging. I
would also like to see online reviews of the products
through the glasses, which would help me decide whether to
buy them or not.
Fig. 1.0 AR shopping
Week 2: Experience Design & Marker-Based AR Experience
In week 2, we started using Unity. Although Mr. Razif briefly
taught us how to get started with Unity in class, I couldn't
keep up. However, after watching the
Marker-based AR experience (Tutorial) video, I learned how to create an AR object and add a video in
Unity.
First, we need to download Unity, and then register for
Vuforia and download the Vuforia package for Unity. After
that, we can create a database and set the image target.
# The rating cannot be below 3 stars, otherwise it will be harder to recognize the image target.
# The rating cannot be below 3 stars, otherwise it will be harder to recognize the image target.
Fig. 2.0 Image target
In Unity, the cube must be a child of the image target, so
it will only appear when the image target is scanned (refer
Fig. 2.1).
- Ensure the image target is selected properly.
- Verify that the license key is input correctly (ARCamera > open Vuforia Configuration).
- In the Vuforia Configuration, "Track Device Pose" is to allow Vuforia to track the device's movements.
Fig. 2.1 The cube appears when the image target is
scanned
After creating a plane and adding the video, the video plays
before scanning the image target. To resolve this issue,
adjustments need to be made as shown in Fig. 2.2.
Fig. 2.2 Play the video after scanning the target, pause
it when the target is lost
# Pause: Stops temporarily, continues from the same spot.
# Stop: Stops completely, resets to the beginning.
Fig. 2.3 Final marker-based AR experience
Week 3: User Controls & UI (Buttons)
I watched
this video tutorial to complete this week's exercise. In Unity, each page needs to
have a different scene, and then you can add text by selecting
Add Canvas > Panel > Text (TMP) in the Unity
Hierarchy. I can only find the default font Liberation Sans
in my Unity, so I am using it.
Fig. 3.0 Menu Scene
Fig. 3.1 Credit Scene
Fig. 3.2 AR Scene
Button Navigation Steps:
- Create Scripts Folder: In the Project window > Create > Folder > name: Scripts
- Create C# Script: In the Scripts folder > Create > C# Script > name: MySceneManager.
- Add Code: Open MySceneManager.cs > Fig. 3.3
- Button Navigation:
- Select the Canvas > Add Component > MySceneManager.
- In the button's On Click() section > MySceneManager.gotoScene > enter the name of the scene.
Cube Animation Steps:
- Select Cube > Open Animation window > Click Create (name: Cube_Hover).
- Record movement for the Cube.
- Create a new animation clip (name: Cube_Idle).
- Keep the Cube stationary (no movement).
Fig. 3.2 Make the cube up & down animation
I had created a Play button and a Stop button before.
In the On Click() section for the Play button, I
assigned the Cube and used Animator.Play to play
Cube_Hover (moving). For the Stop button, I assigned
the Cube and used Animator.Play to play Cube_Idle (not
moving). (Fig. 3.3 & Fig. 3.4)
Fig. 3.3 Play Button
Fig. 3.4 Stop Button
When an image target is detected, the Cube starts
moving. To ensure it remains stationary initially, I
need to set Cube_Idle as the Layer Default State in the
Animator.
Fig. 3.3 Make Cube_Idle appear first
Fig. 3.4 Final Result
Week 4: Markerless AR Experience
This week, I learned how to make a markerless AR experience
(tutorial video). It doesn’t need an image target—just detecting the ground
or any flat surface to place our 3D object. But I ran into
trouble connecting my device (my phone) even though I turned
on Developer Mode and USB debugging. Plus, my phone doesn’t
support AR. So, I tried using my mum’s phone, and it
worked—I could connect and install the APK. However, the
background was just grey, and I couldn’t see the real world
through the camera (Fig. 4.0). I feel really
frustrated. Later, I got it working on my dad’s
phone, so I think it was just a problem with my mum's
phone.
Fig. 4.0 Grey background, no camera view
Steps to Create a Marker-less AR Experience:
- Import Vuforia Packages
- Add Vuforia AR Camere (Vuforia license key.)
- add the Plane Finder & Ground Plane Stage.
- Drag the Ground Plane Stage into the "Anchors Stage" field in the Plane Finder.
- Enable or disable the "Duplicate Stage" option depending on whether you want to spawn multiple objects or a single object.
- Create 3D Objects > child of the Ground Plane Stage.
- Build Settings > Switch to Android > configure the Player Settings > Build and Run
Player Settings For Android:
- Disable Auto Graphics API.
- Remove Vulkan from the list of graphics APIs.
- Set the Minimum API Level to your device’s Android version.
- Set Scripting Backend: Mono > IL2CPP.
- Set Target Architecture: ARMv7 > ARM64.
Exporting to iOS:
- Build Settings > Switch to iOS > configure the Player Settings > Build
- Xcode: Open Existing Project.
- Sign the App: Set up signing capabilities with your Apple ID.
- Deploy to Device: Connect your iOS device and run the app.
Player Settings For iOS:
- Resolution and Presentation > Enable Render Over Native UI.
- Other Settings > Disable Metal API Validation.
- Provide a Camera Usage Description: AR Camera (compulsory).
- Target minimum IOS Version: iOS 16.
It's best to export your AR project to your phone
because only the phone can detect the ground and
depth properly. You can test quickly using a
simulator (Fig. 4.0), which acts like an image
marker. This way, you can check if everything works
without needing the actual AR features on a webcam.
Fig. 4.0 Emulator ground plane
Fig. 4.1 Export to phone
Week 5:
This week, I learned how to use a single button to
control a cube's animation. When you press the button
once, the animation starts, and if you press it again,
the animation pauses.
Besides using a button, you can also add a toggle
directly to the object. This way, you can control the
animation just by clicking on the object itself (refer
to the script in Fig. 5.0). Just make sure the
GameObject has a Collider component, like a BoxCollider
or SphereCollider, so it can detect mouse clicks.
If you want to control a video instead of an animation,
you can create a new script called VideoToggle (refer to
the script in Fig. 5.1). In this script, you will write
a function that plays and pauses the video.
Fig. 5.1 Script for video toggle
Fig. 5.2 Final outcome
Week 6:
Our task this week is to create a small virtual gallery that
expands into a larger gallery when clicked. There will also
be a "details" button that displays more information when
clicked.
Steps to Create an Exhibition Room:
- Import ProBuilder:
- Go to Window > Package Manager > Unity Registry > ProBuilder.
- Open ProBuilder Window via Tools > ProBuilder.
- Create the Room_Scale:
- Use a Plane for the floor.
- Use a Cube for the walls.
- Add Exhibition Images, Details Button, and Text:
- I added some of my photography in there, so it’s like I’m having my own little photo exhibition, haha! We just need to write a script to make the text pop up or disappear when we click the button.
- Create the Room_Mini:
- Simply duplicate Room_Scale and scale it down.
- Create a Room Toggle Script (Fig. 6.1):
- Attach the script to Mini Room.
- Add a Box Collider and Rigidbody (set Is Kinematic and disable Use Gravity).
Fig. 6.0 Create the room
Fig. 6.1 Room Toggle script
For the final AR gallery, I just used some Lorem Ipsum
for the details text, so it doesn't really mean
anything. The photos are ones I took with my phone
before. I also added a button in the top left corner to
go to the room scale. But it looks like the button is
too close to the edge on the phone, so it's hard to
see.
Fig. 6.2 Final AR Gallery
Week 10:
This week, I learned how to import 3D models into
Unity and assign animations to them. First, I used the
website
readyplayer.me
to create my own character and downloaded it as a GLB
file. However, since my Unity couldn't import the
GLTFUtility package (from
GitHub), I was unable to view the 3D model directly.
To resolve this, I used the site
Avaturn
to convert the GLB file to an FBX format. After the
conversion, I could import the FBX model into Unity,
and it would come with both the FBX 3D object and a
corresponding FBM folder. This allowed me to drag the
model into the stage and view it.
Once the model was in Unity, I applied the rigging
process by changing the rig type from
Generic > Humanoid in the Inspector window,
then clicked Apply. This step allowed me to use
humanoid animations for my character.
Additionally, Mr. Razif introduced
Mixamo
to us, which contains a large collection of 3D
animations. You can upload your own character model
(in FBX format) and download various animations for it
in FBX format, suitable for Unity. In Unity, I learned
how to create animations and use the Animator to
adjust the order of these animations. I can also use
buttons to control the different animation sequences
for the character.
Fig. 7.0 Avatar Animation Exercise
Week 11:
This week, I learned how to control model and colour
changes using scripts in Unity. I created a script to
cycle through different models when clicked and another
script to change the material of the model.
Fig. 8.0 Model Switcher Script
Fig. 8.1 Colour Switcher Script
Fig. 8.2 Change Model & Colour Exercise
3 Ideas for AR Project
After thinking for several weeks, I sometimes suddenly come up
with some ideas and jot them down. I’m particularly interested
in the following 6 ideas:
1. Restaurant Menu
Imagine an app where you can see 3D models of dishes from a
restaurant menu before you order. You can customize the
ingredients and portion sizes in real-time, so you get to
see exactly how your meal will look before it arrives. It
makes ordering food a lot more fun and gives you a better
idea of what you’re getting!
2. Furniture Placement
This AR app helps you picture how furniture will look in
your home before you buy it. You can choose different
pieces, resize them, and try them out in your room. It gives
you a realistic preview of how everything fits together,
making it easier to decide what to purchase.
3. Chemical Elements Education
This educational AR experience lets you explore the world of
chemistry. You can scan cards of different molecules (like
H₂ and O₂) to see their 3D models. When you bring the cards
together, you can watch a cool visual of a chemical
reaction, like how water is formed, along with explanations
of what’s happening.
4. Heart Anatomy Education
With this AR app, you can dive into the human heart's
anatomy. You’ll see a detailed 3D model and can interact
with different parts to learn about how they work, like the
ventricles and arteries.
5. Photo Frame Placement in Real Size
This AR tool helps you visualize photo frames in your home
before buying them. By scanning your room, you can place
virtual frames on your walls to see how they look and fit in
various styles. It’s a fun way to design your space with
confidence!
6. Water Cycle Education
This AR app shows you how the water cycle works in a fun and
engaging way. You can explore each stage—like evaporation,
condensation, precipitation, and collection—through animated
3D models. It makes learning about water circulation an
immersive experience!
After considering the difficulty of creating each project
and how interesting they are, I think I will choose
restaurant menu, chemical elements education, and furniture
placement for my 3 idea.
Fig. 5.0 3 Idea for AR Project
Comments
Post a Comment