Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR) are often used interchangeably, which can be confusing—even among industry professionals. As the technology evolves, the lines between these terms continue to blur, with new concepts like XR emerging. But understanding the differences is key to appreciating how each experience works and what it offers.
Jimmy B. Nguyen, a product designer at The Knot who has worked on VR applications for HTC Vive, shared his insights into these technologies on Medium. Here's a compilation of his thoughts and experiences:
**What is VR?**
Virtual reality refers to a computer-generated 3D environment where users can explore and interact. At its core, VR transports you into a completely different world. Think of it like being fully immersed in a novel—except instead of imagining the story, you're part of it.
Not all VR headsets are the same. The main difference lies in the degrees of freedom (DoF). A 3DoF headset allows you to rotate your head in three directions (pitch, roll, yaw), but not move your body. Devices like Google Cardboard or Samsung Gear VR fall into this category. However, 6DoF headsets, such as the HTC Vive or Oculus Rift, track both rotation and movement in space, allowing you to walk around and interact naturally within the virtual world.
**What is AR?**
Augmented Reality overlays digital elements onto the real world, creating a hybrid visual experience. There are two main types: low-fidelity AR and high-fidelity AR. Pokémon Go is a classic example of low-fidelity AR—it simply places virtual creatures on top of the camera feed without considering the environment. High-fidelity AR, on the other hand, places digital objects in specific locations in the real world, making them appear more realistic and interactive.
**What is Mixed Reality (MR)?**
Mixed Reality, often associated with Microsoft HoloLens, takes AR a step further by allowing digital objects to interact with the physical environment. For instance, if a virtual duck hides behind a lamppost in an MR experience, you'd have to physically move around the object to see it, just like in the real world. Apple’s ARKit also supports similar capabilities, blurring the line between AR and MR even further.
**Lessons I Learned from VR Development**
Over the years, I’ve developed several VR projects and demonstrated them to many users. These experiences taught me some important lessons that might help others designing for VR:
**Lesson 1: Nobody Looks at the Text**
In my first VR project, Easy Chef, we included text instructions to guide users through cooking steps. But when we tested it, no one read the text. People were too distracted by the immersive environment. Instead of relying on text, I learned that spatial audio and voice cues work much better.
**Lesson 2: Users Need a Learning Curve**
The HTC Vive controllers aren’t intuitive at first. Many users don’t look down at their hands initially, so I added visual cues and tool descriptions. A clear onboarding process is essential to avoid confusion.
**Lesson 3: Pay Attention to the User’s Starting Point**
Just like the “first fold†concept in web design, VR needs a natural entry point. In my rhythm game Convokation, I had to position the action so users could easily follow the gameplay without getting lost in the environment.
**Lesson 4: Experiment Continuously**
VR is still in its early stages, so there's a lot of room for experimentation. My last project, Model VRoom, allowed users to design rooms in VR. We tried different menu layouts and found that a floating interface, while initially controversial, was actually well-received.
As Mike Alger, a VR designer at Google, said: “We do VR because we can create worlds and stories that others can explore—and in doing so, we understand ourselves better.†This sentiment really resonates with me.
**Summary**
Jimmy B. Nguyen’s insights highlight the evolving nature of VR design. While his distinction between 3DoF and 6DoF helps clarify the differences between mobile and PC-based VR, the line between AR and MR remains blurry. With companies like Apple now supporting high-fidelity AR, the future of these technologies will likely continue to evolve in exciting ways.
Terminal Block
1 PCB Screw Terminal Block 2 Plug-in Terminal Block 3 PCB Spring Terminal Block 4 Barrier Terminal Block 5 Feed Through Terminal Block.
Terminal block consists of fixed terminal block (hereinafter referred to as socket) and free terminal block (hereinafter referred to as plug). The socket is fixed on the electric parts through its square (round) plate (welding method is also adopted for some), the plug is generally connected with the cable, and the plug and socket are connected by connecting nuts.
The terminal block consists of three basic units: shell, insulator and contact body.
Strictly speaking, terminal block refers to a solid device composed of an insulating base and more than one live part. Each conductive member can be used as a joint point for connecting two or more conductors (wires), and also as a tool for connecting or not connecting conductors (wires) individually. In the literal sense, the terminal block can be thought of as a wiring platform providing terminal or wire connection. The terminal block can be a combination type or a single-piece design. The combined terminal block is stacked in the way similar to the block combination, and the two ends are added with side covers and locked and fixed with screws. As for the single terminal block, the combination process mentioned above is not necessary, and it can be used alone.
Terminal Block Series
ShenZhen Antenk Electronics Co,Ltd , https://www.antenk.com