Gyorscope - 3D Modeling
Turn Your Phone into a Magic Wand: Web-Based 3D Gyroscope Controller
Greetings! I'm Yunus. Today I want to share with you my new project, which I've been working on with great pleasure, that connects the physical world and the digital environment in real time: Web-Based 3D Gyroscope Controller.
As you know, I enjoy thinking about system architectures and interactive web experiences. My goal in this project was simple: to be able to rotate a 3D model (a car or a phone) on my computer screen in space as if it were in my own hands, simply by scanning a QR code with my phone's camera, without downloading any application.
Sounds simple, right? However, when different axis spaces, live socket connections, multiple user sessions, and "deployment" processes came into play, bending the laws of physics in the backend became a little more complex than I expected. Let's take a closer look at how I built this system and the stages I went through.
🛠️ Tech Stack
To keep the project modern, scalable, and type-safe, I used the following technologies:
Frontend: Vite, TypeScript, Three.js, TailwindCSS
Backend: Node.js, Express.js, Socket.IO
Deployment: Vercel (Client), Render (API)
Design Language: Spatial Computing & Minimalist Industrial (Frosted glass effects, monochrome tones)
🚀 Development Stages and Challenges I Encountered
This project didn't happen overnight. I had to build the architecture step by step and in great detail to establish a solid foundation for the system.
Stage 1: Prototyping and the HTTP Impasse
Initially, I had set everything up in a single folder with simple HTML and JavaScript files. When I brought the server up and connected from my phone, I faced a harsh reality: iOS and modern mobile browsers completely blocked access to gyroscope (DeviceOrientationEvent) data in HTTP connections for security reasons. To test the system, I created a secure tunnel on the local network in the development environment (localhost) using Vite's @vitejs/plugin-basic-ssl plugin to automatically generate HTTPS certificates. This allowed me to stream the first data to the computer.
Step 2: Deconstructing the Architecture and Switching to TypeScript
As the project grew, the old require (CommonJS) structure and global variables started causing headaches. Models sometimes spun on their own even without gyroscope data, port 3000 conflicting and crashing the server.
I took a deep breath and refactored the project from scratch using TypeScript. I divided the system into two isolated worlds: src/client and src/server. I made Socket.IO events type-safe using Interfaces. Now the server doesn't crash; it dynamically finds an available port and starts up again.
Step 3: The Mathematics of the 3D World (Three.js Axes)
This was the part where I spent the most time (and consumed the most coffee). I would tilt the phone forward, and the model on the screen would rotate to the right!
This was because the reference system of the Euler angles (Alpha, Beta, Gamma) produced by the phone was completely different from the Y-Up (Right-Handed) coordinate system used by Three.js. To resolve this discrepancy, I created a custom axis mapping on THREE.Euler. Where necessary, I used radian transformations and negative multipliers to prevent reverse movement, making the model a perfect reflection of the phone.
I also wrote an auto-scaling and centering algorithm to calculate the Bounding Box of the model entering the scene so that the different .glb models (some gigantic, some tiny) loaded would appear standard on the screen.
Stage 4: Deployment Hell
Complete chaos ensued when loading the project into Vercel and Render.
The Vercel side froze in an infinite loop during the transforming... phase while trying to compile the TypeScript code of the backend.
The Render side crashed due to a Rollup compilation error (@rollup/rollup-linux-x64-gnu not found) because it was a Linux machine and I had created a package-lock.json file on Windows.
Solution: I sharply separated the monolithic structure during the deployment phase. I wrote a vercel.json rule for Vercel that completely ignores the background process. For rendering, I completely disabled the vite build processes and brought the server live directly with node --import tsx.
Step 5: Interface and Multiple Session Management (Rooms)
When I brought the system online, a new problem arose: If everyone was listening to the same channel, my phone would rotate the car on someone else's screen.
To solve this, I implemented a Room-Based Session architecture. When the desktop interface opens, it creates a unique UUID and embeds it in a QR code. When you scan this QR code with your phone camera, you join that specific room via the URL. This way, data flows only between your devices.
As a final touch, I redesigned the interface to conform to Spatial Computing trends; using frosted glass effects (Glassmorphism), clean, thin typography, and massive monospace fonts for data flow. I also added a switch to the mobile interface, offering control options via Touchpad in addition to gyroscope.
Result
More than simply connecting one device to another, synchronizing hardware data with a 3D graphics engine via web sockets was a tremendous experience. Combining the aesthetics of the frontend with the rigid logic of the backend is truly exciting, showing how far the boundaries of a web browser can be expanded.
If you want to examine the code of this project in detail or upload and test your own models, you can check out my GitHub repository. See you in new projects, pushing the boundaries even further!
Yunus.
Images: