top of page
image_2026-03-07_234357850.png
image_2026-03-07_234426150.png
image_2026-03-07_234946654.png

IDDO SHOIKHET

IMG_2160.jpeg

Iddo Shoikhet

Game programming student at Breda University of Applied Sciences

Custom VR Engine

Where it started

image_2026-04-19_123359207.png

For the second half of the of my second year of university we were tasked with creating custom engines for specific types of games.

 

There were two quarters, first we worked on the engine itself in small groups of only programmers, then for the final quarter of the year we had artists and designers introduced into the team to use the engine to make a game.

I was placed in a team aimed on making an engine for VR games, I worked on multiple different systems (audio, event, serialization, reflection, editor).

(If an engine was not deemed good enough to move on to the final quarter by the teachers it was to be replaced with an Unreal Game project).

Mantis Engine interface

Where it stands

My contributions to the Mantis engine are as follows (The code base itself is, unfortunately, in a private university project)

The engine template has no built-in runtime reflection system, so I built a reflection layer on top of EnTT's meta_factory. It allows any system to discover component types, their fields, and associated functions at runtime without knowing the concrete type at compile time. The core of the system is a set of registration macros that wrap the complex EnTT Meta API into single-line calls for a simple user experience.

START_REGISTER handles the common setup, registering a type with its name, a getter, and a factory function.

REGISTER_DATA and REGISTER_FUNCTION expose individual fields and methods to the runtime database.

All registrations live in one central function that runs at startup. Once registered, the data flows into two systems: the scene serializer uses the reflected has, get, and serialize functions to walk every component on an entity and save or load it without per-type code, and the Maya bridge uses the registered reference functions and field names to push gameplay components between Maya and the engine. The macro approach keeps registration readable and explicit, you can see at a glance exactly what's exposed.

Reflection System
Audio system (FMOD)

The 3D audio system is built on top of FMOD and integrated into the engine's ECS. Each entity can carry an Audio3D component that holds multiple named audio sources, each with its own playback state, volume, and directional cone. Every frame the system syncs the listener's position and orientation from the Listener3D component to FMOD, then iterates every Audio3D entity to update its position and cone direction.

 

Cone direction can either be locked to another entity, pointing toward it if it's a different entity, or using its forward vector if locked to itself, or driven by manual yaw/pitch offsets relative to the entity's transform, with an opt-out flag for ignoring the parent's rotation. Playback is managed through a state machine (NotPlaying → StartPlay → Playing → Pause → Paused → Resume) that maps to FMOD channel operations. In debug builds, a renderer draws the inner and outer cone angles in the viewport so designers can visualize the directional audio spread.

Screenshot 2025-02-25 143125.png
Soundtrack Queueing & Playback

The audio system supports sequential soundtrack queueing through AddToQueue. Each queued track is paired with a channel to wait for, by default it chains to the last queued track (or whatever is currently playing), so tracks play back-to-back automatically. You can also pass a specific channel ID to wait for, which pauses that channel and triggers the queued track when it finishes. This makes it easy to set up an intro that transitions into a gameplay loop: queue the intro, queue the loop after it, and the system handles the handoff. Tracks flagged as music auto-loop when there's nothing left in the queue.

Asset browser

The asset browser is an ImGui-based content panel that lets designers navigate the project's file system, preview assets, and drop them into the scene. It renders a grid of folder and file icons using an ImGui table, with configurable thumbnail size and padding via a settings popup. Navigating into directories updates the view in place, and a back button walks up the path. Double-clicking a file instantiates it directly .glb and .gltf models are spawned as new entities positioned in front of the active camera, while .bee scene files are deserialized into the current registry.

The key feature is the drag-and-drop pipeline. Any file in the browser can be dragged out as an ASSET_LOAD payload. To catch drops anywhere in the viewport, not just inside the browser panel, the system draws an invisible full-screen overlay window each frame that only becomes interactive when a drag is active. This means an artist can drag a model from the content browser and drop it directly into the scene viewport without needing a specific drop zone.

20260419-1221-27.2569834-ezgif.com-video-to-gif-converter.gif
Serialization System

The engine has two serialization systems that complement each other. The first is a compile-time JSON serializer built on visit_struct. Any struct marked with the VISITABLE_STRUCT macro can be serialized and deserialized automatically, the serializer walks each member by name and handles type dispatch for primitives, glm vectors, strings, enums (stored as their string names via magic_enum), std::vectors, and nested visitable structs. This is used for standalone data like configuration or asset metadata where you want a quick, type-safe round-trip to a JSON file without any manual save/load code.

The second is the scene serializer, which operates at runtime using the EnTT Meta reflection system. Rather than relying on compile-time struct visitation, it calls entt::resolve() to discover all registered component types, checks whether each entity has a given component through the reflected has function, and invokes serialize or save through the meta API. This means any component registered with the reflection macros is automatically included in scene saves. Deserialization runs in two passes, the first creates all entities by UUID so cross-entity references are valid, the second rewinds and emplaces and deserializes components by resolving their meta type from the node name. The event system connector is also serialized as its own node alongside the entities.

Event System (In-code)

The event system wraps EnTT's entt::dispatcher into a type-safe publish/subscribe layer. Events are plain structs, each carries an entity reference and optionally extra data like an AudioID, and listeners are static functions connected to a specific event type at compile time via AddListener.

 

The MAKE_EVENT macro cuts the boilerplate for defining new event types, generating a struct with the entity field and a string name automatically.

 

CONNECT_LISTENER is the shorthand for wiring up a handler, so registering a new response to a game event is a single line.

The system supports two dispatch modes: Trigger fires an event immediately and all connected listeners execute synchronously on the spot, while Enqueue defers the event to be processed later when UpdateEvent is called for that type. This makes it possible to batch events within a frame, for example, multiple volume changes can be enqueued during the audio update and flushed together at the end. A generic template helper, PlaySoundEvent<T>, handles the common pattern of "look up the Audio3D component on the event's entity and play a sound," so any event type that carries an entity and an AudioID can trigger audio playback without writing a dedicated listener.

The Event system Tool  be found on another page (here)

bottom of page