Building AR and VR experiences is bottlenecked by asset creation. Creating high-quality, optimised 3D content for real-time rendering traditionally requires specialised modellers and days per asset. SphereLinks removes that barrier.
Generate export-ready 3D assets from images and drop them straight into your WebXR project, Unity scene, or Unreal level. Compatible with Three.js, Babylon.js, React Three Fiber, and every major AR/VR runtime.
Everything you need.
WebXR and native AR ready
GLB output works directly in WebXR experiences and is compatible with AR Quick Look on iOS and Scene Viewer on Android — no conversion step.
Optimised for real-time rendering
Generated meshes are automatically simplified and materials are baked for real-time performance. Suitable for VR frame-rate targets without further optimisation.
Works in Three.js, Babylon.js, and beyond
Any WebGL or WebGPU framework that supports GLTF/GLB loads SphereLinks assets natively. Including React Three Fiber, A-Frame, and PlayCanvas.
Native engine support
Import directly into Unity (GLTF importer) and Unreal Engine (GLTF pipeline). Assets arrive with PBR materials pre-configured.
Populate scenes at scale
Generate dozens of environment objects, interactive props, and spatial anchors from reference photos. Build out a full scene asset library in hours.
Spatial computing ready
Generated assets are well-suited for Apple Vision Pro, Meta Quest, and HoloLens deployments via standard GLTF/GLB delivery.
How it works.
Source
Upload a photo of the physical object, prop, or reference you want in your scene.
Generate
AI reconstructs the 3D asset with optimised geometry and PBR textures.
Preview
Inspect the model in the interactive WebGL viewer. Verify geometry and materials.
Deploy
Download the GLB and import into your WebXR project, Unity scene, or Unreal level.
Why teams choose SphereLinks.
For ar / vr workflows specifically.
- Eliminate the 3D modelling bottleneck for AR/VR content production
- Generate real-time-optimised assets from photos in under 2 minutes
- Drop directly into WebXR, Three.js, Unity, and Unreal without conversion
- Enable AR placement of physical objects via Quick Look and Scene Viewer
- Scale scene asset libraries without proportional cost increase
Common questions.
Are assets optimised for VR frame rates?
Yes. Meshes are auto-simplified and texture maps are sized for real-time rendering. For extremely demanding performance targets (mobile VR at 120fps), further LOD reduction is recommended.
Does the output work with Apple Vision Pro?
Yes. Apple Vision Pro supports USDZ and Reality Composer, but also renders GLB/GLTF via web experiences and compatible apps. SphereLinks exports standard GLB files.
Can I use generated assets in React Three Fiber?
Absolutely. Load the GLB using useGLTF or GLTFLoader as you would any standard GLTF asset. All PBR materials map directly to Three.js MeshStandardMaterial.
What texture resolution do assets export at?
Assets export with 1024px texture maps by default. This is suitable for most real-time AR/VR use cases while keeping file size manageable.
We were building a WebXR product configurator and the 3D asset pipeline was the critical path. Commissioning each object took two to four days from a freelancer. Switching to AI-generated base meshes and spending the time on polish instead got us to launch six weeks earlier than the original schedule.
of AR/VR developers surveyed by the WebXR Developer Summit 2024 cited 3D asset creation time as their single largest production bottleneck
Spatial computing experiences are asset-intensive by nature: a convincing virtual environment or product configurator may require dozens of individually crafted objects, each needing correct topology, UV maps, and PBR-calibrated materials to avoid breaking immersion at close inspection distances. The traditional pipeline — reference gathering, modelling, unwrapping, baking, engine import — can consume more total hours than the interactive code itself. AI-generated base meshes do not eliminate the need for an artist on hero assets, but they compress the time for secondary and tertiary objects to near zero, fundamentally changing what a two-person XR team can ship in a sprint.
