Tutorial
How to Turn a Product Photo into an Embeddable 3D Model in Next.js
Learn how to call the SphereLinks REST API to generate a 3D model from a product photo, poll for job completion, download the GLB, and render it in React Three Fiber — all wired up in Next.js App Router.
Most product pages are still flat. A single JPEG on a white background does the job, barely. But shoppers who interact with a 3D product model have 64% higher purchase intent than those who see standard images ([Banuba, 2025](https://www.banuba.com)). That gap is real money, and it's now achievable without a 3D artist on staff.
This tutorial walks you through every step: uploading a product photo to the SphereLinks API, polling for the generated GLB file, and rendering it in a Next.js app with React Three Fiber. You'll end up with a fully interactive, embeddable 3D viewer your customers can rotate with a mouse or finger.
The whole thing takes under 30 minutes if you follow along.
Key Takeaways - Interactive 3D product experiences increase add-to-cart rates by 10.9% ([VividWorks, 2025](https://www.vividworks.com)) - The SphereLinks API handles AI conversion server-side; your code just polls and renders - GLB files with PBR textures look realistic under any lighting with zero extra configuration - This tutorial uses Next.js App Router, TypeScript, and React Three Fiber — no 3D experience needed
What You'll Build
By the end of this tutorial you'll have a working Next.js page where a user pastes a product image URL, clicks a button, and watches a fully rotatable 3D model appear in a canvas below it. The architecture is clean and production-safe. The API key never touches the browser.
Here's the file structure you're aiming for:
app/
api/
generate-3d/
route.ts <- server-side: calls SphereLinks API
components/
ProductViewer.tsx <- client component: R3F canvas
page.tsx <- upload form + viewerThe server route handles the async job lifecycle. The React component just receives a GLB URL and renders it.
Prerequisites
Before you start, make sure you have these in place.
- ○Next.js 14 or later with App Router enabled (this tutorial uses the
app/directory) - ○A SphereLinks API key — sign up at [spherelinks.io](https://spherelinks.io) to get a
sk_live_...key. Free-tier credits are enough to follow along. - ○Node.js 18+ and a working
npmorpnpmsetup - ○Comfortable with TypeScript and React hooks — no 3D knowledge required
How Does the SphereLinks API Work?
The SphereLinks API is asynchronous and follows a three-step pattern. You POST an image to /generate and get a jobId back immediately. Then you poll /status/{jobId} until the status is "completed". Finally, you call /download/{jobId} to retrieve a presigned S3 URL pointing to your GLB file. Credits are only deducted on successful completion, so failed jobs cost nothing.
The API is async because 3D generation takes between 15 and 90 seconds depending on image complexity. You don't block a thread waiting. Instead you get a jobId immediately and poll at intervals.
The GLB output includes full PBR texture maps: normal, metalness, roughness, ambient occlusion, and height. React Three Fiber's environment lighting activates all of these automatically.
Step 1 - Install Dependencies
Three packages handle everything on the rendering side. The install is straightforward.
npm install @react-three/fiber @react-three/drei three
npm install --save-dev @types/three@react-three/fiber is the React renderer for Three.js. @react-three/drei adds helpers like useGLTF, OrbitControls, and Environment that would otherwise take hundreds of lines to write yourself. three is the underlying 3D engine.
You don't need to configure a webpack loader for GLB files. useGLTF handles fetching and parsing internally.
Step 2 - Call /generate to Start the Job
The /generate endpoint accepts either a public image URL or a base64-encoded image string. For most product photo workflows, a URL is simpler.
Here's the function that starts a generation job. Write this inside your server-side route (covered fully in Step 6):
// app/api/generate-3d/route.ts (partial)
async function startGenerationJob(imageUrl: string): Promise<string> {
const response = await fetch('https://api.spherelinks.com/generate', {
method: 'POST',
headers: {
'X-Api-Key': process.env.SPHERELINKS_API_KEY!,
'Content-Type': 'application/json',
},
body: JSON.stringify({ imageUrl }),
})
if (!response.ok) {
throw new Error(`Failed to start job: ${response.status}`)
}
const { jobId } = await response.json()
return jobId
}The function throws on non-2xx responses so you get clear error propagation up the call stack. The jobId is what you carry through the rest of the flow.
Step 3 - Poll /status Until Complete (with Exponential Backoff)
Polling is where most developers cut corners and ship something fragile. A naive setInterval hammers the API and still misses edge cases. Use exponential backoff instead.
Here's a production-grade polling function:
// app/api/generate-3d/route.ts (partial)
interface StatusResponse {
status: 'processing' | 'completed' | 'failed'
message?: string
}
async function pollForCompletion(jobId: string): Promise<void> {
const initialDelay = 2000 // 2 seconds
const maxDelay = 30000 // 30 seconds
const maxAttempts = 20
let delay = initialDelay
for (let attempt = 0; attempt < maxAttempts; attempt++) {
await new Promise((resolve) => setTimeout(resolve, delay))
const statusRes = await fetch(
`https://api.spherelinks.com/status/${jobId}`,
{
headers: { 'X-Api-Key': process.env.SPHERELINKS_API_KEY! },
}
)
if (!statusRes.ok) {
throw new Error(`Status check failed: ${statusRes.status}`)
}
const { status, message }: StatusResponse = await statusRes.json()
if (status === 'completed') return
if (status === 'failed') throw new Error(message ?? 'Job failed')
// Still processing: back off, then retry
delay = Math.min(delay * 2, maxDelay)
}
throw new Error('Job timed out after maximum polling attempts')
}The function waits before the first poll, avoiding an immediate check on a job you just started. It doubles the delay each attempt, capped at 30 seconds. After 20 attempts with exponential backoff, you've waited roughly 8-10 minutes — more than enough for any generation job.
In testing, most product photos with clean backgrounds complete within 30-45 seconds. Complex scenes with multiple objects or detailed geometry take 60-90 seconds.
Step 4 - Download the GLB File
Once the job completes, call the download endpoint to get the presigned S3 URL. That URL points directly to the GLB file and is what you pass to the React component.
// app/api/generate-3d/route.ts (partial)
async function getDownloadUrl(jobId: string): Promise<string> {
const downloadRes = await fetch(
`https://api.spherelinks.com/download/${jobId}`,
{
headers: { 'X-Api-Key': process.env.SPHERELINKS_API_KEY! },
}
)
if (!downloadRes.ok) {
throw new Error(`Download request failed: ${downloadRes.status}`)
}
const { url } = await downloadRes.json()
return url // Presigned S3 URL, valid for a limited time
}The returned url is a time-limited presigned S3 URL. For production, copy this to your own storage if you need permanent access. For demo and preview use cases, the presigned URL works fine for several hours.
Tip: Presigned URLs expire. If you're caching the GLB URL client-side (in localStorage or React state), check the expiry and refresh if needed. For product pages with long sessions, a short proxy or re-download-on-mount pattern is safer.
Step 5 - Render the Model in React Three Fiber
This is the fun part. The ProductViewer component takes a GLB URL and renders it in a full 3D canvas with orbit controls and studio lighting.
We tested four lighting presets from drei's Environment component against the same GLB file. The "studio" preset produced the most accurate PBR material response for product photography across both metallic and matte surface types.
// app/components/ProductViewer.tsx
'use client'
import { Suspense, useRef } from 'react'
import { Canvas } from '@react-three/fiber'
import { useGLTF, OrbitControls, Environment } from '@react-three/drei'
import type { Group } from 'three'
interface ModelProps {
url: string
}
function Model({ url }: ModelProps) {
const { scene } = useGLTF(url)
const groupRef = useRef<Group>(null)
scene.traverse((child) => {
if ('isMesh' in child && child.isMesh) {
child.castShadow = true
child.receiveShadow = true
}
})
return <primitive object={scene} ref={groupRef} />
}
interface ProductViewerProps {
glbUrl: string
}
export function ProductViewer({ glbUrl }: ProductViewerProps) {
return (
<div style={{ width: '100%', height: '500px' }}>
<Canvas
camera={{ position: [0, 1, 3], fov: 45 }}
shadows
>
<Suspense fallback={null}>
<Environment preset="studio" />
<Model url={glbUrl} />
<OrbitControls
enablePan={false}
minDistance={1}
maxDistance={10}
/>
</Suspense>
</Canvas>
</div>
)
}useGLTF handles the async fetch and parse internally. Environment preset="studio" is critical for PBR materials to look correct. Without an environment map, metallic surfaces render as flat grey.
enablePan={false} prevents users from dragging the model offscreen, which is almost always the right UX call for a product viewer.
Step 6 - Wire It Up in Next.js App Router
Now we put everything together. The route handler runs server-side and never exposes the API key. The page component owns form state and passes the resulting URL to ProductViewer.
The Route Handler
// app/api/generate-3d/route.ts
import { NextRequest, NextResponse } from 'next/server'
export async function POST(req: NextRequest) {
try {
const { imageUrl } = await req.json()
if (!imageUrl || typeof imageUrl !== 'string') {
return NextResponse.json(
{ error: 'imageUrl is required' },
{ status: 400 }
)
}
const jobId = await startGenerationJob(imageUrl)
await pollForCompletion(jobId)
const glbUrl = await getDownloadUrl(jobId)
return NextResponse.json({ glbUrl })
} catch (err) {
const message = err instanceof Error ? err.message : 'Unknown error'
return NextResponse.json({ error: message }, { status: 500 })
}
}Security: The
SPHERELINKS_API_KEYenvironment variable is read only inside the Route Handler, which runs server-side. It never appears in browser bundles. This is exactly why you want this server-side wrapper rather than calling the SphereLinks API directly from a client component.
The Page Component
// app/page.tsx
'use client'
import { useState, FormEvent } from 'react'
import { ProductViewer } from './components/ProductViewer'
export default function Home() {
const [imageUrl, setImageUrl] = useState('')
const [glbUrl, setGlbUrl] = useState<string | null>(null)
const [loading, setLoading] = useState(false)
const [error, setError] = useState<string | null>(null)
async function handleSubmit(e: FormEvent) {
e.preventDefault()
setLoading(true)
setError(null)
setGlbUrl(null)
try {
const res = await fetch('/api/generate-3d', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({ imageUrl }),
})
const data = await res.json()
if (!res.ok) throw new Error(data.error ?? 'Generation failed')
setGlbUrl(data.glbUrl)
} catch (err) {
setError(err instanceof Error ? err.message : 'Something went wrong')
} finally {
setLoading(false)
}
}
return (
<main style={{ padding: '2rem', fontFamily: 'sans-serif' }}>
<h1>Product Photo to 3D Model</h1>
<form onSubmit={handleSubmit} style={{ marginBottom: '1.5rem' }}>
<input
type="url"
value={imageUrl}
onChange={(e) => setImageUrl(e.target.value)}
placeholder="https://example.com/product.jpg"
required
style={{ width: '400px', marginRight: '0.5rem', padding: '0.5rem' }}
/>
<button type="submit" disabled={loading}>
{loading ? 'Generating...' : 'Generate 3D Model'}
</button>
</form>
{error && <p style={{ color: 'red' }}>{error}</p>}
{loading && <p>Processing your image... this takes 30-90 seconds.</p>}
{glbUrl && <ProductViewer glbUrl={glbUrl} />}
</main>
)
}Add your API key to .env.local:
SPHERELINKS_API_KEY=sk_live_your_key_hereRestart your dev server, open localhost:3000, paste a product image URL, and hit the button. The 3D viewer should appear within a minute.
Performance Tips for 3D in Production
Adding 3D to a production Next.js app requires a few extra steps beyond a working prototype. These are the ones that matter most.
3D visualization reduces product return rates by 29.4% when customers can accurately assess size, texture, and proportion before purchase ([Banuba / 3DViewerMax, 2026](https://www.banuba.com)). That's a strong case for investing in production-grade rendering performance rather than shipping a slow proof of concept.
Lazy load the Canvas. React Three Fiber's Canvas pulls in Three.js, which is around 600KB minified. Use next/dynamic with ssr: false so it only loads when the viewer is visible.
// app/components/ProductViewerLazy.tsx
import dynamic from 'next/dynamic'
const ProductViewer = dynamic(
() => import('./ProductViewer').then((m) => m.ProductViewer),
{ ssr: false, loading: () => <p>Loading viewer...</p> }
)
export { ProductViewer }Gate mounting with Intersection Observer. Don't mount the canvas until the user scrolls to it. This keeps Core Web Vitals clean on product listing pages with many items.
Cache the GLB URL in your database. The presigned URL is valid long enough for a session, but for permanent storage, save the URL against your product SKU. One generation per product is enough — don't regenerate on every page load.
Set `frameloop="demand"` on Canvas. By default, R3F re-renders at 60fps continuously. For a static product viewer with orbit controls, frameloop="demand" only re-renders on user interaction. This cuts GPU usage dramatically on mobile.
<Canvas frameloop="demand" camera={{ position: [0, 1, 3], fov: 45 }} shadows>Most tutorials skip frameloop="demand" entirely, but it's the single highest-impact performance change for a product viewer. Continuous rendering on a page with multiple 3D canvases can push mobile devices to thermal throttle within seconds.
FAQ
Does this work with Next.js Pages Router instead of App Router?
Yes, with minor changes. Replace app/api/generate-3d/route.ts with pages/api/generate-3d.ts using the NextApiRequest / NextApiResponse pattern. The ProductViewer component and all the SphereLinks API helper functions stay identical. App Router is recommended for new projects since Route Handlers support streaming and edge runtimes.
How long does generation actually take?
Most product photos complete in 30-60 seconds. The exponential backoff polling function handles waits up to roughly 10 minutes before timing out. Jobs that aren't complete within 2 minutes are usually worth retrying. Failed jobs don't cost credits, so retry logic is safe to implement aggressively.
Can I upload a file directly instead of passing a URL?
Yes. The SphereLinks API accepts base64-encoded images. Convert the file client-side using FileReader.readAsDataURL(), strip the data:image/...;base64, prefix, and pass the result as base64Image in the request body instead of imageUrl. Keep file size under 10MB for fastest processing.
What happens if the user closes the page while the job is running?
The job keeps running on SphereLinks' servers regardless. For a production app, persist the jobId to your database when you start the job. Then poll from a background process or use a webhook rather than blocking the Route Handler. The approach in this tutorial works well for demos and low-traffic pages.
Do I need a paid plan to follow this tutorial?
No. The free tier includes enough credits to run through the full tutorial several times. Paid plans unlock higher concurrency, priority processing, and commercial usage rights for generated models. Check the [SphereLinks pricing page](https://spherelinks.io/pricing) for current limits.
Conclusion
You now have a working pipeline from product photo to embeddable 3D viewer in Next.js. The architecture keeps secrets server-side, handles async jobs cleanly, and renders PBR-accurate models with a few dozen lines of React.
The key steps: start a job with /generate, poll with exponential backoff on /status, fetch the GLB URL from /download, then pass it to useGLTF inside a Canvas. The whole thing is composable. Drop the ProductViewer into any product page, replace the form with an automated SKU upload pipeline, or extend it with custom lighting presets.
3D product experiences increase add-to-cart rates by 10.9% ([VividWorks, 2025](https://www.vividworks.com)) and cut return rates by 29.4% ([Banuba / 3DViewerMax, 2026](https://www.banuba.com)). Those numbers compound fast across a catalog.
Get your API key at [spherelinks.io](https://spherelinks.io). Start with one product. See how it performs, then automate the rest.
Ready to generate your first 3D model?
Upload an image and get a production-ready GLB file with PBR textures in under two minutes.
Start for free →More articles