
May 08, 2026
Hemanth Velury
CEO & Co-FounderEvery project in interiors starts with the same artifact: a floor plan or a room photo that only a specialist can really read. Lines, labels, and dimensions mean something to architects and designers, but for clients, homeowners, and buyers, they are little more than guesswork. VirtualSpaces was built to close that gap by turning those flat inputs into photoreal, navigable 3D environments in minutes, not weeks.
What began as tools for interior design and home renovation has grown into something larger: a spec‑accurate world‑building engine that can read real drawings, construct precise geometry, furnish with AI, and stream the result in a browser. That engine now powers Foursite and Remodroom today and is evolving into an infrastructure layer for how people design, sell, and experience real spaces online.
VirtualSpaces today ships two core products: Foursite and Remodroom.
Foursite converts 2D floor plans and architectural blueprints into fully modeled 3D interiors, complete with walls, doors, windows, and room types recognized automatically by the AI. Designers, architects, and homeowners upload a simple JPG or PNG of a plan, and within minutes they can walk through the resulting space in a real‑time 3D viewer. They can then generate photorealistic renders from any point of view with a single click, without touching traditional 3D or CAD software.
Remodroom works from the other direction. Instead of a floor plan, it starts from a single room photograph, taken on a phone or pulled from a shoot, and turns it into a fully redesigned, photorealistic interior image. The system reads the existing furniture, flooring, wall finishes, and lighting, then lets users swap pieces, change styles, and re‑light the space while preserving realism. The output looks like a professional photograph of a completed project, ready to send to a client, contractor, or listing, no manual masking, no layer juggling.
These products already solve a painful, daily problem for anyone working in residential interiors: the visualization gap between what is drawn, what is pitched, and what the buyer or client can actually imagine. Foursite compresses the step from blueprint to believable 3D walkthrough; Remodroom compresses the step from current state photo to future state proposal. Together they give designers, homeowners, and sales teams a way to make decisions with visuals that feel close to the final space instead of generic mood boards or rough sketches.
On paper, read a floor plan and extrude it into 3D sounds simple. In practice, floor plans are one of the hardest inputs in computer vision: dense, stylized, often low‑quality scans where lines, symbols, text, and scale all overlap. Off‑the‑shelf models tend to treat them as noisy clip art, struggling with basic questions like which lines are walls, which rectangles are fixtures, and which annotations map to which rooms. Academic systems have shown partial solutions, but most assume perfectly clean CAD or strictly formatted plans, nothing like the messy, heterogeneous drawings that real projects actually use.
VirtualSpaces had to build an end‑to‑end pipeline that could handle those real‑world constraints. The system pre‑processes each floor plan to normalize orientation, scale, and clarity, then passes it through an AI intelligence layer that performs feature extraction for walls, doors, and windows, semantic segmentation for room types, and dimension extraction for scale. That combination yields enough structure to construct a scene graph: a machine‑readable map of what exists where, and how spaces are connected.
From there, a 3D build engine turns that scene graph into watertight geometry, generating meshes for walls and floors, analyzing topology to avoid errors, and validating that rooms are actually walkable spaces without impossible intersections. Finally, a rendering engine applies physically based materials, lighting, and textures to produce an interactive 3D environment that runs in a standard browser. The result is a pipeline that goes from flat plan to navigable 3D package in under two minutes, without relying on manual modeling or GPU‑heavy desktop software.
The core idea behind the VirtualSpaces engine is simple but powerful: treat architectural drawings as structured specifications, not just images. Instead of estimating geometry from pixels alone, the system uses OCR and natural language processing to read the room names, dimensions, and area labels that architects have already written into the drawing. Those values become the ground truth for room sizes, adjacencies, and proportions.
Once extracted, that specification feeds into a spatial reconstruction engine that builds geometry at exact scale. Walls are generated at the right thickness and height, doors and windows respect their stated positions and clearances, and room adjacencies follow the logic of the plan instead of the guesswork of an artist recreating a reference photo. The difference is not just visual; it's structural. A spec‑accurate model can be trusted for layout decisions, furniture fit, and circulation analysis in a way that a close enough approximation never can.
On top of that geometry, VirtualSpaces layers its photorealistic graphics pipeline. Screen Space Global Illumination (SSGI) and Screen Space Reflections (SSR) simulate how light actually behaves inside interiors, while PBR materials and high‑resolution textures give surfaces the subtle variation of real wood, stone, and fabrics. All of this runs in a browser‑native WebGL engine with automatic lighting, so stakeholders can explore the space on any modern device without installing heavy software or streaming from a dedicated graphics machine.
This is what makes the engine more than just a visualization tool. Because it is built around the numbers in the drawing, the lengths, labels, and specifications, it becomes a reliable backbone for workflows that care about layout fidelity, not only aesthetics. Design teams can iterate on interior schemes knowing that the spatial logic is correct; sales teams can show apartments and homes with confidence that the rooms a buyer walks through on screen map closely to the real unit.
The engine is not static. Over the next cycle, VirtualSpaces is shipping a set of capabilities that change how design happens inside the 3D environment itself, not just what the final render looks like.
The first is White Modeling with AI Furnishing. In Foursite, designers will be able to drop simple 3D furniture placeholders, what the team calls base designs, directly into the floor plan in real time. The AI will then generate high‑fidelity interior visuals based on those placements, with controls for style, finish, and arrangement. Instead of designing on paper and checking renders later, the creative loop collapses into a single session where layout, furnishing, and photoreal feedback all coexist.
Next is a photorealistic graphics pipeline tuned for interiors. By combining SSGI lighting, SSR reflections, and carefully calibrated PBR materials, the system will deliver near‑offline quality lighting inside an ordinary browser. Floors will show accurate sheen and grain; walls will pick up subtle bounce light; metals and glass will respond convincingly to virtual spotlights and daylight. Importantly, all of this will be automatic: users will not need to tweak render settings or manage complex light rigs to get believable results.
VirtualSpaces is also shipping first‑person and orbital views as first‑class modes. Users will be able to toggle between an axonometric technical perspective, which makes it easy to reason about layout, and an immersive first‑person walkthrough with adjustable camera height and field of view. That means the same model can serve as both a design tool for professionals and an experience layer for non‑technical clients during approvals.
On the workflow side, real‑time shareable links will make distribution as simple as sending a URL. Any space built in Foursite can be shared without logins, and as a design evolves, viewers will always see the latest version when they refresh. This compresses feedback cycles from days of email threads and static PDFs to minutes of collaborative exploration in a live 3D room.
Finally, a dynamic floorplan editor with precision controls will let users make structural changes directly in the browser: moving walls, inserting arcs, adjusting heights, and snapping imported models to real‑world units computed from the underlying spatial data. Floor‑to‑ceiling height, door lintel height, and window‑sill height become sliders instead of hours in a modeling package.
These capabilities are not just nice features for render quality. They are a response to how decisions get made in residential design and home retail. A project rarely stalls because of the lack of another mood board; it stalls because clients and buyers cannot see themselves in the proposed space. They do not know whether the sofa will actually fit, whether the dining table will feel cramped, or whether the wardrobe will block natural light. The result is hesitation, slow approvals, and, in retail, expensive returns.
With a spec‑accurate engine under the hood, Foursite and Remodroom can give each stakeholder a trustworthy preview of what they are about to commit to. Designers can iterate layouts and finishes in the same environment where clients will walk through the space. Homeowners can test multiple schemes before approving renovation work. Furniture brands and property portals can show their pieces and listings inside a buyer's actual layout instead of a generic, perfectly staged room.
For online furniture and home goods, this shift is especially important. Today, a customer often sees a beautiful product shot that bears little resemblance to their real apartment. Different wall lengths, ceiling heights, and circulation patterns make it difficult to translate that image into a confident purchase. By anchoring the visual experience in the customer's own floor plan or photo, the engine turns those products into context‑aware objects that can be placed, rearranged, and approved in seconds. That is not just a better UX; it is a structural advantage in conversion rate and return reduction.

Because the engine is built around real‑world floor plans and spec data, its usefulness extends beyond any single industry. Residential designers and homeowners use it to visualize projects. Property portals can embed it to let buyers step through floor plans before a site visit. Furniture brands can plug it into their product pages so shoppers can place pieces into an accurate model of their room with a single upload. And any workflow that depends on from drawing to believable 3D space can, in principle, sit on top of the same pipeline.
This is why VirtualSpaces has invested in a REST API and SDK layer alongside its own applications. The engine is accessible not only through Foursite's interface but also as an integration surface for other products and platforms that want floor‑plan‑to‑3D as a native capability. A single API call can send a plan and receive back a navigable 3D scene, ready to embed inside an existing experience.
As the feature set grows, white modeling, AI furnishing, first‑person walkthroughs, shareable links, the distinction between tool and infrastructure becomes clearer. Instead of treating 3D visualization as a one‑off step handled by a specialist, the VirtualSpaces engine positions it as a shared layer that design, sales, and operations can use together. The same model that drives an internal design review can power a client presentation, a furniture upsell, and a portal's listing experience.
VirtualSpaces is not trying to replace designers, nor to chase buzzwords about extended reality. The goal is much more grounded: to make the path from intent to a trustworthy visual incredibly short, and to make that path available to anyone working with residential spaces. When a homeowner uploads a plan, they should see a believable version of their future home within minutes. When a furniture brand launches a new collection, customers should be able to see it inside their own rooms on day one.
Achieving that vision requires more than pretty renders. It requires an engine that understands structure, respects specification, handles messy real‑world inputs, and serves them through modern, accessible interfaces. That is the engine VirtualSpaces has been quietly building: a system that reads drawings like an architect, rebuilds them like a 3D artist, lights them like a visualizer, and ships them like a web product.
The next wave of features, AI‑driven white modeling, photoreal browser graphics, immersive walkthroughs, live shareable links, and precision editing in the browser, push that engine closer to being an invisible layer beneath how interiors are bought, sold, and approved. If you work in residential design, staging, renovation, or furniture retail, the promise is straightforward: less guesswork, fewer surprises, and a faster, more confident path from drawing to decision.