Back to Blog
EngineeringMarch 1, 202610 min read

Introducing Composer: A Node-Based Image Editor Built for the GPU Era

By Will Waltz

Photoshop turned 35 this year. Thirty-five years of features bolted onto an architecture designed when 8 MB of RAM was generous and a single CPU core was all you had. Layer upon layer of legacy — not the creative kind, the technical debt kind. Filters that still block the UI thread. A plugin system older than most of its users. GPU acceleration that arrived piecemeal, shader by shader, with half the pipeline still running on the CPU because rewriting the foundation would break everything.

Adobe can't start over. They have 30 million subscribers and decades of file format compatibility to maintain. Every architectural compromise they made in 1994 is still load-bearing in 2026.

We don't have that problem. So we're building Composer.

What Procreate Did for Painting, Composer Does for Image Creation

When Procreate launched, it didn't try to be Photoshop on a tablet. It asked a simpler question: what would a painting app look like if you designed it from scratch for this hardware? No legacy menus. No feature bloat. Just a canvas, brushes that felt alive, and an engine that used every ounce of the GPU.

The result wasn't a lesser Photoshop. It was a better painting experience. Procreate didn't win by having more features — it won by being fundamentally better at the thing it was designed for.

Composer applies the same philosophy to image creation, compositing, and visual effects. Not a Photoshop clone with fewer features. A different tool built on a different foundation, designed for how creative professionals actually work in 2026 — with full GPU acceleration, a node-based pipeline, real-time simulation, and hand-keyed animation as first-class citizens.

The Case Against Layers

Photoshop's layer stack is a filing cabinet. You pile things on top of each other, and the order matters. Want to apply a blur to three specific layers? You merge them, losing editability, or you convert to a smart object — Photoshop's duct tape for the fact that its architecture can't express the relationship you actually want.

Nodes don't have this problem.

A node graph is a directed pipeline. Every operation — color correction, blur, mask, composite, distortion — is an explicit, visible connection between inputs and outputs. You can see exactly what feeds into what. You can rewire anything at any point without flattening, merging, or praying that "undo" gets you back to where you were.

This isn't a new idea. Nuke, Houdini, and Fusion have used node-based compositing for decades in film and VFX. But those tools are priced and designed for studio pipelines, not individual creatives. Composer brings node-based thinking to the same audience Photoshop serves — designers, digital artists, photographers, content creators — without the six-figure studio license or the learning curve that assumes you have a pipeline TD on speed dial.

Why nodes matter for creative work:

  • Non-destructive by design — every operation is editable at any point. No "flatten to apply." No point of no return. Your entire history of creative decisions lives in the graph, adjustable forever.
  • Parallel processing — independent branches of a node graph execute simultaneously on the GPU. A layer stack is sequential by nature. Nodes are parallel by nature. This is why Composer can process complex compositions in real time while Photoshop chokes on a smart object stack.
  • Reusable workflows — save a chain of nodes as a preset. Apply it to a different project. Share it with your team. In Photoshop, recreating a complex effect means remembering every adjustment layer, blend mode, and mask you used. In Composer, you wire it once and reuse it forever.
  • Transparency — the graph shows you exactly what's happening to your image at every stage. No black-box filters. No wondering why the output looks different from what you expected. Trace any pixel back through the pipeline and see every operation that touched it.

Simulation as a Creative Tool

Here's something Photoshop can't do at all: simulate.

Want particles drifting across a composition? In Photoshop, you paint them frame by frame, or you leave Photoshop entirely and open After Effects. In Composer, you drop a particle emitter node into your graph, set the physics parameters, and watch it run. Adjust gravity, turbulence, wind, lifetime — all in real time, all feeding directly into your composition.

Simulation in Composer isn't a bolted-on feature. It's a core pipeline stage, running on the GPU alongside everything else.

  • Particle systems — fire, smoke, dust, sparks, rain, abstract motion graphics. Define emitters, forces, and collision surfaces. The simulation runs in the viewport at full speed.
  • Fluid dynamics — paint-like flows, ink dispersion, liquid distortion. Not a canned filter — an actual simulation you art-direct by adjusting physical parameters.
  • Procedural generation — noise-driven displacement, growth patterns, fractal geometry. Feed simulation output into any downstream node. Use a fluid sim to drive a displacement map. Use particles to scatter texture elements.

This is where Composer diverges most sharply from Photoshop. Photoshop treats an image as a static artifact you manipulate with tools. Composer treats an image as the output of a process — a process you design, simulate, and refine.

Hand-Keyed Animation: Not an Afterthought

Photoshop's animation support is a timeline bolted onto a layer stack. It works the way you'd expect something to work when it was never part of the original design: awkwardly. Frame-by-frame pixel animation or tweened layer properties, with a UI that fights you at every step.

Composer builds animation into the node graph itself. Every parameter on every node is keyframeable. Opacity, blur radius, particle emission rate, color correction curves, displacement intensity — if it has a value, you can animate it over time.

  • Per-node keyframing — select any node, open its animation curves, set keys. Standard stuff, but integrated deeply rather than tacked on.
  • Spring and physics-based easing — not just bezier curves. Composer's animation system supports spring dynamics, bounce, and overshoot natively. Motion that feels physical, not mathematical.
  • Onion skinning and drawing tools — for frame-by-frame illustration and hand-drawn animation directly in the viewport, composited live with the rest of your node graph.
  • Expression-driven values — link parameters to each other with simple expressions. Make a blur radius follow a color channel's average value. Make particle emission rate respond to audio amplitude. The graph becomes programmable without writing code.

Animation in Composer isn't a separate mode you switch into. It's a dimension of the same workspace. You're always one keyframe away from making any static composition move.

The Five Pillars

Composer is built on five architectural commitments that inform every design decision:

1. GPU-first rendering. The entire pipeline — every node, every composite operation, every simulation step — runs on the GPU. Not "GPU-accelerated where possible." GPU-native, full stop. The CPU handles UI and file I/O. Everything that touches pixels runs in shaders. This is the single biggest architectural difference from Photoshop, and it's only possible because we started from zero.

2. Non-destructive everything. There is no destructive operation in Composer. No flatten. No merge. No rasterize. Every operation is a node in the graph, and every node is editable at any time. Your project file is a complete record of every creative decision, fully adjustable in any order.

3. Real-time feedback. What you see in the viewport is the final output, rendered live. Change a parameter and see the result immediately — not after a progress bar, not after a "processing" spinner. Real-time means real-time: adjust a color curve while a particle simulation plays and watch both update simultaneously in the viewport.

4. Simulation and motion as equals to static editing. Composer doesn't treat animation and simulation as extensions of a still-image editor. They're peers. The same graph that composites a still image can drive a 60fps animation sequence. No export-to-another-tool workflow. No feature walls between still and moving images.

5. Desktop-native performance. Composer is a native desktop application. Not a web app pretending to be software. Not an Electron wrapper. Native GPU access, native file system integration, native memory management. The creative professionals we're building for work with large files, complex compositions, and tight deadlines. They need software that respects their hardware.

Photoshop Comparison, Directly

Let's be specific.

PhotoshopComposer
ArchitectureCPU-first with selective GPU accelerationGPU-native, full pipeline
Editing modelDestructive by default, non-destructive via workarounds (smart objects, adjustment layers)Non-destructive by design — every operation is a node
CompositingLayer stack (sequential, order-dependent)Node graph (parallel, order-flexible)
SimulationNoneParticles, fluids, procedural generation — built in
AnimationBolted-on timeline, limited parameter controlPer-node keyframing, spring physics, expressions
Real-time previewPartial — many operations require renderingFull pipeline renders live in the viewport
File formatPSD (proprietary, complex, legacy-burdened)Open graph format + standard image export
TargetEverything for everyoneImage creation, compositing, and VFX for creative professionals

Photoshop is a better tool for photo retouching, print layout, and the hundreds of niche workflows its 35-year feature set supports. We're not trying to replace all of that. Composer is purpose-built for the creative professional who spends their day compositing images, building visual effects, and creating motion content — and who's tired of working around an architecture that was never designed for what they're asking it to do.

Where We Are

Composer is in active development. The node engine is working. The GPU pipeline renders in real time. Particle simulation runs in the viewport. We're deep in the work of building the tool set — the brushes, the masks, the color tools, the export pipeline — that turns a capable engine into a creative application.

We're building this because we've spent years working with creative professionals who fight their tools more than they use them. The technology exists to do this right. The only reason it hasn't been done is that the incumbents can't afford to start over.

We can.


Photo by Lorenzo Herrera on Unsplash

Key Takeaways

  • Photoshop's 35-year-old architecture can't be fully GPU-accelerated without a ground-up rewrite Adobe can't afford to do
  • Node-based editing is non-destructive by design, parallel by nature, and transparent at every stage
  • Real-time simulation — particles, fluids, procedural generation — is a core creative tool, not a bolted-on feature
  • Animation is built into every node, with spring physics and expression-driven parameters
  • Composer is GPU-native, non-destructive, and purpose-built for image creation, compositing, and visual effects

Composer is being built by Wise Mountain. Learn more about our Game & 3D Co-Development work.

Composerimage editingVFXGPUnode-basedcreative tools