Friday 5 | Blake Courter, Founder of GCL, Former CTO of nTop
Geometry as Code: Blake Courter’s Blueprint for Implicit CAD
🏭 Breaking the Bottleneck is a weekly manufacturing technology newsletter with perspectives, interviews, news, funding announcements, manufacturing market maps, 2025 predictions, and more!
💥 If you are building, operating, or investing in manufacturing, supply chain, or robots, please reach out to aditya@machinafactory.org.com. I’d love to chat!
If you were forwarded this and found it interesting, please sign up!
Geometry as Code: Blake Courter’s Blueprint for Implicit CAD 🎙️💬
“With implicits, construction is transparent to the user; the kernel’s job is to render and convert back to B-reps, so the modeling tech can live in user space and geometry becomes open-source code.”
From founding SpaceClaim to leading nTopology and now co-founding Gradient Control Laboratories, what experiences have most shaped your vision for the future of design? What advice do you have for early-stage founders in the space?
I think my advice to anyone is to follow one's curiosity. When I was in about first grade, my father handed over his Apple ][+ to me, on which I could play a few games and program in BASIC and Logo. Logo’s turtle graphics were a great playground for what we now call generative design. While I mostly had happy accidents, in about 8th grade, I learned the equation for an ellipse and was able to get Logo to draw one very slowly. At that point, I was hooked. In high school, I figured out 2D transformations in Turbo Pascal on a Mac SE. In college, I wrote graphics for simulations and even a flight simulator on SGIs using GL before it was open. Working for Professor Dan Nosenchuck one summer, I got the department’s first CNC installed and running, for which I wrote a postprocessor for it in PERL. We also licensed Pro/ENGINEER, which I found so compelling that I only applied for a job at PTC.
PTC was the best business training one could hope for. In the 1990s, they built the most aggressive sales force of the tech industry and provided excellent training to application engineers like me. On the other hand, they had had such a compelling product for so long that they had stopped listening to customers and had no idea how to respond to emerging competition. Not only that, the competition (mostly SolidWorks) copied the worst aspects of the Pro/ENGINEER data model, and no one was talking about how to build solid modelers that were better for users. I left PTC to start working on a better approach to B-rep modeling with David Taylor, while also consulting on the world’s first interactive implicit modeler, Sarah Frisken and Ron Perry’s ADF-powered Kizamu from MERL. The juxtaposition of those two projects instilled a career-long vision for top-down generative design that synthesizes high-fidelity, robust implicits.
It's incredible how much success this model has had. When Stratasys acquired GrabCAD, Stratasys embraced innovation, and we used implicits to get a dozen projects working, ultimately replacing the raster pipeline in several products. nTop has grown into a double-precision modeling not just for additive and generative design, but also appears to be better at airplane and marine design than traditional surface-driven approaches. At Gradient Control Laboratories, we have a great business booting up custom CAD systems using both our modeling technology and SAAS architecture. On that note, my other advice to anyone starting a company is to be more interested in the business and your customers than the technology.
How did your work on implicit modeling and field-driven design at nTopology create the foundation for AI in engineering workflows?
We started with the observation that what CAD calls “implicit geometry” is the same concept as a classifier in machine learning. Traditional CAD is essentially a very tedious way for humans to tell a computer which points are part of the model and which aren’t. Worse, boundary-representation technology, either precise B-reps or approximate meshes, doesn’t offer a closed-form way to do this in traditional CAD technology. Complex boolean libraries embed hand-coded human logic, creating huge parameter spaces that are extremely hard to learn.
At nTopology, we figured out how to do everything CAD does with implicits, and nTop is a generation ahead in delivering an interactive, implicit CAD experience. At GCL, we’re taking the next step with “Omega,” an SDK for pure implicit representations using open, machine-understandable code for all geometry. We expect that “geometry as code” will pair better with LLMs and related techniques than traditional CAD scripting.
How is geometry as code different from those scripting techniques that generate CAD today, like Zoo, neThing.xyz, or using an LLM to drive a CAD API?
If you want precise CAD B-reps, LLM scripting is the only practical approach. It’s well-suited for generating lots of designs, much like the success of CAD configurators. Check out Infinitive for a good example that bridges generative and parametric design.
Implicits, however, are made of a different “material” than B-reps, and everything is backwards and inside out, kernel included. With B-reps, it’s easier to read the model but hard to construct or evaluate it, which is why proprietary kernels persist. With implicits, construction is transparent to the user; the kernel’s job is to render and convert back to B-reps. That means the modeling tech can live in user space rather than being hidden away, and geometry becomes open-source code. An implicit is simply code that maps any point in space to a signed value indicating containment and distance to a shape. GCL builds modeling libraries and compilers for that code, relying on external kernels for rendering and meshing. The result is that any user, including AI, can fully “understand” the CAD model, enabling more meaningful, knowledge-aware editing.
Gradient Control Laboratories supports ventures like LatticeRobot and residencies with xNilio, Variant3D, and Rapid Liquid Print. How does this model help de-risk deep tech innovation and accelerate market adoption?
GCL formed when our Technical Director, Luke Church, realized that the tools that made software development easier over the last decade could help implicit modeling scale to CAD-level complexity. Our team, including Mariana Marisol and Dan Rubel, wanted to build that interactive system, so we spent about a year creating LatticeRobot as a testbed for inverse engineering with an implicit kernel that could transpile to other generative systems. Along the way, we built a platform and a culture we could reuse to jumpstart RLP and help transform Variant3D.
We admire pioneers like Shape Data and D-Cubed, as well as BBN-style FRCs, so we chose to self-fund our IP while proving it in customer settings using a similar model. Beyond our modeling library, Omega, now applied in two Variant3D use cases, we’ve built a deliberately boring, robust architecture, “Alpha,” designed for multi-user, server-side compute with a 3D viewer, and more. For startups aiming to deliver world-class CAE/CAD/CAM experiences, we assemble these pieces as needed while retaining rights to enhancements.
As a result, our most significant value isn’t just IP, it’s culture injection. Jon Hirschtick, the founding CEO of SolidWorks and OnShape, defines culture as “the set of organizational behaviors that are automatic,” and we focus on being intentional about those behaviors, then fostering them with clients. We often work with teams graduating from raw research; the key is adding only the minimum new culture needed to scale around their early energy.
Looking ahead to 2025 and beyond, which specialized AI agent use cases, such as automated quoting, simulation assistants, or supplier collaboration bots, do you expect to reach mainstream adoption first, and why?
We’re seeing lots of experiments that add agent interfaces to existing tools, and by year-end, I expect leading platforms to add MCP support, with LLMs complementing traditional interfaces. The market will try many topologies.
You can split the landscape into traditional and data-driven applications. On the traditional side, expect CAE analyst agents trained to set up and interpret simulations, though not on a manufacturer’s proprietary knowledge. Surrogate models trained on synthetic data are already popular. Most vendors will demo these personal-productivity features inside their current platforms, which are helpful but not transformative for the business.
On the data-driven side, it’s worth asking which existing datasets benefit most from agentic interfaces. Hanomi, for example, learns from drawings; automation is the least interesting outcome. More critical is translating generational knowledge into living tools usable by both humans and AI. Part-level 3D CAD data is often gibberish, and startups like Mecado are building the data tier to fix that. nTop is seeing demand to fill this gap too, especially in aircraft and nautical vehicle design. At the assembly level, C-Infinity is adding configuration-space intelligence.
As AI agents take on more engineering tasks, what new skills or ways of thinking will engineers need to stay ahead and make the most of these tools?
It’s an exciting time to be an engineer. GPT and Claude already help us ask better questions and identify blind spots, but we still have to validate the work. It might be faster to spin up quick calculations, yet we still need to think through every step. Get comfortable with LLMs and surrogate models, but keep the pencil sharp.
As we gain the ability to run more experiments and solve more problems, the traditional CAD/PLM stack will struggle to keep pace. We’ll need new interfaces to manage pervasive analysis. PhysicsX’s ai.rplane demo, Intact Solutions meshless simulation, and Pasteur Labs’ tesseracts are early signals. At the high end, techniques like uncertainty quantification and design/manufacturing allowables ensure statistical confidence, and I hope to see UX innovation as these mature tools become democratized.
In particular, xNilio is building what looks like the future of system-oriented engineering, combining surprising approaches while firing off agents, so I expect engineers of the future to work alongside AI agents. As a human in this environment, the ability to understand and communicate customer needs to peers and AI will directly impact one's effectiveness. There’s a familiar career pattern, the more responsibility you hold, the more the job becomes sales. Engineering software has a corollary, as tools climb the abstraction ladder, they align more with business value than with technical acumen or personal productivity. So track the value chains you serve, and prioritize revenue and customer outcomes over personal efficiency and the bottom line.
To contact Blake, you can contact him on LinkedIn here. He’s always happy to help and chat!



Great Interview! Pro Engineer was an awesome software back in the day! It was great to read Blake's perspective on how CAD and CAE workflows will be disrupted by AI.