Luminal, a next-generation inference infrastructure company, announced the successful closing of a USD 5.3 million seed funding round to deliver “speed-of-light inference” to developers and organizations. The round was led by Felicis Ventures, with participation from prominent angel investors including Paul Graham, Guillermo Rauch, and others.
Luminal was founded to address a growing inefficiency in AI infrastructure: while cutting-edge hardware continues to advance rapidly, software for inference is failing to keep pace. The company argues that, despite massive investments in powerful accelerators, much of this compute remains underutilized because the software layer cannot fully exploit the chips’ potential. As noted by Luminal, “the software that runs on those chips continues to lag far behind, leading to huge swatches of these chips running dark and unutilized.”
To solve this, Luminal has built a tight integration between a high-performance compiler and an inference cloud. This infrastructure enables developers to deploy models with a simple call for example, by running: luminal.deploy().





