
Artificial Intelligence • Hardware • SaaS
Kernelize is a company that builds a Triton-based platform and backend plugins to enable day-0 LLM model support across CPUs, GPUs, and NPUs. They provide Triton and vLLM backend integrations, optimized kernels, and compiler expertise to accelerate and simplify LLM inference on diverse hardware, helping teams integrate high-performance kernels into existing ML stacks and reduce development delays. Kernelize leverages the Triton open-source compiler community and focuses on hardware-agnostic, performance-oriented AI inference tooling for organizations deploying large language models.
1 - 10 employees
Founded 2025
🤖 Artificial Intelligence
🔧 Hardware
☁️ SaaS
October 22
Compiler Engineer building compiler backends for AI accelerators at Kernelize. Focused on low-level optimizations and cutting-edge compiler technologies to maximize performance.
October 22
Runtime Engineer focused on building scalable runtime for AI hardware accelerators utilizing Triton. Work on advanced technologies and help shape core tech at Kernelize.