Accelerating LLM-Driven Developer Productivity at Zoox

Abstract

Over the past year, Zoox has invested in integrating Large Language Models (LLMs) into internal developer workflows through a company-wide initiative called Zoox Intelligence (ZI). This talk shares how we approached this transformation — from identifying high-impact use cases (like code authoring, code review, search, onboarding, and context-aware chatbots) to designing infrastructure, choosing the right LLMs, and embedding them into real developer tools (CLI, IDE, Slack, web apps).

Beyond software development tasks, we’re also tailoring our platform to support workflows unique to autonomous vehicle development, such as triaging events observed by our autonomy fleet, and supporting our eventual customers. Our vision is to create a flexible, intelligent platform that supports both technical and non-technical teams — from engineers to program managers, and beyond.

Whether you're just beginning to explore LLMs in your engineering org or scaling your internal AI capabilities, this session will offer tangible ideas, design patterns, and organizational strategies.


Speaker

Amit Navindgi

Staff Software Engineer, Developer Experience @ Zoox, Leading Applied AI Initiatives

Amit Navindgi is a Staff Software Engineer at Zoox, where he leads Zoox Intelligence — an initiative applying Large Language Models (LLMs) across engineering, operations, customer support, and autonomy. He builds products and platforms that combine technical depth with thoughtful design, creating interactions that are both intuitive to use and elegant to build. His expertise spans Applied AI, Observability, Semantic Search, Experimentation Platforms, Data Engineering and incident and on-call management tools.

He also runs the Zoox AI Hackathon and The Assembly, a cross-functional forum for knowledge sharing, collaboration, and innovation.

Earlier in his career, he developed web applications and distributed systems at Veritas Technologies and focused on Natural Language Processing at the Xerox Research Centre Europe

Read more
Find Amit Navindgi at:

From the same track

Session

Powering the Future: Building Your Genai Infrastructure Stack

Behind every productivity leap is a rock-solid platform. Go under the hood with Intuit’s GenOS team to see how vector stores, prompt management, RAG pipelines, and agent orchestration come together to serve ~100 million users.

Speaker image - Maggie (Kun) Hu

Maggie (Kun) Hu

Group Product Manager for the Core AI Platform @Intuit

Session

AI-Driven Productivity: From Idea to Impact

In this session you'll learn how product leaders turn GenAI enthusiasm into an enterprise-ready blueprint for real productivity gains.

Speaker image - Jyothi Nookula

Jyothi Nookula

Product Director @Netflix, AI Product Management Instructor, Previously @Meta, @Amazon, and @Etsy

Session

Choosing Your AI Copilot: Maximizing Developer Productivity

The AI coding agent landscape evolves weekly. This talk compares today’s frontrunners, shows where each shines, and shares prompts, policies, and “rules templates” that turn code suggestions into production-quality output.

Speaker image - Sepehr Khosravi

Sepehr Khosravi

Machine Learning Platform Engineer @Coinbase, Instructor @UC Berkeley - 1st Vibe Coding Course