Simon Wardan
Principal Engineer at Terem
Simon Wardan
Principal Engineer at Terem
Simon Wardan has been building software for over 25 years—and geeking out on technology for even longer. From early web apps to complex enterprise systems, he’s seen the landscape evolve and tackled challenges across nearly every layer of the stack.
For the past eight years, Simon has worked with Terem to help clients integrate AI into real-world solutions—long before it was called AI, back when it was just machine learning. His focus today is on making advanced tools like LLMs practical and usable by everyday engineering teams.
When he’s not working on software, Simon spends his time with his two kids, and pursuing his passion for Brazilian jiu jitsu.
Upcoming conference sessions featuring Simon Wardan
AI as the Engine, not the Product
AI is everywhere—but most teams still treat it as an accessory, not as infrastructure.
Think about this; You use a database to persist information, you use an API gateway to expose structured data to other parts of the business. What function can you use an LLM for? You’re missing out on leveraging this technology!
In this talk, we’ll explore how LLMs can serve as dynamic building blocks—not just for ML specialists, but for everyday engineers. We'll move beyond the usual "AI for automation" narrative and show how LLMs can be treated like adaptable modules in your software architecture.
Drawing on real-world case studies from our work with clients, we’ll dive into patterns that signal LLMs might be the right tool—scenarios like:
- Systems with tons of heuristic rules
- Tasks that tolerate fuzziness but need scale
- Workflows with verifiable outcomes
- Data locked in PDFs or images
You’ll leave with a practical framework for spotting LLM-shaped gaps in your own stack—and a clearer idea of when AI isn’t just a solution, but part of the system.
Get conference pass
AI as the Engine, not the Product
AI is everywhere—but most teams still treat it as an accessory, not as infrastructure.
Think about this; You use a database to persist information, you use an API gateway to expose structured data to other parts of the business. What function can you use an LLM for? You’re missing out on leveraging this technology!
In this talk, we’ll explore how LLMs can serve as dynamic building blocks—not just for ML specialists, but for everyday engineers. We'll move beyond the usual "AI for automation" narrative and show how LLMs can be treated like adaptable modules in your software architecture.
Drawing on real-world case studies from our work with clients, we’ll dive into patterns that signal LLMs might be the right tool—scenarios like:
- Systems with tons of heuristic rules
- Tasks that tolerate fuzziness but need scale
- Workflows with verifiable outcomes
- Data locked in PDFs or images
You’ll leave with a practical framework for spotting LLM-shaped gaps in your own stack—and a clearer idea of when AI isn’t just a solution, but part of the system.
Get conference pass
Browse all experts
Here