Home GOTO Experts Roberto Perez Al...

Roberto Perez Alcolea

Software Engineer at Netflix

Roberto Perez Alcolea is a Software Engineer at Netflix who focuses on the JVM development lifecycle, spanning build automation and testing infrastructure. With a deep appreciation for the JVM ecosystem and Build Tools, he works on improving how Netflix engineers build, test, and publish software.

Roberto contributes to maintaining Netflix's build-package-publish infrastructure through Nebula (Gradle) plugins, helping with dependency management and artifact publishing across the organization's projects. He's also involved in testing strategy initiatives, including working on E2E testing frameworks that incorporate observability and failure analysis.

Roberto advocates for modern integration testing practices and helps teams adopt container-based testing approaches, which led him to become a Testcontainers Community Champion. He enjoys sharing knowledge at conferences, contributing to open-source projects, and collaborating with engineers on testing best practices.

Roberto believes strongly in community-driven innovation and enjoys both learning from fellow engineers and sharing his experiences to help advance the broader JVM and testing communities.

Upcoming conference sessions featuring Roberto Perez Alcolea

Lag to Lightning: Confident, Automated Chan

How do you confidently deploy automated changes—such as dependency updates—across thousands of repositories, in hours instead of weeks? At Netflix, we've reimagined the way code changes and dependency updates are delivered—swiftly, safely, and at scale. In this talk, we’ll share how we reimagined our approach to automated changes at scale, blending technical innovation with a focus on developer trust and safety.

We’ll dive into the systems and strategies that enabled us to move from manual, slow updates to a fully automated, zero-touch process. Key to this transformation were advanced dependency management resolution rules, automated SCM changes, and comprehensive artifact observability, allowing us to track and validate every change across a vast codebase.

But automating at scale only works if engineers are confident in the process. Drawing from Netflix’s experience, we’ll explore how sharing validation results, feedback loops, and impact analysis data builds trust in automation—empowering teams to embrace rapid, platform-driven changes. We’ll discuss how shifting verification left, providing developer self-service insights, and analyzing failure impacts enable both high velocity and robust quality, even as the number of automated changes grows.

Finally, we’ll highlight future directions—such as language-agnostic tooling and proactive security measures—offering practical takeaways for any organization looking to accelerate and scale their automated change management with confidence.

Thursday Oct 2 @ 15:15 @ GOTO Copenhagen 2025

Get conference pass

Browse all experts

Here