About this role
An On-Prem Infrastructure Engineer at Arize deploys and maintains the Arize platform in customer environments, building software and tooling to manage large-scale on-prem and SaaS systems. The role focuses on adapting cloud-native architectures for on-premise use, supporting deployments across Kubernetes and major cloud providers, and collaborating with customers and product teams. The team emphasizes autonomy, continuous improvement, and shipping reliable, scalable infrastructure.
Required Skills
- Kubernetes
- AWS
- GCP
- Azure
- Monitoring
- Troubleshooting
- Automation
- Networking
- Security
- Release Automation
+1 more
About Arize AI
arize.comArize AI is a unified observability and evaluation platform for large language models (LLMs) and AI agents. It helps teams monitor model performance, detect drift and errors, evaluate outputs and agents, and surface root-cause signals for retraining across development and production. Aimed at MLOps, ML engineers, and product teams, Arize provides end-to-end tooling to measure, debug, and improve deployed LLM applications. The platform integrates with model and data pipelines to deliver metrics, explanations, and automated alerts that keep models reliable in production.