ML/AL Ops Engineer (GenAI)
ML Ops Engineer:
Primary skill - MLOPS & AIOPS & CCAI & Dialog flow
Responsibilities:
Conversational AI Operations:
Deploy and manage Google Cloud CCAI solutions including Dialogflow CX virtual agents and Agent Assist.
Monitor and optimize conversational flows, intent accuracy, and containment rates.
GenAI & LLM Operations:
Operate LLM-powered assistants and knowledge-grounded chatbots.
Manage prompt templates, RAG pipelines, and knowledge integrations.
Monitor metrics such as response quality, hallucination rate, and latency.
Monitoring & Reliability:
Implement observability using Cloud Monitoring, Logging, and BigQuery analytics.
Troubleshoot conversational AI issues and support incident response.
Integration & Automation:
Integrate AI assistants with enterprise systems such as CRM, ticketing systems, and knowledge bases.
Develop APIs, webhooks, and automation for AI workflows.
AI Lifecycle Management:
Manage CI/CD pipelines, versioning, and releases for AI agents, prompts, and knowledge bases.
Conduct testing and continuous improvement of AI performance.
Required Skills:
Experience with Google Cloud Platform (GCP) and Contact Center AI (CCAI).
Hands-on experience with Dialogflow CX.
Experience working with Generative AI / LLM applications.
Proficiency in Python or Node.js.
Experience with REST APIs and webhook integrations.
Familiarity with monitoring, logging, and AI performance analytics.
Preferred Qualifications:
Experience with Vertex AI and RAG architectures.
Familiarity with LangChain, LlamaIndex, or vector databases.
Experience integrating with contact center platforms (Genesys, Five9, Cisco).
Google Cloud certifications (Cloud Engineer or ML Engineer).
ML Ops Engineer:
Primary skill - MLOPS & AIOPS & CCAI & Dialog flow
Responsibilities:
Conversational AI Operations:
Deploy and manage Google Cloud CCAI solutions including Dialogflow CX virtual agents and Agent Assist.
Monitor and optimize conversational flows, intent accuracy, and containment rates.
GenAI & LLM Operations:
Operate LLM-powered assistants and knowledge-grounded chatbots.
Manage prompt templates, RAG pipelines, and knowledge integrations.
Monitor metrics such as response quality, hallucination rate, and latency.
Monitoring & Reliability:
Implement observability using Cloud Monitoring, Logging, and BigQuery analytics.
Troubleshoot conversational AI issues and support incident response.
Integration & Automation:
Integrate AI assistants with enterprise systems such as CRM, ticketing systems, and knowledge bases.
Develop APIs, webhooks, and automation for AI workflows.
AI Lifecycle Management:
Manage CI/CD pipelines, versioning, and releases for AI agents, prompts, and knowledge bases.
Conduct testing and continuous improvement of AI performance.
Required Skills:
Experience with Google Cloud Platform (GCP) and Contact Center AI (CCAI).
Hands-on experience with Dialogflow CX.
Experience working with Generative AI / LLM applications.
Proficiency in Python or Node.js.
Experience with REST APIs and webhook integrations.
Familiarity with monitoring, logging, and AI performance analytics.
Preferred Qualifications:
Experience with Vertex AI and RAG architectures.
Familiarity with LangChain, LlamaIndex, or vector databases.
Experience integrating with contact center platforms (Genesys, Five9, Cisco).
Google Cloud certifications (Cloud Engineer or ML Engineer).