Singapore

Optum Tech (UnitedHealth Group) is hiring a Data Engineering Consultant

Optum Tech, part of UnitedHealth Group, seeks a Data Engineering Consultant to lead the design, development, and maintenance of our scalable data pipelines and solutions. This role collaborates with cross-functional teams to architect robust data integration workflows and provide technical mentorship.

What You'll Do

  • Lead the design, development, and maintenance of scalable data pipelines using Java, APIs, and cloud technologies like Azure, Databricks, and Snowflake
  • Collaborate with cross-functional teams to architect and implement robust data integration workflows, leveraging ADF, Azure Databricks, and Github for version control and automation
  • Oversee data modeling, ETL processes, and performance optimization to ensure high-quality, reliable data delivery
  • Mentor and guide team members, providing technical leadership and best practices in data engineering
  • Ensure adherence to security, compliance, and data governance standards across all data engineering activities
  • Drive continuous improvement by evaluating and adopting new tools like Terraform, Kafka streaming, and Generative AI
  • Troubleshoot and resolve complex data issues, ensuring system reliability and scalability
  • Actively participate in project planning, code reviews, and stakeholder communications

What We're Looking For

  • Bachelor’s degree in Computer Science, Engineering, or a related field
  • 7+ years of hands-on experience in data engineering, with strong expertise in Java programming
  • Proven experience with Databricks, Snowflake, and data pipeline orchestration
  • Experience with Generative AI skills, LLMs understanding, and hands-on knowledge in building agents
  • Proven experience with cloud platforms, especially Azure, and tools like Azure Data Factory, Azure Databricks, and Snowflake
  • Proficient in designing and developing APIs and integrating data solutions across diverse systems
  • Familiarity with version control systems, particularly Github
  • Solid understanding of data modeling, ETL processes, and data governance best practices
  • Excellent problem-solving, communication, and leadership skills
  • Ability to mentor team members and collaborate effectively with cross-functional teams

Nice to Have

  • Experience with Terraform, Kafka streaming, and Generative AI technologies

Technical Stack

  • Languages & Frameworks: Java, APIs
  • Cloud & Platforms: Azure, Databricks, Snowflake, Azure Data Factory
  • Tools: Github, Terraform, Kafka

Our culture is guided by inclusion and a commitment to caring, connecting, and growing together. We are dedicated to mitigating our impact on the environment and enabling equitable care.

Required Skills
JavaAPIsAzureDatabricksSnowflakeAzure Data FactoryGitHubTerraformKafkaData EngineeringData Pipeline OrchestrationGenerative AILLMs JavaAPIsAzureDatabricksSnowflakeAzure Data FactoryGitHubTerraformKafkaData EngineeringData Pipeline OrchestrationGenerative AILLMs
Earn more as a remote developer

Performance pay that rewards your skills

Iglu's revenue-sharing model means top performers earn significantly more than traditional salaries. Choose your projects, deliver great work, and see it reflected in your pay.

Revenue-sharing compensation
Project choice & autonomy
International client base
Career growth support
Check compensation
Top earners exceed market rate
About company
Optum Tech (UnitedHealth Group)
Optum Tech is a global leader in health care innovation. Our teams develop cutting-edge solutions that help people live healthier lives and help make the health system work better for everyone. From advanced data analytics and AI to cybersecurity, we use innovative approaches to solve some of health care’s most complex challenges.
All jobs at Optum Tech (UnitedHealth Group) Visit website
Job Details
Department Data and Analytics
Category data
Posted 2 months ago