Data Platform Developer (12-month contract)

Date Posted:  30 Oct 2025
Location: 

Calgary, Alberta, CA

 

ARC has had an exciting growth story driven by the contributions of our people, principled business strategy and high-performance culture. We are a Canadian energy company with a strong track record of operational and financial performance. Today, we are the largest pure-play Montney producer, and Canada’s third-largest natural gas producer and largest producer of condensate.


In alignment with our values, we are proud to produce Canadian energy safely and efficiently. Producing low-cost, reliable energy strengthens our resilience in the evolving global energy system, and enables ARC to create value for our people, shareholders, stakeholders and communities.


We have a long-term view and are committed to best-in-class performance in every aspect of our business. Following our guiding principles has shaped the company we are today and will underpin our success in the future. Through innovation, teamwork and a commitment to operational excellence, our diverse team of people drive our company’s success. From the office to the field, our team of talented professionals work hard each day to safely execute our business and create positive and lasting impacts for our stakeholders.

 

THE OPPORTUNITY
We are currently seeking a Data Platform Developer for a 12-month term to join our Data Analytics team. Reporting to the Supervisor, Data Services, this role will be responsible for designing, building, and maintaining scalable data platforms and infrastructure that support analytics, reporting, and business intelligence across the organization. The successful candidate will bring a strong technical foundation in data engineering, with a focus on data reliability, security, and operational continuity. 

 

RESPONSIBILITIES
Database Support 

  • DBA Support: Administration and support of relational databases (on-premise and cloud), including:
    • Performance tuning and optimization
    • Automation of maintenance tasks like backups, refreshes, and monitoring
    • Replication and high availability solutions
    • Troubleshoot across database, OS, and storage/network layers
  • Support and management of database monitoring and alerting tools (SolarWinds, custom alerts)
  • Automation of provisioning and configuration via scripting
  • Develop and support RDBMS objects: views, functions, triggers, etc.
  • Support for SOX and regulatory audits 

Design & Development 

  • Build robust, fault-tolerant data pipelines using tools such as Databricks, Apache Airflow, and Spark
  • Develop and automate ETL workflows to ingest structured and unstructured data from APIs, databases, and third-party systems
  • Implement data cleansing, transformation, and loading into cloud-based data lakes or warehouses (Databricks, Azure)
  • Monitor pipeline performance, troubleshoot issues, and optimize scalability and cost-efficiency 

Platform Infrastructure 

  • Vendor and platform support coordination.
  • Maintain robust data platforms and management systems
  • Optimize data delivery and automate processes to enhance scalability, efficiency, and reliability
  • Develop and maintain backup and recovery strategies to ensure data protection and business continuity 

Infrastructure as Code (IaC) & Cloud Operations 

  • Design, implement, and manage cloud infrastructure using IaC tools such as Terraform
  • Automate provisioning and scaling of Databricks environments and related cloud resources
  • Maintain infrastructure definitions in version control systems (e.g., Git)
  • Collaborate with platform and security teams to enforce access controls and compliance standards 

Collaboration & Stakeholder Support 

  • Partner with data scientists, analysts, and business stakeholders to understand data needs and deliver tailored platform solutions
  • Provide support for data-related technical issues and infrastructure requirements 

Technology & Continuous Improvement 

  • Monitor and optimize platform workflows for performance and cost efficiency
  • Recommend and implement improvements to platform reliability, efficiency, and scalability
  • Document platform processes, data flows, and system architectures for transparency and knowledge sharing 

 

WHAT YOU WILL BRING TO ARC:   

  • A bachelor's degree in computer science or a related field, or equivalent work experience
  • A minimum of 3 years of experience in data engineering, with a strong understanding of data infrastructure and platform development
  • Proficiency in SQL, Python, and cloud technologies
  • Hands-on experience with Terraform, GitHub Actions, and Azure DevOps for infrastructure management and workflow automation
  • Experience administering Oracle, Postgres and SQL Server databases
  • Hands-on experience with Databricks for pipeline development and orchestration
  • Strong problem-solving skills and the ability to work collaboratively across technical and business teams
  • A commitment to continuous improvement, innovation, and delivering high-quality solutions in a fast-paced and dynamic environment 

 

REWARDING OPPORTUNITY:

  • An opportunity to be part of delivering responsible energy by developing and growing your career with us
  • Working collaboratively with, and learning from some of the most talented and experienced people in our industry
  • A culture of caring and high performance
  • Focus on health and well-being
  • An environment where all individuals are treated fairly and respectfully, have equal access to opportunities and resources, feel a sense of belonging, and can contribute fully to the organization’s success
  • Meaningful work that makes a difference in the Canadian energy industry

 

We thank you for your interest in ARC, however, only those candidates selected for an interview will be contacted. Accessibility accommodation for applicants is available upon request during the talent acquisition process.