Enterprise Data Engineer
About Us
We are a forward-thinking, data-driven organisation committed to leveraging cutting-edge technology to drive innovation and excellence. At the heart of our success is our ability to transform complex data into actionable insights. We pride ourselves on fostering a collaborative environment where talented professionals can thrive, develop their skills, and make a tangible impact on our global operations.
Role Overview
As an Enterprise Data Engineer, you will play a pivotal role in designing, building, and maintaining the robust data infrastructure that powers our organisation. You will be responsible for developing scalable data pipelines, ensuring data integrity across the enterprise, and enabling our analytics and data science teams to derive value from our vast data assets.
This is a high-impact position suited for a technical expert who enjoys solving complex architectural challenges and is passionate about engineering excellence. You will work at the intersection of software engineering and data management to create a seamless, high-performance data ecosystem.
Key Responsibilities
- Data Pipeline Development: Design, develop, and optimise scalable ETL/ELT processes to ingest data from diverse sources into our central data platform.
- Architectural Design: Contribute to the design and implementation of enterprise-grade data architectures, including data lakes, warehouses, and real-time streaming solutions.
- Quality Assurance: Implement rigorous data validation and cleansing processes to ensure the highest standards of data quality, consistency, and reliability.
- Performance Optimisation: Continuously monitor and tune data processes and database performance to ensure maximum efficiency and low latency.
- Collaboration: Work closely with Data Scientists, Analysts, and stakeholders across the business to understand their requirements and provide technical solutions that meet their needs.
- Data Governance: Ensure all data solutions adhere to security, privacy, and compliance standards, maintaining the integrity of our enterprise data assets.
- Process Automation: Identify opportunities to automate manual data processes, improving team productivity and reducing the risk of human error.
Required Skills and Qualifications
- Technical Expertise: Extensive experience in data engineering, with a proven track record of building production-grade data pipelines.
- Programming: Proficiency in languages such as Python, Scala, or Java, with a focus on writing clean, maintainable code.
- SQL Mastery: Advanced knowledge of SQL for complex data manipulation, transformation, and analysis.
- Cloud Platforms: Experience working with major cloud providers (AWS, Azure, or GCP) and their associated data services.
- Big Data Technologies: Familiarity with modern data tools such as Spark, Kafka, Airflow, dbt, or Snowflake.
- Analytical Mindset: Strong problem-solving skills with the ability to analyse complex datasets and identify patterns or anomalies.
- Communication: Excellent verbal and written communication skills, with the ability to explain technical concepts to non-technical stakeholders.
- Education: A degree in Computer Science, Data Engineering, Information Technology, or a related field (or equivalent professional experience).
What We Offer
- A highly competitive salary and performance-based bonus scheme.
- Comprehensive health and wellness benefits.
- Opportunities for continuous professional development and certifications.
- A flexible working environment that promotes a healthy work-life balance.
- The chance to work on large-scale, impactful projects within a supportive and innovative team.