Skip to main content

Senior Data Engineer - F56200

This job is brought to you by Jobs/Redefined, the UK's leading over-50s age inclusive jobs board.

Job Description

Job Title: Senior Data Engineer

Location: Dunton Essex / Hybrid (Dunton every Tuesday. 4 days hybrid from home. Needs to be flexible to come in other days when requested. Subject to change if required)

Employment Type: Contract (12 Months)

Hours: Standard 37.5 Hours Per Week

Rate: £42.85Ph - Umbrella Co Basis - Inside IR35

Reference: F56200

Job Description:

This role is for a Global Data Analytics and Insights Senior Data Engineer who would join our organisation with the purpose to build a Data Products Team and be part of an innovative journey to transform how data is managed and utilized across our organization. We are dedicated to pioneering an adaptive and collaborative data ecosystem that optimizes every aspect of the data lifecycle.

Our team focuses on comprehensive data ingestion, ensuring regulatory compliance, and democratizing access to enhanced insights. By fostering a culture of continuous improvement and innovation, we empower every team with actionable and enriched insights. Our goal is to drive transformative outcomes and set a new standard of data-powered success. The successful candidate will be responsible for building scalable data products in a cloud-native environment. You will lead both inbound and outbound data integrations, support global data and analytics initiatives, and develop always-on solutions. Your work will be pivotal in ensuring our data infrastructure is robust, efficient, and adaptable to evolving business requirements.

Responsibilities:

  • Collaborate with GDIA product lines and business partners to understand data requirements and opportunities.
  • Build and maintain data products in accordance with GDI&A Data Factory standards, ensuring adherence to data quality, governance, and control guidelines.
  • Develop and automate scalable cloud solutions using GCP native tools (e.g., Data Prep, Data Proc, Data Fusion, Data Flow, DataForm, DBT, Big Query) and Apache Airflow.
  • Operationalize and automate data best practices with quality, auditable, timeliness and which completes the monitoring and enhances the performance and scalability of data processing systems to meet organizational needs.
  • Participate in design reviews to accelerate the business and ensure scalability.
  • Advise and direct team members and business partners on our client's standards and processes.

Skills Required:

  • Develop custom cloud solutions and pipelines with GCP native tools such as Data Prep, Data Proc, Data Fusion, Data Flow, DataForm, DBT, and Big Query.
  • Proficiency in SQL, Python, and PySpark.
  • Expertise in GCP Cloud and open-source tools like Terraform.
  • Experience with CI/CD practices and tools such as Tekton.
  • Knowledge of workflow management platforms like Apache Airflow and Astronomer.
  • Proficiency in using GitHub for version control and collaboration.
  • Ability to design and maintain efficient data pipelines.
  • Familiarity with data security, governance, and compliance best practices.
  • Strong problem-solving, communication, and collaboration skills.
  • Ability to work autonomously and in a collaborative environment.
  • Ability to design pipelines and architectures for data processing.
  • Experience with data security, governance, and compliance best practices in the cloud.
  • An understanding of current architecture standards and digital platform services strategy.
  • Excellent problem-solving skills, with the ability to design and optimize complex data pipelines.
  • Meticulous approach to data accuracy and quality.
  • Strong communication and collaboration skills, capable of working effectively with both technical and non-technical stakeholders as part of a large global and diverse team.
  • Ability to work autonomously in areas of high ambiguity.
  • Evidence of a proactive mindset to problem solving and willingness to take the initiative.
  • Strong prioritization, co-ordination, organizational skills, and a proven ability to balance workload and competing demands to meet deadlines.
  • Ability to work in a collaborative working environment, promoting knowledge sharing with the product team, and supporting other team members with technical expertise if required.

Skills Preferred:

  • Experience of Java, MDM.
  • Front-end experience, e.g., angular or react.
  • Experience with data visualization tools (e.g., Tableau, Power BI).
  • Software Quality and Performance (e.g., Sonarqube, Checkmarx, FOSSA, and Dynatrace.

Experience Required:

  • Extensive experience of programming and scripting with SQL, Python, and PySpark.
  • Extensive experience and ability to work effectively across organizations, product teams and business partners.
  • Extensive experience and knowledge of Agile Methodology, experience in writing user stories.
  • Extensive experience and demonstrated ability to lead data engineering projects, design sessions and deliverables to successful completion.
  • Strong experience of GCP Cloud with solutions designed and implemented at production scale.
  • Knowledge of Data Warehouse concepts.
  • Experience with Data Warehouse/ ETL processes.
  • Strong process discipline and thorough understanding of IT processes (ISP, Data Security).
  • Critical thinking skills to propose data solutions, test, and make them a reality.
  • Deep understanding of data service ecosystems including data warehousing, lakes, metadata, meshes, fabrics and AI/ML use cases.
  • User experience advocacy through empathetic stakeholder relationship.
  • Effective communication both internally (with Team Members) and externally (with Stakeholders).
  • Must be able to take customer requirements, conceptualize solutions, and build scalable/extensible systems that can be easily expanded or enhanced in the future.

Experience Preferred:

  • Excellent communication, collaboration, influencing skills and the ability to energize a team.
  • Knowledge of data, software and architecture operations, data engineering and data management standards, governance and quality.
  • Hands on experience in Python using libraries like NumPy, Pandas, etc.
  • Extensive knowledge and understanding of GCP offerings, bundled services, especially those associated with data operations Cloud Console, BigQuery, DataFlow, DataFusion, PubSub / Kafka, Looker Studio, VertexAI.
  • Experience with recoding, re-developing and optimizing data operations, data science and analytical workflows and products.
  • Data Governance concepts including GDPR (General Data Protection Regulation) and how these can impact technical architecture.

Education Required:

  • 2.2 Computer Science, Engineering, or a related technical field degree or equivalent qualification or experience.

Additional Information:
This role is INSIDE IR35

Do not miss out on your chance of interview - APPLY NOW!

Our Clients are unable to provide sponsorship for Tier 2 General Visas; therefore, only candidates eligible to work in the UK need apply!

Caresoft Global Limited operates as an Employment Business and Employment Agency.
We are an independent highly-experienced recruitment consultancy dedicated to specialist markets within the Automotive, Aerospace, Agricultural & Construction Industries.

No terminology within this advert is intended to unlawfully discriminate on the grounds of age, sex, race or disability and we welcome all applications.

Senior Data Engineer - F56200

Caresoft Global Limited
Basildon, UK
Contract

Published on 21/09/2024

Share this job now