Back to jobs
Featured
Data Engineer
Job description
Data Engineer – HIRING ASAP
Start date: ASAP
Duration: 12 Months
Location: 1 day in Essex, 4 days from home
Rate: £300-£321 per day PAYE
Summary:
Our clients’ team focuses on comprehensive data ingestion, ensuring regulatory compliance, and democratizing access to enhanced insights. By fostering a culture of continuous improvement and innovation, we empower every team with actionable and enriched insights. Our goal is to drive transformative outcomes and set a new standard of data-powered success. The successful candidate will be responsible for building scalable data products in a cloud-native environment. You will lead both inbound and outbound data integrations, support global data and analytics initiatives, and develop always-on solutions. Your work will be pivotal in ensuring our data infrastructure is robust, efficient, and adaptable to evolving business requirements. Responsibilities: - Collaborate with GDIA product lines and business partners to understand data requirements and opportunities.
Skills:
Start date: ASAP
Duration: 12 Months
Location: 1 day in Essex, 4 days from home
Rate: £300-£321 per day PAYE
Summary:
Our clients’ team focuses on comprehensive data ingestion, ensuring regulatory compliance, and democratizing access to enhanced insights. By fostering a culture of continuous improvement and innovation, we empower every team with actionable and enriched insights. Our goal is to drive transformative outcomes and set a new standard of data-powered success. The successful candidate will be responsible for building scalable data products in a cloud-native environment. You will lead both inbound and outbound data integrations, support global data and analytics initiatives, and develop always-on solutions. Your work will be pivotal in ensuring our data infrastructure is robust, efficient, and adaptable to evolving business requirements. Responsibilities: - Collaborate with GDIA product lines and business partners to understand data requirements and opportunities.
Skills:
- 5+ years of programming and scripting experience with SQL, Python, and PySpark.
- 5+ Years - Ability to work effectively across organizations, product teams and business partners.
- 5+ Years - Knowledge Agile Methodology, experience in writing user stories
- 5+ Years Demonstrated ability to lead data engineering projects, design sessions and deliverables to successful completion.
- 2+ years of GCP Cloud experience with solutions designed and implemented at production scale.
- Knowledge of Data Warehouse concepts
- Experience with Data Warehouse/ ETL processes.
- Strong process discipline and thorough understating of IT processes (ISP, Data Security).
- Critical thinking skills to propose data solutions, test, and make them a reality.
- Deep understanding of data service ecosystems including data warehousing, lakes, metadata, meshes, fabrics and AI/ML use cases.
- User experience advocacy through empathetic stakeholder relationship.
- Effective Communication both internally (with team members) and externally (with stakeholders)
- Must be able to take customer requirements, conceptualize solutions, and build scalable/extensible systems that can be easily expanded or enhanced in the future.