Candidates: Create an Account or Sign In
Data Engineer – Python, Linux, Apache Airflow, AWS or GCP
I’m working with a small but outstanding Data Analytics consultancy who are looking to recruit a Data Engineer with at least 2 years experience to work on a long term client project. They work with a very impressive client list to deliver bespoke data projects to drive decision making and client .
In this role, you'll design, build, and optimize data pipelines and infrastructure to support their business intelligence and data analytics tools.
Key responsibilities:
* Develop data pipelines using Python, SQL and cloud platforms (GCP ideally)
* Integrate data from databases, data lakes, and other sources
* Implement efficient ETL/ELT processes for high-quality, reliable data
* Optimize pipeline performance and scalability
* Collaborate with data teams to deliver impactful data solutions
Required skills:
* 2+ years experience in a data Engineering role
* 2+ years of work experience using Python
* 2+ years of work experience using Linux systems and their administration
* 2+ years of work experience using various databases (BigQuery, PostgreSQL, MSSQL, Etc)
* Experience with Cloud Platforms, ideally GCP
* Understanding of data modeling, ETL, and data quality best practices
* Strong problem-solving and analytical skills
* Excellent communication and collaboration abilities
This will be a great role in a small data consultancy that punch above their weight in dealing with many blue chip companies and offering end to end data services.
Salary: Circa £50k
Duration: Permanent
Location: Hybrid 4 days from home / 1 day from the office in Central London (Wednesday)
APPLY NOW