Data Engineer
Our Culture
CoinShares is an innovative, agile and ambitious organisation. We strive for excellence in everything we do. We are a high performance culture with a focus on:
Professional and personal integrity
Curiosity and a deep learning mindset
Transparency
Teamwork and collaboration
CoinShares is strongly committed to diversity and inclusion and warmly welcomes candidates from all backgrounds.
The Team
CoinShares deploys discretionary and systematic, computer-driven trading algorithms across digital assets, cryptocurrencies and derivatives. We have a proven and profitable track record in proprietary trading and are building and expanding our market-making and active investment strategies to complement our world-leading ETP & ETF busines.
The Engineering team is responsible for all aspects of software development for the firm, including platform engineering, quant engineering, and ML and AI infrastructure and implementation. As part of a nimble team in a growing organisation, you will be collaborating and developing real time solutions with your colleagues on a constant basis.
Our technical stack runs in a microservices architecture with Golang and Python services deployed on AWS alongside a Java/React user interface. We connect with our proprietary platform, MATRIX, to 15+ trading venues managing hundreds of millions of messages and orders per day. You will continue to scale and improve this platform as crypto gains further prominence at the heart of the world financial ecosystem.
Role Profile
CoinShares is undertaking a strategic multi-year modernisation of its data and AI capabilities. As a Data Engineer, you will play a key hands-on role in building the next-generation data platform that supports our ETF/ETP operations, trading, research, marketing, corporate reporting, and future AI-driven innovation.
This is an excellent opportunity for an engineer with strong foundations in Python, Airflow, and SQL/Postgres who wants to work on modern data infrastructure in a collaborative team environment.
You will contribute to designing and delivering scalable ingestion pipelines, analytics-ready datasets, and reliable orchestration workflows on AWS.
This role is ideal for someone who enjoys building production-grade data systems and wants to grow into broader platform ownership over time. The role is based in the City of London, with a hybrid working pattern of four days in the office and one day from home.
Responsibilities
Data Pipeline Development
Build and maintain ingestion pipelines from APIs, SaaS systems, and internal data sources.
Write clean, testable Python code to support ETL/ELT workflows.
Develop and optimise SQL transformations in Postgres and analytics layers.
Workflow Orchestration (Airflow)
Implement and monitor Airflow DAGs for scheduled data processing.
Troubleshoot pipeline failures and improve reliability and performance.
Contribute to orchestration best practices across the platform.
Data Modelling & Analytics Enablement
Help create analytics-ready datasets for reporting, automation, and internal teams.
Support the development domain-oriented schemas.
Work with stakeholders to ensure data is understandable and usable.
Data Quality & Observability
Implement data validation, monitoring, and alerting.
Support schema consistency, documentation, and governance standards.
Assist with improving auditability and reliability of published datasets.
Platform Collaboration
Work closely with senior engineers, product teams, and business users.
Contribute to the ongoing migration and modernisation of legacy systems.
Learn and grow within a modern AWS-based data platform environment.
Skills & Qualifications
Essential
Minimum 2 years of experience in data engineering or related software engineering roles.
Strong skills in Python for building data pipelines and backend services.
Solid SQL experience, especially with Postgres.
Hands-on experience with Airflow (or similar orchestration tools).
Understanding of ETL/ELT concepts and data warehouse/lakehouse patterns.
Ability to write maintainable, production-quality code in a collaborative team.
Strong communication skills and willingness to learn from others.
Desirable
Familiarity with AWS data services (S3, IAM, Aurora etc.).
Experience ingesting financial or market data.
Interest in data governance, quality, and lineage.
Curiosity about AI/LLM-enabled data workflows
An interest in digital assets and cryptocurrencies
Core Skills
Analytical mindset with excellent execution and operational risk awareness.
Delivery focused with proven ability to pro-actively multi-task in a pressurised environment.
Excellent verbal and written communication skills and interpersonal skills.
Demonstrated capability for identifying and managing critical stakeholders, partners and team members.
Team player with the ability to work collaboratively in a cross-functional team environment.
High attention to detail and a passion for creating an exceptional working environment.
Ability to see opportunities for improvement from all kinds of situations and events, for the benefit of the business, the team, and the individual.
- Department
- Software Engineering
- Locations
- London