Data Engineer, Post Trade Analytics at Jump Trading

Shanghai, China
· 4 hours ago
·
Full time
·
Graduate
Jump Trading logo

Jump Trading Group is committed to world class research. We empower exceptional talents in Mathematics, Physics, and Computer Science to seek scientific boundaries, push through them, and apply cutting edge research to global financial markets. Our culture is unique. Constant innovation requires fearlessness, creativity, intellectual honesty, and a relentless competitive streak. We believe in winning together and unlocking unique individual talent by incenting collaboration and mutual respect. At Jump, research outcomes drive more than superior risk adjusted returns. We design, develop, and deploy technologies that change our world, fund start-ups across industries, and partner with leading global research organizations and universities to solve problems.

Jump Trading's Post Trade Analytics Team plays a crucial role in measuring and optimizing performance and trade execution across all global markets where the firm operates. This team is responsible for developing, operating, and maintaining performance measurement solutions, as well as conducting a wide range of post-trade analyses for various departments, including trading, business development, and other central teams. Daily, the team's core infrastructure collects, organizes, and stores billions of data points related to trading performance.

This role offers the opportunity to develop both business and technical expertise while significantly contributing to our global post trade analytics effort. We are seeking an experienced Data Engineer with 3+ years of experience to work closely with our Shanghai team, building and maintaining sophisticated datasets and data pipelines that will shape the way we trade.

What you’ll do:

  • Build, operate, and maintain complex datasets and end-to-end dataset pipelines for analytics and research use cases.
  • Partner directly with traders, quantitative researchers and data scientists to understand data requirements and deliver reliable data products in both real-time and next-day environments.
  • Ensure data quality and reliability through validation frameworks, monitoring, alerting, lineage, and incident response.
  • Optimize data processing and workflows for performance, cost, and robustness; troubleshoot pipeline failures and production issues.
  • Maintain and improve associated internal tools, documentation, and operational runbooks.

Skills you’ll need:

  • At least 3+ years’ experience as a Data Engineer
  • Proficient in Python development, DevOps, and Linux environments.
  • Proven expertise building and operating ETL/ELT pipelines (including orchestration, scheduling, and dependency management).
  • Hands-on experience with relational and non-relational databases, including schema design and query optimization.
  • Familiarity with SQL and data analytics tools and libraries, such as Pandas and NumPy.
  • C++ experience is a plus.
  • Strong analytical and problem-solving abilities.
  • Other Duties as assigned and needed.
  • Reliable and predictable availability.
  • Shanghai based
Jump Trading

Report submitted