The successful candidate will:
- Develop ETL pipelines to integrate and test very large alternative datasets
- Architect, deploy, and manage cloud-based systems for storing and exploring very large datasets
- Monitor, support, debug, and extend existing trading and research infrastructure
Required Skills and Experience:
- Proficiency in Python, particularly with numerical libraries such as numpy, pandas, and matplotlib.
- Basic knowledge of AWS.
- Basic knowledge of databases (e.g., SQL).
- Familiarity with development practices such as version control with Git and unit testing.
- A quantitative mindset.
- Strong team player with a collaborative attitude.
Nice to Have:
- Experience creating dashboards or using data visualization software (e.g., Tableau, Dash).
- In-depth AWS experience (e.g., DynamoDB, RDS, S3, Lambda, AWS CDK).
- Advanced database knowledge (query optimization, relational vs non-relational databases, etc.).
- Experience with parallel computation.
- Experience with geographic data using geopandas and xarray.
- Financial knowledge is a plus but not required.
Why Join?
- Be part of a team at the forefront of quantitative and systematic investing.
- Work in a dynamic and collaborative environment.
- Opportunity to work on cutting-edge technology and innovative projects.
- Competitive compensation and benefits package.
