✨ About The Role
- The role involves building and integrating components of Ramp's Analytics Platform and Machine Learning Platform.
- The candidate will be responsible for creating tools that enhance the data experience for various teams within Ramp.
- Building batch and streaming data pipelines using technologies like Airflow, Snowflake, and Kafka is a key responsibility.
- Collaboration with stakeholder teams to productionize analytical products and machine learning systems is expected.
- The job requires building reliable, scalable, maintainable, and cost-efficient systems across the data stack.
âš¡ Requirements
- The ideal candidate should have experience with workflow orchestrators like Airflow, Dagster, or Prefect.
- A strong background in building infrastructure on cloud platforms such as AWS, GCP, or Azure is essential.
- Proficiency in SQL and familiarity with databases like Snowflake, Redshift, or BigQuery is required.
- The candidate should possess strong Python programming skills and a track record of building reliable data infrastructure.
- A passion for making data systems observable, reliable, scalable, and highly automated is crucial for success in this role.