Archer is an aerospace company based in San Jose, California building an all-electric vertical takeoff and landing aircraft with a mission to advance the benefits of sustainable air mobility. We are designing, manufacturing, and operating an all-electric aircraft that can carry four passengers while producing minimal noise.
Our sights are set high and our problems are hard, and we believe that diversity in the workplace is what makes us smarter, drives better insights, and will ultimately lift us all to success. We are dedicated to cultivating an equitable and inclusive environment that embraces our differences, and supports and celebrates all of our team members.
What You’ll Do
- Architect and develop a high-performance Midnight aircraft, lab, and flight operation data and machine learning (ML) infrastructure, connecting flight test operations and engineering.
- Develop and create every stage of pipeline using best available open-source tooling and practices.
- Develop automated data processing solutions and visualization dashboards supporting the analysis of flight test data.
- Contribute to the development of a world class Continuous Integration and Continuous Deployment infrastructure to automate data processing and testing.
- Think creatively, leverage automation, and boost the engineering team efficiency with new tools.
- Develop key performance indicators to track system performance and availability, while continuously improving the infrastructure.
- Write well engineered documentation to help the team embrace new tools.
MINIMUM EDUCATION REQUIREMENT: Master’s degree in Computer Science, or a closely related computer field.
MINIMUM EXPERIENCE REQUIREMENT: Software engineer, data engineer, or any occupation or job title, in which 3 years of experience with developing and managing extensive data databases and automation pipelines, adeptly scaling software systems in a manufacturing domain was gained. Experience must include 3 years of experience with Spark, Kafka and dbt and Kubernetes and Terraform and Ansible and AWS. Experience must include 3 years of experience with Delta Lake and InfluxDB and clustering and partition.
SPECIAL REQUIREMENT: Must have knowledge of data processing in high-volume production environments. Must be skilled in orchestrating data workflows, processing data, detecting sensitive information, assessing data quality, profiling data, and ensuring robust data governance practices. Must be familiar with controller area network (CAN bus).