We lead seed rounds and partner with founders from 0 to 1

76
companies
427
Jobs

Software Engineer, Perception

XWING

XWING

Software Engineering
San Francisco, CA, USA · Concord, CA, USA
Posted 6+ months ago

Meet one of your future colleagues!

About Us:

Xwing is a cutting-edge aerospace technology company focused on revolutionizing the future of aviation. Backed by industry veterans and top-tier investors, our mission is to build a safer, more efficient, and more accessible air transportation system powered by autonomous flight. By combining artificial intelligence, proprietary software, and hardware solutions, Xwing is bringing uncrewed aircraft to commercial aviation. We are computer scientists, roboticists, and aerospace experts who are at the forefront of the transformation of the aviation industry.

Founded in 2016, Xwing is based in Northern California with offices in San Francisco and in Concord, CA. In 2021, Xwing completed the world's first fully autonomous cargo flight with a remote crew supervising the flight from the ground. In 2023, Xwing became the first company with a Federal Aviation Administration (FAA) certification project for an uncrewed aircraft system for large aircraft.

Visit xwing.com to learn more, and see all available positions under "Careers" to join our rapidly growing team.

Who We're Looking For:

We're seeking a skilled Machine Learning Software Engineer with industry experience in perception for autonomous vehicles. Using onboard aircraft sensors (cameras, lidar, IR cameras, inertial sensors, etc.), our Xwing Perception Team develops safety-certifiable robotics, computer vision, and machine learning solutions to enable safe autonomous flight such as: runway and taxiway detection, ground obstacle detection, vision-based navigation and mapping, and tracking. As part of the team, you will implement robust perception solutions, train models, and deploy them as low-latency and efficient algorithms onto our airborne autonomy system. You will play a crucial role translating perception research into practical applications while thriving in a collaborative and dynamic environment. If you are passionate about innovation in autonomous vehicles, this role is for you!

Responsibilities include but are not limited to:

  • Develop and evaluate robotics perception algorithms including deep learning, machine learning, computer vision, and so forth, on our unique autonomous flight datasets (camera, lidar)
  • Set up and maintain scalable dataset, metrics evaluation, and visualization pipelines for your perception algorithms
  • Investigate and implement algorithm improvements to exceed safety metrics requirements
  • Rigorously optimize, deploy, and benchmark the algorithms on a variety of compute hardware such as CPU, GPU, and possibly FPGA. This may include:
    • Optimize machine learning models for parallel processing (GPUs, hardware accelerators)
    • Integrate machine learning models with inference engines and runtime libraries that are optimized for specific hardware platforms (e.g. Apache TVM, ONNX Runtime, TensorRT, TensorFlow Serving)
    • Write and optimize GPU-specific code, often using libraries such as OpenCL, CUDA (for NVIDIA GPUs), or ROCm (for AMD GPUs) to accelerate model inference.
    • Investigate and apply techniques for model compression and quantization to reduce memory and compute requirements while maintaining model performance.
  • Implement unit tests, integration tests, and end-to-end tests for perception components
  • Contribute intuitive, readable, scalable, and modular code to the team repositories
  • Create and maintain documentation for the key perception components you own
  • Work closely with other teams (systems, software, navigation, flight test, etc.) to ensure seamless integration of perception algorithms into testing and production environments
  • Contribute to critical team tooling, processes and best practices

Required Qualifications:

  • Master's degree in Computer Science, Machine Learning, Robotics, or a related field
  • Proven experience of software development in the autonomous vehicle industry, with a strong focus on robotics perception algorithms and machine learning
  • Strong problem-solving skills and the ability to optimize perception algorithms performance
  • Proficiency in a prototyping programming language such as Python
  • Proficiency in a compiled programming language such as C/C++
  • Familiarity with common robotics perception libraries and tools such as:
    • Deep learning frameworks (e.g., TensorFlow, PyTorch)
    • Computer vision libraries (e.g., OpenCV)
    • Lidar processing (e.g., PCL, open3d)
    • Middleware (e.g., ROS, protobuf)
  • Knowledge of algorithm performance profiling and optimization techniques
  • Excellent communication and collaboration skills
  • Demonstrated ability to contribute timely deliverables in a fast-paced, agile development environment

Desirable Qualifications:

  • PhD in Computer Science, Machine Learning, Robotics, or a related field
  • Familiarity with GPU or FPGA acceleration for machine learning
  • Familiarity with model compilation techniques such as model quantization, pruning, and kernel optimization
  • Experience with containerization and deployment technologies (e.g., Docker, Kubernetes)
  • Experience with edge computing and deploying models on resource-constrained devices
  • Experience developing software running under an RTOS or hypervisor on resource-constrained hardware architectures
  • Experience with cloud-based ML services (e.g. AWS, GCP)
  • Experience setting up DevOps/MLOps pipelines

Employment Terms:

To conform to U.S. Government aerospace technology export regulations (EAR / ITAR), applicant must be a U.S. citizen, lawful permanent resident of the U.S., protected individual as defined by 8 U.S.C. 1324b(a)(3), or eligible to obtain the required authorizations from the U.S. Department of State. In compliance with federal law, all persons hired will be required to verify identity and eligibility to work in the U.S. and to complete the required employment eligibility verification form upon hire. Employer provides equal employment opportunities to all employees and applicants for employment without regard to race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, genetic information, disability, or veteran status. In addition to federal law requirements, employer complies with applicable state and local laws governing nondiscrimination in employment. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training. Employer expressly prohibits any form of workplace harassment based on race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, genetic information, disability, or veteran status. Improper interference with the ability of employees to perform their job duties may result in discipline up to and including discharge.