<html><head><title>Machine Learning and Computer Vision Engineer (multiple levels) - San Francisco, CA 94102</title></head>
<body><h2>Machine Learning and Computer Vision Engineer (multiple levels) - San Francisco, CA 94102</h2>
<p><b>Who We Are</b></p><p>
Neya Systems, the industry leader in developing off-road autonomy for commercial and defense customers, has multiple openings (junior to senior) in our computer vision / machine learning group. We are looking for exceptional software engineers interested in solving real world problems with cutting-edge solutions. We have positions available in our Pittsburgh, PA headquarters and our Boston, MA satellite office, as well as some remote positions.
</p><p>You will have opportunities to work on a diverse set of computer vision, machine learning and autonomy-stack challenges in applications for autonomous off-road<b> </b>ground vehicles, autonomous construction equipment, unmanned aircraft systems (UAS), and image processing and analysis. You will be part of an experienced team of scientists and will help drive critical architecture decisions in our machine learning and computer vision pipeline.</p><p><b>
This position is posted for California; however, we encourage applicants to consider joining Neya's core team in our Pittsburgh headquarters. The city is lauded for being the apex of the autonomous driving industry, its low-cost of living, and high quality of life.</b> <b>Pittsburgh is regularly ranked one of the best places to live in the country with high rankings in categories such as "Best Places to Live", "Top College Towns", "Cool Cities to Visit", etc. Learn more about Pittsburgh </b><b>here</b><b>!</b></p><p>
Interested in joining our team? Take a look below to see what we are looking for. If you feel that you meet most but not all of the requirements below, please reach out to us, you might be a good fit.</p><p><b>
What You Will Do</b></p><p>
Responsibilities will vary, depending on your experience level, but examples include:</p><ul><li>Design, implement, and optimize cutting-edge ML systems in object classification, labeling, object detection, and/or prediction</li><li>Enhance unified back end architecture for solving multiple ML problems</li><li>Extract features and other important characteristics from LIDAR and stereo camera systems</li></ul><p><b>
What We Are Looking For</b></p><p>
Various experience levels are welcomed. Basic requirements for the position include:</p><ul><li>Bachelors, Masters or Ph.D. in Robotics, Computer Science, Electrical Engineering, or related discipline</li><li>Experience with sensor data processing, environment modeling, large-scale point-cloud processing</li><li>Experience with a subset of machine learning algorithms (e.g. classification, filtering, deep learning, segmentation, etc.)</li><li>Excellent mathematical reasoning skills, especially with probability</li><li>Comfortable writing high-quality code in Linux/C++11 or C++14</li><li>Understanding of sensor error modeling</li><li>Passion for developing advanced robotics and intelligent systems</li></ul><p><b>
What Makes Neya Unique</b></p><ul><li>Our staff get to work on immediate and pressing real-world problems in robotics, autonomy, computer vision, and machine learning applications</li><li>We have gathered a group of some of the smartest people in unmanned systems</li><li>Great company culture, centered on continual learning, refining our craft, and a good work/life balance</li><li>Competitive salary and benefits including 401k, Employee Stock Ownership Plan (ESOP), bonuses, and company paid medical, dental, and life insurance</li><li>Flexible hours and working conditions</li></ul><p><i>
This position requires use of information that is subject to the International Traffic in Arms Regulations (ITAR) and/or the Export Administration Regulations (EAR). Non-U.S. persons must meet eligibility requirements for access to export-restricted information. The ITAR/EAR defines a U.S. person as a U.S. Citizen, U.S. Permanent Resident (i.e. 'Green Card Holder'), Political Asylee, or Refugee.</i></p><p></p><br/>
<p>
YsWil26rKg</p></body>
</html>