Circle Optics CEO Zak Niazi presenting "Beyond The Line of Sight" at the Drones & Robotics Summit in NYC September 2022.
Zak Niazi presenting “Beyond The Line of Sight” at the Drones & Robotics Summit in NYC September 2022.
The 1950s were the golden age of air travel. Flying was extremely glamorous. People would dress up. Booze was served in fancy glassware and the meals consisted of fresh cut ham, lobster, and prime rib. Between 1948 and 1955 there were 127 mid-air collisions in the United States. Because the speeds of those aircraft were relatively low, those hundred 27 accidents only led to 226 deaths. By 1955, air traffic controllers became worried that the increased speeds of aircraft would increase fatalities. Those worst fears were realized when two aircraft collided over the Grand Canyon killing everyone aboard both planes. The deadly collision was front page news for days and brought the issue of airline safety to the public paving the way for Congress to pass a bill a year later, authorizing the creation of the FAA to update the ancient traffic control system.
The wreckage you see here was caused by small UAVs colliding with aircraft in 2021. The risk is significant because drones are small and hard to see until they are close by. Large aircraft are slow to maneuver. As drones take to the skies in the coming years and greater numbers, the risk of these collisions is significant and is one of the FAA’s top priorities today. The solution is to enable drones to detect and avoid these collisions before they happen by providing long range optical sensors. Otherwise, pilots will be forced to keep drones within visual line of sight at all times. This prevents the industry from taking off because the most useful operations for drones happen beyond visual line-of-site. Drones are increasingly being used for logistics and inspection applications for power lines, gas pipelines, and critical infrastructure and they need to fly out of human line of sight in order to be cost effective.
In the United States, the FAA has been incredibly successful eliminating fatalities, and today it’s more common to die from a lightning strike than a plane crash. However, the FAA faces a new risk today, in the form of UAVs.
The current situation requires an individual to drive along side of a drone to survey miles of equipment to ensure FAA line of sight compliance. Drone delivery services are slow to take off because they exist only in a handful of cities where the FAA has granted beyond visual line-of-sight waivers. Likewise, first responder use of drones has been limited and unuseful because drones are not allowed to fly out of line of sight of police and first responders to get advanced awareness of the situation before they arrive on scene. Like first responder technology is unuseful because drones are not able to fly ahead out of the line of sight of police and are first responders.
Because drones are small and easy to maneuver and planes are large and hard to maneuver, the FAA has put the responsibility of seeing and avoid another aircraft on the UAV. The UAV needs to see the aircraft in front, to the sides and partially behind the direction of flight. The faster the UAV is going the farther out it needs to be able to see to prevent a collision. The UAV also needs to detect cloud cover to avoid going through clouds and it needs to detect people on the ground and stay well clear of them in case a malfunction occurs and it falls out of the sky.
Radar cannot solve the detect and avoid problem because it is bulky, has interference effects with other drones using the same frequency, and it cannot detect clouds but rather sees precipitation. The optimal solutions are camera sensors because they are low size and weight and because they are passive without interference effects. The most common solutions used today are to array together multiple cameras to cover a panoramic field of regard around a drone. The problem with current camera-based solutions is they lack range. Often a fisheye lens is used to capture wide field-of-view; however, that solutions suffers distortion. The center of the image has high quality, but the quality degrades as you move towards the edge of the image. This is a fundamental physics-based limitation of wide-angle lenses. The second problem with multi camera solutions is that the field-of-view between multiple cameras overlap. Where there is overlap, there is redundant image capture and processing power and ultimately SWaP (Size, Weight, and Power) is wasted.
The image on the right-hand side is from Casia X, a popular DAA solution. You can see 50% overlap and wasted imagery between every two images. In addition to wasting processing power, this overlap prevents the ability to fuse imagery from the multiple cameras together to create a single continuous panoramic image. Such an image could be fed to a mobile pilot to provide a panoramic first-person view. However, because of stitching errors, the multiple camera feeds need to be fed to pilots individually, adding cognitive burden to remote pilot operators.
At Circle Optics we have been working alongside NASA to develop a solution that eliminates these deficiencies in camera-based systems. Our technology is composed of multiple cameras where the fields-of-view can join to create a continuous panoramic field of regard. Our technology has no optical distortion whatsoever. The fields conjoin, and our system provides a complete panorama for the pilot’s first-person view. Additionally, we can run AI analytics directly on the panoramic images themselves at the edge, creating what our NASA TPOC likes to call “situational consciousness” for the drone. Lastly, our lenses are designed to be FAA compliant with the latest regulations and provide a solution that ensures our drones can safely detect and avoid uncooperative air traffic at long ranges. We believe that the FAA shouldn’t compromises safety requirements, but rather we need new technology that enables safe operation for UAVs. At Circle, we are working to deliver these solutions for a safer airspace.