Welcome to the new SBIR.gov, to assist in getting you situated with the system, a preview of the new login and registration process is available here. Please reach out to the website support team with any questions via sba.sbir.support@reisystems.com
Company
Portfolio Data
PHLUX TECHNOLOGIES, INC
UEI: C91LVZJEU1K8
Number of Employees: 2
HUBZone Owned: No
Woman Owned: No
Socially and Economically Disadvantaged: No
SBIR/STTR Involvement
Year of first award: 2022
2
Phase I Awards
1
Phase II Awards
50%
Conversion Rate
$395,767
Phase I Dollars
$999,999
Phase II Dollars
$1,395,766
Total Awarded
Awards

SBIR Phase II: Programmable Three-Dimensional (3D) Light Curtains for Enhanced Human-Robot Collaboration
Amount: $999,999 Topic: R
The broader/commercial impact of this Small Business Innovation Research (SBIR) Phase II project will be to enhance the efficiency and safety of human-robot collaboration through the development of an innovative 3D safety sensor system. As labor shortages stress supply chains, companies are rapidly adopting robotic solutions to ease the pressure. Manufacturers are recognizing the efficiency benefits of flexible and collaborative robots, moving away from large, application-specific robots that require fixed safety fences. Unlike large industrial robots, collaborative robots can function safely without physical barriers, facilitating easy reconfiguration and adaptation to new tasks. However, the safety limitations of these small co-bots result in systems that are much slower and weaker than industrial robots. The 3D safety sensor system developed in this project will bridge the gap between small, flexible co-bots and powerful industrial robots by eliminating the need for safety fences around large industrial robots and enabling collaborative applications. This technological advancement promises to revolutionize manufacturing, improve productivity, and create safer working environments while shifting human workers to higher-skill positions. The innovation will provide new insights into sensor technology, human-robot interaction, and adaptive safety systems, paving the way for further advancements in robotics and automation. This Small Business Innovation Research (SBIR) Phase II project addresses the limitations of current 3D sensors in robotics safety applications. Existing 3D sensors like LIDAR or depth cameras lack the reliability, resolution, or cost-effectiveness required for industrial safety. Consequently, industrial robot safety relies on outdated 2D sensor technology, which only captures a slice of the environment and cannot provide 3D protection. This limitation necessitates larger safety boundaries and increases robot cell sizes, making most collaborative
Tagged as:
SBIR
Phase II
2025
NSF

SBIR Phase I: Programmable Three-Dimensional (3D) Light Curtains for Enhanced Human-Robot Collaboration
Amount: $255,806 Topic: R
The broader impact/commercial potential of this Small Business Innovation Research (SBIR) Phase I project seeks to improve the efficiency and safety of human-robot collaboration with a new type of three-dimensional (3D) sensor. As labor shortages stress supply chains, companies are rapidly adopting robotic solutions to try and reduce time delays in obtaining materials. Whereas robots of the past typically operated by themselves, the proposed robots work collaboratively with humans to perform complicated tasks. Human workers can then transition to higher-skill positions where they can assist in the more intricate aspects of a task, while robots perform the dull and monotonous operations. The new applications are enabled by close interactions between robots and humans. Where physical barriers were once used to guarantee separation and safety, sensors now detect people and ensure human-robot interactions are safe. Despite the broad use of 3D sensors in many other areas of robotics, low-resolution and processing requirements force most safety applications to use the same fundamental 2D sensor technology that have been employed for decades. Since 2D technology cannot provide 3D protection, safety buffers are added, and human-robot collaboration is limited. New safety-sensors that provide 3D coverage will enable improved collaboration and efficiency in robotic applications throughout global supply chains.This Small Business Innovation Research (SBIR) Phase I project seeks to commercialize 3D light curtain safety sensors that have potential for use in autonomous systems and human-robot interactions. The core technology is a software-defined 3D sensor that senses specific 3D surfaces upon request. Instead of sensing an entire 3D volume like most sensors on the market today, these sensors monitor a single requested 3D surface within the specified volume. For 3D safety sensing, this approach allows users to adaptively program specific 3D safety boundaries around robots and humans. This removes the complex 3D processing required by other sensors and replaces it with a simple detection task similar to how existing 2D safety-sensors work. The proposed research may advance the applicability of the 3D light curtain technology to tasks in agile manufacturing and co-bots, potentially improving safety and efficiency in factories. The objectives of this work are to improve detection capabilities, increase field-of-view, and advance product readiness with customer-guided testing. Research will include iterative simulation, prototyping, and characterization of potential optical and photonic configurations to achieve these goals.This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
Tagged as:
SBIR
Phase I
2022
NSF

Adaptive Epipolar Time-of-Flight Imaging
Amount: $139,961 Topic: N22A-T020
Autonomous robots rely on 3D sensors to safely navigate and interact with their environment. The most reliable sensing solution is light detection and ranging (LIDAR) which uses the time-of-flight (ToF) principle to measure the distance to scene points. These long-range systems typically scan one or more lasers across the scene but the scanning is slow and although speed can be increased with more lasers, this further increases the complexity, size, weight, power, and cost (SWaP-C) of these already expensive systems. Additionally, even high-end LIDAR sensors do not have enough spatial resolution to both detect small objects at long ranges and image a wide FoV. Alternatively, flash LIDAR uses a powerful broad flash of light to capture the scene in a single snapshot. The resulting increase in speed, however, comes at a large cost in light efficiency since the system’s light power is now spread out across the entire scene. A fundamental limitation of these LIDAR systems is that they have few adjustable parameters and cannot adapt to the sensing needs of a particular situation. The proposed approach develops an adaptive ToF imaging method to provide a robust, fast, and efficient 3D sensing solution. First, it utilizes a powerful 1D scanning approach innovated by the proposing team called epipolar imaging to improve efficiency. This method sweeps a line of sensing from the rolling shutter of a 2D image sensor across the scene in sync with an aligned plane of illumination. Due to the concentration of the available light into a single line of sensing, this 1D scanning method is more efficient and provides longer ranges than global shutter systems at speeds faster than point scanning systems. To further improve the efficiency of epipolar imaging, this work will extend the approach with an adaptive illumination system that can redistribute and steer available photons along the plane to illuminate only as much of the scene that is needed. Guided by adaptive sampling algorithms developed in the proposed work, this approach will enable an even more efficient system through the dynamic allocation of light power to key areas of the scene. The proposed research and development effort first applies this method to continuous-wave time-of-flight (CW-ToF) sensors to build a high-resolution, robust, and low SWaP-C 3D sensor for mid-range applications in object detection, tracking, and recognition. Phase I research will include the development, integration, and demonstration of the high-resolution epipolar ToF imaging system with the adaptive illumination source. The base effort will first focus on hardware and system development. The following option effort will develop and demonstrate the algorithms used to adaptively sample a scene for improved object detection and tracking in a variety of real-world situations.
Tagged as:
STTR
Phase I
2022
DOD
NAVY