Dataset Preview
Full Screen Viewer
Full Screen
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
Commit 0/1 could not be created on the Hub (after 6 attempts).
Error code: CreateCommitError
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
id_x
string | text
string | patent_number
string | title
string | assignee
string | inventor/author
string | priority date
string | filing/creation date
string | publication date
string | grant date
string | result link
string | representative figure link
string | question
string | __index_level_0__
int64 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
US11360197B2-158 | In embodiments, the type of perturbation and amount or percentage of perturbation may be determined by the current mutual information value, mutual information cost, a difference of the cost value of a current iteration and the cost value of a previous iteration, or a difference of the mutual information value of a current iteration and the mutual information value of a previous iteration. For example, it may be desirable to perturb the transformation parameters by 15% when the difference of the mutual information value between two consecutive iterations is greater than 10. In subsequent iterations, it may be determined that the perturbation should be reduced to 5% due to a decrease in the difference of the mutual information value between two iterations of the optimization process. Dynamically changing the perturbation during optimization may decrease optimization times, enable greater mutual information values of the determined optimized transformation parameters, and allow for smaller difference in mutual information values between iterations during optimizations. | US11360197B2 | Calibration of sensor systems | Luminar, Llc | Amey Sutavani, Lekha Walajapet Mohan, Benjamin Englard | 2020-01-07 | 2020-05-07 | 2022-06-14 | 2022-06-14 | https://patents.google.com/patent/US11360197B2/en | null | 9,666 |
|
US10088559B1-41 | In operation, the light source 110 emits an output beam of light 125 which may be continuous-wave, pulsed, or modulated in any suitable manner for a given application. The output beam of light 125 is directed downrange toward a remote target 130 located a distance D from the lidar system 100 and at least partially contained within a field of regard of the system 100. Depending on the scenario and/or the implementation of the lidar system 100, D can be between 1 m and 1 km, for example. | US10088559B1 | Controlling pulse timing to compensate for motor dynamics | Luminar Technologies, Inc. | Matthew D. Weed, Scott R. Campbell, Lane A. Martin, Jason M. Eichenholz, Austin K. Russell, Rodger W. Cleye, Melvin L. Stauffer | 2017-03-29 | 2018-01-22 | 2018-10-02 | 2018-10-02 | https://patents.google.com/patent/US10088559B1/en | null | 474 |
|
US11367990B2-19 | FIG. 17 illustrates an example computer system. | US11367990B2 | Lidar system operating at 1200-1400 NM | Luminar, Llc | Jason M. Eichenholz, Laurance S. Lingvay, David Welford | 2018-08-29 | 2019-08-29 | 2022-06-21 | 2022-06-21 | https://patents.google.com/patent/US11367990B2/en | null | 9,850 |
|
US10627521B2-138 | FIG. 11 depicts scenarios 710A through 710D, which illustrate how a vehicle sensor may be optimally focused, or at least more usefully focused, as a vehicle 712 goes down and up the same hills shown in FIG. 10. Similar to the vehicle 702 of FIG. 10, the vehicle 712 (e.g., the vehicle 300 of FIG. 4A or the vehicle 360 of FIG. 4B) has one or more forward-facing sensors (e.g., the sensor heads 312A, 312D or the sensor heads 372A, 372G, respectively), and a sensor direction 714 may represent the center of the field of regard of a sensor (e.g., lidar device, camera, etc.), the center of a bottom edge of the field of regard, an area of highest focus within the field of regard (e.g., a densest concentration of horizontal scan lines for a lidar or radar device), etc. | US10627521B2 | Controlling vehicle sensors based on dynamic objects | Luminar Technologies, Inc. | Benjamin Englard, Eric C. Danziger, Austin K. Russell | 2017-12-13 | 2018-10-31 | 2020-04-21 | 2020-04-21 | https://patents.google.com/patent/US10627521B2/en | null | 6,482 |
|
US10591600B2-83 | In particular embodiments, each optical link (330-1, 330-2, . . . , 330-N) may be approximately the same length, or the optical links may have two or more different lengths. As an example, each optical link may include a fiber-optic cable with a length of approximately 20 m. As another example, the optical links may each include a fiber-optic cable with a particular or different length (e.g., two of the optical links may include a 5-m fiber, another two optical links may include a 10-m fiber, and one optical link may include a 20-m fiber). In particular embodiments, the sensor heads (310-1, 310-2, . . . , 310-N) may emit optical pulses at substantially the same time or at different times with respect to each other. As an example, a demultiplexer 410 may split a single optical pulse into N optical pulses. The N optical pulses may be conveyed to the N sensor heads by fiber-optic cables having substantially the same length, and the pulses may be emitted by the sensor heads at approximately the same time. As another example, the N optical pulses may be conveyed to the N sensor heads by fiber-optic cables having two or more different lengths, and, due to the different propagation times associated with the different fiber lengths, the pulses may be emitted at different times. As another example, the demultiplexer 410 may direct different optical pulses to different sensor heads at different times, resulting in the pulses being emitted by the sensor heads at different times. In particular embodiments, since each sensor head may emit and receive pulses independent of other sensor heads, the operation of a lidar system 100 as described and illustrated herein may not specifically depend on whether the pulses are emitted in a time-synchronous fashion or the pulses are emitted without regard to the relative time synchronization. | US10591600B2 | Lidar system with distributed laser and multiple sensor heads | Luminar Technologies, Inc. | Alain Villeneuve, Jason M. Eichenholz | 2015-11-30 | 2016-11-29 | 2020-03-17 | 2020-03-17 | https://patents.google.com/patent/US10591600B2/en | null | 5,613 |
|
US10503172B2-171 | The term value generator 740 may generate values for X different terms of the objective equation, where X is any suitable positive integer. Each term may correspond to a different driving objective over some finite time horizon. For example, “Term 1” of FIG. 13 may be a distance from a nearest object or from some predetermined perimeter surrounding that object (with larger distances generally being desired), “Term 2” may be a metric indicating how “off course” the autonomous vehicle is with respect to some waypoint or other intermediate destination (e.g., an angle, with smaller angles generally being desired), and so on. Other driving objectives may include one or more of the following: keeping the autonomous vehicle in the center of the lane in which the autonomous vehicle is traveling; keeping the autonomous vehicle within the boundaries of the road (e.g., between a curb and a center median or between a shoulder and a guardrail) on which the autonomous vehicle is traveling; maintaining the heading of the autonomous vehicle based on the curvature of the lane or road on which the autonomous vehicle is traveling (e.g., turning the autonomous vehicle based on the curvature of the lane ahead); and obeying the rules of the road on which the autonomous vehicle is traveling (e.g., maintaining the speed at or near the speed limit, stopping for stop signs and traffic lights, pulling over for emergency vehicles, merging at yield signs, stopping for pedestrians, staying out of bicycle lanes, and/or not driving on road shoulders). | US10503172B2 | Controlling an autonomous vehicle based on independent driving decisions | Luminar Technologies, Inc. | Benjamin Englard, Joseph Augenbraun | 2017-10-18 | 2018-10-02 | 2019-12-10 | 2019-12-10 | https://patents.google.com/patent/US10503172B2/en | null | 4,696 |
|
US10545240B2-135 | Still further, as illustrated in FIG. 13, there may be two amplitude detection circuits 608 associated with each particular threshold value (T1, for example). In particular, there are two types of comparators 610, including rising-edge comparators, indicated with a plus sign (+), and falling-edge comparators, indicated with a minus sign (−). As will be understood, rising-edge comparators determine when the amplified light detection signal provided at the input thereto reaches or rises above the threshold T going in a positive or rising direction (that is, reaches the threshold from a lower value). On the other hand, falling-edge comparators determine or detect when the amplified light detection signals at the input thereto reach or fall below the associated threshold T in the negative or falling direction (that is, reach the threshold from a higher value). Thus, the comparator 610A+ provides a comparison between the incoming light detection signal to the threshold T1, and determines when the incoming light detection signal reaches the level of threshold T1 going in a positive direction, while the comparator 610A− determines when the light signal reaches the threshold T1 going in the negative or falling direction. Upon making a determination that the light detection signal meets the associated threshold from the correct direction, the comparator produces an output signal that indicates such a condition (i.e., that the comparison criteria is met). As illustrated in FIG. 13, the output signal of each comparator 610, which may be a direct current (DC) signal, a rising-edge or falling-edge signal, or a digital bit indicative of the status of the comparison (e.g., met or not met), is provided to an associated TDC 612. | US10545240B2 | LIDAR transmitter and detector system using pulse encoding to reduce range ambiguity | Luminar Technologies, Inc. | Scott R. Campbell, Joseph G. LaChapelle, Jason M. Eichenholz, Austin K. Russell | 2017-03-28 | 2018-03-10 | 2020-01-28 | 2020-01-28 | https://patents.google.com/patent/US10545240B2/en | null | 4,965 |
|
US10061019B1-46 | The window 157 may be made from any suitable substrate material, such as for example, glass or plastic (e.g., polycarbonate, acrylic, cyclic-olefin polymer, or cyclic-olefin copolymer). The window 157 may include an interior surface (surface A) and an exterior surface (surface B), and surface A or surface B may include a dielectric coating having particular reflectivity values at particular wavelengths. A dielectric coating (which may be referred to as a thin-film coating, interference coating, or coating) may include one or more thin-film layers of dielectric materials (e.g., SiO2, TiO2, Al2O3, Ta2O5, MgF2, LaF3, or AlF3) having particular thicknesses (e.g., thickness less than 1 μm) and particular refractive indices. A dielectric coating may be deposited onto surface A or surface B of the window 157 using any suitable deposition technique, such as for example, sputtering or electron-beam deposition. | US10061019B1 | Diffractive optical element in a lidar system to correct for backscan | Luminar Technologies, Inc. | Scott R. Campbell, Jason M. Eichenholz | 2017-03-28 | 2017-10-10 | 2018-08-28 | 2018-08-28 | https://patents.google.com/patent/US10061019B1/en | null | 322 |
|
US11378666B2-92 | In some implementations, the scanner 162 may include one mirror configured to be scanned along two axes, where two actuators arranged in a push-pull configuration provide motion along each axis. For example, two resonant actuators arranged in a horizontal push-pull configuration may drive the mirror along a horizontal direction, and another pair of resonant actuators arranged in a vertical push-pull configuration may drive mirror along a vertical direction. In another example implementation, two actuators scan the output beam 170 along two directions (e.g., horizontal and vertical), where each actuator provides rotational motion along a particular direction or about a particular axis. | US11378666B2 | Sizing the field of view of a detector to improve operation of a lidar system | Luminar, Llc | Scott R. Campbell, Lane A. Martin, Matthew D. Weed, Jason M. Eichenholz | 2017-03-29 | 2020-04-29 | 2022-07-05 | 2022-07-05 | https://patents.google.com/patent/US11378666B2/en | null | 10,144 |
|
US10591600B2-236 | The lidar system, wherein the light source comprises: a plurality of laser diodes, wherein each laser diode is configured to produce light at a different operating wavelength; and an optical multiplexer configured to combine the light produced by each laser diode into a single optical fiber. | US10591600B2 | Lidar system with distributed laser and multiple sensor heads | Luminar Technologies, Inc. | Alain Villeneuve, Jason M. Eichenholz | 2015-11-30 | 2016-11-29 | 2020-03-17 | 2020-03-17 | https://patents.google.com/patent/US10591600B2/en | null | 5,766 |
|
US10845480B1-149 | FIG. 18 illustrates an example temporal offset (Δt) between electrical current pulses supplied to a seed laser diode 400 and a SOA 410. In particular embodiments, a rising edge of a seed-laser current pulse I1 may be offset from a rising edge of a corresponding SOA current pulse I2 by a particular time interval Δt. The temporal offset of Δt between the rising edges of the current pulses may correspond to a similar temporal offset between the rising edges of the resulting seed-laser optical pulse and the SOA optical pulse. A temporal offset Δt may have any suitable value, such as for example, a value of 0 ns, 0.1 ns, 0.5 ns, 1 ns, 2 ns, or 5 ns. For example, the rising edges of the seed and SOA pulses may occur at approximately the same time so that the temporal offset Δt is approximately zero. As another example, the rising edge of the SOA current pulse I2 may be temporally advanced or delayed by Δt with respect to the rising edge of the seed-laser current pulse I1. In FIG. 18, the rising edge of the SOA current pulse I2 is delayed by Δt with respect to the rising edge of the seed-laser current pulse I1 (e.g., the rising edge of the SOA current pulse I2 occurs after the rising edge of the seed-laser current pulse I1). Alternatively, the rising edge of the SOA current pulse I2 may be advanced by Δt with respect to the rising edge of the seed-laser current pulse I1 (e.g., the rising edge of the SOA current pulse I2 may occur before the rising edge of the seed-laser current pulse I1). | US10845480B1 | Lidar system with semiconductor optical amplifier | Luminar Technologies, Inc. | Lawrence Shah, Jason M. Eichenholz, Joseph G. LaChapelle, Alex Michael Sincore, Cheng Zhu | 2019-02-08 | 2020-02-07 | 2020-11-24 | 2020-11-24 | https://patents.google.com/patent/US10845480B1/en | null | 7,506 |
|
US10976417B2-12 | FIG. 7 schematically illustrates fields of view (FOVs) of a light source and a detector that can operate in the lidar system of FIG. 1; | US10976417B2 | Using detectors with different gains in a lidar system | Luminar Holdco, Llc | Joseph G. LaChapelle, Scott R. Campbell, Jason M. Eichenholz, Matthew D. Weed | 2017-03-29 | 2018-03-29 | 2021-04-13 | 2021-04-13 | https://patents.google.com/patent/US10976417B2/en | null | 7,717 |
|
US10338199B1-34 | FIG. 8 shows a side cross-sectional view of a transmitter component of a transceiver, according to an exemplary, non-limiting embodiment of the invention. | US10338199B1 | Transceiver apparatus, method and applications | Luminar Technologies, Inc. | John E. McWhirter, Allen Gabriele | 2018-07-05 | 2018-07-05 | 2019-07-02 | 2019-07-02 | https://patents.google.com/patent/US10338199B1/en | null | 2,754 |
|
US10451716B2-113 | Generally speaking, the light from the Sun that passes through the Earth's atmosphere and reaches a terrestrial-based lidar system such as the system 120A can establish an optical background noise floor for this system. Thus, in order for a signal from the lidar system 120A to be detectable, the signal must rise above the background noise floor. It is generally possible to increase the signal-to-noise (SNR) ratio of the lidar system 120A by raising the power level of the output beam 150A, but in some situations it may be desirable to keep the power level of the output beam 150A relatively low. For example, increasing transmit power levels of the output beam 150A can result in the lidar system 120A not being eye-safe. | US10451716B2 | Monitoring rotation of a mirror in a lidar system | Luminar Technologies, Inc. | John Hughes, Nicholas Ventola, Sean P. Hughes | 2017-11-22 | 2018-04-20 | 2019-10-22 | 2019-10-22 | https://patents.google.com/patent/US10451716B2/en | null | 4,073 |
|
US11378666B2-88 | Now referring to FIG. 2, a scanner 162 and a receiver 164 can operate in the lidar system of FIG. 1 as the scanner 120 and the receiver 140, respectively. More generally, the scanner 162 and the receiver 164 can operate in any suitable lidar system. | US11378666B2 | Sizing the field of view of a detector to improve operation of a lidar system | Luminar, Llc | Scott R. Campbell, Lane A. Martin, Matthew D. Weed, Jason M. Eichenholz | 2017-03-29 | 2020-04-29 | 2022-07-05 | 2022-07-05 | https://patents.google.com/patent/US11378666B2/en | null | 10,140 |
|
US10401481B2-5 | In another embodiment, a method in a lidar system operating in a vehicle for scanning a field of regard is provided. The method comprises generating a first output beam of light has a first amount of power; generating a second output beam of light has a second amount of power smaller than the first amount of power; scanning the field of regard using the first output beam and the second output beam, including directing the first output beam in a direction substantially parallel to a plane of motion of the vehicle, and directing the second output beam in a direction non-parallel to the plane of motion of the vehicle; and detecting light associated with the first output beam and light associated with the second beam scattered by one or more remote targets. | US10401481B2 | Non-uniform beam power distribution for a laser operating in a vehicle | Luminar Technologies, Inc. | Scott R. Campbell, Matthew D. Weed, Lane A. Martin, Jason M. Eichenholz | 2017-03-30 | 2018-03-30 | 2019-09-03 | 2019-09-03 | https://patents.google.com/patent/US10401481B2/en | null | 3,422 |
|
US11378666B2-24 | FIG. 16 a diagram of a detector array which includes three detectors, which can be used in the lidar system of FIG. 1; | US11378666B2 | Sizing the field of view of a detector to improve operation of a lidar system | Luminar, Llc | Scott R. Campbell, Lane A. Martin, Matthew D. Weed, Jason M. Eichenholz | 2017-03-29 | 2020-04-29 | 2022-07-05 | 2022-07-05 | https://patents.google.com/patent/US11378666B2/en | null | 10,076 |
|
US10121813B2-164 | While operations may be depicted in the drawings as occurring in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all operations be performed. Further, the drawings may schematically depict one more example processes or methods in the form of a flow diagram or a sequence diagram. However, other operations that are not depicted may be incorporated in the example processes or methods that are schematically illustrated. For example, one or more additional operations may be performed before, after, simultaneously with, or between any of the illustrated operations. Moreover, one or more operations depicted in a diagram may be repeated, where appropriate. Additionally, operations depicted in a diagram may be performed in any suitable order. Furthermore, although particular components, devices, or systems are described herein as carrying out particular operations, any suitable combination of any suitable components, devices, or systems may be used to carry out any suitable operation or combination of operations. In certain circumstances, multitasking or parallel processing operations may be performed. Moreover, the separation of various system components in the implementations described herein should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems may be integrated together in a single software product or packaged into multiple software products. | US10121813B2 | Optical detector having a bandpass filter in a lidar system | Luminar Technologies, Inc. | Jason M. Eichenholz, Scott R. Campbell, Joseph G. LaChapelle | 2017-03-28 | 2018-03-01 | 2018-11-06 | 2018-11-06 | https://patents.google.com/patent/US10121813B2/en | null | 971 |
|
US11536803B2-7 | FIG. 5 illustrates an example unidirectional scan pattern that includes multiple pixels and multiple scan lines. | US11536803B2 | Lidar receiver with multiple detectors for range-ambiguity mitigation | Luminar, Llc | Stephen D. Gaalema, Mark A. Drummer, Stephen L. Mielke, Jason M. Eichenholz | 2018-12-05 | 2019-08-29 | 2022-12-27 | 2022-12-27 | https://patents.google.com/patent/US11536803B2/en | null | 12,195 |
|
US10451716B2-231 | In an embodiment, the rotary encoder 1000 is an optical encoder that has an optical beam, the presence or absence of which is detectable by a stationary photo-interrupter 1006 (also known as an opto-detector), to generate data that may be used to indicate the rotational speed of the rotatable polygon mirror 12 or other rotational parameter. An optical encoder may be preferable, because the photo-interrupter is considered a non-contact (optical) switch, which improves reliability by preventing wear and tear due to abrasion. However, it should be understood that additional rotary encoders may be utilized, including, but not limited to, magnetic encoders, capacitive encoders and mechanical encoders. In any event, the photo-interrupter 1006 is stationary with respect to the source of the output beams for the lidar sensor unit 10 and/or the mount 29. | US10451716B2 | Monitoring rotation of a mirror in a lidar system | Luminar Technologies, Inc. | John Hughes, Nicholas Ventola, Sean P. Hughes | 2017-11-22 | 2018-04-20 | 2019-10-22 | 2019-10-22 | https://patents.google.com/patent/US10451716B2/en | null | 4,191 |
|
US11551547B2-103 | The lane detection module 104 may similarly perform the example subdivision technique 636 illustrated in example bubble 640 on all other linked sections of pixels included in the point cloud. For example, the lane detection module 104 may evaluate the linked section of pixels with beginning pixel 646 and ending pixel 647. In this circumstance, the lane detection module 104 may determine that all pixels between the beginning pixel 646 and ending pixel 647 are substantially linear, such that the grouping of all pixels between the beginning pixel 646 and ending pixel 647 will be grouped together into a subsection 648. Similarly, the lane detection module 104 may evaluate and designate subsections 649 a, 649 b, 650 a-650 c, 651, 652 a, and 652 b. | US11551547B2 | Lane detection and tracking techniques for imaging systems | Luminar, Llc | Pranav Maheshwari, Vahid R. Ramezani, Ismail El Houcheimi, Shubham C. Khilari, Rounak Mehta | 2020-01-06 | 2020-07-09 | 2023-01-10 | 2023-01-10 | https://patents.google.com/patent/US11551547B2/en | null | 12,574 |
|
US11119219B1-179 | In particular embodiments, the average optical power of LO light 430 may be configured by adjusting or setting (i) an amount of seed current I1 supplied to a seed laser diode 450, (ii) a reflectivity of the back face 451 of the seed laser diode 450, (iii) a reflectivity of a free-space splitter 470, or (iv) an amount of light split off by a fiber-optic or optical-waveguide splitter 470. In the example of FIG. 8 or FIG. 9, the seed current I1 and the reflectivity of the back face 451 of the seed laser diode 450 may be configured so that the average optical power of the LO light 430 is set to a particular value (e.g., a value between 10 μW and 100 μW). In the example of FIG. 10, the seed current I1 and the reflectivity of the splitter 470 may be configured so that the average optical power of the LO light 430 is set to a particular value (e.g., a value below 10 mW). In the example of FIG. 11, the seed current supplied to the seed laser diode 450 and the amount of light split off to output port 2 by the optical-waveguide splitter 470 may be configured so that the average optical power of the LO light 430 is set to a particular value (e.g., a value below 1 mW). | US11119219B1 | Lidar system with input optical element | Luminar, Llc | Joseph G. LaChapelle, Jason M. Eichenholz, Alex Michael Sincore, Lawrence Shah | 2020-08-10 | 2021-02-24 | 2021-09-14 | 2021-09-14 | https://patents.google.com/patent/US11119219B1/en | null | 8,943 |
|
US11536803B2-45 | In particular embodiments, lidar system 100 may include one or more optical components configured to reflect, focus, filter, shape, modify, steer, or direct light within the lidar system 100 or light produced or received by the lidar system 100 (e.g., output beam 125 or input beam 135). As an example, lidar system 100 may include one or more lenses, mirrors, filters (e.g., bandpass or interference filters), beam splitters, polarizers, polarizing beam splitters, wave plates (e.g., half-wave or quarter-wave plates), diffractive elements, holographic elements, isolators, couplers, detectors, beam combiners, or collimators. The optical components in a lidar system 100 may be free-space optical components, fiber-coupled optical components, or a combination of free-space and fiber-coupled optical components. | US11536803B2 | Lidar receiver with multiple detectors for range-ambiguity mitigation | Luminar, Llc | Stephen D. Gaalema, Mark A. Drummer, Stephen L. Mielke, Jason M. Eichenholz | 2018-12-05 | 2019-08-29 | 2022-12-27 | 2022-12-27 | https://patents.google.com/patent/US11536803B2/en | null | 12,233 |
|
US10545240B2-88 | Now referring to FIG. 4, a rotating scan module 220 is generally similar to the rotating scan module 200. In this implementation, however, the components of the rotating scan module 220 are disposed on a platform 222 which rotates inside a stationary circular housing 230. In this implementation, the circular housing 230 is substantially transparent to light at the lidar-system operating wavelength to pass inbound and outbound light signals. The circular housing 230 in a sense defines a circular window similar to the window 212, and may be made of similar material. | US10545240B2 | LIDAR transmitter and detector system using pulse encoding to reduce range ambiguity | Luminar Technologies, Inc. | Scott R. Campbell, Joseph G. LaChapelle, Jason M. Eichenholz, Austin K. Russell | 2017-03-28 | 2018-03-10 | 2020-01-28 | 2020-01-28 | https://patents.google.com/patent/US10545240B2/en | null | 4,918 |
|
US11391842B2-94 | At block 920, the VROI detection module 110 determines a lower bound of the VROI based at least in part on detecting a suitable subset of the received sensor data. In some implementations, the suitable subset may have a minimum relative elevation metric. To that end, the VROI detection module 110 may first identify a plurality of subsets of data (e.g., grouped by corresponding lidar scan lines), each associated with a certain corresponding elevation with respect to a neutral look direction of the imaging sensor. For each of the identified subsets of data, the VROI detection module 110 may select the points in the receptive field, and assign a weight to each point in the subset based on the location of the point in the receptive field. Subsequently, the VROI detection module 110 may use the weighted contributions of the points in each subset to compute corresponding relative elevation metrics for the subsets, and select the subset with the minimum relative elevation metric. | US11391842B2 | Adaptive scan pattern with virtual horizon estimation | Luminar, Llc | Dmytro Trofymov | 2020-01-06 | 2020-02-12 | 2022-07-19 | 2022-07-19 | https://patents.google.com/patent/US11391842B2/en | null | 10,431 |
|
US10169680B1-223 | 27. The computer-implemented method of aspect 1, the 3-D environment image is generated by one or more active imaging sensors or devices. | US10169680B1 | Object identification and labeling tool for training autonomous vehicle controllers | Luminar Technologies, Inc. | Prateek Sachdeva, Dmytro Trofymov | 2017-12-21 | 2018-02-27 | 2019-01-01 | 2019-01-01 | https://patents.google.com/patent/US10169680B1/en | null | 1,201 |
|
US10338199B1-7 | wherein the outer housing has an inner diameter that is up to 2% larger than an outer diameter of the inner housing; | US10338199B1 | Transceiver apparatus, method and applications | Luminar Technologies, Inc. | John E. McWhirter, Allen Gabriele | 2018-07-05 | 2018-07-05 | 2019-07-02 | 2019-07-02 | https://patents.google.com/patent/US10338199B1/en | null | 2,727 |
|
US9810786B1-70 | FIG. 7 illustrates an example forward-scan direction and reverse-scan direction for a light-source field of view and a receiver field of view. In particular embodiments, a lidar system 100 may be configured so that the FOVR is larger than the FOVL, and the receiver and light-source FOVs may be substantially coincident, overlapped, or centered with respect to one another. As an example, the FOVR may have a diameter or angular extent ΘR, that is approximately 1.5×, 2×, 3×, 4×, 5×, or 10× larger than the diameter or angular extent ΘL of the FOVL. In the example of FIG. 7, the diameter of the receiver field of view is approximately 2 times larger than the diameter of the light-source field of view, and the two FOVs are overlapped and centered with respect to one another. The receiver field of view being larger than the light-source field of view may allow the receiver 140 to receive scattered light from emitted pulses in both scan directions (forward scan or reverse scan). In the forward-scan direction illustrated in FIG. 7, scattered light may be received primarily by the left side of the FOVR, and in the reverse-scan direction, scattered light may be received primarily by the right side of the FOVR. For example, as a pulse of light propagates to and from a target 130 during a forward scan, the FOVR scans to the right, and scattered light that returns to the lidar system 100 may be received primarily by the left portion of the FOVR. | US9810786B1 | Optical parametric oscillator for lidar system | Luminar Technologies, Inc. | David Welford, Martin A. Jaspan, Jason M. Eichenholz, Scott R. Campbell, Lane A. Martin, Matthew D. Weed | 2017-03-16 | 2017-03-16 | 2017-11-07 | 2017-11-07 | https://patents.google.com/patent/US9810786B1/en | null | 14,471 |
|
US11360197B2-94 | Similar to the scan pattern 240, each of the linear scan patterns 254A-N includes pixels associated with one or more laser pulses and distance measurements. FIG. 4 illustrates example pixels 252A, 252B and 252C along the scan patterns 254A, 254B and 254C, respectively. The lidar system 100 in this example may generate the values for the pixels 252A-252N at the same time, thus increasing the scan rate. | US11360197B2 | Calibration of sensor systems | Luminar, Llc | Amey Sutavani, Lekha Walajapet Mohan, Benjamin Englard | 2020-01-07 | 2020-05-07 | 2022-06-14 | 2022-06-14 | https://patents.google.com/patent/US11360197B2/en | null | 9,602 |
|
US11367990B2-213 | In particular embodiments, certain features described herein in the context of separate implementations may also be combined and implemented in a single implementation. Conversely, various features that are described in the context of a single implementation may also be implemented in multiple implementations separately or in any suitable sub-combination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination may in some cases be excised from the combination, and the claimed combination may be directed to a sub-combination or variation of a sub-combination. | US11367990B2 | Lidar system operating at 1200-1400 NM | Luminar, Llc | Jason M. Eichenholz, Laurance S. Lingvay, David Welford | 2018-08-29 | 2019-08-29 | 2022-06-21 | 2022-06-21 | https://patents.google.com/patent/US11367990B2/en | null | 10,044 |
|
US10003168B1-36 | In particular embodiments, an output beam of light 125 emitted by light source 110 may be unpolarized or randomly polarized, may have no specific or fixed polarization (e.g., the polarization may vary with time), or may have a particular polarization (e.g., output beam 125 may be linearly polarized, elliptically polarized, or circularly polarized). As an example, light source 110 may produce linearly polarized light, and lidar system 100 may include a quarter-wave plate that converts this linearly polarized light into circularly polarized light. The circularly polarized light may be transmitted as output beam 125, and lidar system 100 may receive input beam 135, which may be substantially or at least partially circularly polarized in the same manner as the output beam 125 (e.g., if output beam 125 is right-hand circularly polarized, then input beam 135 may also be right-hand circularly polarized). The input beam 135 may pass through the same quarter-wave plate (or a different quarter-wave plate) resulting in the input beam 135 being converted to linearly polarized light which is orthogonally polarized (e.g., polarized at a right angle) with respect to the linearly polarized light produced by light source 110. As another example, lidar system 100 may employ polarization-diversity detection where two polarization components are detected separately. The output beam 125 may be linearly polarized, and the lidar system 100 may split the input beam 135 into two polarization components (e.g., s-polarization and p-polarization) which are detected separately by two photodiodes (e.g., a balanced photoreceiver that includes two photodiodes). | US10003168B1 | Fiber laser with free-space components | Luminar Technologies, Inc. | Alain Villeneuve | 2017-10-18 | 2017-11-30 | 2018-06-19 | 2018-06-19 | https://patents.google.com/patent/US10003168B1/en | null | 36 |
|
US10684360B2-108 | Next, FIG. 9 illustrates an example vehicle 354 with a lidar system 351 that includes a laser 352 with multiple sensor heads 360 coupled to the laser 352 via multiple laser-sensor links 370. The laser 352 and the sensor heads 360 may be similar to the laser 300 and the sensor 310 discussed above, in some implementations. For example, each of the laser-sensor links 370 may include one or more optical links and/or one or more electrical links. The sensor heads 360 in FIG. 9 are positioned or oriented to provide a greater than 30-degree view of an environment around the vehicle. More generally, a lidar system with multiple sensor heads may provide a horizontal field of regard around a vehicle of approximately 30°, 45°, 60°, 90°, 120°, 180°, 270°, or 360°. Each of the sensor heads may be attached to or incorporated into a bumper, fender, grill, side panel, spoiler, roof, headlight assembly, taillight assembly, rear-view mirror assembly, hood, trunk, window, or any other suitable part of the vehicle. | US10684360B2 | Protecting detector in a lidar system using off-axis illumination | Luminar Technologies, Inc. | Scott R. Campbell | 2017-03-30 | 2017-09-22 | 2020-06-16 | 2020-06-16 | https://patents.google.com/patent/US10684360B2/en | null | 7,028 |
|
US9810786B1-105 | In particular embodiments, gain medium 410 may include a back surface 470 with a dielectric coating. As an example, back surface 470 may have a coating with a low reflectivity (e.g., R<10%) at a pump-laser wavelength and a high reflectivity (e.g., R>90%) at an operating wavelength of the PQSW laser 400. In particular embodiments, saturable absorber 420 may include an output surface 480 with a dielectric coating. In particular embodiments, a dielectric coating (which may be referred to as a thin-film coating, interference coating, or coating) may include one or more layers of dielectric materials (e.g., SiO2, TiO2, Al2O3, Ta2O5, MgF2, LaF3, or AlF3) having particular thicknesses (e.g., thickness less than 1 μm) and particular refractive indices. A dielectric coating may be deposited onto a surface (e.g., a surface of gain medium 410 or saturable absorber 420) using any suitable deposition technique, such as for example, sputtering or electron-beam deposition. | US9810786B1 | Optical parametric oscillator for lidar system | Luminar Technologies, Inc. | David Welford, Martin A. Jaspan, Jason M. Eichenholz, Scott R. Campbell, Lane A. Martin, Matthew D. Weed | 2017-03-16 | 2017-03-16 | 2017-11-07 | 2017-11-07 | https://patents.google.com/patent/US9810786B1/en | null | 14,506 |
|
US11435479B2-122 | In some implementations, the expected configuration is determined based upon the relative position between the vehicle and the object. For example, the classification module 412 may associate the point cloud object with a generic object of the same type of object. The generic object may be rotated and/or scaled based on the determined relative position to determine the expected configuration of the point cloud object. Skew may then be determined by comparing the expected configuration and the apparent/sensed configuration, and identifying a substantial difference (e.g., greater than a threshold difference) between the two. In other implementations and/or scenarios, it is determined that the object is skewed because a bound of the point cloud object is determined to be curved when it is known that the point cloud object should instead have a straight bound. For example, the classification module 412 may have classified the point cloud object as a type associated with a rule that side bounds should be approximately vertical (e.g., a truck container). Thus, the expected configuration of that bound is a line. | US11435479B2 | Determining relative velocity based on an expected configuration | Luminar, Llc | Eric C. Danziger, Austin K. Russell, Benjamin Englard | 2018-08-06 | 2018-11-20 | 2022-09-06 | 2022-09-06 | https://patents.google.com/patent/US11435479B2/en | null | 11,359 |
|
US11521009B2-119 | The SDCA 500 also includes a prediction component 520, which processes the perception signals 508 to generate prediction signals 522 descriptive of one or more predicted future states of the autonomous vehicle's environment. For a given object, for example, the prediction component 520 may analyze the type/class of the object (as determined by the classification module 512) along with the recent tracked movement of the object (as determined by the tracking module 514) to predict one or more future positions of the object. As a relatively simple example, the prediction component 520 may assume that any moving objects will continue to travel on their current direction and with their current speed, possibly taking into account first or higher-order derivatives to better track objects that have continuously changing directions, objects that are accelerating, and so on. In some embodiments, the prediction component 520 also predicts movement of objects based on more complex behaviors. For example, the prediction component 520 may assume that an object that has been classified as another vehicle will follow rules of the road (e.g., stop when approaching a red light), and will react in a certain way to other dynamic objects (e.g., attempt to maintain some safe distance from other vehicles). The prediction component 520 may inherently account for such behaviors by utilizing a neural network or other machine learning model, for example. For example, in some embodiments, a machine learning model for prediction component 520 may be trained using virtual sensor data. In additional embodiments, virtual data may be used as output by a virtual version of perception component 506 to train a machine learning model of prediction component 520. The prediction component 520 may be omitted from the SDCA 500, in some embodiments. | US11521009B2 | Automatically generating training data for a lidar using simulated vehicles in virtual space | Luminar, Llc | Miguel Alexander Peake, Benjamin Englard | 2018-09-04 | 2019-09-04 | 2022-12-06 | 2022-12-06 | https://patents.google.com/patent/US11521009B2/en | null | 12,107 |
|
US10677897B2-67 | In one implementation, the controller 150 compares the detected sound to a sound signature stored in the memory of the controller 150 to identify the source of the sound. For example, the controller 150 may perform a Fourier analysis on the detected sound and compare the analyzed sound data to a stored sound signature. In this manner, the lidar system 100 may determine that the sound is a siren from an emergency vehicle as well as estimate the position of the emergency vehicle in relation to the vehicle. For example, the lidar system 100 may determine that the emergency vehicle is approaching the vehicle, and accordingly the vehicle may pull over. In other examples, the lidar system 100 may identify that the sound is a vehicle horn, skidding tires, indicative of mechanical issues in the vehicle, or any other suitable sound. | US10677897B2 | Combining lidar and camera data | Luminar Technologies, Inc. | Joseph G. LaChapelle, Jason M. Eichenholz | 2017-04-14 | 2018-04-16 | 2020-06-09 | 2020-06-09 | https://patents.google.com/patent/US10677897B2/en | null | 6,836 |
|
US9810775B1-47 | In particular embodiments, an autonomous vehicle may be configured to drive with a driver present in the vehicle, or an autonomous vehicle may be configured to operate the vehicle with no driver present. As an example, an autonomous vehicle may include a driver's seat with associated controls (e.g., steering wheel, accelerator pedal, and brake pedal), and the vehicle may be configured to drive with no one seated in the driver's seat or with little or no input from a person seated in the driver's seat. As another example, an autonomous vehicle may not include any driver's seat or associated driver's controls, and the vehicle may perform substantially all driving functions (e.g., driving, steering, braking, parking, and navigating) without human input. As another example, an autonomous vehicle may be configured to operate without a driver (e.g., the vehicle may be configured to transport human passengers or cargo without a driver present in the vehicle). As another example, an autonomous vehicle may be configured to operate without any human passengers (e.g., the vehicle may be configured for transportation of cargo without having any human passengers onboard the vehicle). | US9810775B1 | Q-switched laser for LIDAR system | Luminar Technologies, Inc. | David Welford, Martin A. Jaspan, Jason M. Eichenholz, Scott R. Campbell, Lane A. Martin, Matthew D. Weed | 2017-03-16 | 2017-03-16 | 2017-11-07 | 2017-11-07 | https://patents.google.com/patent/US9810775B1/en | null | 14,210 |
|
US10812745B2-26 | The point cloud is provided to a perception engine 118 which applies perception techniques to the point cloud. This may include object classification that allows the future behavior of the object to be predicted. As an example, a tree will remain stationary and the outer periphery of the tree will likely bend in a collision. A pedestrian may move in any direction at any time, but will never move very fast. Perceiving the sizes and positions of particular objects in the scene allows the scene to be more fully characterized. | US10812745B2 | Bit depth reduction of image pixels | Luminar Technologies, Inc. | Richmond Hicks | 2019-03-14 | 2019-03-14 | 2020-10-20 | 2020-10-20 | https://patents.google.com/patent/US10812745B2/en | null | 7,268 |
|
US10551501B1-162 | In an example scenario, the relative speed between a target and lidar system is v=45 m/s (=100 mph). The time interval between successive pulses in pulse burst is τ=1 ns; the pulse repetition frequency is f=1/τ=1 GHz; and the relative distance moved between successive pulses: Δd=(45 m/s)×(1 ns)=45 nm. The change in time interval between successive pulses: Δτ=(2×Δd)/c=0.3 fs, and the change in frequency between transmitted and received pulses: Δf=(Δτ)/τ2=300 Hz. Because it is generally easier to detect a frequency shift of 300 Hz than a time-domain shift of 0.3 fs, a dual-model lidar system can process pulse-frequency characteristics in frequency domain. However, both frequency-domain and time-domain approaches are discussed below. | US10551501B1 | Dual-mode lidar system | Luminar Technologies, Inc. | Joseph G. LaChapelle | 2018-08-09 | 2018-08-09 | 2020-02-04 | 2020-02-04 | https://patents.google.com/patent/US10551501B1/en | null | 5,165 |
|
US10627495B2-148 | In any event, the amplified signal may be compared to a threshold voltage VT. When the amplified signal rises above VT, the pulse-detection circuit 504 determines that a received optical signal from the APD 502 is indicative of a returned light pulse scattered by a remote target. | US10627495B2 | Time varying gain in an optical detector operating in a lidar system | Luminar Technologies, Inc. | Stephen D. Gaalema, Austin K. Russell, Joseph G. LaChapelle, Scott R. Campbell, Jason M. Eichenholz, Tue Tran | 2017-03-28 | 2018-11-09 | 2020-04-21 | 2020-04-21 | https://patents.google.com/patent/US10627495B2/en | null | 6,011 |
|
US11774561B2-38 | In particular embodiments, a lidar system 100 may be used to determine the distance to one or more downrange targets 130. By scanning the lidar system 100 across a field of regard, the system may be used to map the distance to a number of points within the field of regard. Each of these depth-mapped points may be referred to as a pixel or a voxel. A collection of pixels captured in succession (which may be referred to as a depth map, a point cloud, or a frame) may be rendered as an image or may be analyzed to identify or detect objects or to determine a shape or distance of objects within the FOR. As an example, a point cloud may cover a field of regard that extends 60° horizontally and 15° vertically, and the point cloud may include a frame of 100-2000 pixels in the horizontal direction by 4-400 pixels in the vertical direction. | US11774561B2 | Amplifier input protection circuits | Luminar Technologies, Inc. | Stephen D. Gaalema, Robert D. Still | 2019-02-08 | 2019-02-08 | 2023-10-03 | 2023-10-03 | https://patents.google.com/patent/US11774561B2/en | null | 13,285 |
|
US10418776B2-115 | In particular embodiments, a light source 110 of a lidar system 100 may include a solid-state laser, where the solid-state laser includes a PQSW laser 400 and an OPO 600. As an example, light source 110 may include an OPO 600 pumped by pulses of light from a Nd:YAG/Cr:YAG PQSW laser 400, a Nd:YAG/V:YAG PQSW laser 400, a Yb:YAG/Cr:YAG PQSW laser 400, or a Yb:YAG/V:YAG PQSW laser 400. In particular embodiments, pulses of OPO pump light may correspond to the pulses of light of output beam 460 produced by PQSW laser 400. As an example, the OPO pump beam illustrated in each of FIGS. 13-15 may correspond to an output beam 460 of PQSW laser 400. | US10418776B2 | Solid-state laser for lidar system | Luminar Technologies, Inc. | David Welford, Martin A. Jaspan, Jason M. Eichenholz, Scott R. Campbell, Lane A. Martin, Matthew D. Weed | 2017-03-16 | 2018-02-21 | 2019-09-17 | 2019-09-17 | https://patents.google.com/patent/US10418776B2/en | null | 3,717 |
|
US11521009B2-141 | Occupancy grid generator 600 may further include a label layer component 604 configured to generate a label layer 614. In various aspects, label layer 614 may be mapped to normal layer 612 (e.g., as depicted by occupancy grid 610), and encoded with a first channel set. While occupancy grid 610 is represented as a series of layered objects, it is to be understood that occupancy grid 610 need not be visualized and may exist as a computing structure or object, e.g., in memory 152 of graphics platform 101. The first channel set may be associated with one or more text-based or state-based values of one or more objects of the environment (e.g., objects or surfaces 402-418 of the virtual environment of FIG. 4A or objects or surfaces depicted in FIG. 6B, including, for example, road 655, vehicles 656A-C, pedestrian 656D, etc.). In some embodiments, the first channel set may include a plurality of first channels of a pixel. For example, the plurality of first channels of the pixel may include red (R), green (G), and blue (B) channels. Each of the plurality of first channels of the pixel may indicate a particular text-based or state-based value. The text-based or state-based values may define one or more classifications or one or more states of the one or more objects of the environment. For example, a value of zero (e.g., where all RGB channels have a zero value) may indicate that a vehicle (e.g., vehicle 401 of FIG. 4A or vehicle 656C of FIG. 6B) in the scene is not moving. As another example, a value of 65 may indicate (e.g., where RGB channels equal a value of 65), or label, that a particular object or surface with a scene is a miscellaneous object or surface (e.g., non-moving miscellaneous object 482 of FIG. 4B). | US11521009B2 | Automatically generating training data for a lidar using simulated vehicles in virtual space | Luminar, Llc | Miguel Alexander Peake, Benjamin Englard | 2018-09-04 | 2019-09-04 | 2022-12-06 | 2022-12-06 | https://patents.google.com/patent/US11521009B2/en | null | 12,129 |
|
US10191155B2-31 | The operating wavelength of a lidar system 100 may lie, for example, in the infrared, visible, or ultraviolet portions of the electromagnetic spectrum. The Sun also produces light in these wavelength ranges, and thus sunlight can act as background noise which can obscure signal light detected by the lidar system 100. This solar background noise can result in false-positive detections or can otherwise corrupt measurements of the lidar system 100, especially when the receiver 140 includes SPAD detectors (which can be highly sensitive). | US10191155B2 | Optical resolution in front of a vehicle | Luminar Technologies, Inc. | George C. Curatu | 2017-03-29 | 2017-11-27 | 2019-01-29 | 2019-01-29 | https://patents.google.com/patent/US10191155B2/en | null | 1,288 |
|
US10627516B2-51 | Moreover, in some implementations, the housing 155 includes multiple lidar sensors, each including a respective scanner and a receiver. Depending on the particular implementation, each of the multiple sensors can include a separate light source or a common light source. The multiple sensors can be configured to cover non-overlapping adjacent fields of regard or partially overlapping fields of regard, depending on the implementation. | US10627516B2 | Adjustable pulse characteristics for ground detection in lidar systems | Luminar Technologies, Inc. | Jason M. Eichenholz | 2018-07-19 | 2018-07-19 | 2020-04-21 | 2020-04-21 | https://patents.google.com/patent/US10627516B2/en | null | 6,196 |
|
US10345447B1-67 | The segmentation module 40 is generally configured to identify distinct objects within the sensor data representing the sensed environment. Depending on the embodiment and/or scenario, the segmentation task may be performed separately for each of a number of different types of sensor data, or may be performed jointly on a fusion of multiple types of sensor data. In some embodiments where lidar devices are used, the segmentation module 40 analyzes frames that include point cloud datasets therein to identify subsets of points within each frame that correspond to probable physical objects located in the environment. In other embodiments, the segmentation module 40 jointly analyzes lidar point cloud data frames in conjunction with camera image frames to identify objects that are located in the environment. Other suitable techniques, and/or data from other suitable sensor types, may also be used to identify objects. It is noted that, as used herein, references to different or distinct “objects” may encompass physical things that are entirely disconnected (e.g., with two vehicles being two different “objects,” and the road on which the vehicles are traveling as yet a different “object”), as well as physical things that are connected or partially connected (e.g., with a vehicle being a first “object” and the vehicle's hitched trailer being a second “object”). The segmentation module 40 may use predetermined rules or algorithms to identify objects. For example, the segmentation module 40 may identify as distinct objects, within a point cloud, any clusters of points that meet certain criteria (e.g., having no more than a certain maximum distance between the points in the cluster, or having the same relative velocity). As another example, the segmentation module 40 may utilize one or more neural networks that have been trained to identify distinct objects within the environment (e.g., using supervised learning with generated labels for different objects within test data point clouds, etc.), or may utilize one or more other types of machine-learning based models that have been trained, by using test or training data, to discern, distinguish, and/or identify probably distinct objects within a source image. | US10345447B1 | Dynamic vision sensor to direct lidar scanning | Luminar Technologies, Inc. | Richmond Hicks | 2018-06-27 | 2018-06-27 | 2019-07-09 | 2019-07-09 | https://patents.google.com/patent/US10345447B1/en | null | 2,977 |
|
US10310058B1-146 | The controller 130 may be electrically coupled or otherwise communicatively coupled to one or more of the light source 122A, the scanner 11, and the receiver 128A. The controller 130 may receive electrical trigger pulses or edges from the light source 122A, where each pulse or edge corresponds to the emission of an optical pulse by the light source 122A. The controller 130 may provide instructions, a control signal, or a trigger signal to the light source 122A indicating when the light source 122A should produce optical pulses. For example, the controller 130 may send an electrical trigger signal that includes electrical pulses, where the light source 122A emits an optical pulse in response to each electrical pulse. Further, the controller 130 may cause the light source 122A to adjust one or more of the frequency, period, duration, pulse energy, peak power, average power, or wavelength of the optical pulses produced by the light source 122A. | US10310058B1 | Concurrent scan of multiple pixels in a lidar system equipped with a polygon mirror | Luminar Technologies, Inc. | Scott R. Campbell, Jason M. Eichenholz, Matthew D. Weed, Lane A. Martin | 2017-11-22 | 2018-04-27 | 2019-06-04 | 2019-06-04 | https://patents.google.com/patent/US10310058B1/en | null | 2,400 |
|
US9810786B1-224 | In some embodiments, the lidar system further comprises an overlap mirror configured to overlap the input and output beams so that they are substantially coaxial, wherein the overlap mirror comprises: a hole, slot, or aperture which the output beam passes through; and a reflecting surface that reflects at least a portion of the input beam toward the receiver. | US9810786B1 | Optical parametric oscillator for lidar system | Luminar Technologies, Inc. | David Welford, Martin A. Jaspan, Jason M. Eichenholz, Scott R. Campbell, Lane A. Martin, Matthew D. Weed | 2017-03-16 | 2017-03-16 | 2017-11-07 | 2017-11-07 | https://patents.google.com/patent/US9810786B1/en | null | 14,625 |
|
US9810775B1-222 | In some embodiments, the scanner comprises one or more mirrors, wherein each mirror is mechanically driven by a galvanometer scanner, a resonant scanner, a microelectromechanical systems (MEMS) device, or a voice coil motor. | US9810775B1 | Q-switched laser for LIDAR system | Luminar Technologies, Inc. | David Welford, Martin A. Jaspan, Jason M. Eichenholz, Scott R. Campbell, Lane A. Martin, Matthew D. Weed | 2017-03-16 | 2017-03-16 | 2017-11-07 | 2017-11-07 | https://patents.google.com/patent/US9810775B1/en | null | 14,385 |
|
US10451716B2-141 | The receiver 128A may have an active region or an avalanche-multiplication region that includes silicon, germanium, or InGaAs. The active region of receiver 128A may have any suitable size, such as for example, a diameter or width of approximately 50-500 μm. The receiver 128 may include circuitry that performs signal amplification, sampling, filtering, signal conditioning, analog-to-digital conversion, time-to-digital conversion, pulse detection, threshold detection, rising-edge detection, or falling-edge detection. For example, the receiver 128A may include a transimpedance amplifier that converts a received photocurrent (e.g., a current produced by an APD in response to a received optical signal) into a voltage signal. The receiver 128A may direct the voltage signal to pulse-detection circuitry that produces an analog or digital output signal 145A that corresponds to one or more characteristics (e.g., rising edge, falling edge, amplitude, or duration) of a received optical pulse. For example, the pulse-detection circuitry may perform a time-to-digital conversion to produce the digital output signal 145A. The receiver 128A may send the electrical output signal 145A to the controller 130 for processing or analysis, e.g., to determine a time-of-flight value corresponding to a received optical pulse. | US10451716B2 | Monitoring rotation of a mirror in a lidar system | Luminar Technologies, Inc. | John Hughes, Nicholas Ventola, Sean P. Hughes | 2017-11-22 | 2018-04-20 | 2019-10-22 | 2019-10-22 | https://patents.google.com/patent/US10451716B2/en | null | 4,101 |
|
US11367990B2-154 | In particular embodiments, lidar system 100 may include a processor (e.g., controller 150 in FIG. 1) configured to adjust the output powers of two or more pump laser diodes 430. As an example, in FIG. 13, the output power of each pump laser diode may be adjusted in response to a temperature change (e.g., a temperature change of the DPSS laser 400, a pump laser diode 430-1 or 430-2, the light source 110, or the lidar system 100). The controller 150 may be configured to adjust the output powers of the pump laser diodes so that the pump laser operating at the most efficient wavelength to pump the gain medium 410 produces the most output power. For example, at a lower temperature (e.g., pump laser 430-1 operates at approximately 803 nm, and pump laser 430-2 operates at approximately 808 nm), the controller 150 may instruct pump laser 430-1 to produce a relatively low output power (e.g., 0 to 0.5 W), and pump laser 430-2 may be instructed to produce a relatively high output power (e.g., greater than 3 W). Similarly, at a higher temperature (e.g., pump laser 430-1 operates at approximately 808 nm, and pump laser 430-2 operates at approximately 813 nm), the controller may instruct pump laser 430-1 to produce a relatively high output power, and the output power of pump laser 430-2 may be decreased to a relatively low value. | US11367990B2 | Lidar system operating at 1200-1400 NM | Luminar, Llc | Jason M. Eichenholz, Laurance S. Lingvay, David Welford | 2018-08-29 | 2019-08-29 | 2022-06-21 | 2022-06-21 | https://patents.google.com/patent/US11367990B2/en | null | 9,985 |
|
US10241198B2-144 | Various implementations have been described in connection with the accompanying drawings. However, it should be understood that the figures may not necessarily be drawn to scale. As an example, distances or angles depicted in the figures are illustrative and may not necessarily bear an exact relationship to actual dimensions or layout of the devices illustrated. | US10241198B2 | Lidar receiver calibration | Luminar Technologies, Inc. | Joseph G. LaChapelle, Rodger W. Cleye, Scott R. Campbell, Jason M. Eichenholz | 2017-03-30 | 2017-11-30 | 2019-03-26 | 2019-03-26 | https://patents.google.com/patent/US10241198B2/en | null | 1,746 |
|
US10451716B2-85 | The planar mirror 14 may be configured so as to pivot over a range of allowable motion larger than a range corresponding to the vertical angular dimension of the field of regard, so as to define a maximum range of allowable motion larger than a range within which the planar mirror 14 pivots during a scan. A controller associated with the planar mirror 14 selects different portions of the maximum range of allowable motion as the range within which the second mirror pivots, in accordance with modifications of the scan pattern. In particular, to modify at least one of a scan pattern or a scan rate, a controller associated with the motor 32 of the polygon mirror 12 can be configured to cause the motor 32 to vary the speed of rotation of the polygon mirror 12, cause the drive motor 64 to vary the vary the oscillation of the planar mirror 14, or both. The controller can be associated with both the polygon mirror 12 and the planar mirror 14. The controller may be configured to modify the scan pattern on a frame-by-frame basis, each frame corresponding to a complete scan of the field of regard of the lidar system 10. In some implementations, the oscillation of the planar mirror 14 may be varied (e.g., to change the vertical angular dimension of the field of regard), and the rotational speed of the polygon mirror 12 may be regulated or stabilized so that the polygon mirror 12 rotates at a substantially constant speed. | US10451716B2 | Monitoring rotation of a mirror in a lidar system | Luminar Technologies, Inc. | John Hughes, Nicholas Ventola, Sean P. Hughes | 2017-11-22 | 2018-04-20 | 2019-10-22 | 2019-10-22 | https://patents.google.com/patent/US10451716B2/en | null | 4,045 |
|
US10732281B2-8 | The scanning system may also include an envelope detector coupled to the plurality of amplitude detectors that determines a magnitude or amplitude envelope of the scattered light pulse based on the time delays determined by the plurality of amplitude detectors. The envelope detector may determine the center of the scattered light pulse based on the time delays determined by three or more of the plurality of amplitude detectors, based on one or more of the amplitude detectors associated with a maximum detected threshold, or based on the center of the scattered light pulse determined in any other manner. | US10732281B2 | Lidar detector system having range walk compensation | Luminar Technologies, Inc. | Joseph G. LaChapelle | 2017-03-28 | 2017-10-06 | 2020-08-04 | 2020-08-04 | https://patents.google.com/patent/US10732281B2/en | null | 7,075 |
|
US11367990B2-12 | FIG. 10 illustrates an example laser diode along with an example volume Bragg grating (VBG). | US11367990B2 | Lidar system operating at 1200-1400 NM | Luminar, Llc | Jason M. Eichenholz, Laurance S. Lingvay, David Welford | 2018-08-29 | 2019-08-29 | 2022-06-21 | 2022-06-21 | https://patents.google.com/patent/US11367990B2/en | null | 9,843 |
|
US10481605B1-18 | FIG. 10 is a flow diagram of an example method of managing operation of an autonomous vehicle moving toward a destination, in accordance with some embodiments; | US10481605B1 | Autonomous vehicle technology for facilitating safe stopping according to separate paths | Luminar Technologies, Inc. | Tomi P. Maila, Vahid R. Ramezani, Benjamin Englard | 2018-09-21 | 2018-09-21 | 2019-11-19 | 2019-11-19 | https://patents.google.com/patent/US10481605B1/en | null | 4,250 |
|
US10591600B2-332 | As used herein, the terms “based on” and “based at least in part on” may be used to describe or present one or more factors that affect a determination, and these terms may not exclude additional factors that may affect a determination. A determination may be based solely on those factors which are presented or may be based at least in part on those factors. The phrase “determine A based on B” indicates that B is a factor that affects the determination of A. In some instances, other factors may also contribute to the determination of A. In other instances, A may be determined based solely on B. | US10591600B2 | Lidar system with distributed laser and multiple sensor heads | Luminar Technologies, Inc. | Alain Villeneuve, Jason M. Eichenholz | 2015-11-30 | 2016-11-29 | 2020-03-17 | 2020-03-17 | https://patents.google.com/patent/US10591600B2/en | null | 5,862 |
|
US10627521B2-155 | In other embodiments, distinct neural networks are separately trained to handle different sensor parameter settings. FIG. 13 illustrates one such embodiment. In FIG. 13, a perception component 730 includes N sets 732 of neural networks (N being any suitable integer greater than one). Each set 732 of neural networks includes one or more neural networks that are trained, using a respective one of N sets 734 of training data, to accommodate a sensor configured according to a specific sensor parameter setting. For example, “Neural Network(s) 1” may be trained, using “Training Data 1,” to process a first scan line distribution, while “Neural Network(s) 2” may be trained, using “Training Data 2,” to process a second, different scan line distribution. In some embodiments, each of the sets 732 of neural networks includes separate neural networks for segmentation, classification and/or tracking (e.g., corresponding to the functions of the segmentation module 110, classification module 112 and/or tracking module 114, respectively, of FIG. 1). Alternatively, each of the sets 732 of neural networks may include a single neural network that jointly performs segmentation, classification and tracking, or jointly performs any two of those three functions. | US10627521B2 | Controlling vehicle sensors based on dynamic objects | Luminar Technologies, Inc. | Benjamin Englard, Eric C. Danziger, Austin K. Russell | 2017-12-13 | 2018-10-31 | 2020-04-21 | 2020-04-21 | https://patents.google.com/patent/US10627521B2/en | null | 6,499 |
|
US11927777B2-55 | In one implementation, the insulating undercut area 612 has a diameter greater than a corresponding diameter (W1) of the high-index dielectric block 606. In this case, the cell 602 may be suspended above the insulating undercut area 612 via one or more mechanical tethers, such as in the manner shown within top-down cell view 620. Specifically, the top-down cell view 620 illustrates exemplary positions for mechanism tethers 622 that prevent each cell from collapsing into the underlying insulating undercut area 612. Although any number of tethers may be used, the top-down cell view 620 illustrates four separate mechanical tether 622 that each extend across a region of the insulating undercut area 612 between a wall of the low-index dielectric substrate 604 and the cell 602, providing both tension and support that allows the cell 602 to rest in a position vertically aligned with a center of the underlying cavity. | US11927777B2 | Plasma dispersion effect for metasurface tuning | Luminar Technologies, Inc. | Aditya Jain, Zoran Jandric, Dan Mohr, Kevin A. Gomez, Krishnan Subramanian | 2019-12-14 | 2020-07-30 | 2024-03-12 | 2024-03-12 | https://patents.google.com/patent/US11927777B2/en | null | 14,005 |
|
US11361449B2-120 | Next, FIG. 8 illustrates an example method 800 for constructing tracks through a message passing graph, such as the graph 50 or 70 of the examples above, which also can be implemented in the multi-object tracker 14. At block 802, the multi-object tracker 14 receives a sequence of images generated by one or more sensors. At block 804, the multi-object tracker 14 constructs a message passing graph in which each of a multiplicity of layers corresponds to a respective image in the sequence of images. | US11361449B2 | Neural network for object detection and tracking | Luminar, Llc | Vahid R. Ramezani, Akshay Rangesh, Benjamin Englard, Siddhesh S. Mhatre, Meseret R. Gebre, Pranav Maheshwari | 2020-05-06 | 2020-09-04 | 2022-06-14 | 2022-06-14 | https://patents.google.com/patent/US11361449B2/en | null | 9,811 |
|
US11467266B2-193 | In particular embodiments, a phase shifter 429 may be implemented as a part of an integrated-optic 90-degree optical hybrid 428. For example a phase shifter 429 may be implemented as a portion of optical waveguide that only one part of the LO light 430 propagates through. The portion of optical waveguide may be temperature controlled to adjust the refractive index of the waveguide portion and produce a relative phase delay of approximately 90 degrees between the two parts of LO light 430. Additionally or alternatively, the 90-degree optical hybrid 428 as a whole may be temperature controlled to set and maintain a 90-degree phase delay. As another example, a phase shifter 429 may be implemented by applying an external electric field to a portion of optical waveguide to change the refractive index of the waveguide portion and produce a 90-degree phase delay. In particular embodiments, a phase shifter 429 may be implemented as a part of a free-space or fiber-coupled 90-degree optical hybrid 428. For example the input and output beams in a free-space 90-degree optical hybrid 428 may be reflected by or transmitted through the optical surfaces of the optical hybrid 428 so that a relative phase shift of 90 degrees is imparted to one part of LO light 430 with respect to the other part of LO light 430. | US11467266B2 | Coherent pulsed lidar system with spectral signatures | Luminar, Llc | Joseph G. LaChapelle, Jason M. Eichenholz, Alex Michael Sincore | 2019-08-20 | 2020-06-26 | 2022-10-11 | 2022-10-11 | https://patents.google.com/patent/US11467266B2/en | null | 11,812 |
|
US10627521B2-148 | Dynamic adjustment of sensor parameters settings (e.g., for parameters that define the area of focus for a sensor), according to any of the embodiments described herein, can greatly improve the ability of sensors to capture useful information about the environment (e.g., information needed to improve vehicle safety). However, variability in sensor settings may make it more difficult to process the sensor data. For example, perception functions (e.g., segmentation, classification and tracking) may be made more difficult if the perception component must process lidar data with different scan line spatial distributions, camera data with different exposure settings, and so on. In the case of non-uniform scan line distributions, for instance, different parts of the “scene” captured by the sensor (e.g., different elevation angle ranges captured by a lidar device) will have different densities of points as compared to a uniform scan line distribution. Thus, an object (e.g., a car) may “look” very different based solely on where the object resides within the scene, even if the object remains at a constant distance from the sensor. | US10627521B2 | Controlling vehicle sensors based on dynamic objects | Luminar Technologies, Inc. | Benjamin Englard, Eric C. Danziger, Austin K. Russell | 2017-12-13 | 2018-10-31 | 2020-04-21 | 2020-04-21 | https://patents.google.com/patent/US10627521B2/en | null | 6,492 |
|
US10324170B1-10 | Because the polygon block rotates at a high speed, it produces a significant amount of acoustic noise. To reduce the acoustic noise, the polygon block in some implementations includes chamfered edges or corners. In other implementations, the housing partially enclosing the polygon mirror, and/or the bracket adjacent to which the polygon mirror is mounted, includes tapered features on the interior surface. These tapered features effectively spread out pressure wave in time, thereby reducing the energy of the acoustic waves generated by the rotating polygon mirror. | US10324170B1 | Multi-beam lidar system with polygon mirror | Luminar Technologies, Inc. | John P. Engberg, Jr., Christopher A. Engberg, John G. Hughes, Sean P. Hughes | 2018-04-05 | 2018-05-08 | 2019-06-18 | 2019-06-18 | https://patents.google.com/patent/US10324170B1/en | null | 2,507 |
|
US11002853B2-124 | FIG. 10 illustrates an example InGaAs avalanche photodiode (APD) 400. Referring back to FIG. 1, the receiver 140 may include one or more APDs 400 configured to receive and detect light from input light such as the beam 135. More generally, the APD 400 can operate in any suitable receiver of input light. The APD 400 may be configured to detect a portion of pulses of light which are scattered by a target located downrange from the lidar system in which the APD 400 operates. For example, the APD 400 may receive a portion of a pulse of light scattered by the target 130 depicted in FIG. 1, and generate an electrical-current signal corresponding to the received pulse of light. | US11002853B2 | Ultrasonic vibrations on a window in a lidar system | Luminar, Llc | John E. McWhirter | 2017-03-29 | 2017-09-26 | 2021-05-11 | 2021-05-11 | https://patents.google.com/patent/US11002853B2/en | null | 8,250 |
|
US10983213B2-182 | FIG. 21 illustrates an example detector array 700 with non-uniform spatial separation between adjacent array elements, which can be implemented in the lidar system of FIG. 1. The detector array 700 may be generally similar to the detector 600 of FIG. 12, for example. Similar to the detector array 700, detector sites 702A, 702B, etc. may include any suitable individual detector element or a cluster of detector elements. However, detector sites 702B/702C and 702C/702D in this example implementation are separated by distance d1, and detector sites 702A/702B and 702D/702E are separated by distance d2, and the distance d2 is larger than the distance d1. The detector site 702C may be disposed on the centerline 704, which may correspond to the center of the field of regard. Thus, in the example of FIG. 21, spatial separation increases with the distance from the center of the field of regard. A lidar system that scans different lines using different output beams (see, e.g., FIG. 6), for an angular field of regard or a circular field of regard (see, e.g., FIGS. 3 and 4), may use the detector array 700 to scan line A using beam A and the detector site 702A, line B using beam B and the detector site 702B, etc. | US10983213B2 | Non-uniform separation of detector array elements in a lidar system | Luminar Holdco, Llc | Jason M. Eichenholz, Scott R. Campbell, Joseph G. LaChapelle | 2017-03-29 | 2018-03-29 | 2021-04-20 | 2021-04-20 | https://patents.google.com/patent/US10983213B2/en | null | 8,098 |
|
US11353555B2-41 | In particular embodiments, one or more lidar systems 100 may be integrated into a vehicle as part of an autonomous-vehicle driving system. As an example, a lidar system 100 may provide information about the surrounding environment to a driving system of an autonomous vehicle. An autonomous-vehicle driving system may include one or more computing systems that receive information from a lidar system 100 about the surrounding environment, analyze the received information, and provide control signals to the vehicle's driving systems (e.g., steering wheel, accelerator, brake, or turn signal). As an example, a lidar system 100 integrated into an autonomous vehicle may provide an autonomous-vehicle driving system with a point cloud every 0.1 seconds (e.g., the point cloud has a 10 Hz update rate, representing 10 frames per second). The autonomous-vehicle driving system may analyze the received point clouds to sense or identify targets 130 and their respective locations, distances, or speeds, and the autonomous-vehicle driving system may update control signals based on this information. As an example, if lidar system 100 detects a vehicle ahead that is slowing down or stopping, the autonomous-vehicle driving system may send instructions to release the accelerator and apply the brakes. | US11353555B2 | Detector quench circuit for lidar system comprising a discrete transistor to draw a quench current to enable a drop in a reverse bias voltage applied to an avalanche photodiode | Luminar, Llc | Stephen D. Gaalema | 2017-11-01 | 2018-11-01 | 2022-06-07 | 2022-06-07 | https://patents.google.com/patent/US11353555B2/en | null | 9,406 |
|
US10003168B1-210 | The free-space input beam 914 in FIG. 22 may include pulses of light having one or more wavelengths between approximately 1400 nm and approximately 1600 nm, a pulse duration less than or equal to 100 nanoseconds, or a duty cycle less than or equal to 10%. In particular embodiments, free-space input beam 914 may be supplied to amplifier assembly 900 by a seed laser diode or by a gain fiber from a previous amplifier stage. As an example, an amplifier assembly 900 may include a free-space seed laser diode (e.g., similar to seed laser diode 710 in FIG. 15) configured to supply the free-space input beam 914. The amplifier assembly 900 may also include a lens (e.g., similar to seed-laser lens 720 in FIG. 15) configured to collimate or focus the input beam 914. As another example, an amplifier assembly 900 may include a fiber-coupled seed laser diode (e.g., similar to laser diode 440 in FIG. 8). An output end of a fiber-optic cable from the fiber-coupled seed laser diode may be attached to the platform 905 and may supply the free-space input beam 914. As another example, the output end of a gain fiber (e.g., similar to output end 808 of gain fiber 760 in FIG. 18) may be attached to the platform 905 and may supply the free-space input beam 914. The amplifier assembly 900 may also include a lens (similar to input-beam lens 820 in FIG. 18) configured to collimate or focus the input beam 914. | US10003168B1 | Fiber laser with free-space components | Luminar Technologies, Inc. | Alain Villeneuve | 2017-10-18 | 2017-11-30 | 2018-06-19 | 2018-06-19 | https://patents.google.com/patent/US10003168B1/en | null | 210 |
|
US11841440B2-99 | Each scan line 230 of a high-resolution scan pattern 200 may be oriented substantially parallel to a first scan axis, and the scan lines 230 may be distributed along a second scan axis that is substantially orthogonal to the first scan axis. The first scan axis (which may be referred to as the Θ1 scan axis) may be oriented along a horizontal direction, a vertical direction, at 45 degrees to a horizontal direction, or along any other suitable direction, and the second scan axis (which may be referred to as the Θ2 scan axis) may be substantially orthogonal to the first scan axis. For example, the first scan axis may be along a substantially horizontal direction (e.g., within 10° of the horizontal direction), and the scan lines 230 may be distributed along a substantially vertical direction (e.g., within 10° of the vertical direction). Alternatively, the first scan axis may be along a substantially vertical direction, and the scan lines 230 may be distributed along a substantially horizontal direction. The scan lines 230 in FIG. 7 may be referred to as being (i) oriented along a substantially horizontal direction (which corresponds to the Θ1 scan axis) and (ii) distributed along a substantially vertical direction (which corresponds to the Θ2 scan axis). Scan lines 230 that are oriented within 10° of a Θ1 scan axis may be referred to as being oriented substantially along or substantially parallel to the Θ1 scan axis. | US11841440B2 | Lidar system with high-resolution scan pattern | Luminar Technologies, Inc. | Istvan Peter Burbank, Matthew D. Weed, Jason Paul Wojack, Jason M. Eichenholz, Dmytro Trofymov | 2020-05-13 | 2021-11-24 | 2023-12-12 | 2023-12-12 | https://patents.google.com/patent/US11841440B2/en | null | 13,650 |
|
US10267918B2-47 | According to some implementations, the lidar system 100 can include an eye-safe laser, or the lidar system 100 can be classified as an eye-safe laser system or laser product. An eye-safe laser, laser system, or laser product may refer to a system with an emission wavelength, average power, peak power, peak intensity, pulse energy, beam size, beam divergence, exposure time, or scanned output beam such that emitted light from the system presents little or no possibility of causing damage to a person's eyes. For example, the light source 110 or lidar system 100 may be classified as a Class 1 laser product (as specified by the 60825-1 standard of the International Electrotechnical Commission (IEC)) or a Class I laser product (as specified by Title 21, Section 1040.10 of the United States Code of Federal Regulations (CFR)) that is safe under all conditions of normal use. In some implementations, the lidar system 100 may be classified as an eye-safe laser product (e.g., with a Class 1 or Class I classification) configured to operate at any suitable wavelength between approximately 1400 nm and approximately 2100 nm. In some implementations, the light source 110 may include a laser with an operating wavelength between approximately 1400 nm and approximately 1600 nm, and the lidar system 100 may be operated in an eye-safe manner. In some implementations, the light source 110 or the lidar system 100 may be an eye-safe laser product that includes a scanned laser with an operating wavelength between approximately 1530 nm and approximately 1560 nm. In some implementations, the lidar system 100 may be a Class 1 or Class I laser product that includes a fiber laser or solid-state laser with an operating wavelength between approximately 1400 nm and approximately 1600 nm. | US10267918B2 | Lidar detector having a plurality of time to digital converters integrated onto a detector chip | Luminar Technologies, Inc. | Joseph G. LaChapelle, Jason M. Eichenholz, Stephen D. Gaalema, Austin K. Russell | 2017-03-28 | 2018-06-25 | 2019-04-23 | 2019-04-23 | https://patents.google.com/patent/US10267918B2/en | null | 1,950 |
|
US10481605B1-96 | The network interface 616 is generally configured to convert data received from one or more devices or systems external to the autonomous vehicle to a format that is consistent with a protocol of the network 608 and is recognized by one or more of the processor(s) 602. In some embodiments, the network interface 616 includes separate interface hardware, firmware and/or software for different external sources. For example, a remote mapping/navigation server may send mapping and navigation/route data (e.g., mapping and navigation signals 132 of FIG. 1) to the computing system 600 via a cellular network interface of the network interface 616, while one or more peer vehicles (e.g., other autonomous vehicles) may send data (e.g., current positions of the other vehicles) to the computing system 600 via a WiFi network interface of the network interface 616. Other types of external data may also, or instead, be received via the network interface 616. For example, the computing system 600 may use the network interface 616 to receive data representing rules or regulations (e.g., speed limits), object positions (e.g., road rails, overhanging signage, etc.), and/or other information from various infrastructure devices or systems. | US10481605B1 | Autonomous vehicle technology for facilitating safe stopping according to separate paths | Luminar Technologies, Inc. | Tomi P. Maila, Vahid R. Ramezani, Benjamin Englard | 2018-09-21 | 2018-09-21 | 2019-11-19 | 2019-11-19 | https://patents.google.com/patent/US10481605B1/en | null | 4,328 |
|
US10094925B1-28 | System Overview | US10094925B1 | Multispectral lidar system | Luminar Technologies, Inc. | Joseph G. LaChapelle | 2017-03-31 | 2018-04-02 | 2018-10-09 | 2018-10-09 | https://patents.google.com/patent/US10094925B1/en | null | 651 |
|
US10295668B2-121 | In FIG. 10, photons of the input light 410 may be absorbed primarily in the absorption layer 424, resulting in the generation of electron-hole pairs (which may be referred to as photo-generated carriers). For example, the absorption layer 424 may be configured to absorb photons corresponding to the operating wavelength of the lidar system 100 (e.g., any suitable wavelength between approximately 1400 nm and approximately 1600 nm). In the avalanche layer 422, an avalanche-multiplication process occurs where carriers (e.g., electrons or holes) generated in the absorption layer 424 collide with the semiconductor lattice of the absorption layer 424, and produce additional carriers through impact ionization. This avalanche process can repeat numerous times so that one photo-generated carrier may result in the generation of multiple carriers. As an example, a single photon absorbed in the absorption layer 424 may lead to the generation of approximately 10, 50, 100, 200, 500, 1000, 10,000, or any other suitable number of carriers through an avalanche-multiplication process. The carriers generated in an APD 400 may produce an electrical current that is coupled to an electrical circuit which may perform signal amplification, sampling, filtering, signal conditioning, analog-to-digital conversion, time-to-digital conversion, pulse detection, threshold detection, rising-edge detection, or falling-edge detection. | US10295668B2 | Reducing the number of false detections in a lidar system | Luminar Technologies, Inc. | Joseph G. LaChapelle, Jason M. Eichenholz, Laurance S. Lingvay | 2017-03-30 | 2017-12-15 | 2019-05-21 | 2019-05-21 | https://patents.google.com/patent/US10295668B2/en | null | 2,206 |
|
US10571570B1-9 | FIG. 7 illustrates an example voltage signal corresponding to a received optical signal. | US10571570B1 | Lidar system with range-ambiguity mitigation | Luminar Technologies, Inc. | David L. Paulsen, Christopher Gary Sentelle, Zachary Heylmun, Matthew Hansen | 2019-03-07 | 2019-06-28 | 2020-02-25 | 2020-02-25 | https://patents.google.com/patent/US10571570B1/en | null | 5,267 |
|
US10324170B1-118 | Components of the first eye of the lidar system 10E thus generate outbound beams 250-1A and 250-1B, and components of the second eye generate outbound beams 250-2A and 250-2B. Stationary mirrors 208-1 and 208-2 fold these beams at 90 degrees to reduce the overall size of the lidar system 10E. Each outbound beam first strikes a folding mirror, which then reflects the outbound beam toward a scan mirror whose current rotary position defines the vertical scan angle, and the scan mirror in turn reflects the outbound beam toward the polygon mirror, whose current rotary position defines the horizontal scan angle of the pulse. | US10324170B1 | Multi-beam lidar system with polygon mirror | Luminar Technologies, Inc. | John P. Engberg, Jr., Christopher A. Engberg, John G. Hughes, Sean P. Hughes | 2018-04-05 | 2018-05-08 | 2019-06-18 | 2019-06-18 | https://patents.google.com/patent/US10324170B1/en | null | 2,615 |
|
US10191155B2-139 | At block 804, the emitted light pulses are directed, via the scanner 120, at various scan angles or orientations relative to a forward-facing direction of the vehicle. In this manner, the emitted light pulses are scanned across a FORH (e.g., from −60 degrees horizontal to +60 degrees horizontal with respect to the forward-facing direction of the vehicle) and a FORV (e.g., from −15 degrees vertical to +15 degrees vertical). In some implementations, the controller 150 provides a drive signal to the scanner 120 for rotating the scanning mirror across a FORH to direct light pulses toward different points within the FORH. The drive signal or another drive signal may also be provided to the scanner 120 for rotating another scanning mirror across a FORV to direct the light pulses to generate horizontal scan lines at several heights within the FORV. | US10191155B2 | Optical resolution in front of a vehicle | Luminar Technologies, Inc. | George C. Curatu | 2017-03-29 | 2017-11-27 | 2019-01-29 | 2019-01-29 | https://patents.google.com/patent/US10191155B2/en | null | 1,396 |
|
US11874401B2-53 | Surface A or surface B may have a dichroic coating that is anti-reflecting at one or more operating wavelengths of one or more light sources 110 and high-reflecting at wavelengths away from the one or more operating wavelengths. For example, surface A may have an AR coating for an operating wavelength of the light source 110, and surface B may have a dichroic coating that is AR at the light-source operating wavelength and HR for wavelengths away from the operating wavelength. A coating that is HR for wavelengths away from a light-source operating wavelength may prevent most incoming light at unwanted wavelengths from being transmitted through the window 117. In one implementation, if light source 110 emits optical pulses with a wavelength of approximately 1550 nm, then surface A may have an AR coating with a reflectivity of less than or equal to 0.5% from approximately 1546 nm to approximately 1554 nm. Additionally, surface B may have a dichroic coating that is AR at approximately 1546-1554 nm and HR (e.g., reflectivity of greater than or equal to 90%) at approximately 800-1500 nm and approximately 1580-1700 nm. | US11874401B2 | Adjusting receiver characteristics in view of weather conditions | Luminar Technologies, Inc. | Joseph G. LaChapelle, Matthew D. Weed, Scott R. Campbell, Jason M. Eichenholz, Austin K. Russell, Lane A. Martin | 2017-03-28 | 2019-04-08 | 2024-01-16 | 2024-01-16 | https://patents.google.com/patent/US11874401B2/en | null | 13,826 |
|
US11543652B2-64 | With continued reference to FIG. 10 , the light source 76 may include a pulsed laser configured to produce or emit pulses of light with a certain pulse duration. In an example implementation, the pulse duration or pulse width of the pulsed laser is approximately 10 picoseconds (ps) to 20 nanoseconds (ns). In another implementation, the light source 76 is a pulsed laser that produces pulses with a pulse duration of approximately 1-4 ns. In yet another implementation, the light source 76 is a pulsed laser that produces pulses at a pulse repetition frequency of approximately 100 kHz to 5 MHz or a pulse period (e.g., a time between consecutive pulses) of approximately 200 ns to 10 μs. The light source 76 may have a substantially constant or a variable pulse repetition frequency, depending on the implementation. As an example, the light source 76 may be a pulsed laser that produces pulses at a substantially constant pulse repetition frequency of approximately 640 kHz (e.g., 640,000 pulses per second), corresponding to a pulse period of approximately 1.56 μs. As another example, the light source 76 may have a pulse repetition frequency that can be varied from approximately 500 kHz to 3 MHz. As used herein, a pulse of light may be referred to as an optical pulse, a light pulse, or a pulse, and a pulse repetition frequency may be referred to as a pulse rate. | US11543652B2 | Imaging system having coil on mirror actuator | Luminar, Llc | Sean P. Hughes | 2020-04-20 | 2020-04-20 | 2023-01-03 | 2023-01-03 | https://patents.google.com/patent/US11543652B2/en | null | 12,436 |
|
US10445599B1-0 | The present description relates generally to vehicle navigation systems and in particular to object detection augmented with thermal sensor data. | US10445599B1 | Sensor system augmented with thermal sensor object confirmation | Luminar Technologies, Inc. | Richmond Hicks | 2018-06-13 | 2018-06-13 | 2019-10-15 | 2019-10-15 | https://patents.google.com/patent/US10445599B1/en | null | 3,837 |
|
US10241198B2-68 | A galvanometer scanner (which also may be referred to as a galvanometer actuator) may include a galvanometer-based scanning motor with a magnet and coil. When an electrical current is supplied to the coil, a rotational force is applied to the magnet, which causes a mirror attached to the galvanometer scanner to rotate. The electrical current supplied to the coil may be controlled to dynamically change the position of the galvanometer mirror. A resonant scanner (which may be referred to as a resonant actuator) may include a spring-like mechanism driven by an actuator to produce a periodic oscillation at a substantially fixed frequency (e.g., 1 kHz). A MEMS-based scanning device may include a mirror with a diameter between approximately 1 and 10 mm, where the mirror is rotated using electromagnetic or electrostatic actuation. A voice coil motor (which may be referred to as a voice coil actuator) may include a magnet and coil. When an electrical current is supplied to the coil, a translational force is applied to the magnet, which causes a mirror attached to the magnet to move or rotate. | US10241198B2 | Lidar receiver calibration | Luminar Technologies, Inc. | Joseph G. LaChapelle, Rodger W. Cleye, Scott R. Campbell, Jason M. Eichenholz | 2017-03-30 | 2017-11-30 | 2019-03-26 | 2019-03-26 | https://patents.google.com/patent/US10241198B2/en | null | 1,670 |
|
US10310058B1-66 | FIGS. 39A and 39B schematically illustrate adjusting the vertical field of regard FORV based on detected changes in the grade of the road, which can be implemented in the lidar sensor unit of FIG. 1; | US10310058B1 | Concurrent scan of multiple pixels in a lidar system equipped with a polygon mirror | Luminar Technologies, Inc. | Scott R. Campbell, Jason M. Eichenholz, Matthew D. Weed, Lane A. Martin | 2017-11-22 | 2018-04-27 | 2019-06-04 | 2019-06-04 | https://patents.google.com/patent/US10310058B1/en | null | 2,320 |
|
US11415676B2-29 | FIG. 34 illustrates an example computer system. | US11415676B2 | Interlaced scan patterns for lidar system | Luminar, Llc | Eric C. Danziger | 2017-10-09 | 2018-10-09 | 2022-08-16 | 2022-08-16 | https://patents.google.com/patent/US11415676B2/en | null | 10,610 |
|
US9989629B1-52 | With continued reference to FIG. 1, the light source 110 may include a pulsed laser configured to produce or emit pulses of light with a certain pulse duration. In an example implementation, the pulse duration or pulse width of the pulsed laser is approximately 10 picoseconds (ps) to 20 nanoseconds (ns). In another implementation, the light source 110 is a pulsed laser that produces pulses with a pulse duration of approximately 1-4 ns. In yet another implementation, the light source 110 is a pulsed laser that produces pulses at a pulse repetition frequency of approximately 100 kHz to 5 MHz or a pulse period (e.g., a time between consecutive pulses) of approximately 200 ns to 10 μs. The light source 110 may have a substantially constant or a variable pulse repetition frequency, depending on the implementation. As an example, the light source 110 may be a pulsed laser that produces pulses at a substantially constant pulse repetition frequency of approximately 640 kHz (e.g., 640,000 pulses per second), corresponding to a pulse period of approximately 1.56 μs. As another example, the light source 110 may have a pulse repetition frequency that can be varied from approximately 500 kHz to 3 MHz. As used herein, a pulse of light may be referred to as an optical pulse, a light pulse, or a pulse, and a pulse repetition frequency may be referred to as a pulse rate. | US9989629B1 | Cross-talk mitigation using wavelength switching | Luminar Technologies, Inc. | Joseph G. LaChapelle | 2017-03-30 | 2017-10-06 | 2018-06-05 | 2018-06-05 | https://patents.google.com/patent/US9989629B1/en | null | 14,762 |
|
US10267918B2-49 | As a more specific example, if the lidar system 100 measures the time of flight to be T=300 ns, then the lidar system 100 can determine the distance from the target 130 to the lidar system 100 to be approximately D=45.0 m. As another example, the lidar system 100 measures the time of flight to be T=1.33 μs and accordingly determines that the distance from the target 130 to the lidar system 100 is approximately D=199.5 m. The distance D from lidar system 100 to the target 130 may be referred to as a distance, depth, or range of the target 130. As used herein, the speed of light c refers to the speed of light in any suitable medium, such as for example in air, water, or vacuum. The speed of light in vacuum is approximately 2.9979×108 m/s, and the speed of light in air (which has a refractive index of approximately 1.0003) is approximately 2.9970×108 m/s. | US10267918B2 | Lidar detector having a plurality of time to digital converters integrated onto a detector chip | Luminar Technologies, Inc. | Joseph G. LaChapelle, Jason M. Eichenholz, Stephen D. Gaalema, Austin K. Russell | 2017-03-28 | 2018-06-25 | 2019-04-23 | 2019-04-23 | https://patents.google.com/patent/US10267918B2/en | null | 1,952 |
|
US10503172B2-304 | As used herein, the terms “based on” and “based at least in part on” may be used to describe or present one or more factors that affect a determination, and these terms may not exclude additional factors that may affect a determination. A determination may be based solely on those factors which are presented or may be based at least in part on those factors. The phrase “determine A based on B” indicates that B is a factor that affects the determination of A. In some instances, other factors may also contribute to the determination of A. In other instances, A may be determined based solely on B. | US10503172B2 | Controlling an autonomous vehicle based on independent driving decisions | Luminar Technologies, Inc. | Benjamin Englard, Joseph Augenbraun | 2017-10-18 | 2018-10-02 | 2019-12-10 | 2019-12-10 | https://patents.google.com/patent/US10503172B2/en | null | 4,829 |
|
US10969488B2-138 | In particular embodiments, one or more implementations of the subject matter described herein may be implemented as one or more computer programs (e.g., one or more modules of computer-program instructions encoded or stored on a computer-readable non-transitory storage medium). As an example, the steps of a method or algorithm disclosed herein may be implemented in a processor-executable software module which may reside on a computer-readable non-transitory storage medium. In particular embodiments, a computer-readable non-transitory storage medium may include any suitable storage medium that may be used to store or transfer computer software and that may be accessed by a computer system. Herein, a computer-readable non-transitory storage medium or media may include one or more semiconductor-based or other integrated circuits (ICs) (such, as for example, field-programmable gate arrays (FPGAs) or application-specific ICs (ASICs)), hard disk drives (HDDs), hybrid hard drives (HHDs), optical discs (e.g., compact discs (CDs), CD-ROM, digital versatile discs (DVDs), blue-ray discs, or laser discs), optical disc drives (ODDs), magneto-optical discs, magneto-optical drives, floppy diskettes, floppy disk drives (FDDs), magnetic tapes, flash memories, solid-state drives (SSDs), RAM, RAM-drives, ROM, SECURE DIGITAL cards or drives, any other suitable computer-readable non-transitory storage media, or any suitable combination of two or more of these, where appropriate. A computer-readable non-transitory storage medium may be volatile, non-volatile, or a combination of volatile and non-volatile, where appropriate. | US10969488B2 | Dynamically scanning a field of regard using a limited number of output beams | Luminar Holdco, Llc | Scott R. Campbell | 2017-03-29 | 2018-03-29 | 2021-04-06 | 2021-04-06 | https://patents.google.com/patent/US10969488B2/en | null | 7,696 |
|
US10345447B1-27 | The motion vectors may be used to group points together as relating to the same object. As an example, a vehicle moving down the road in front of the lidar will be represented by many points. These points will all move toward and away from the lidar together as the corresponding vehicle will move toward and away from the lidar. | US10345447B1 | Dynamic vision sensor to direct lidar scanning | Luminar Technologies, Inc. | Richmond Hicks | 2018-06-27 | 2018-06-27 | 2019-07-09 | 2019-07-09 | https://patents.google.com/patent/US10345447B1/en | null | 2,937 |
|
US11415676B2-188 | The term “or” as used herein is to be interpreted as an inclusive or meaning any one or any combination, unless expressly indicated otherwise or indicated otherwise by context. Therefore, herein, the expression “A or B” means “A, B, or both A and B.” As another example, herein, “A, B or C” means at least one of the following: A; B; C; A and B; A and C; B and C; A, B and C. An exception to this definition will occur if a combination of elements, devices, steps, or operations is in some way inherently mutually exclusive. | US11415676B2 | Interlaced scan patterns for lidar system | Luminar, Llc | Eric C. Danziger | 2017-10-09 | 2018-10-09 | 2022-08-16 | 2022-08-16 | https://patents.google.com/patent/US11415676B2/en | null | 10,769 |
|
US11415675B2-100 | In particular embodiments, a dynamically adjusted pulse period τ for an adaptive-resolution scan may be capped so as not to exceed a maximum pulse period τmax. The maximum pulse period may correspond to the maximum range Dmax of the lidar system 100. As an example, the maximum pulse period may be expressed as τmax=2Dmax/c+β. If the maximum range Dmax of the lidar system 100 is 200 m and the buffer time β is 0 ps, then the maximum pulse period τmax is approximately 1.33 μs. If the maximum range Dmax of the lidar system 100 is 200 m and the buffer time β is 200 ns, then τmax is approximately 1.53 μs. If the maximum range Dmax of the lidar system 100 is 150 m and the buffer time β is 10 ns, then the maximum pulse period is approximately 1.01 μs. After a pulse is emitted, the lidar system 100 may wait for a time period of up to τmax to elapse, and if no return pulse is received within that time period, then the lidar system 100 may send out the next pulse. In the example of FIG. 13, after pulse 300-U is emitted, the lidar system 100 waits for the maximum pulse period τmax to elapse, and since no return pulse is detected within that time period, the lidar system 100 then emits the next pulse 300-V. | US11415675B2 | Lidar system with adjustable pulse period | Luminar, Llc | Austin K. Russell, Matthew D. Weed, Liam J. McGregor, Jason M. Eichenholz | 2017-10-09 | 2018-10-09 | 2022-08-16 | 2022-08-16 | https://patents.google.com/patent/US11415675B2/en | null | 10,558 |
|
US11360197B2-171 | In embodiments, a calibration may be performed using sensor data from multiple lidar sensors having different projection planes, referred to in this disclosure as multi-view projections. The multi-view projections may include lidar data sets with top-down projections, front projections, and/or side projections. The lidar data from two or more different projection planes may allow for better detection of sensor alignments/misalignments and for determining whether a sensor should be recalibrated. Single-view projection calibrations can minimize or de-emphasize certain details of spatial data that depend on certain dimensions of an object in a field of regard. For example, a small lateral shift in an object as viewed from a front projection may not be as apparent as the same shift when viewed from a top-down or side projection. By performing multiple projections an object can be viewed from multiple viewpoints, which provides potentially different information to be collected and analyzed for calibration. Implementing multi-view projections allows for data offsets that could otherwise be missed or de-emphasized by a single view. Multi-view projection calibration may be useful for lidar-to-lidar sensors in which 3-D data sets (x, y, z) is available. | US11360197B2 | Calibration of sensor systems | Luminar, Llc | Amey Sutavani, Lekha Walajapet Mohan, Benjamin Englard | 2020-01-07 | 2020-05-07 | 2022-06-14 | 2022-06-14 | https://patents.google.com/patent/US11360197B2/en | null | 9,679 |
|
US11521009B2-39 | In some implementations, a sensor simulator may generate simulated sensor data within the virtual environment. For example, one or more virtual sensors may be placed in various positions around one or more vehicles in the virtual environment for the purpose of generating the simulated sensor data. The sensor simulator may simulate lidar (e.g., light detection and ranging) readings using ray casting or depth maps, for example, and/or images captured by a camera, etc. In addition, particular objects or surfaces in the virtual environment may be associated with reflectivity values for the purpose of simulating lidar and/or thermal camera readings. Lidar parameters such as scan patterns, etc., can be optimized, and/or models that control lidar parameters may be trained, using the data collected by simulating lidar readings in the virtual environment. The reflectively data or other simulated data may be accessed efficiently and quickly using direct memory access (DMA) techniques. | US11521009B2 | Automatically generating training data for a lidar using simulated vehicles in virtual space | Luminar, Llc | Miguel Alexander Peake, Benjamin Englard | 2018-09-04 | 2019-09-04 | 2022-12-06 | 2022-12-06 | https://patents.google.com/patent/US11521009B2/en | null | 12,027 |
|
US11378666B2-11 | FIG. 3 illustrates an example configuration in which the components of FIG. 1 scan a 360-degree field of regard through a window in a rotating housing; | US11378666B2 | Sizing the field of view of a detector to improve operation of a lidar system | Luminar, Llc | Scott R. Campbell, Lane A. Martin, Matthew D. Weed, Jason M. Eichenholz | 2017-03-29 | 2020-04-29 | 2022-07-05 | 2022-07-05 | https://patents.google.com/patent/US11378666B2/en | null | 10,063 |
|
US10088559B1-51 | As a more specific example, if the lidar system 100 measures the time of flight to be T=300 ns, then the lidar system 100 can determine the distance from the target 130 to the lidar system 100 to be approximately D=45.0 m. As another example, the lidar system 100 measures the time of flight to be T=1.33 μs and accordingly determines that the distance from the target 130 to the lidar system 100 is approximately D=199.5 m. The distance D from lidar system 100 to the target 130 may be referred to as a distance, depth, or range of the target 130. As used herein, the speed of light c refers to the speed of light in any suitable medium, such as for example in air, water, or vacuum. The speed of light in vacuum is approximately 2.9979×108 m/s, and the speed of light in air (which has a refractive index of approximately 1.0003) is approximately 2.9970×108 m/s. | US10088559B1 | Controlling pulse timing to compensate for motor dynamics | Luminar Technologies, Inc. | Matthew D. Weed, Scott R. Campbell, Lane A. Martin, Jason M. Eichenholz, Austin K. Russell, Rodger W. Cleye, Melvin L. Stauffer | 2017-03-29 | 2018-01-22 | 2018-10-02 | 2018-10-02 | https://patents.google.com/patent/US10088559B1/en | null | 484 |
|
US10732281B2-42 | According to some implementations, the lidar system 100 can include an eye-safe laser, or the lidar system 100 can be classified as an eye-safe laser system or laser product. An eye-safe laser, laser system, or laser product may refer to a system with an emission wavelength, average power, peak power, peak intensity, pulse energy, beam size, beam divergence, exposure time, or scanned output beam such that emitted light from the system presents little or no possibility of causing damage to a person's eyes. For example, the light source 110 or lidar system 100 may be classified as a Class 1 laser product (as specified by the 60825-1 standard of the International Electrotechnical Commission (IEC)) or a Class I laser product (as specified by Title 21, Section 1040.10 of the United States Code of Federal Regulations (CFR)) that is safe under all conditions of normal use. In some implementations, the lidar system 100 may be classified as an eye-safe laser product (e.g., with a Class 1 or Class I classification) configured to operate at any suitable wavelength between approximately 1400 nm and approximately 2100 nm. In some implementations, the light source 110 may include a laser with an operating wavelength between approximately 1400 nm and approximately 1600 nm, and the lidar system 100 may be operated in an eye-safe manner. In some implementations, the light source 110 or the lidar system 100 may be an eye-safe laser product that includes a scanned laser with an operating wavelength between approximately 1530 nm and approximately 1560 nm. In some implementations, the lidar system 100 may be a Class 1 or Class I laser product that includes a fiber laser or solid-state laser with an operating wavelength between approximately 1400 nm and approximately 1600 nm. | US10732281B2 | Lidar detector system having range walk compensation | Luminar Technologies, Inc. | Joseph G. LaChapelle | 2017-03-28 | 2017-10-06 | 2020-08-04 | 2020-08-04 | https://patents.google.com/patent/US10732281B2/en | null | 7,109 |
|
US10121813B2-91 | In the example of FIG. 3, a rotating scan module 200 revolves around a central axis in one or both directions as indicated. An electric motor may drive the rotating scan module 200 around the central axis at a constant speed, for example. The rotating scan module 200 includes a scanner, a receiver, an overlap mirror, etc. The components of the rotating module 200 may be similar to the scanner 120, the receiver 140, and the overlap mirror 115. In some implementations, the subsystem 200 also includes a light source and a controller. In other implementations, the light source and/or the controller are disposed apart from the rotating scan module 200 and/or exchange optical and electrical signals with the components of the rotating scan module 200 via corresponding links. | US10121813B2 | Optical detector having a bandpass filter in a lidar system | Luminar Technologies, Inc. | Jason M. Eichenholz, Scott R. Campbell, Joseph G. LaChapelle | 2017-03-28 | 2018-03-01 | 2018-11-06 | 2018-11-06 | https://patents.google.com/patent/US10121813B2/en | null | 898 |
|
US10241198B2-130 | In another example implementation, a triggering event for calibration is based on an environmental condition around the lidar system 100. An environmental condition may include a weather condition (e.g., rain, fog, or snow) or an atmospheric condition (e.g., the presence in the air of smoke, dust, dirt, or a swarm of insects). For example, the lidar system may receive a notification to perform a calibration if a particular environmental condition occurs (e.g., it begins to rain) or if there is a change in an environmental condition. | US10241198B2 | Lidar receiver calibration | Luminar Technologies, Inc. | Joseph G. LaChapelle, Rodger W. Cleye, Scott R. Campbell, Jason M. Eichenholz | 2017-03-30 | 2017-11-30 | 2019-03-26 | 2019-03-26 | https://patents.google.com/patent/US10241198B2/en | null | 1,732 |
|
US11802946B2-46 | As a more specific example, if the lidar system 100 measures the time of flight to be T=300 ns, then the lidar system 100 can determine the distance from the target 130 to the lidar system 100 to be approximately D=45.0 m. As another example, the lidar system 100 measures the time of flight to be T=1.33 μs and accordingly determines that the distance from the target 130 to the lidar system 100 is approximately D=199.5 m. The distance D from lidar system 100 to the target 130 may be referred to as a distance, depth, or range of the target 130. As used herein, the speed of light c refers to the speed of light in any suitable medium, such as for example in air, water, or vacuum. The speed of light in vacuum is approximately 2.9979×108 m/s, and the speed of light in air (which has a refractive index of approximately 1.0003) is approximately 2.9970×108 m/s. | US11802946B2 | Method for dynamically controlling laser power | Luminar Technologies, Inc. | Austin K. Russell, Jason M. Eichenholz, Laurance S. Lingvay | 2017-03-28 | 2022-05-26 | 2023-10-31 | 2023-10-31 | https://patents.google.com/patent/US11802946B2/en | null | 13,419 |
|
US10324170B1-116 | FIG. 6A schematically illustrates another implementation of a lidar system in which several techniques discussed above are implemented, and some of the components of this system are illustrated in perspective views in FIGS. 7-18. To avoid clutter, a control sub-system of the lidar system of FIG. 6A is illustrated separately in FIG. 6B. Generally speaking, the lidar system 10E generates two output beams for each of the two eyes, similar to the lidar system 10C. Further, similar to the lidar system 10D, the lidar system 10E includes stationary mirrors to fold input and output beams and thereby reduce the overall size of the lidar system. Still further, the lidar system 10E includes a single seed laser that supplies pulses to multiple collimators. | US10324170B1 | Multi-beam lidar system with polygon mirror | Luminar Technologies, Inc. | John P. Engberg, Jr., Christopher A. Engberg, John G. Hughes, Sean P. Hughes | 2018-04-05 | 2018-05-08 | 2019-06-18 | 2019-06-18 | https://patents.google.com/patent/US10324170B1/en | null | 2,613 |
|
US11536803B2-97 | In particular embodiments, one or more output electrical signals produced by one or more receivers 140 may be used to compare optical characteristics of two or more optical pulses detected by the receivers. For example, a receiver 140 may include two detectors 340 configured to detect two separate optical pulses (e.g., each detector may detect a different portion of a received optical pulse). An optical characteristic of the two optical pulses may be compared based on one or more output electrical signals associated with the two pulses and produced by the receiver 140. For example, a controller 150 may determine the peak voltages of two voltage signals 360 associated with the two optical pulses. The voltage signal 360 with the higher peak voltage may correspond to the optical pulse having a higher peak optical power or peak optical intensity. Rather than determining values for the optical power or intensity of two optical pulses (e.g., by using a formula or lookup table), a controller 150 may compare the peak voltage values of one or more output electrical signals to determine which pulse has the higher peak optical power or intensity. As another example, a controller 150 may compare the areas under two voltage-signal curves to compare the energy of the two corresponding optical pulses. The voltage-signal curve with the larger area may correspond to the optical pulse having a larger pulse energy. Rather than determining values for the pulse energy of two optical pulses, a controller 150 may compare the area of two voltage-signal curves to determine which pulse has the higher pulse energy. | US11536803B2 | Lidar receiver with multiple detectors for range-ambiguity mitigation | Luminar, Llc | Stephen D. Gaalema, Mark A. Drummer, Stephen L. Mielke, Jason M. Eichenholz | 2018-12-05 | 2019-08-29 | 2022-12-27 | 2022-12-27 | https://patents.google.com/patent/US11536803B2/en | null | 12,285 |
|
US10663595B2-91 | Generating Pixels within a Field of Regard | US10663595B2 | Synchronized multiple sensor head system for a vehicle | Luminar Technologies, Inc. | George C. Curatu | 2017-03-29 | 2017-11-27 | 2020-05-26 | 2020-05-26 | https://patents.google.com/patent/US10663595B2/en | null | 6,698 |
|
US10491885B1-8 | FIG. 3 is an isometric diagram of a scene model with occluded areas. | US10491885B1 | Post-processing by lidar system guided by camera information | Luminar Technologies, Inc. | Richmond Hicks | 2018-06-13 | 2018-06-13 | 2019-11-26 | 2019-11-26 | https://patents.google.com/patent/US10491885B1/en | null | 4,417 |
End of preview.
README.md exists but content is empty.
- Downloads last month
- 47
Data Sourcing report
powered
by
Spawning.aiNo elements in this dataset have been identified as either opted-out, or opted-in, by their creator.