This report analyzes provided patent documents to assess the novelty of a proposed bicycle system that projects dynamic visual information onto the road surface using lasers or high-intensity LEDs and sensors. The invention describes three primary functions: Virtual Lane Projection, Predictive Turn Signals, and Braking Deceleration Light.
The analysis reveals that while individual components of the proposed invention, such as the use of lasers/LEDs for projection, various sensors (accelerometers, gyroscopes, radar/lidar), and systems for detecting vehicle state or external objects, are present in the prior art, no single document or combination of documents explicitly describes or claims a system that integrates these components specifically for the purpose of projecting dynamic visual information onto the road surface around a bicycle to enhance visibility and communication with other road users in the manners described (virtual lane, predictive turn signals, braking patterns).
Existing patents cover laser projection for guidance or denying visual access, sensor systems for vehicle control and object detection (including bicycles), and lighting systems for vehicles. However, the unique combination of these elements for dynamic road projection from a bicycle, driven by bicycle-specific sensors and intended for communication with other road users through projected patterns, appears to be novel based on the provided text.
The task is to conduct a prior art search to prove the novelty of a bicycle-mounted or frame-integrated system. This system utilizes lasers or high-intensity LEDs in conjunction with sensors (accelerometers, gyroscopes, radar/lidar) to project dynamic visual information onto the road surface around the bicycle. The system is defined by three specific functionalities: * A-Virtual Lane Projection: Projecting parallel lines to define the cyclist's space. * B-Predictive Turn Signals: Projecting turning arrows based on lean or button press. * C-Braking Deceleration Light: Projecting a distinct pattern behind the cyclist upon sharp deceleration.
The objective of this report is to analyze the provided unstructured text, which consists of excerpts from various patent documents, to determine the extent to which it supports or refutes the novelty of the described bicycle dynamic road projection system. The analysis will focus on identifying existing technologies related to light projection, sensor systems, bicycle components, and vehicle safety systems, and assessing whether they disclose or suggest the specific combination and application claimed by the invention.
This section analyzes the provided patent documents, categorizing them by relevant technological areas and assessing their relationship to the proposed bicycle invention.
Several documents discuss the use of lasers and LEDs for various purposes, including projection.
The patent US5427328A [13] describes a laser beam rider guidance system that uses a rotating laser light pattern to help a weapon-borne receiver determine its position relative to the beam axis. The system projects a modulated beam with changing light patterns onto space. While this involves laser projection and dynamic patterns, its application is for projectile guidance, not for projecting visual information onto a road surface for communication with other road users. The patterns are designed for detection by a specific receiver on the projectile, not for general visibility.
The patent US20140176954A1 [1] describes a system for preventing visual access to an area by projecting a structured light pattern using a laser or LED. This system can adjust the pattern's properties (color, intensity, shape, movement) based on detected sensors (human eyes, cameras). The projected pattern can be dynamic due to movement of the light source or optic element. While this document discusses projecting dynamic patterns with lasers/LEDs and using sensors to adjust the projection, its purpose is to disrupt vision, not to communicate information to other road users by projecting onto the road surface.
The patent US8033686B2 [11] describes various wireless lighting devices and systems, primarily using LEDs. These systems can include sensors (motion, light) and wireless control for adjusting light intensity, color, and on/off states. Applications include pathway lighting and emergency lighting. However, this document focuses on illuminating areas or providing status/warning through the lights themselves, not through projected patterns onto the ground.
The patent US10007994B2 [15] describes a stereodepth camera system using a VCSEL projector to emit infrared beams and project a pattern onto a scene for depth determination. A moveable lens can dynamically alter the projected pattern by changing focus. While this involves projecting a dynamic pattern with a laser source (VCSEL), the purpose is depth sensing, and the projection is typically in the infrared spectrum, not visible light intended for other road users.
Summary on Projection Technology: The prior art demonstrates the use of lasers and LEDs for projection and creating dynamic patterns. However, none of the provided documents describe the specific application of projecting visible, dynamic patterns onto the road surface from a bicycle for the purpose of communicating with other road users as outlined in the task definition.
Multiple documents discuss the use of various sensors, including those mentioned in the task definition.
Accelerometers and gyroscopes are frequently mentioned in the context of determining vehicle state and control. * US6724165B2 [7] describes a regenerative braking system for an electric scooter that uses wheel speed sensors and an accelerometer to control braking. * US6405132B1 [14] describes an accident avoidance system that uses an inertial navigation system (INS) with gyroscopes and accelerometers for accurate vehicle positioning, especially when GPS signals are unavailable. * US8660734B2 [18] mentions using accelerometers and gyroscopes as part of an autonomous vehicle system to detect and predict object behaviors. * US10529052B2 [10] mentions using accelerometers and gyroscopes as metadata from sensors to inform image processing for simulating virtual camera perspectives. * US7762368B2 [21] describes a tilting suspension system for a motorcycle trike that uses sensors to detect orientation, speed, and/or acceleration to automatically tilt the frame. * US8830048B2 [16] describes controlling a personal transporter based on user position, using sensors like accelerometers and gyroscopes for dynamic stabilization and detecting lateral acceleration and roll angle. * US6023221A [12] describes an automotive safety system that uses an accelerometer to measure longitudinal acceleration and automatically activate hazard warning lights during hard braking.
These documents show that accelerometers and gyroscopes are commonly used in vehicles to detect movement, orientation, and acceleration/deceleration for various control and safety functions.
Radar and lidar are discussed for detecting objects and determining their distance and characteristics. * US6405132B1 [14] describes an accident avoidance system that uses radar, laser radar (lidar), and optical imaging to detect non-equipped vehicles, pedestrians, animals, and other hazards. It specifically mentions a scanning infrared laser radar with range gating to identify and locate objects. * US8660734B2 [18] mentions using radar and laser range finders (lidar) as part of an autonomous vehicle system for object detection. * US10393872B2 [9] describes a bicycle radar sensor system that uses a radar unit to determine the location and velocity of targets behind the bicycle. It also includes a camera.
These documents confirm the use of radar and lidar in vehicle systems for detecting objects in the environment.
Several patents highlight the integration of multiple sensors in vehicle systems for enhanced awareness and control. * US6405132B1 [14] describes a system combining GPS/DGPS, INS (gyroscopes, accelerometers), radar, laser radar, and cameras for accurate vehicle location and hazard detection. * US20230057509A1 [3] describes a vision-based machine learning model for autonomous driving that primarily uses image sensors but can utilize kinematic information (velocity, acceleration, yaw rate) from other sensors. * US11586854B2 [8] describes a system for identifying objects in a vehicle's environment using data acquisition devices (cameras, radar, LIDAR) and measurement devices (accelerometer, gyroscope). * US10393872B2 [9] integrates radar and a camera on a bicycle for situational awareness.
Summary on Sensor Technology: The prior art extensively covers the use of accelerometers, gyroscopes, radar, and lidar in various vehicle systems, including bicycles, for detecting vehicle state, movement, and external objects. The integration of multiple sensor types is also well-established.
Some documents specifically address bicycle-related technologies.
US10393872B2 [9] describes a bicycle radar sensor system with an integrated camera. This system detects targets behind the bicycle using radar and can use the camera to provide images/video, determine target size, and correlate targets to road lanes. It provides situational awareness indicators to the cyclist via a display, haptic feedback, or audible alerts. While this system uses sensors (radar, camera, potentially accelerometers for taillight control) on a bicycle to detect the environment and inform the rider, it does not involve projecting visual information onto the road surface.
US8402568B2 [4] describes a protective apparel system for a bicyclist's head, containing a folded airbag, inflator, and a trigger device with micro sensors that detect abnormal movement (fall or collision). This document is relevant in that it describes a safety system for bicycles using sensors to detect specific events, but it does not involve light projection.
US9216791B2 [23] describes a bicycle rear suspension system. This document is not directly relevant to the proposed invention's core functionality of dynamic road projection.
US20150096335A1 [2] describes a system and method for bike locking. This document is not relevant to the proposed invention.
Summary on Bicycle-Specific Systems: The provided prior art includes bicycle-specific systems that utilize sensors for safety and situational awareness (radar/camera, airbag). However, none of these systems incorporate the concept of projecting dynamic visual information onto the road surface.
Several documents discuss vehicle lighting and signaling, but not in the context of dynamic road projection from a bicycle.
US4972302A [5] describes a vehicle lamp system (tail lamps, turn signal lamps, stop lamps) focusing on lens and reflector arrangements for compact design and efficient light transmission. The light sources mentioned are bulbs, and the system does not include sensors or dynamic projection onto the road surface based on vehicle movement or actions.
US6864787B1 [6] describes a supplemental brake light system for the front of a vehicle, using lights (primarily bulbs) to make braking visible from the front and sides. It is wired to the existing brake light circuit and indicates braking action but does not project dynamic visual information onto the road surface.
US6023221A [12] describes an automotive safety system that automatically activates hazard warning lights during hard braking or sudden stops using an accelerometer. This system provides a warning via the vehicle's existing lights, not by projecting patterns onto the road surface.
Summary on Vehicle Lighting and Signaling Systems: The prior art shows various vehicle lighting and signaling systems, including those triggered by deceleration. However, these systems utilize conventional lights and do not involve dynamic projection onto the road surface.
Several documents relate to autonomous driving and systems that detect and react to the vehicle's environment.
US20230057509A1 [3] describes a vision-based machine learning model for autonomous driving that processes images from multiple cameras to detect objects and associated signals. It projects features into a vector space based on a "virtual camera" perspective. While this involves processing visual data and determining information about the environment, it does not involve projecting visual information onto the road surface.
US8660734B2 [18] describes a system for predicting behaviors of detected objects for autonomous vehicles using various sensors (cameras, radar, laser range finders, accelerometers, gyroscopes). It detects and classifies objects (including bicycles and pedestrians) and predicts their movements to control the vehicle's path and speed. This system focuses on the vehicle's perception and reaction to its environment, not on projecting information onto the road surface.
US11586854B2 [8] describes a system for accurately identifying objects in a vehicle's environment for autonomous driving using machine learning models and contextual variables from sensors. Similar to US8660734B2 [18], this focuses on the vehicle's internal processing of environmental data.
US7720580B2 [20] describes an object detection system for a vehicle using an imaging array sensor (camera) and control system to detect objects, including bicycles. It can adjust processing based on steering angle and speed. This system is about detecting objects in the environment, not projecting information onto the road.
US6405132B1 [14] describes an accident avoidance system that uses accurate location determination and sensing (radar, laser radar, cameras) to detect potential collision hazards and generate warnings or control signals. While it involves detecting objects and reacting to them, it does not describe projecting visual information onto the road surface as a means of communication or warning.
Summary on Autonomous Driving and Object Detection Systems: The prior art in this area focuses on vehicles perceiving their environment and reacting accordingly. While these systems utilize many of the sensors mentioned in the task definition and can detect objects like bicycles, they do not incorporate the concept of projecting dynamic visual information onto the road surface from the vehicle itself.
Some documents describe systems that use sensors to influence visual output, though not in the manner of the proposed invention.
US10529052B2 [10] describes a system for simulating a virtual lens in video and photo editing, using metadata from sensors (accelerometers, gyroscopes, GPS) to inform the processing of image data. This involves manipulating existing image data to create a modified visual output, not projecting new visual information onto the real-world environment.
US20160006922A1 [8] describes a vehicle camera system that interfaces with a mobile communication device. It compiles information from multiple components, including location, vehicle status (speed, acceleration, braking), and potentially signal detection data (radar/laser detector), into video recordings or displays. The camera display can show predicted alerts or other information overlaid on the video view. While this system uses sensors and displays information visually, the display is internal to the vehicle or on a mobile device, not projected onto the road surface.
Summary on Systems Utilizing Sensors for Visual Output Modification: These systems demonstrate the use of sensors to influence visual output, but the output is either a manipulation of existing images or a display within the vehicle, not a projection onto the external environment.
Based on the analysis of the provided prior art, the novelty of the proposed bicycle dynamic road projection system lies in the specific combination of technologies and their application for projecting dynamic visual information onto the road surface from a bicycle to enhance safety and communication.
The concept of projecting a "virtual lane" to mark the cyclist's space is not explicitly described in the provided documents. While US20140176954A1 [1] discusses projecting structured light patterns, its purpose is visual disruption, not defining a safe zone for a cyclist. The combination of sensors (accelerometers, gyroscopes, potentially radar/lidar) to detect bicycle speed and potentially adjust the projected lane width based on this data is also not found in the prior art.
Projecting large, clear turning arrows onto the road surface ahead of the turn, based on handlebar lean (detected by sensors) or a button press, is not described in the provided documents. Existing turn signal systems (e.g., US4972302A [5]) use conventional lights. While US8830048B2 [16] discusses using handlebar lean as an input for controlling a personal transporter, it does not relate to projecting turn signals onto the road. The use of sensors to detect lean or button presses and trigger a projected turning pattern is a novel combination based on the provided text.
Projecting a distinct pattern (e.g., a widening red band or flashing chevrons) onto the road behind the cyclist upon detecting sharp deceleration (braking) is not described in the provided documents. While US6023221A [12] describes using an accelerometer to detect hard braking and activate hazard lights, and US6864787B1 [6] describes supplemental brake lights, these use conventional lighting. The concept of using sensors to detect deceleration and trigger a dynamic projected pattern onto the road surface behind the bicycle for enhanced braking indication appears novel based on the provided text.
The provided prior art demonstrates the existence of technologies relevant to the proposed bicycle dynamic road projection system, including:
However, none of the analyzed documents, individually or in combination, describe or suggest a system that integrates these technologies specifically for the purpose of projecting dynamic visual information onto the road surface around a bicycle to achieve the functionalities of Virtual Lane Projection, Predictive Turn Signals, and Braking Deceleration Light as defined in the task.
The novelty of the proposed invention appears to reside in this specific combination of components and their application for enhancing bicycle safety and communication with other road users through dynamic road projection. The provided text does not contain prior art that directly refutes the novelty of this specific system and its described functionalities.
https://www.webreport.ai NOTICE: This report is the result of automated web browsing and AI analysis conducted by Web Report at the request of the client. Web Report makes no representations or warranties regarding the accuracy, completeness, or legality of the information provided. The client assumes sole responsibility for verifying the accuracy of the information and ensuring compliance with all applicable laws and regulations, including those related to intellectual property rights.ID:e5714, DATE:May-20-2025.