WO2017105696A1 - Automatic event recorder - Google Patents

Automatic event recorder Download PDF

Info

Publication number
WO2017105696A1
WO2017105696A1 PCT/US2016/061547 US2016061547W WO2017105696A1 WO 2017105696 A1 WO2017105696 A1 WO 2017105696A1 US 2016061547 W US2016061547 W US 2016061547W WO 2017105696 A1 WO2017105696 A1 WO 2017105696A1
Authority
WO
WIPO (PCT)
Prior art keywords
event
memory buffer
input stream
computer
data
Prior art date
Application number
PCT/US2016/061547
Other languages
French (fr)
Inventor
Benjamin Conrad OIEN
Paul F. Sorenson
Kahyun Kim
Emily N. IVERS
Original Assignee
Intel Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Corporation filed Critical Intel Corporation
Publication of WO2017105696A1 publication Critical patent/WO2017105696A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
    • G06F3/0601Interfaces specially adapted for storage systems
    • G06F3/0602Interfaces specially adapted for storage systems specifically adapted to achieve a particular effect
    • G06F3/062Securing storage systems
    • G06F3/0623Securing storage systems in relation to content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
    • G06F3/0601Interfaces specially adapted for storage systems
    • G06F3/0628Interfaces specially adapted for storage systems making use of a particular technique
    • G06F3/0655Vertical data movement, i.e. input-output transfer; data movement between one or more hosts and one or more storage devices
    • G06F3/0659Command handling arrangements, e.g. command buffers, queues, command scheduling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/06Digital input from, or digital output to, record carriers, e.g. RAID, emulated record carriers or networked record carriers
    • G06F3/0601Interfaces specially adapted for storage systems
    • G06F3/0668Interfaces specially adapted for storage systems adopting a particular infrastructure
    • G06F3/0671In-line storage system
    • G06F3/0673Single storage device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details

Definitions

  • Bicyclists may have little recourse if they have a collision with a vehicle. According to the Centers for Disease Control, over 900 cyclists were killed and 494,000 emergency room visits occurred due to bike accidents in 2013. Many infractions or near misses happen each day that may not be reported or recorded. Determining fault may be important when an actual infraction of the law is committed. Safety improvements may be beneficial to bicyclists as a bicyclist may be more vulnerable to injury in a collision than a motor vehicle driver.
  • FIG. 1 illustrates an environment including an event recorder, according to an embodiment.
  • FIG. 2 illustrates an example of a system for an automatic event recorder, according to an embodiment.
  • FIG. 3 illustrates an example of a method for an automatic event recorder, according to an embodiment.
  • FIG. 4 is a block diagram illustrating an example of a machine upon which one or more embodiments may be implemented.
  • Systems and techniques for an automatic event recorder are disclosed herein which may provide reliable information providing the circumstances leading up to and subsequent to an event (e.g., crash, interesting visuals, happenstances, etc.). With reliable information, it may be possible enact new safety measures or establish fault.
  • a technique utilizing sensor technology e.g., for bike-commuting, etc.
  • a wearable device e.g., head mounted display (HMD) glasses
  • a storage device e.g., solid-state drive (SSD) quick-access storage
  • SSD solid-state drive
  • the present subject matter may allow recording of information (e.g., location, video, audio)
  • information e.g., location, video, audio
  • the present subject matter may heighten awareness and attention of a user and may facilitate avoidance of incidents between bikes and vehicles.
  • the disclosed techniques may give cyclists and drivers recourse in situations where they are often at a disadvantage in reporting accidents by providing data to inform understanding and decision making.
  • the wearable device may include sensors such as, for example, audio microphones and video cameras to record the scene from the rider's perspective.
  • the present subj ect matter may provide rider-point-of-view perspective and orientation of the camera and microphone sensors that may enable a recording event that may be near-real-time and may very relevant to an event that has just occurred.
  • an intelligent Head Mounted Display (HMD) device paired to a companion smartphone keeps a circular recording buffer (audio and video) for a user defined period while the cyclist is wearing the HMD. As long as no trigger events occur, buffered data older than the user defined period is discarded.
  • a series of triggers and/or events may be determined from input received such as, by way of example and not limitation, a loud crash noise, a rider's exclamation (e.g., voice, yelling loudly, swearing, etc.), and very sudden, violent stopping or quick change in orientation/motion.
  • the sensor data combined with the LPAL trigger mechanism may automatically cause real-time recording of location, time, audio, and video of the event.
  • a trigger event e.g., a start event, a stop event, etc
  • the device may save the buffered video before the point of the trigger event and may continue to save ambient audio and video continuously until, for example, either a pre-defined storage limit is reached, the user terminates the recording, or the device stops functioning.
  • the buffer may constantly replace old data with new data.
  • the buffer may be a user-defined-duration circular buffer.
  • the circular buffer may be transparent to the user until an event occurs.
  • the buffered data may be written to a second persistent storage location.
  • the buffer may be persistent in case of hardware failure.
  • the user may be notified via a combination of a subtle audio or a display cue. Data may continue to be persistently recorded until a predetermined stop event has occurred. Some stop events such as, for example, a custom voice event may be configured to automatically delete the recorded data.
  • the video clip may be saved along with other data such as, for example, GPS/location, event date and time, trigger event type, and accelerometer data.
  • the record may be stored locally and may be backed up to a companion device (e.g. , mobile phone, tablet, etc.) and/or cloud storage (e.g., public cloud storage system, private cloud storage system, etc. ) upon the next connection.
  • a companion device e.g. , mobile phone, tablet, etc.
  • cloud storage e.g., public cloud storage system, private cloud storage system, etc.
  • the record may be deleted in the absence of a trigger event or may be saved with presence of a trigger event for later inspection or deletion if the event turns out to be a false-positi ve (e.g., an accident did not occur, or a near-accident event has occurred, etc.).
  • FIG. 1 illustrates an environment 100 including an event recorder 108, according to an embodiment.
  • the environment 100 may include an automobile 1 10, and a user 102 (e.g., bicyclist, etc.).
  • the user 102 may be wearing a wearable device 104 (e.g., smartglasses, smart watch, etc.) and may have a mobile device (e.g., smartphone, tablet, etc.) 106.
  • the wearable device 104 and mobile device 106 may be communicatively coupled to an event recorder 108.
  • the event recorder 108 may be integrated into the wearable device 104 and/or the mobile device 106.
  • the wearable device 104 may include an array of sensors (e.g., camera, microphone, accelerometer, gyroscope, magnetometer, GPS receiver, etc.) that may observe the environment surrounding the user 102.
  • sensors e.g., camera, microphone, accelerometer, gyroscope, magnetometer, GPS receiver, etc.
  • the user 102 may be riding a bicycle and may come to an intersection where an automobile 1 10 may enter the intersection in the path of the user.
  • the user 102 may slow rapidly and may utter a verbal exclamation.
  • An accelerometer in the array of sensors may detect the rapid deceleration of the user 1 02 and a microphone of the array of sensors may detect the verbal exclamation.
  • the wearable device 104 may include a memory device and a storage device capable of storing data received from the array of sensors.
  • the memory device may be a memory buffer.
  • the memory device may be a user-defined-duration circular memory buffer.
  • the storage device may be a solid-state drive.
  • the data collected from the array of sensors may be received by the event recorder 108.
  • the event recorder 1 08 may write the data to the memory device of the wearable device 104.
  • the data collected by the accelerometer indicating the rapid deceleration of the user 102 and the audio of the verbal exclamation of the user may be written to the memory device.
  • the event recorder 108 may use the data to detect the occurrence of an event.
  • the event recorder 108 may detect a variety of events that may be classified by event type.
  • Example event types include, but are not limited to, a start event, a stop event, a save event, and a delete event.
  • a start event is an event that causes the event recorder 108 to save data collected for a period of time before the occurrence of the start event and data coll ected for a period of time after the occurrence of the start event.
  • the verbal exclamation of the user 102, the rapid deceleration of the user 102, or both may be used by the event recorder 108 to determine that an impending collision event has occurred which may be classified as a start event.
  • a stop event is an event that causes the event recorder 108 to cease the data save operations imitated subsequent to and/or simultaneously with the detection of a start event.
  • a voice command uttered by the user 102 such as "stop recording” may be classified as a stop command.
  • event types may be further classified by sub-types.
  • the data saved may be discarded.
  • a voice command uttered by the user 102 such as "delete recording” may be classified as a stop and delete event type.
  • the stop event may cease data save operations and initiate a subsequent process, such as uploading the audio/video recording to a remote destination for data preservation.
  • a voice command uttered by the user 102 such as "save recording” may be classified as a stop and save command and the data saved may be transferred to a cloud based storage system.
  • the event types may be used by the event recorder 108 to perform actions on the data such as, for example, setting storage flags for data in the memory device and transferring data between the memory device and the storage device.
  • a start event may flag data currently residing in a memory buffer and data written to the memory buffer subsequent to the start event as write protected.
  • a stop event may clear the write protected flag for data written to the memory buffer subsequent to the stop event.
  • a stop and delete event may clear the write protect flag for data written subsequent to the stop and delete event and may delete data from the memory buffer prior to the stop and delete event, in an example, a stop and save event may clear the write protect flag for data written subsequent to the stop and save event and may transfer the memory buffer to the storage device.
  • data written subsequent to the start event may continue to be marked as write protected until an automatic stop event is detected such as, for example, the memory buffer has reached a physical limit or the memory buffer has reach a storage limit set by the user 102.
  • data may continue to be written to a storage device
  • the data may be written to the storage device until an automatic stop event is detected such as the storage device has reached its physical limit or a limit set by the user 102.
  • the event recorder is described in greater detail in FIG. 2.
  • FIG. 2 illustrates an example of a system 200 for an automatic event recorder, according to an embodiment.
  • the system 200 includes a sensor array (e.g., a group of sensors integrated into a head mounted display, etc.) 202 that is communicatively coupled to an event recorder 204.
  • a sensor array e.g., a group of sensors integrated into a head mounted display, etc.
  • the array of sensors 202 may include various sensors including, by way of example and not limitation, a microphone, a camera, a global positioning system (GPS) receiver, accelerometer, magnetometer, gyroscope, etc.
  • the array of sensors 202 may be used to collect information about a user and the environment surrounding the user. For example, the array of sensors 202 may observe audio, video, GPS location, and movement data
  • the event recorder 204 may include an input receiver 206, an event detector 208, and a storage manager 210.
  • the input receiver 206 may receive inputs from the sensor array 202.
  • the input receiver 206 may receive the audio from a microphone of the sensor array 202, video from a camera of the sensor array 202, acceleration data from an accelerometer of the sensor array 202, and GPS data from a GPS receiver of the array of sensors 202.
  • the data collected by the input receiver 206 from the array of sensors 202 may provide information from the user's perspective. For example, video and audio may be collected from the vantage point of the cyclist, accelerometer data collected may show impact strength as felt from the user's head, gyroscope data may show unusual positions such as a roll experienced by the user, magnetometer data collected may show the direction the user is looking, and GPS location data collected may show a point of impact, point of rest, and path of travel of the user leading up to an event. [0030] The event detector 208 may use the data received by the input receiver 206 to make determinations about events that are occurring in the user's environment.
  • the event detector 208 may use received audio, video, and acceleration data to make a determination that a collision between the bicycle and another entity (e.g., vehicle, person, obstruction, etc.) is imminent.
  • the determination may be based on received inputs indicating that the bicyclist may be yelling, the other entity may be becoming larger in the video, and the bicyclist may be slowing rapidly.
  • an event may be determined using audio input received from the sensor array 202.
  • the event detector 208 may use various techniques such as, by way of example and not limitation, computational linguistics (e.g. , speech recognition, etc.) and audio event detection.
  • the audio received from a microphone of the sensor array 202 may be processed using an algorithm to determine if units of the audio segment match models of audio segments from training data associated with an event. For example, a user may yell "Look Out! " which may be matched to an impending collision event.
  • the audio collected from a microphone of the sensor array 202 may include the sound of screeching tires and the sound may be matched to a sound signature corresponding with the impending collision event.
  • an event may be determined using video input received from the sensor array 202
  • the event detector 208 may use a variety of image recognition techniques to determine events occurring in the received video input.
  • a set of training video segments may be used to create a model for an event.
  • the received video input may then be evaluated against a set of event models to classify the received video segment with an event class. For example, a video segment received from a camera of the sensor array 202 may show a car continually getting larger throughout the clip and the video segment may be assigned an impending collision classification based on the video segments similarity to an impending collision model.
  • an event may be determined using accelerometer data received from the sensor array.
  • the event detector 208 may use a variety of accelerometry techniques to determine an event has occurred.
  • a set of training data including various acceleration and deceleration sequences may be used to generate a set of event models.
  • the received accelerometer data may then be evaluated against the models to classify segments of the accelerometer data. For example, the received accelerometer data may indicate the user was moving at a steady pace for a period of time and then is rapidly decelerating. The segment of
  • accelerometer may be evaluated against the set of models and may be assigned a classification of impending collision.
  • an algorithm may be used to determine a speed variance for the segment and if the speed variance is outside a threshold it may be determined that an event has occurred.
  • the received accelerometer data may indicate the user was moving at a steady pace for a period of time and then is rapidly decelerating and the speed variance may indicate that a collision is imminent.
  • received data inputs may be combined and evaluated against a set of models to determine if an event has occurred.
  • the collected audio data may indicate that the user yelled "Look Out! " and the accelerometer data may indicate there was a rapid deceleration occurring during the same time period. Using the indici a together may provide a more accurate indication of an event.
  • the event detector 208 may detect different classes of events.
  • a start class of events may be used to identify data that should be saved. Examples of start events may include by example and not limitation, audible events (e.g., a volume threshold may be used to detect an impact or a car horn, etc.), custom voice events (e.g., yelling, predetermined keywords: "Watch it! " and "That was cool", etc.), common expletives, (e.g. , detection of an impending event based on a dictionary of common expletives uttered in emergency situations, etc.), and data triggered events (e.g., sensor data indicates an impact or an unusual occurrence such as being upside down, etc.).
  • audible events e.g., a volume threshold may be used to detect an impact or a car horn, etc.
  • custom voice events e.g., yelling, predetermined keywords: "Watch it! " and "That was cool", etc.
  • common expletives e.g. , detection
  • a start event may be detected that sets a write protect flag for a set of data corresponding with the start event.
  • a collision event may be detected as a start event.
  • the user may say "Look at that" which may be detected as a start event.
  • the set of data may include data for first period of time before occurrence of the start event and data for a second period of time after the occurrence of the start event.
  • a collision start event may be determined and 30 seconds of data may be flagged for write protection and five minutes of data after the collision start event may be marked for write protection.
  • a stop class event may be used to end the recording and/or write protection of data collected surrounding the start event. Stop events may include automatic and/or manual indications that the event has ended and/or that the data preservation period has ended.
  • the event detector 208 may detect a stop event that turns off the write protect flag for data subsequent to the occurrence of the stop event. In an example, the event detector 208 may detect sub-classes of stop events. In an example, a stop and delete event may be detected that turns of the write protect flag and deletes data collected before the occurrence of the stop and delete event. In an example, a stop and save event may be detected that turns off the write protect flag and transfers the data collected before the occurrence of the stop and save event to long-term storage.
  • stop events may include, by way of example and not limitation, physical storage limit reached, user defined storage limit reached (e.g., 1 gigabyte, etc.), user defined recording period reached (e.g., 5 minutes), user defined (e.g., through a training session) commands (e.g., "forget what just happened," “it's over,” etc. ), and preprogrammed commands (e.g., voice commands, gestures, etc.).
  • user defined storage limit reached e.g., 1 gigabyte, etc.
  • user defined recording period reached e.g., 5 minutes
  • user defined commands e.g., "forget what just happened," “it's over,” etc.
  • preprogrammed commands e.g., voice commands, gestures, etc.
  • the stop event may be detected from the data collected from the sensor array 202.
  • the sensor array 202 may include a microphone for collecting audio data and the stop event may be generated from the audio data.
  • the user may utter "stop recording" which may be classified as a stop event.
  • the sensor array 202 may include a camera for collecting video data and the stop event may be generated from the video data.
  • the user may make a gesture with an arm that is in view of the camera.
  • the sensor array 202 may include an acceierometer for collecting movement data and the stop event may be generated from the movement data.
  • the user may make a series of motions with his head.
  • the sensor array 202 may include a GPS receiver for collecting location data and the stop event may be generated using the location data. For example, the user may move to a safe location.
  • a combination of data collected from various sensors of the sensor array 202 may be used to generate a stop event.
  • the stop event may be detected from the proximity of emergency personnel to the user.
  • a police officer may be detected in the video data collected which may be detected as a stop event.
  • the event recorder 204 may include a storage manager 210 to manage storage operations.
  • the storage manager 210 may determine the storage location of data collected from the input receiver 206.
  • the storage manager 210 may be communicatively coupled with a memory buffer 212 in a wearable device such as, for example, the wearable device 104 as described in FIG. 1.
  • the memory buffer 212 may be a short- term memory storage location that is able to store streaming input data in real-time.
  • input receiver 206 may receive a data stream from the array of sensors 202 including audio, video, movement, and location data and the storage manager 210 may continuously write the data stream to the memory buffer 212.
  • the storage manager 210 may continuously write the input data stream to the memory buffer whil e the event recorder 204 is in operation.
  • a video clip may be saved along with location, event date, event time, event type, and accelerometer data.
  • the constant read/write cycles may be taxing on storage devices (e.g., memory buffer 212). Therefore, the storage manager 210 may automatically self-validate static memory integrity on an ongoing basis.
  • the storage manager 210 may implement data integrity validation.
  • a notification may be transmitted to the user indicating that the static memory has failed the data integrity validation.
  • the notification may be a graphical user interface.
  • the notification may be a text message.
  • an indication that the memory buffer 212 has failed an integrity check may be detected as a stop event.
  • the storage manager may use the memory buffer 212 as a circular buffer and may delete and/or overwrite data in response to reaching a maximum data size threshold .
  • the storage manager 210 may delete and/or overwrite data in response to reaching a time threshold. These techniques may prevent the need to store large amounts of data that are not relevant to an event.
  • the storage manager 210 may write protect (e.g., mark as read-only, remove blocks from the buffer, etc.) data stored in the memory buffer 212.
  • data stored in the memory buffer 212 occurring before the start event e.g., 30 sec, etc.
  • the storage manager 210 may write protect data stored in the memory buffer 212 written between the occurrence of the start event and occurrence of the stop event.
  • the storage manager 210 may configure the memory buffer 212 to store all data collected after the start event as write protected.
  • the storage manager 210 may- reconfigure the memory buffer 212 to store data collected after the occurrence of the stop event as overwriteable (e.g. , remove the write protection, etc.).
  • the storage manager 210 may be communicatively coupled with storage 214.
  • storage 214 may be a solid-state drive (SSD) included in a mobile device (e.g., smartphone, tablet, etc.).
  • the storage 214 may provide long-term storage for the event recorder 204.
  • the storage 214 may be a cloud based storage system.
  • the storage manager 210 may move data from the memory buffer 212 to the storage 214. In an example, the data may be moved in response to the event detector 208 receiving a stop event.
  • a start event may have been detected based on the user saying "look at that” and a stop and save event may be detected based on the user saying "save video" and the data collected between the occurrence of the start event and the stop and save event may be moved by the storage manager 210 from the memory buffer 212 to the storage 214.
  • the storage manager 210 may transfer the data stored in the memory buffer 212 during a period of time before the start event (e.g., 30 seconds, etc.) and the data stored in the memory buffer 212 during a time period between the occurrence of the start event and the occurrence of the stop event to the storage 214.
  • the storage manager 210 may delete the contents of the memory buffer 212 in response to an indication of a stop and delete event.
  • the user may say "delete recording" which may be detected as a stop and delete event upon which the storage manager 210 may delete and/or clear the contents of the memory buffer 212.
  • the storage manager 210 may delete and/or transfer the contents of the memory buffer 212 in response to an indication of a gesture.
  • a gesture For example, the user make a gesture with and arm. (e.g., a swipe of the arm from lower left to upper right, etc.) or head (e.g., a series of nods, etc.) that may be detected as a stop event upon which the storage manager may delete the contents of the memory buffer 212.
  • the storage 214 may be included in a mobile device and the storage manager 210 may transfer the contents of the memory buffer 2.12 to the storage 214 upon establishing a connection with the mobile device.
  • a start event may have been detected based on a gesture made by the user and a stope event may have been made by a voice command issued by the user while the user was away from a mobile device.
  • the storage manager 21 0 may automatically transfer the contents of the memory buffer 212 to the storage 214 when the connection to the mobile device is established.
  • the connection between the mobile device and the wearable device may be via a wireless connection (e.g.
  • connection between the wearable device and the mobile device may be a wired connection (e.g., USB, etc.).
  • the event recorder 204 may send data collected before occurrence of the stop event and data collected after the occurrence of the start event over a network (e.g., cellular network, WiFi, SMS, etc. ).
  • a network e.g., cellular network, WiFi, SMS, etc.
  • the data may be transmitted via an email message.
  • the data may be transmitted as a text message (e.g., SMS message, etc.). This may allow the data to be sent directly to a third party (e.g., emergency personnel, family member, friend, etc.). In formation such as the user's location and an indication of an event type may assist the third party in providing assistance to the user.
  • wearable device e.g., smartglasses, smartwatch, etc.
  • automobiles e.g., sporting equipment (bicycles, scooters, etc.), motorcycles, etc.
  • present subject matter has been described in the context of a bicy clist, it will be understood that the present subj ect matter could be used in a variety of contexts in which a person wishes to record data surrounding an event. Some examples include, by way of example and not limitation, a motorcyclist, a motorist, an action sports participant (e.g. , skier, snowboarder, etc.).
  • FIG. 3 illustrates an example of a method 300 for an automatic event recorder, according to an embodiment.
  • method 300 may write protect a first portion of an input stream written to a memory buffer as read-only data upon receiving notification of a start event.
  • the input stream may be received from a sensor array of a wearable device.
  • the first portion of the input steam may correspond to a period of time before the start event.
  • the input stream may be received from a sensor array of a wearable device.
  • the input stream may include audio captured from a microphone of the wearable device.
  • the input stream may include video captured from a camera of the wearable device.
  • the input stream may include data collected from an accelerometer of the wearable device.
  • the input stream may include data collected from a global positioning system receiver of the wearable device.
  • method 300 may write a second portion of the input stream to the memory buffer as read-only data subsequent to the start event.
  • the second portion of the input stream may be written between the start event and a stop event.
  • the stop event may be generated using the audio captured from the microphone of the wearable device.
  • the stop event may be generated using the video captured from the camera of the wearable device.
  • the stop event may be generated using the data collected from the accelerometer of the wearable device.
  • the stop event may be generated using the data collected from the global positioning system receiver of the wearable device.
  • the stop event may be received from the wearable device.
  • audio information captured by the sensor array may be accessed, a voice command made by a user may be detected, and the stop event may be generated based on the voice command.
  • video information captured by the sensor array may be accessed, a gesture made by a user may be detected, and the stop event may be generated based on the gesture.
  • a user-defined data storage limit for the memory buffer may be accessed, it may be determined that the memory buffer has reached the user-defined storage limit, and the stop event may be generated based on the determination.
  • a third portion of the input stream may be written to the memory buffer as over-writeable data upon receiving notification of the stop event.
  • the third portion of the input stream may be written subsequent io the occurrence of the stop event.
  • the first portion of the input stream written to the memory buffer and the second portion of the input stream written to the memory buffer may be transferred to a storage device upon receiving notification of the stop event.
  • the storage device may be included in a mobile device.
  • the storage device may be included in the wearable device.
  • the memory buffer may be cleared upon the receipt of the stop event.
  • FIG. 4 illustrates a block diagram of an example machine 400 upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform.
  • the machine 400 may operate as a standalone device or may be connected (e.g. , networked) to other machines.
  • the machine 400 may operate in the capacity of a server machine, a client machine, or both in server-client network environments.
  • the machine 400 may act as a peer machine in peer-to-peer (P2P) (or other distributed) network environment.
  • P2P peer-to-peer
  • the machine 400 may be a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a mobile telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • PC personal computer
  • PDA personal digital assistant
  • STB set-top box
  • PDA personal digital assistant
  • mobile telephone a web appliance
  • network router, switch or bridge or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • machine shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein, such as cloud computing, software as a service (SaaS), other computer cluster configurations.
  • SaaS software as a service
  • Examples, as described herein, may include, or may operate by, logi c or a number of components, or mechanisms.
  • Circuit sets are a collection of circuits implemented in tangible entities that include hardware (e.g., simple circuits, gates, logic, etc.). Circuit set membership may be flexible over time and underlying hardware variability. Circuit sets include members that may, alone or in combination, perform specified operations when operating.
  • hardware of the circuit set may be immutably designed to carry out a specific operation (e.g., hardwired).
  • the hardware of the circuit set may include variably connected physical components (e.g.
  • execution units including a computer readable medium physically modified (e.g., magnetically, electrically, moveable placement of invariant massed particles, etc.) to encode instructions of the specific operation.
  • a computer readable medium physically modified (e.g., magnetically, electrically, moveable placement of invariant massed particles, etc.) to encode instructions of the specific operation.
  • the instructions enable embedded hardware (e.g. , the execution units or a loading mechani sm) to create members of the circuit set in hardware via the variable connections to carry out portions of the specific operation when in operation.
  • the computer readable medium is communicatively coupled to the other components of the circuit set member when the device is operating.
  • any of the physical components may be used in more than one member of more than one circuit set.
  • execution units may be used in a first circuit of a first circuit set at one point in time and reused by a second circuit in the first circuit set, or by a third circuit in a second circuit set at a different time.
  • Machine 400 may include a hardware processor 402 (e.g. , a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof), a main memor ' 404 and a static memory 406, some or all of which may
  • a hardware processor 402 e.g. , a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof
  • main memor ' 404 e.g., main memor ' 404 and static memory 406, some or all of which may
  • the machine 400 may further include a display unit 410, an alphanumeric input device 412 (e.g., a keyboard), and a user interface (UI) navigation device 414 (e.g., a mouse).
  • the display unit 410, input device 412 and UI navigation device 414 may be a touch screen display.
  • the machine 400 may additionally include a storage device (e.g., drive unit) 416, a signal generation device 418 (e.g., a speaker), a network interface device 420, and one or more sensors 421, such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensor.
  • the machine 400 may include an output controller 428, such as a serial (e.g., Universal Serial Bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field
  • USB Universal Serial Bus
  • IR infrared
  • NFC network communication
  • peripheral devices e.g., a printer, card reader, etc.
  • the storage device 416 may include a machine readable medium 422 on which is stored one or more sets of data structures or instructions 424 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein.
  • the instructions 424 may also reside, completely or at least partially, within the main memory 404, within static memory 406, or within the hardware processor 402 during execution thereof by the machine 400.
  • one or any combination of the hardware processor 402, the main memory 404, the static memory 406, or the storage device 416 may constitute machine readable media.
  • machine readable medium 422 is illustrated as a single medium, the term “machine readable medium” may include a single medium or multiple medi a (e.g., a centralized or distributed database, and/or associated caches and servers) configured to store the one or more instructions 424.
  • machine readable medium may include a single medium or multiple medi a (e.g., a centralized or distributed database, and/or associated caches and servers) configured to store the one or more instructions 424.
  • machine readable medium may include any medium that is capable of storing, encoding, or carrying instructions for execution by the machine 400 and that cause the machine 400 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions.
  • Non-limiting machine readable medium examples may include solid-state memories, and optical and magnetic media.
  • a massed machine readable medium comprises a machine readable medium with a plurality of particles having invariant (e.g., rest) mass. Accordingly, massed machine-readable media are not transitory propagating signals.
  • massed machine readable media may include: nonvolatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable
  • EEPROM Electrically Programmable Read-Only Memory
  • flash memory devices such as internal hard disks and removable disks; magneto- optical disks; and CD-ROM and DVD-ROM disks.
  • the instructions 424 may further be transmitted or received over a communications network 426 using a transmission medium via the network interface device 420 utilizing any one of a number of transfer protocols (e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.).
  • transfer protocols e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.
  • Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as Wi- Fi®, IEEE 802.16 family of standards known as WiMax®), IEEE 802.15.4 family of standards, peer-to-peer (P2P) networks, among others.
  • the network interface device 420 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to the communications network 426.
  • the network interface device 420 may include a plurality of antennas to wirelessly communicate using at least one of single-input multiple-output (SIMO), multiple-input multiple-output (MIMO), or multiple-input single-output (MISO) techniques.
  • SIMO single-input multiple-output
  • MIMO multiple-input multiple-output
  • MISO multiple-input single-output
  • transmission medium shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine 400, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software. Additional Notes & Examples
  • Example 1 is a computing apparatus for automatic event recording, the computing apparatus comprising: a processor; and a memory storing instructions that, when executed by the processor, configure the computing apparatus to: write protect, upon receiving notification of a start event, a first portion of an input stream written to a memory buffer as read-only data, the input stream received from a sensor array of a wearable device, the first portion of the input steam corresponding to a period of time before the start event; and write, subsequent to the start event, a second portion of the input stream to the memory buffer as read-only data, the second portion of the input stream written between the start event and a stop event.
  • Example 2 the subject matter of Example 1 optionally includes, wherein the input stream includes audio captured from a microphone of the wearable device.
  • Example 3 the subject matter of Example 2 optionally inciudes, further comprising instructions, which when executed by the processor, cause the processor to generate the stop event using the audio captured from the microphone of the wearable device.
  • Example 4 the subject matter of any one or more of Examples 1 - 3 optionally include, wherein the input stream includes video captured from a camera of the wearable device.
  • Example 5 the subject matter of Example 4 optionally includes, further comprising instructions, which when executed by the processor, cause the processor to generate the stop event using the video captured from the camera of the wearable device.
  • Example 6 the subject matter of any one or more of Examples 1 - 5 optionally include, wherein the input stream includes data collected from an accelerometer of the wearable device.
  • Example 7 the subject matter of Example 6 optionally inciudes, further comprising instructions, which when executed by the processor, cause the processor to generate the stop event using the data collected from the accelerometer of the wearable device.
  • Example 8 the subj ect matter of any one or more of Examples 1 - 7 optionally include, wherein the input stream includes data collected from a global positioning system receiver of the wearable device,
  • Example 9 the subject matter of Example 8 optionally includes, further comprising instructions, which when executed by the processor, cause the processor to generate the stop event using the data collected from the global positioning system receiver of the wearable device.
  • Example 10 the subject matter of any one or more of Examples 1 -9 optionally include, further comprising instructions, which when executed by the processor, cause the processor to: transfer, upon receiving notification of the stop event, the first portion of the input stream written to the memory buffer and the second portion of the input stream written to the memory buffer to a storage device,
  • Example 1 the subj ect matter of Example 10 optionally includes, wherein the storage device is included in a mobile device.
  • Example 12 the subject matter of any one or more of Examples 10-11 optionally include, wherein the storage device is included in the wearable device.
  • Example 13 the subject matter of any one or more of Examples 1-12 optionally include, wherein the stop event is received from the wearable device,
  • Example 14 the subject matter of any one or more of Examples 1 -13 optionally include, further comprising instructions, which when executed by the processor, cause the processor to clear the memory buffer upon the receipt of the stop event.
  • Example 15 the subject matter of any one or more of Examples 1-14 optionally include, further comprising instructions, which when executed by the processor, cause the processor to: access audio information captured by the sensor array; detect a voice command made by a user of the computer apparatus; and generate the stop event based on the voice command.
  • Example 16 the subject matter of any one or more of Examples 1-15 optionally include, further comprising instructions, which when executed by the processor, cause the processor to: access video information captured by the sensor array; detect a gesture made by a user of the computing apparatus; and generate the stop event based on the gesture.
  • Example 17 the subject matter of any one or more of Examples 1-16 optionally include, further comprising instructions, which when executed by the processor, cause the processor to: access a user-defined data storage limit for the memory buffer; determine that the memory buffer has reached the user-defined storage limit; and generate the stop event based on the determination.
  • Example 18 the subject matter of any one or more of Examples 1-17 optionally include, further comprising instructions, which when executed by the processor, cause the processor to: write, upon receiving notification of the stop event, a third portion of the input stream to the memory buffer as over-writeable data.
  • Example 19 is a computer-readable storage medium for automatic event recording, the computer-readable storage medium including instructions that when executed by a computer, cause the computer to: write protect, upon receiving notification of a start event, a first portion of an input stream written to a memory buffer as read-only data, the input stream received from a sensor array of a wearable device, the first portion of the input steam corresponding to a period of time before the start event; and write, subsequent to the start event, a second portion of the input stream to the memory buffer as read-only data, the second portion of the input stream written between the start event and a stop event.
  • Example 20 the subj ect matter of Example 19 optionally includes, wherein the input stream includes audio captured from a microphone of the wearable device.
  • Example 21 the subj ect matter of Example 20 optionally includes, further comprising instructions to cause the computer to generate the stop event using the audio captured from the microphone of the wearable devi ce.
  • Example 22 the subject matter of any one or more of Examples 19-21 optionally include, wherein the input stream includes video captured from a camera of the wearable device.
  • Example 23 the subj ect matter of Example 22 optionally includes, further comprising instructions to cause the computer to generate the stop event using the video captured from the camera of the wearable device.
  • Example 24 the subject matter of any one or more of Examples 19-23 optionally include, wherein the input stream includes data collected from an accelerometer of the wearable device.
  • Example 25 the subj ect matter of Example 24 optionally includes, further comprising instructions to cause the computer to generate the stop event using the data collected from the accelerometer of the wearable device.
  • Example 26 the subject matter of any one or more of Examples 19—25 optionally include, wherein the input stream includes data collected from a global positioning system receiver of the wearable device.
  • Example 27 the subj ect matter of Example 26 optionally includes, further comprising instructions to cause the computer to generate the stop event using the data collected from the global positioning system receiver of the wearable device.
  • Example 28 the subject matter of any one or more of Examples 19-27 optionally include, further comprising instructions to cause the computer to: transfer, upon receiving notification of the stop event, the first portion of the input stream written to the memory buffer and the second portion of the input stream written to the memory buffer to a storage device.
  • Example 29 the subj ect matter of Example 28 optionally includes, wherein the storage device is included in a mobile device.
  • Example 30 the subject matter of any one or more of Examples 28-29 optionally include, wherein the storage device is included in the wearable device.
  • Example 31 the subject matter of any one or more of Examples 1 9-30 optionally include, wherein the stop event is received from the wearable device.
  • Example 32 the subject matter of any one or more of Examples 19-31 optionally include, further comprising instructions to cause the computer to clear the memory buffer upon the receipt of the stop event.
  • Example 33 the subject matter of any one or more of Examples 1 9-32 optionally include, further comprising instructions to cause the computer to: access audio information captured by the sensor array; detect a voice command made by a user of the computer; and generate the stop event based on the voice command.
  • Example 34 the subject matter of any one or more of Examples 19-33 optionally include, further comprising instructions to cause the computer to: access video information captured by the sensor array; detect a gesture made by a user of the computer; and generate the stop event based on the gesture.
  • Example 35 the subject matter of any one or more of Examples 19-34 optionally include, further comprising instructions to cause the computer to: access a user-defined data storage limit for the memorv- buffer; determine that the memorv' buffer has reached the user-defined storage limit; and generate the stop event based on the determination.
  • Example 36 the subject matter of any one or more of Examples 19-3.5 optionally include, further comprising instructions to cause the computer to: write, upon receiving notification of the stop event, a third portion of the input stream to the memory buffer as over-writeable data.
  • Example 37 is a system for automatic event recording, the system comprising: means for write protecting, upon receiving notification of a start event, a first portion of an input stream written to a memory buffer as readonly data, the input stream received from a sensor array of a wearable device, the first portion of the input steam corresponding to a period of time before the start event; and means for writing, subsequent to the start event, a second portion of the input stream to the memory buffer as read-only data, the second portion of the input stream written between the start event and a stop event.
  • Example 38 the subj ect matter of Example 37 optionally includes, wherein the input stream includes audio captured from a microphone of the wearable device.
  • Example 39 the subject matter of Example 38 optionally includes, further comprising means for generating the stop event using the audio captured from the microphone of the wearable device.
  • Example 40 the subject matter of any one or more of Examples 37-39 optionally include, wherein the input stream includes video captured from a camera of the wearable device.
  • Example 41 the subj ect matter of Example 40 optionally includes, further comprising means for generating the stop event using the video captured from the camera of the wearable device,
  • Example 42 the subject matter of any one or more of Examples 37-41 optionally include, wherein the input stream includes data collected from an accelerometer of the wearable device.
  • Example 43 the subj ect matter of Example 42 optionally includes, further comprising means for generating the stop event using the data collected from the accelerometer of the wearable device.
  • Example 44 the subject matter of any one or more of Examples 37-43 optionally include, wherein the input stream includes data collected from a global positioning system receiver of the wearable device.
  • Example 45 the subj ect matter of Example 44 optionally includes, further comprising means for generating the stop event using the data collected from the global positioning system receiver of the wearable device.
  • Example 46 the subject matter of any one or more of Examples 37-45 optionally include, further comprising: means for transferring, upon receiving notification of the stop event, the first portion of the input stream written to the memory buffer and the second portion of the input stream written to the memory buffer to a storage device.
  • Example 47 the subj ect matter of Example 46 optionally includes, wherein the storage device is included in a mobile device.
  • Example 48 the subject matter of any one or more of Examples 46-47 optionally include, wherein the storage device is included in the wearable device.
  • Example 49 the subject matter of any one or more of Examples 37-48 optionally include, wherein the stop event is received from the wearable device.
  • Example 50 the subject matter of any one or more of Examples 37-49 optionally include, further comprising means for clearing the memory- buffer upon the receipt of the stop event.
  • Example 51 the subject matter of any one or more of Examples 37-50 optionally include, further comprising: means for accessing audio information captured by the sensor array; means for detecting a voice command made by a user of the system; and means for generating the stop event based on the voice command.
  • Example 52 the subject matter of any one or more of Examples 37-51 optionally include, further comprising: means for accessing video information captured by the sensor array; means for detecting a gesture made by a user of the system; and means for generating the stop event based on the gesture.
  • Example 53 the subject matter of any one or more of Examples 37-52 optionally include, further comprising: means for accessing a user- defined data storage limit for the memory buffer; means for determining that the memory buffer has reached the user-defined storage limit; and means for generating the stop event based on the determination.
  • Example 54 the subject matter of any one or more of Examples 37-53 optionally include, further comprising: means for writing, upon receiving notification of the stop event, a third portion of the input stream to the memory buffer as over-writeabie data.
  • Example 55 is a method for automatic event recording, the method comprising: write protecting, upon receiving notification of a start event, a first portion of an input stream written to a memory buffer as read-only data, the input stream recei ved from a sensor array of a wearable device, the first portion of the input steam corresponding to a period of time before the start event; and writing, subsequent to the start event, a second portion of the input stream to the memory buffer as read-only data, the second portion of the input stream written between the start event and a stop event.
  • Example 56 the subj ect matter of Example 55 optionally includes, wherein the input stream includes audio captured from a microphone of the wearable device.
  • Example 57 the subj ect matter of Example 56 optionally includes, further comprising generating the stop event using the audio captured from the microphone of the wearable device.
  • Example 58 the subject matter of any one or more of Examples 55-57 optionally include, wherein the input stream includes video captured from a camera of the wearable device,
  • Example 59 the subj ect matter of Example 58 optionally includes, further comprising generating the stop event using the video captured from the camera of the wearable device,
  • Example 60 the subject matter of any one or more of Examples 55-59 optionally include, wherein the input stream includes data collected from an accelerometer of the wearable device,
  • Example 61 the subj ect matter of Example 60 optionally includes, further comprising generating the stop event using the data collected from the accelerometer of the wearable device.
  • Example 62 the subject matter of any one or more of Examples 55-61 optionally include, wherein the input stream includes data collected from a global positioning system receiver of the wearable device.
  • Example 63 the subj ect matter of Example 62 optionally includes, further comprising generating the stop event using the data collected from the global positioning system receiver of the wearable device.
  • Example 64 the subject matter of any one or more of Examples 55-63 optionally include, further comprising: transferring, upon receiving notification of the stop event, the first portion of the input stream written to the memory buffer and the second portion of the input stream written to the memory buffer to a storage device.
  • Example 65 the subj ect matter of Example 64 optionally includes, wherein the storage device is included in a mobile device.
  • Example 66 the subject matter of any one or more of Examples 64-65 optionally include, wherein the storage device is included in the wearable device.
  • Example 67 the subject matter of any one or more of Examples 55-66 optionally include, wherein the stop event is received from the wearable device.
  • Example 68 the subject matter of any one or more of Examples 55-67 optionally include, further comprising clearing the memory buffer upon the receipt of the stop event.
  • Example 69 the subject matter of any one or more of Examples 55-68 optionally include, further comprising: accessing audio information captured by the sensor array; detecting a voice command made by a user; and generate the stop event based on the voice command.
  • Example 70 the subject matter of any one or more of Examples 55-69 optionally include, further comprising: accessing video information captured by the sensor array; detecting a gesture made by a user; and generating the stop event based on the gesture.
  • Example 71 the subject matter of any one or more of Examples 55-70 optionally include, further comprising: accessing a user-defined data storage limit for the memory buffer; determining that the memory buffer has reached the user-defined storage limit; and generating the stop event based on the determination.
  • Example 72 the subject matter of any one or more of Examples 55-71 optionally include, further comprising: writing, upon receiving notification of the stop event, a third portion of the input stream to the memory buffer as over-writeable data.
  • Example 73 is a system for automatic event recording, the system comprising means to perform any method of Examples 55-72.
  • Example 74 is a machine readable medium for automatic event recording, the machine readable medium including instructions that, when executed by a machine, cause the machine to perform any method of Examples 55-72.

Abstract

Systems and techniques for an automatic event recorder are described herein. An input stream may be written to a memory buffer in an overwrite mode. The input stream may be received from a sensor array of a wearable device. A first portion of the input stream written to the memory buffer may be protected upon obtaining an indication of occurrence of a start event. The memory buffer may be configured to receive a second portion of the input stream in a write mode subsequent to the occurrence of the start event. The memory buffer may be reconfigured to receive a third portion of the input stream in the overwrite mode upon obtaining an indication of the occurrence of the stop event.

Description

AUTOMATIC EVENT RECORDER
PRIORITY CLAIM
[0001] This patent application claims the benefit of priority to U. S.
Application Serial No. 14/971,574, filed December 16, 2015, which is incorporated by reference herein in its entirety.
BACKGROUND
[0002] Bicyclists may have little recourse if they have a collision with a vehicle. According to the Centers for Disease Control, over 900 cyclists were killed and 494,000 emergency room visits occurred due to bike accidents in 2013. Many infractions or near misses happen each day that may not be reported or recorded. Determining fault may be important when an actual infraction of the law is committed. Safety improvements may be beneficial to bicyclists as a bicyclist may be more vulnerable to injury in a collision than a motor vehicle driver.
BRIEF DESCRIPTION OF THE DRAWINGS
[0003] To easily identify the discussion of any particular element or act, the most significant digit or digits in a reference number refer to the figure number in which that element is first introduced.
[0004] FIG. 1 illustrates an environment including an event recorder, according to an embodiment.
[0005] FIG. 2 illustrates an example of a system for an automatic event recorder, according to an embodiment.
[0006] FIG. 3 illustrates an example of a method for an automatic event recorder, according to an embodiment.
[0007] FIG. 4 is a block diagram illustrating an example of a machine upon which one or more embodiments may be implemented.
DETAILED DESCRIPTION
[0008] Accidents between bicycles and automobiles generally pose a greater risk to the bicycle rider. Often times there may be little evidence of what caused a collision other than statements by a party to the accident or a witness. However, human memory may be subj ect to inaccuracies and may be influenced by bias or other internal or external forces. It may be difficult to determine the circumstances leading up to or subsequent to an event such as an accident without reliable information.
[0009] Systems and techniques for an automatic event recorder are disclosed herein which may provide reliable information providing the circumstances leading up to and subsequent to an event (e.g., crash, interesting visuals, happenstances, etc.). With reliable information, it may be possible enact new safety measures or establish fault. A technique utilizing sensor technology ( e.g., for bike-commuting, etc.) in conjunction with a wearable device (e.g., head mounted display (HMD) glasses) and a storage device (e.g., solid-state drive (SSD) quick-access storage) may provide data surrounding events (e.g., collisions, etc.) that may be helpful in
understanding the cause of the event. For example, the present subject matter may allow recording of information (e.g., location, video, audio)
surrounding events when they occur.
[0010] The present subject matter may heighten awareness and attention of a user and may facilitate avoidance of incidents between bikes and vehicles. In addition, the disclosed techniques may give cyclists and drivers recourse in situations where they are often at a disadvantage in reporting accidents by providing data to inform understanding and decision making. The wearable device may include sensors such as, for example, audio microphones and video cameras to record the scene from the rider's perspective. The present subj ect matter may provide rider-point-of-view perspective and orientation of the camera and microphone sensors that may enable a recording event that may be near-real-time and may very relevant to an event that has just occurred.
[001.1] In an example, an intelligent Head Mounted Display (HMD) device paired to a companion smartphone keeps a circular recording buffer (audio and video) for a user defined period while the cyclist is wearing the HMD. As long as no trigger events occur, buffered data older than the user defined period is discarded. By using an accelerometer, a microphone, and a video/camera sensor in conjunction with low power 'always listening'' (LPAL) technologies, a series of triggers and/or events may be determined from input received such as, by way of example and not limitation, a loud crash noise, a rider's exclamation (e.g., voice, yelling loudly, swearing, etc.), and very sudden, violent stopping or quick change in orientation/motion.
[0012] The sensor data combined with the LPAL trigger mechanism may automatically cause real-time recording of location, time, audio, and video of the event. In an example, after recei ving a trigger event (e.g., a start event, a stop event, etc), the device may save the buffered video before the point of the trigger event and may continue to save ambient audio and video continuously until, for example, either a pre-defined storage limit is reached, the user terminates the recording, or the device stops functioning.
[0013] The buffer may constantly replace old data with new data. In an example, the buffer may be a user-defined-duration circular buffer. The circular buffer may be transparent to the user until an event occurs. At the point of the trigger event, the buffered data may be written to a second persistent storage location. In an example, the buffer may be persistent in case of hardware failure. When the device begins persistent storage of recorded data, the user may be notified via a combination of a subtle audio or a display cue. Data may continue to be persistently recorded until a predetermined stop event has occurred. Some stop events such as, for example, a custom voice event may be configured to automatically delete the recorded data.
[0014] In an example, the video clip may be saved along with other data such as, for example, GPS/location, event date and time, trigger event type, and accelerometer data. In an example, the record may be stored locally and may be backed up to a companion device (e.g. , mobile phone, tablet, etc.) and/or cloud storage (e.g., public cloud storage system, private cloud storage system, etc. ) upon the next connection.
[0015] In some examples, the record may be deleted in the absence of a trigger event or may be saved with presence of a trigger event for later inspection or deletion if the event turns out to be a false-positi ve (e.g., an accident did not occur, or a near-accident event has occurred, etc.).
[0016] FIG. 1 illustrates an environment 100 including an event recorder 108, according to an embodiment. The environment 100 may include an automobile 1 10, and a user 102 (e.g., bicyclist, etc.). The user 102 may be wearing a wearable device 104 (e.g., smartglasses, smart watch, etc.) and may have a mobile device (e.g., smartphone, tablet, etc.) 106. The wearable device 104 and mobile device 106 may be communicatively coupled to an event recorder 108. In an example, the event recorder 108 may be integrated into the wearable device 104 and/or the mobile device 106.
[0017] The wearable device 104 may include an array of sensors (e.g., camera, microphone, accelerometer, gyroscope, magnetometer, GPS receiver, etc.) that may observe the environment surrounding the user 102. For example, the user 102 may be riding a bicycle and may come to an intersection where an automobile 1 10 may enter the intersection in the path of the user. The user 102 may slow rapidly and may utter a verbal exclamation. An accelerometer in the array of sensors may detect the rapid deceleration of the user 1 02 and a microphone of the array of sensors may detect the verbal exclamation.
[0018] The wearable device 104 may include a memory device and a storage device capable of storing data received from the array of sensors. In an example, the memory device may be a memory buffer. In an example, the memory device may be a user-defined-duration circular memory buffer. In an example, the storage device may be a solid-state drive.
[0019] The data collected from the array of sensors may be received by the event recorder 108. The event recorder 1 08 may write the data to the memory device of the wearable device 104. For example, the data collected by the accelerometer indicating the rapid deceleration of the user 102 and the audio of the verbal exclamation of the user may be written to the memory device.
[0020] The event recorder 108 may use the data to detect the occurrence of an event. The event recorder 108 may detect a variety of events that may be classified by event type. Example event types include, but are not limited to, a start event, a stop event, a save event, and a delete event.
[0021 ] A start event is an event that causes the event recorder 108 to save data collected for a period of time before the occurrence of the start event and data coll ected for a period of time after the occurrence of the start event. For example, the verbal exclamation of the user 102, the rapid deceleration of the user 102, or both may be used by the event recorder 108 to determine that an impending collision event has occurred which may be classified as a start event.
[0022] A stop event is an event that causes the event recorder 108 to cease the data save operations imitated subsequent to and/or simultaneously with the detection of a start event. For example, a voice command uttered by the user 102 such as "stop recording" may be classified as a stop command. In an example, event types may be further classified by sub-types. Depending on the type of stop event detected by the crash recorder 108, the data saved may be discarded. For example, a voice command uttered by the user 102 such as "delete recording" may be classified as a stop and delete event type. As another example, the stop event may cease data save operations and initiate a subsequent process, such as uploading the audio/video recording to a remote destination for data preservation. For example, a voice command uttered by the user 102 such as "save recording" may be classified as a stop and save command and the data saved may be transferred to a cloud based storage system.
[0023] The event types may be used by the event recorder 108 to perform actions on the data such as, for example, setting storage flags for data in the memory device and transferring data between the memory device and the storage device. In an example, a start event may flag data currently residing in a memory buffer and data written to the memory buffer subsequent to the start event as write protected.
[0024] In an example, a stop event may clear the write protected flag for data written to the memory buffer subsequent to the stop event. In an example, a stop and delete event may clear the write protect flag for data written subsequent to the stop and delete event and may delete data from the memory buffer prior to the stop and delete event, in an example, a stop and save event may clear the write protect flag for data written subsequent to the stop and save event and may transfer the memory buffer to the storage device.
[0025] In some examples, data written subsequent to the start event may continue to be marked as write protected until an automatic stop event is detected such as, for example, the memory buffer has reached a physical limit or the memory buffer has reach a storage limit set by the user 102. In an example, data may continue to be written to a storage device
automatically when the memory buffer reaches a physical limit. In an example, the data may be written to the storage device until an automatic stop event is detected such as the storage device has reached its physical limit or a limit set by the user 102. The event recorder is described in greater detail in FIG. 2.
[0026J FIG. 2 illustrates an example of a system 200 for an automatic event recorder, according to an embodiment. The system 200 includes a sensor array (e.g., a group of sensors integrated into a head mounted display, etc.) 202 that is communicatively coupled to an event recorder 204.
[0027] The array of sensors 202 may include various sensors including, by way of example and not limitation, a microphone, a camera, a global positioning system (GPS) receiver, accelerometer, magnetometer, gyroscope, etc. The array of sensors 202 may be used to collect information about a user and the environment surrounding the user. For example, the array of sensors 202 may observe audio, video, GPS location, and movement data
corresponding with the user and the user's environment over a period of time.
[0028] The event recorder 204 may include an input receiver 206, an event detector 208, and a storage manager 210. The input receiver 206 may receive inputs from the sensor array 202. For example, the input receiver 206 may receive the audio from a microphone of the sensor array 202, video from a camera of the sensor array 202, acceleration data from an accelerometer of the sensor array 202, and GPS data from a GPS receiver of the array of sensors 202.
[0029] The data collected by the input receiver 206 from the array of sensors 202 may provide information from the user's perspective. For example, video and audio may be collected from the vantage point of the cyclist, accelerometer data collected may show impact strength as felt from the user's head, gyroscope data may show unusual positions such as a roll experienced by the user, magnetometer data collected may show the direction the user is looking, and GPS location data collected may show a point of impact, point of rest, and path of travel of the user leading up to an event. [0030] The event detector 208 may use the data received by the input receiver 206 to make determinations about events that are occurring in the user's environment. For example, if the user is on a bicycle the event detector 208 may use received audio, video, and acceleration data to make a determination that a collision between the bicycle and another entity (e.g., vehicle, person, obstruction, etc.) is imminent. In the example, the determination may be based on received inputs indicating that the bicyclist may be yelling, the other entity may be becoming larger in the video, and the bicyclist may be slowing rapidly.
[0031] In some examples, an event may be determined using audio input received from the sensor array 202. The event detector 208 may use various techniques such as, by way of example and not limitation, computational linguistics (e.g. , speech recognition, etc.) and audio event detection. In an example, the audio received from a microphone of the sensor array 202 may be processed using an algorithm to determine if units of the audio segment match models of audio segments from training data associated with an event. For example, a user may yell "Look Out! " which may be matched to an impending collision event. In another example, the audio collected from a microphone of the sensor array 202 may include the sound of screeching tires and the sound may be matched to a sound signature corresponding with the impending collision event.
[0032J In some examples, an event may be determined using video input received from the sensor array 202, The event detector 208 may use a variety of image recognition techniques to determine events occurring in the received video input. In an example, a set of training video segments may be used to create a model for an event. The received video input may then be evaluated against a set of event models to classify the received video segment with an event class. For example, a video segment received from a camera of the sensor array 202 may show a car continually getting larger throughout the clip and the video segment may be assigned an impending collision classification based on the video segments similarity to an impending collision model.
[0033] In an example, an event may be determined using accelerometer data received from the sensor array. The event detector 208 may use a variety of accelerometry techniques to determine an event has occurred. In an example, a set of training data including various acceleration and deceleration sequences may be used to generate a set of event models. The received accelerometer data may then be evaluated against the models to classify segments of the accelerometer data. For example, the received accelerometer data may indicate the user was moving at a steady pace for a period of time and then is rapidly decelerating. The segment of
accelerometer may be evaluated against the set of models and may be assigned a classification of impending collision. In another example, an algorithm may be used to determine a speed variance for the segment and if the speed variance is outside a threshold it may be determined that an event has occurred. For example, the received accelerometer data may indicate the user was moving at a steady pace for a period of time and then is rapidly decelerating and the speed variance may indicate that a collision is imminent.
[0034] In some examples, received data inputs may be combined and evaluated against a set of models to determine if an event has occurred. For example, the collected audio data may indicate that the user yelled "Look Out! " and the accelerometer data may indicate there was a rapid deceleration occurring during the same time period. Using the indici a together may provide a more accurate indication of an event.
[0035J In an example, the event detector 208 may detect different classes of events. A start class of events may be used to identify data that should be saved. Examples of start events may include by example and not limitation, audible events (e.g., a volume threshold may be used to detect an impact or a car horn, etc.), custom voice events (e.g., yelling, predetermined keywords: "Watch it! " and "That was cool", etc.), common expletives, (e.g. , detection of an impending event based on a dictionary of common expletives uttered in emergency situations, etc.), and data triggered events (e.g., sensor data indicates an impact or an unusual occurrence such as being upside down, etc.).
[0036] In an example, a start event may be detected that sets a write protect flag for a set of data corresponding with the start event. For example, a collision event may be detected as a start event. For example, the user may say "Look at that" which may be detected as a start event. In an example, the set of data may include data for first period of time before occurrence of the start event and data for a second period of time after the occurrence of the start event. For example, a collision start event may be determined and 30 seconds of data may be flagged for write protection and five minutes of data after the collision start event may be marked for write protection.
[0037J A stop class event may be used to end the recording and/or write protection of data collected surrounding the start event. Stop events may include automatic and/or manual indications that the event has ended and/or that the data preservation period has ended.
[0038] In an example, the event detector 208 may detect a stop event that turns off the write protect flag for data subsequent to the occurrence of the stop event. In an example, the event detector 208 may detect sub-classes of stop events. In an example, a stop and delete event may be detected that turns of the write protect flag and deletes data collected before the occurrence of the stop and delete event. In an example, a stop and save event may be detected that turns off the write protect flag and transfers the data collected before the occurrence of the stop and save event to long-term storage. Examples of stop events may include, by way of example and not limitation, physical storage limit reached, user defined storage limit reached (e.g., 1 gigabyte, etc.), user defined recording period reached (e.g., 5 minutes), user defined (e.g., through a training session) commands (e.g., "forget what just happened," "it's over," etc. ), and preprogrammed commands (e.g., voice commands, gestures, etc.).
[0039] In an example, the stop event may be detected from the data collected from the sensor array 202. In an example, the sensor array 202 may include a microphone for collecting audio data and the stop event may be generated from the audio data. For example, the user may utter "stop recording" which may be classified as a stop event. In an example, the sensor array 202 may include a camera for collecting video data and the stop event may be generated from the video data. For example, the user may make a gesture with an arm that is in view of the camera. In an example, the sensor array 202 may include an acceierometer for collecting movement data and the stop event may be generated from the movement data. For example. the user may make a series of motions with his head. In an example, the sensor array 202 may include a GPS receiver for collecting location data and the stop event may be generated using the location data. For example, the user may move to a safe location.
[0040] In some examples, a combination of data collected from various sensors of the sensor array 202 may be used to generate a stop event. In an example, the stop event may be detected from the proximity of emergency personnel to the user. For example, a police officer may be detected in the video data collected which may be detected as a stop event.
[0041] The event recorder 204 may include a storage manager 210 to manage storage operations. The storage manager 210 may determine the storage location of data collected from the input receiver 206. The storage manager 210 may be communicatively coupled with a memory buffer 212 in a wearable device such as, for example, the wearable device 104 as described in FIG. 1. In an example, the memory buffer 212 may be a short- term memory storage location that is able to store streaming input data in real-time. For example, input receiver 206 may receive a data stream from the array of sensors 202 including audio, video, movement, and location data and the storage manager 210 may continuously write the data stream to the memory buffer 212. In an example, the storage manager 210 may continuously write the input data stream to the memory buffer whil e the event recorder 204 is in operation. In an example, a video clip may be saved along with location, event date, event time, event type, and accelerometer data.
[0042] The constant read/write cycles may be taxing on storage devices (e.g., memory buffer 212). Therefore, the storage manager 210 may automatically self-validate static memory integrity on an ongoing basis. In an example, the storage manager 210 may implement data integrity validation. In an example, a notification may be transmitted to the user indicating that the static memory has failed the data integrity validation. In an example, the notification may be a graphical user interface. In an example, the notification may be a text message. In an example, an indication that the memory buffer 212 has failed an integrity check may be detected as a stop event. [ΘΘ43] In an example, the storage manager may use the memory buffer 212 as a circular buffer and may delete and/or overwrite data in response to reaching a maximum data size threshold . In an example, the storage manager 210 may delete and/or overwrite data in response to reaching a time threshold. These techniques may prevent the need to store large amounts of data that are not relevant to an event. In an example, the storage manager 210 may write protect (e.g., mark as read-only, remove blocks from the buffer, etc.) data stored in the memory buffer 212. In an example, data stored in the memory buffer 212 occurring before the start event (e.g., 30 sec, etc.) may be write protected to prevent the data from being overwritten. In an example, the storage manager 210 may write protect data stored in the memory buffer 212 written between the occurrence of the start event and occurrence of the stop event. In an exampl e, the storage manager 210 may configure the memory buffer 212 to store all data collected after the start event as write protected. In an example, the storage manager 210 may- reconfigure the memory buffer 212 to store data collected after the occurrence of the stop event as overwriteable (e.g. , remove the write protection, etc.).
[0044] The storage manager 210 may be communicatively coupled with storage 214. In an example, storage 214 may be a solid-state drive (SSD) included in a mobile device (e.g., smartphone, tablet, etc.). In an example, the storage 214 may provide long-term storage for the event recorder 204. In an example, the storage 214 may be a cloud based storage system. In an example, the storage manager 210 may move data from the memory buffer 212 to the storage 214. In an example, the data may be moved in response to the event detector 208 receiving a stop event. For example, a start event may have been detected based on the user saying "look at that" and a stop and save event may be detected based on the user saying "save video" and the data collected between the occurrence of the start event and the stop and save event may be moved by the storage manager 210 from the memory buffer 212 to the storage 214. In an example, the storage manager 210 may transfer the data stored in the memory buffer 212 during a period of time before the start event (e.g., 30 seconds, etc.) and the data stored in the memory buffer 212 during a time period between the occurrence of the start event and the occurrence of the stop event to the storage 214.
[0045] In some examples, the storage manager 210 may delete the contents of the memory buffer 212 in response to an indication of a stop and delete event. For example, the user may say "delete recording" which may be detected as a stop and delete event upon which the storage manager 210 may delete and/or clear the contents of the memory buffer 212.
[0046] In some examples, the storage manager 210 may delete and/or transfer the contents of the memory buffer 212 in response to an indication of a gesture. For example, the user make a gesture with and arm. (e.g., a swipe of the arm from lower left to upper right, etc.) or head (e.g., a series of nods, etc.) that may be detected as a stop event upon which the storage manager may delete the contents of the memory buffer 212.
[0047] In some examples, the storage 214 may be included in a mobile device and the storage manager 210 may transfer the contents of the memory buffer 2.12 to the storage 214 upon establishing a connection with the mobile device. For example, a start event may have been detected based on a gesture made by the user and a stope event may have been made by a voice command issued by the user while the user was away from a mobile device. When the user returns home the storage manager 21 0 may automatically transfer the contents of the memory buffer 212 to the storage 214 when the connection to the mobile device is established. In an example, the connection between the mobile device and the wearable device may be via a wireless connection (e.g. , near-field communication, WiFi, short-wavelength radio, Bluetooth, Bluetooth low energy, Ant+, near-field communication, etc.). In an example, the connection between the wearable device and the mobile device may be a wired connection (e.g., USB, etc.).
[0048] In some examples, the event recorder 204 may send data collected before occurrence of the stop event and data collected after the occurrence of the start event over a network (e.g., cellular network, WiFi, SMS, etc. ). In an example, the data may be transmitted via an email message. In an example, the data may be transmitted as a text message (e.g., SMS message, etc.). This may allow the data to be sent directly to a third party (e.g., emergency personnel, family member, friend, etc.). In formation such as the user's location and an indication of an event type may assist the third party in providing assistance to the user.
[0049] While the present subject matter has been described in a wearable device, it will be understood that the subject matter described herein could be implemented in a variety of devices such as wearable devices (e.g., smartglasses, smartwatch, etc.), automobiles, sporting equipment (bicycles, scooters, etc.), motorcycles, etc. While the present subject matter has been described in the context of a bicy clist, it will be understood that the present subj ect matter could be used in a variety of contexts in which a person wishes to record data surrounding an event. Some examples include, by way of example and not limitation, a motorcyclist, a motorist, an action sports participant (e.g. , skier, snowboarder, etc.).
[0050] FIG. 3 illustrates an example of a method 300 for an automatic event recorder, according to an embodiment.
[0051] At operation 302, method 300 may write protect a first portion of an input stream written to a memory buffer as read-only data upon receiving notification of a start event. The input stream may be received from a sensor array of a wearable device. The first portion of the input steam may correspond to a period of time before the start event. In an example, the input stream may be received from a sensor array of a wearable device. In an example, the input stream may include audio captured from a microphone of the wearable device. In an example, the input stream may include video captured from a camera of the wearable device. In an example, the input stream may include data collected from an accelerometer of the wearable device. In an example, the input stream may include data collected from a global positioning system receiver of the wearable device.
[0052] At operation 304, method 300 may write a second portion of the input stream to the memory buffer as read-only data subsequent to the start event. The second portion of the input stream may be written between the start event and a stop event. In an example, the stop event may be generated using the audio captured from the microphone of the wearable device. In an example, the stop event may be generated using the video captured from the camera of the wearable device. In an example, the stop event may be generated using the data collected from the accelerometer of the wearable device. In an example, the stop event may be generated using the data collected from the global positioning system receiver of the wearable device. In an example, the stop event may be received from the wearable device.
[0053] In some examples, audio information captured by the sensor array may be accessed, a voice command made by a user may be detected, and the stop event may be generated based on the voice command. In some examples, video information captured by the sensor array may be accessed, a gesture made by a user may be detected, and the stop event may be generated based on the gesture. In some examples, a user-defined data storage limit for the memory buffer may be accessed, it may be determined that the memory buffer has reached the user-defined storage limit, and the stop event may be generated based on the determination.
[0054] In some examples, a third portion of the input stream may be written to the memory buffer as over-writeable data upon receiving notification of the stop event. In an example, the third portion of the input stream may be written subsequent io the occurrence of the stop event.
[0055] In some examples, the first portion of the input stream written to the memory buffer and the second portion of the input stream written to the memory buffer may be transferred to a storage device upon receiving notification of the stop event. In an example, the storage device may be included in a mobile device. In an example, the storage device may be included in the wearable device. In some examples, the memory buffer may be cleared upon the receipt of the stop event.
[0056] FIG. 4 illustrates a block diagram of an example machine 400 upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform. In alternative embodiments, the machine 400 may operate as a standalone device or may be connected (e.g. , networked) to other machines. In a networked deployment, the machine 400 may operate in the capacity of a server machine, a client machine, or both in server-client network environments. In an example, the machine 400 may act as a peer machine in peer-to-peer (P2P) (or other distributed) network environment. The machine 400 may be a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a mobile telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term "machine" shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein, such as cloud computing, software as a service (SaaS), other computer cluster configurations.
[0057] Examples, as described herein, may include, or may operate by, logi c or a number of components, or mechanisms. Circuit sets are a collection of circuits implemented in tangible entities that include hardware (e.g., simple circuits, gates, logic, etc.). Circuit set membership may be flexible over time and underlying hardware variability. Circuit sets include members that may, alone or in combination, perform specified operations when operating. In an example, hardware of the circuit set may be immutably designed to carry out a specific operation (e.g., hardwired). In an example, the hardware of the circuit set may include variably connected physical components (e.g. , execution units, transistors, simple circuits, etc) including a computer readable medium physically modified (e.g., magnetically, electrically, moveable placement of invariant massed particles, etc.) to encode instructions of the specific operation. In connecting the physical components, the underlying electrical properties of a hardware constituent are changed, for example, from an insulator to a conductor or vice versa. The instructions enable embedded hardware (e.g. , the execution units or a loading mechani sm) to create members of the circuit set in hardware via the variable connections to carry out portions of the specific operation when in operation. Accordingly, the computer readable medium is communicatively coupled to the other components of the circuit set member when the device is operating. In an example, any of the physical components may be used in more than one member of more than one circuit set. For example, under operation, execution units may be used in a first circuit of a first circuit set at one point in time and reused by a second circuit in the first circuit set, or by a third circuit in a second circuit set at a different time.
[0058] Machine (e.g., computer sy stem) 400 may include a hardware processor 402 (e.g. , a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof), a main memor ' 404 and a static memory 406, some or all of which may
communicate with each other via an interlink (e.g., bus) 408. The machine 400 may further include a display unit 410, an alphanumeric input device 412 (e.g., a keyboard), and a user interface (UI) navigation device 414 (e.g., a mouse). In an example, the display unit 410, input device 412 and UI navigation device 414 may be a touch screen display. The machine 400 may additionally include a storage device (e.g., drive unit) 416, a signal generation device 418 (e.g., a speaker), a network interface device 420, and one or more sensors 421, such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensor. The machine 400 may include an output controller 428, such as a serial (e.g., Universal Serial Bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field
communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).
[0059] The storage device 416 may include a machine readable medium 422 on which is stored one or more sets of data structures or instructions 424 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein. The instructions 424 may also reside, completely or at least partially, within the main memory 404, within static memory 406, or within the hardware processor 402 during execution thereof by the machine 400. In an example, one or any combination of the hardware processor 402, the main memory 404, the static memory 406, or the storage device 416 may constitute machine readable media.
[0060] While the machine readable medium 422 is illustrated as a single medium, the term "machine readable medium" may include a single medium or multiple medi a (e.g., a centralized or distributed database, and/or associated caches and servers) configured to store the one or more instructions 424.
[0061] The term "machine readable medium" may include any medium that is capable of storing, encoding, or carrying instructions for execution by the machine 400 and that cause the machine 400 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions. Non-limiting machine readable medium examples may include solid-state memories, and optical and magnetic media. In an example, a massed machine readable medium comprises a machine readable medium with a plurality of particles having invariant (e.g., rest) mass. Accordingly, massed machine-readable media are not transitory propagating signals.
Specific examples of massed machine readable media may include: nonvolatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable
Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic disks, such as internal hard disks and removable disks; magneto- optical disks; and CD-ROM and DVD-ROM disks.
[0062] The instructions 424 may further be transmitted or received over a communications network 426 using a transmission medium via the network interface device 420 utilizing any one of a number of transfer protocols (e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.).
Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as Wi- Fi®, IEEE 802.16 family of standards known as WiMax®), IEEE 802.15.4 family of standards, peer-to-peer (P2P) networks, among others. In an example, the network interface device 420 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to the communications network 426. In an example, the network interface device 420 may include a plurality of antennas to wirelessly communicate using at least one of single-input multiple-output (SIMO), multiple-input multiple-output (MIMO), or multiple-input single-output (MISO) techniques. The term " transmission medium" shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine 400, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software. Additional Notes & Examples
[0063] Example 1 is a computing apparatus for automatic event recording, the computing apparatus comprising: a processor; and a memory storing instructions that, when executed by the processor, configure the computing apparatus to: write protect, upon receiving notification of a start event, a first portion of an input stream written to a memory buffer as read-only data, the input stream received from a sensor array of a wearable device, the first portion of the input steam corresponding to a period of time before the start event; and write, subsequent to the start event, a second portion of the input stream to the memory buffer as read-only data, the second portion of the input stream written between the start event and a stop event.
[0064] In Example 2, the subject matter of Example 1 optionally includes, wherein the input stream includes audio captured from a microphone of the wearable device.
[0065] In Example 3, the subject matter of Example 2 optionally inciudes, further comprising instructions, which when executed by the processor, cause the processor to generate the stop event using the audio captured from the microphone of the wearable device.
[0066] In Example 4, the subject matter of any one or more of Examples 1 - 3 optionally include, wherein the input stream includes video captured from a camera of the wearable device.
[0067] In Example 5, the subject matter of Example 4 optionally includes, further comprising instructions, which when executed by the processor, cause the processor to generate the stop event using the video captured from the camera of the wearable device.
[0068] In Example 6, the subject matter of any one or more of Examples 1 - 5 optionally include, wherein the input stream includes data collected from an accelerometer of the wearable device.
[0069] In Example 7, the subject matter of Example 6 optionally inciudes, further comprising instructions, which when executed by the processor, cause the processor to generate the stop event using the data collected from the accelerometer of the wearable device. [0070] In Example 8, the subj ect matter of any one or more of Examples 1 - 7 optionally include, wherein the input stream includes data collected from a global positioning system receiver of the wearable device,
[0071] In Example 9, the subject matter of Example 8 optionally includes, further comprising instructions, which when executed by the processor, cause the processor to generate the stop event using the data collected from the global positioning system receiver of the wearable device.
[0072] In Example 10, the subject matter of any one or more of Examples 1 -9 optionally include, further comprising instructions, which when executed by the processor, cause the processor to: transfer, upon receiving notification of the stop event, the first portion of the input stream written to the memory buffer and the second portion of the input stream written to the memory buffer to a storage device,
[0073] In Example 1 1, the subj ect matter of Example 10 optionally includes, wherein the storage device is included in a mobile device.
[0074] In Example 12, the subject matter of any one or more of Examples 10-11 optionally include, wherein the storage device is included in the wearable device.
[0075] In Example 13, the subject matter of any one or more of Examples 1-12 optionally include, wherein the stop event is received from the wearable device,
[0076] In Example 14, the subject matter of any one or more of Examples 1 -13 optionally include, further comprising instructions, which when executed by the processor, cause the processor to clear the memory buffer upon the receipt of the stop event.
[0077] In Example 15, the subject matter of any one or more of Examples 1-14 optionally include, further comprising instructions, which when executed by the processor, cause the processor to: access audio information captured by the sensor array; detect a voice command made by a user of the computer apparatus; and generate the stop event based on the voice command.
[0078] In Example 16, the subject matter of any one or more of Examples 1-15 optionally include, further comprising instructions, which when executed by the processor, cause the processor to: access video information captured by the sensor array; detect a gesture made by a user of the computing apparatus; and generate the stop event based on the gesture.
[0079] In Example 17, the subject matter of any one or more of Examples 1-16 optionally include, further comprising instructions, which when executed by the processor, cause the processor to: access a user-defined data storage limit for the memory buffer; determine that the memory buffer has reached the user-defined storage limit; and generate the stop event based on the determination.
[0080] In Example 18, the subject matter of any one or more of Examples 1-17 optionally include, further comprising instructions, which when executed by the processor, cause the processor to: write, upon receiving notification of the stop event, a third portion of the input stream to the memory buffer as over-writeable data.
[0081] Example 19 is a computer-readable storage medium for automatic event recording, the computer-readable storage medium including instructions that when executed by a computer, cause the computer to: write protect, upon receiving notification of a start event, a first portion of an input stream written to a memory buffer as read-only data, the input stream received from a sensor array of a wearable device, the first portion of the input steam corresponding to a period of time before the start event; and write, subsequent to the start event, a second portion of the input stream to the memory buffer as read-only data, the second portion of the input stream written between the start event and a stop event.
[0082] In Example 20, the subj ect matter of Example 19 optionally includes, wherein the input stream includes audio captured from a microphone of the wearable device.
[0083] In Example 21 , the subj ect matter of Example 20 optionally includes, further comprising instructions to cause the computer to generate the stop event using the audio captured from the microphone of the wearable devi ce.
[0084] In Example 22, the subject matter of any one or more of Examples 19-21 optionally include, wherein the input stream includes video captured from a camera of the wearable device. [0085] In Example 23, the subj ect matter of Example 22 optionally includes, further comprising instructions to cause the computer to generate the stop event using the video captured from the camera of the wearable device.
[0086] In Example 24, the subject matter of any one or more of Examples 19-23 optionally include, wherein the input stream includes data collected from an accelerometer of the wearable device.
[0087] In Example 25, the subj ect matter of Example 24 optionally includes, further comprising instructions to cause the computer to generate the stop event using the data collected from the accelerometer of the wearable device.
[0088] In Example 26, the subject matter of any one or more of Examples 19—25 optionally include, wherein the input stream includes data collected from a global positioning system receiver of the wearable device.
[0089] In Example 27, the subj ect matter of Example 26 optionally includes, further comprising instructions to cause the computer to generate the stop event using the data collected from the global positioning system receiver of the wearable device.
[0090] In Example 28, the subject matter of any one or more of Examples 19-27 optionally include, further comprising instructions to cause the computer to: transfer, upon receiving notification of the stop event, the first portion of the input stream written to the memory buffer and the second portion of the input stream written to the memory buffer to a storage device.
[0091] In Example 29, the subj ect matter of Example 28 optionally includes, wherein the storage device is included in a mobile device.
[0092] In Example 30, the subject matter of any one or more of Examples 28-29 optionally include, wherein the storage device is included in the wearable device.
[0093] In Example 31, the subject matter of any one or more of Examples 1 9-30 optionally include, wherein the stop event is received from the wearable device.
[0094] In Example 32, the subject matter of any one or more of Examples 19-31 optionally include, further comprising instructions to cause the computer to clear the memory buffer upon the receipt of the stop event. [0095] In Example 33, the subject matter of any one or more of Examples 1 9-32 optionally include, further comprising instructions to cause the computer to: access audio information captured by the sensor array; detect a voice command made by a user of the computer; and generate the stop event based on the voice command.
[0096] In Example 34, the subject matter of any one or more of Examples 19-33 optionally include, further comprising instructions to cause the computer to: access video information captured by the sensor array; detect a gesture made by a user of the computer; and generate the stop event based on the gesture.
[0097] In Example 35, the subject matter of any one or more of Examples 19-34 optionally include, further comprising instructions to cause the computer to: access a user-defined data storage limit for the memorv- buffer; determine that the memorv' buffer has reached the user-defined storage limit; and generate the stop event based on the determination.
[0098] In Example 36, the subject matter of any one or more of Examples 19-3.5 optionally include, further comprising instructions to cause the computer to: write, upon receiving notification of the stop event, a third portion of the input stream to the memory buffer as over-writeable data.
[0099] Example 37 is a system for automatic event recording, the system comprising: means for write protecting, upon receiving notification of a start event, a first portion of an input stream written to a memory buffer as readonly data, the input stream received from a sensor array of a wearable device, the first portion of the input steam corresponding to a period of time before the start event; and means for writing, subsequent to the start event, a second portion of the input stream to the memory buffer as read-only data, the second portion of the input stream written between the start event and a stop event.
[00100] In Example 38, the subj ect matter of Example 37 optionally includes, wherein the input stream includes audio captured from a microphone of the wearable device.
[00101] In Example 39, the subject matter of Example 38 optionally includes, further comprising means for generating the stop event using the audio captured from the microphone of the wearable device. [00102] In Example 40, the subject matter of any one or more of Examples 37-39 optionally include, wherein the input stream includes video captured from a camera of the wearable device.
|00103] In Example 41, the subj ect matter of Example 40 optionally includes, further comprising means for generating the stop event using the video captured from the camera of the wearable device,
[00104J In Example 42, the subject matter of any one or more of Examples 37-41 optionally include, wherein the input stream includes data collected from an accelerometer of the wearable device.
[00105] In Example 43, the subj ect matter of Example 42 optionally includes, further comprising means for generating the stop event using the data collected from the accelerometer of the wearable device.
[00106] In Example 44, the subject matter of any one or more of Examples 37-43 optionally include, wherein the input stream includes data collected from a global positioning system receiver of the wearable device.
[00107] In Example 45, the subj ect matter of Example 44 optionally includes, further comprising means for generating the stop event using the data collected from the global positioning system receiver of the wearable device.
[00108] In Example 46, the subject matter of any one or more of Examples 37-45 optionally include, further comprising: means for transferring, upon receiving notification of the stop event, the first portion of the input stream written to the memory buffer and the second portion of the input stream written to the memory buffer to a storage device.
[00109] In Example 47, the subj ect matter of Example 46 optionally includes, wherein the storage device is included in a mobile device.
[00110] In Example 48, the subject matter of any one or more of Examples 46-47 optionally include, wherein the storage device is included in the wearable device.
[00111] In Example 49, the subject matter of any one or more of Examples 37-48 optionally include, wherein the stop event is received from the wearable device. [00112] In Example 50, the subject matter of any one or more of Examples 37-49 optionally include, further comprising means for clearing the memory- buffer upon the receipt of the stop event.
|00113] In Example 51, the subject matter of any one or more of Examples 37-50 optionally include, further comprising: means for accessing audio information captured by the sensor array; means for detecting a voice command made by a user of the system; and means for generating the stop event based on the voice command.
[00114] In Example 52, the subject matter of any one or more of Examples 37-51 optionally include, further comprising: means for accessing video information captured by the sensor array; means for detecting a gesture made by a user of the system; and means for generating the stop event based on the gesture.
[00115] In Example 53, the subject matter of any one or more of Examples 37-52 optionally include, further comprising: means for accessing a user- defined data storage limit for the memory buffer; means for determining that the memory buffer has reached the user-defined storage limit; and means for generating the stop event based on the determination.
[00116] In Example 54, the subject matter of any one or more of Examples 37-53 optionally include, further comprising: means for writing, upon receiving notification of the stop event, a third portion of the input stream to the memory buffer as over-writeabie data.
[00117] Example 55 is a method for automatic event recording, the method comprising: write protecting, upon receiving notification of a start event, a first portion of an input stream written to a memory buffer as read-only data, the input stream recei ved from a sensor array of a wearable device, the first portion of the input steam corresponding to a period of time before the start event; and writing, subsequent to the start event, a second portion of the input stream to the memory buffer as read-only data, the second portion of the input stream written between the start event and a stop event.
[00118] In Example 56, the subj ect matter of Example 55 optionally includes, wherein the input stream includes audio captured from a microphone of the wearable device. [00119] In Example 57, the subj ect matter of Example 56 optionally includes, further comprising generating the stop event using the audio captured from the microphone of the wearable device.
|00120] In Example 58, the subject matter of any one or more of Examples 55-57 optionally include, wherein the input stream includes video captured from a camera of the wearable device,
[00121] In Example 59, the subj ect matter of Example 58 optionally includes, further comprising generating the stop event using the video captured from the camera of the wearable device,
[00122] In Example 60, the subject matter of any one or more of Examples 55-59 optionally include, wherein the input stream includes data collected from an accelerometer of the wearable device,
[00123] In Example 61 , the subj ect matter of Example 60 optionally includes, further comprising generating the stop event using the data collected from the accelerometer of the wearable device.
[00124] In Example 62, the subject matter of any one or more of Examples 55-61 optionally include, wherein the input stream includes data collected from a global positioning system receiver of the wearable device.
[00125] In Example 63, the subj ect matter of Example 62 optionally includes, further comprising generating the stop event using the data collected from the global positioning system receiver of the wearable device.
[00126] In Example 64, the subject matter of any one or more of Examples 55-63 optionally include, further comprising: transferring, upon receiving notification of the stop event, the first portion of the input stream written to the memory buffer and the second portion of the input stream written to the memory buffer to a storage device.
[00127] In Example 65, the subj ect matter of Example 64 optionally includes, wherein the storage device is included in a mobile device.
[00128] In Example 66, the subject matter of any one or more of Examples 64-65 optionally include, wherein the storage device is included in the wearable device.
[00129] In Example 67, the subject matter of any one or more of Examples 55-66 optionally include, wherein the stop event is received from the wearable device. [00130] In Example 68, the subject matter of any one or more of Examples 55-67 optionally include, further comprising clearing the memory buffer upon the receipt of the stop event.
|00131] In Example 69, the subject matter of any one or more of Examples 55-68 optionally include, further comprising: accessing audio information captured by the sensor array; detecting a voice command made by a user; and generate the stop event based on the voice command.
[00132] In Example 70, the subject matter of any one or more of Examples 55-69 optionally include, further comprising: accessing video information captured by the sensor array; detecting a gesture made by a user; and generating the stop event based on the gesture.
[00133] In Example 71 , the subject matter of any one or more of Examples 55-70 optionally include, further comprising: accessing a user-defined data storage limit for the memory buffer; determining that the memory buffer has reached the user-defined storage limit; and generating the stop event based on the determination.
[00134] In Example 72, the subject matter of any one or more of Examples 55-71 optionally include, further comprising: writing, upon receiving notification of the stop event, a third portion of the input stream to the memory buffer as over-writeable data.
[00135] Example 73 is a system for automatic event recording, the system comprising means to perform any method of Examples 55-72.
[00136] Example 74 is a machine readable medium for automatic event recording, the machine readable medium including instructions that, when executed by a machine, cause the machine to perform any method of Examples 55-72.

Claims

CLAIMS What is claimed is:
1. A computing apparatus for automatic event recording, the computing apparatus comprising:
a processor; and
a memory storing instructions that, when executed by the processor, configure the computing apparatus to:
write protect, upon receiving notification of a start event, a first portion of an input stream written to a memory buffer as read-only data, the input stream received from a sensor array of a wearable device, the first portion of the input steam corresponding to a period of time before the start event; and
write, subsequent to the start event, a second portion of the input stream to the memoty buffer as read-only data, the second portion of the input stream written between the start event and a stop event.
2. The computing apparatus of claim I, wherein the input stream includes audio captured from a microphone of the wearable device.
3. The computing apparatus of claim 2, further comprising instructions, which when executed by the processor, cause the processor to generate the stop event using the audio captured from the microphone of the wearable device.
4. The computing apparatus of claim 1, further comprising instructions, which when executed by the processor, cause the processor to:
transfer, upon receiving notification of the stop event, the first portion of the input stream written to the memory buffer and the second portion of the input stream written to the memory buffer to a storage device.
5. The computing apparatus of claim 1 , further comprising instructions, which when executed by the processor, cause the processor to clear the memory buffer upon the receipt of the stop event.
6. The computing apparatus of claim 1 , further comprising instructions, which when executed by the processor, cause the processor to:
access audio information captured by the sensor array;
detect a voice command made by a user of the computer apparatus: and generate the stop event based on the voice command.
7. The computing apparatus of claim 1, further comprising instructions, which when executed by the processor, cause the processor to:
access video information captured by the sensor array;
detect a gesture made by a user of the computing apparatus; and generate the stop event based on the gesture.
8. The computing apparatus of claim 1, further comprising instructions, which when executed by the processor, cause the processor to:
access a user-defined data storage limit for the memory buffer;
determine that the memory buffer has reached the user-defined storage limit; and
generate the stop event based on the determination.
9. A computer-readable storage medium for automatic event recording, the computer-readable storage medium including instructions that when executed by a computer, cause the computer to:
write protect, upon receiving notification of a start event, a first portion of an input stream written to a memory buffer as read-only data, the input stream received from a sensor array of a wearable device, the first portion of the input steam corresponding to a period of time before the start event; and
write, subsequent to the start event, a second portion of the input stream to the memory buffer as read-only data, the second portion of the input stream written between the start event and a stop event.
10. The computer-readable storage medium of claim 9, wherein the input stream includes audio captured from a microphone of the wearable device.
11. The computer-readable storage medium of claim 10, further comprising instructions to cause the computer to generate the stop event using the audio captured from the microphone of the wearable device.
12. The computer-readable storage medium of claim 9, further comprising instructions to cause the computer to:
transfer, upon receiving notification of the stop event, the first portion of the input stream written to the memory buffer and the second portion of the input stream written to the memory buffer to a storage device.
13. The computer-readable storage medium of claim 9, further comprising instructions to cause the computer to clear the memory buffer upon the receipt of the stop event.
14. The computer-readable storage medium of claim 9, further comprising instructions to cause the computer to:
access audio information captured by the sensor array;
detect a voice command made by a user of the computer: and
generate the stop event based on the voice command.
15. The computer-readable storage medium of claim 9, further comprising instructions to cause the computer to:
access video information captured by the sensor array;
detect a gesture made by a user of the computer; and
generate the stop event based on the gesture.
16. The computer-readable storage medium of claim 9, further comprising instructions to cause the computer to:
access a user-defined data storage limit for the memory buffer;
determine that the memory buffer has reached the user-defined storage limit; and
generate the stop event based on the determination.
A method for automatic event recording, the method comprisin write protecting, upon receiving notification of a start e vent, a first portion of an input stream written to a memory buffer as read-only data, the input stream received from a sensor array of a wearable device, the first portion of the input steam corresponding to a period of time before the start event; and writing, subsequent to the start event, a second portion of the input stream to the memory buffer as read-only data, the second portion of the input stream written between the start event and a stop event.
1 8. The method of claim 17, wherein the input stream includes audio captured from a microph one of the wearable device,
19, The method of claim 18, further comprising generating the stop event using the audio captured from the microphone of the wearable device.
20. The method of claim 17, further comprising:
transferring, upon receiving notification of the stop event, the first portion of the input stream written to the memory buffer and the second portion of the input stream written to the memory buffer to a storage device.
21. The method of claim 17, further comprising clearing the memory buffer upon the receipt of the stop event.
22. The method of claim 17, further comprising:
accessing audio information captured by the sensor array;
detecting a voice command made by a user; and
generate the stop event based on the voice command.
23, The method of cl aim 17, further comprising:
accessing video information captured by the sensor array;
detecting a gesture made by a user; and
generating the stop event based on the gesture.
The method of claim 17, further comprising:
accessing a user-defined data storage limit for the memory buffer; determining that the memory buffer has reached the user-defined storage limit; and
generating the stop event based on the determination.
PCT/US2016/061547 2015-12-16 2016-11-11 Automatic event recorder WO2017105696A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14/971,574 US9912902B2 (en) 2015-12-16 2015-12-16 Automatic event recorder
US14/971,574 2015-12-16

Publications (1)

Publication Number Publication Date
WO2017105696A1 true WO2017105696A1 (en) 2017-06-22

Family

ID=59057285

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/061547 WO2017105696A1 (en) 2015-12-16 2016-11-11 Automatic event recorder

Country Status (2)

Country Link
US (1) US9912902B2 (en)
WO (1) WO2017105696A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6873864B2 (en) * 2017-08-09 2021-05-19 株式会社東芝 Storage control device, storage device and write control method
US11605224B2 (en) * 2019-05-31 2023-03-14 Apple Inc. Automated media editing operations in consumer devices
CN115191003A (en) * 2019-12-07 2022-10-14 罗翰·提拉克·利亚纳拉奇 Personal safety system and method
US11087801B1 (en) * 2020-02-06 2021-08-10 Micron Technology, Inc. Configuring a host interface of a memory device based on mode of operation
US11243896B2 (en) 2020-03-25 2022-02-08 Micron Technology, Inc. Multiple pin configurations of memory devices
US11582392B2 (en) 2021-03-25 2023-02-14 International Business Machines Corporation Augmented-reality-based video record and pause zone creation

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040104823A1 (en) * 1999-01-20 2004-06-03 Chainer Timothy J. Event-recorder for transmitting and storing electronic signature data
US20110125708A1 (en) * 2004-10-12 2011-05-26 Vanman Robert V Method of and system for mobile surveillance and event recording
US8531526B1 (en) * 2009-08-25 2013-09-10 Clinton A. Spence Wearable video recorder and monitor system and associated method
WO2014041032A1 (en) * 2012-09-11 2014-03-20 L.I.F.E. Corporation S.A. Wearable communication platform
US20150317801A1 (en) * 2010-08-26 2015-11-05 Blast Motion Inc. Event analysis system

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7570158B2 (en) * 2006-08-17 2009-08-04 At&T Intellectual Property I, L.P. Collaborative incident media recording system and related methods
US9076041B2 (en) 2010-08-26 2015-07-07 Blast Motion Inc. Motion event recognition and video synchronization system and method
US9462444B1 (en) * 2010-10-04 2016-10-04 Nortek Security & Control Llc Cloud based collaborative mobile emergency call initiation and handling distribution system
US9549583B2 (en) * 2013-01-04 2017-01-24 Bell Sports, Inc. Helmet with integrated electronic components
US9833031B2 (en) * 2013-05-23 2017-12-05 Accenture Global Services Limited Safety accessory with situational awareness and data retention
US10298825B2 (en) * 2014-07-23 2019-05-21 Orcam Technologies Ltd. Systems and methods for remembering held items and finding lost items using wearable camera systems
WO2016022984A1 (en) * 2014-08-08 2016-02-11 Fusar Technologies, Inc. Helmet system and methods

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040104823A1 (en) * 1999-01-20 2004-06-03 Chainer Timothy J. Event-recorder for transmitting and storing electronic signature data
US20110125708A1 (en) * 2004-10-12 2011-05-26 Vanman Robert V Method of and system for mobile surveillance and event recording
US8531526B1 (en) * 2009-08-25 2013-09-10 Clinton A. Spence Wearable video recorder and monitor system and associated method
US20150317801A1 (en) * 2010-08-26 2015-11-05 Blast Motion Inc. Event analysis system
WO2014041032A1 (en) * 2012-09-11 2014-03-20 L.I.F.E. Corporation S.A. Wearable communication platform

Also Published As

Publication number Publication date
US20170180676A1 (en) 2017-06-22
US9912902B2 (en) 2018-03-06

Similar Documents

Publication Publication Date Title
US9912902B2 (en) Automatic event recorder
US11335200B2 (en) Method and system for providing artificial intelligence analytic (AIA) services using operator fingerprints and cloud data
Chang et al. DeepCrash: A deep learning-based internet of vehicles system for head-on and single-vehicle accident detection with emergency notification
US20210334558A1 (en) Method and system for providing behavior of vehicle operator using virtuous cycle
US20210021687A1 (en) Method and system for providing predictions via artificial intelligence (ai) models using a distributed system
US10540557B2 (en) Method and apparatus for providing driver information via audio and video metadata extraction
US20140375807A1 (en) Camera activity system
US20130070928A1 (en) Methods, systems, and media for mobile audio event recognition
US11823469B2 (en) Cloud-controlled vehicle security system
JP2017204104A (en) Control device, on-vehicle device, video distribution method, and program
KR102143211B1 (en) A method and system for preventing drowsiness driving and keeping vehicle safe
US20180139485A1 (en) Camera System for Car Security
JP2019079413A (en) Drive support device and drive support system
JP2016136332A (en) Information processing apparatus, information processing method, and storage medium
US11089451B2 (en) Automatic incident detection and reporting
WO2022066026A1 (en) Device, method and system for providing a notification of a distinguishing activity
CN113965726A (en) Method, device and system for processing traffic video
CN111739266A (en) User behavior recognition method and device, electronic equipment and storage medium
JP2021164008A (en) Information processing method, information processing device, program, and information processing system
JP2008226075A (en) Operation state recording device
TWI798001B (en) Method and system of gesture control driving recorder
CN202615562U (en) Alarm device
JP2016097966A (en) Method for use in vehicle protection, equipment for use in vehicle protection, and non-transitory computer readable medium
CN202615561U (en) Safety alarm device
WO2018068166A1 (en) Drive recorder and vehicle alarm method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16876274

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16876274

Country of ref document: EP

Kind code of ref document: A1