DeviceOrientation Event Specification

W3C Candidate Recommendation Snapshot,

More details about this document
This version:
https://www.w3.org/TR/2024/CR-orientation-event-20240320/
Latest published version:
https://www.w3.org/TR/orientation-event/
Editor's Draft:
https://w3c.github.io/deviceorientation/
History:
https://www.w3.org/standards/history/orientation-event/
Feedback:
public-device-apis@w3.org with subject line “[orientation-event] … message topic …” (archives)
DeviceOrientation Event Specification Issues Repository
Implementation Report:
https://wpt.fyi/results/orientation-event
Editors:
Reilly Grant (Google LLC)
Raphael Kubo da Costa (Intel Corporation)
Former Editors:
Rich Tibbett (Opera Software ASA)
Tim Volodine (Google Inc)
Steve Block (Google Inc until July 2012)
Andrei Popescu (Google Inc until July 2012)

Abstract

This specification defines several new DOM events that provide information about the physical orientation and motion of a hosting device.

Status of this document

This section describes the status of this document at the time of its publication. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at https://www.w3.org/TR/.

This document was published by the Devices and Sensors Working Group and the Web Applications Working Group as a Candidate Recommendation Snapshot using the Recommendation track. This document is intended to become a W3C Recommendation. This document will remain a Candidate Recommendation at least until in order to ensure the opportunity for wide review.

If you wish to make comments regarding this document, please send them to public-device-apis@w3.org (subscribe, archives) and public-webapps@w3.org (subscribe, archives). When sending e-mail, please put the text “orientation-event” in the subject, preferably like this: “[orientation-event] …summary of comment…”. All comments are welcome.

Publication as a Candidate Recommendation does not imply endorsement by W3C and its Members. A Candidate Recommendation Snapshot has received wide review, is intended to gather implementation experience, and has commitments from Working Groups members to royalty-free licensing for implementations.

The entrance criteria for this document to enter the Proposed Recommendation stage is to have a minimum of two independent and interoperable user agents that implement all the features of this specification, which will be determined by passing the user agent tests defined in the test suite developed by the Working Groups. The Working Groups will prepare an implementation report to track progress.

This document was produced by groups operating under the W3C Patent Policy. W3C maintains a public list of any patent disclosures (Devices and Sensors) and a public list of any patent disclosures (Web Applications) made in connection with the deliverables of each group; these pages also include instructions for disclosing a patent. An individual who has actual knowledge of a patent which the individual believes contains Essential Claim(s) must disclose the information in accordance with section 6 of the W3C Patent Policy.

This document is governed by the 03 November 2023 W3C Process Document.

This specification was initially published as a Candidate Recommendation in August 2016 and was retired in 2017 due to the Geolocation Working Group closure. In 2019 Devices and Sensors Working Group adopted this specification and during 2019-2024 made substantial interoperability, test automation, privacy and editorial improvements as outlined in the Changes section. These changes aligned the specification with widely available implementations. In 2024 this specification became a joint deliverable between the Devices and Sensors Working Group and WebApps Working Group.

1. Conformance requirements

All diagrams, examples, and notes in this specification are non-normative, as are all sections explicitly marked non-normative. Everything else in this specification is normative.

The key words "MUST", "MUST NOT", "REQUIRED", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in the normative parts of this document are to be interpreted as described in RFC2119. For readability, these words do not appear in all uppercase letters in this specification. [RFC2119]

Requirements phrased in the imperative as part of algorithms (such as "strip any leading space characters" or "return false and abort these steps") are to be interpreted with the meaning of the key word ("must", "should", "may", etc) used in introducing the algorithm.

Conformance requirements phrased as algorithms or specific steps may be implemented in any manner, so long as the end result is equivalent. (In particular, the algorithms defined in this specification are intended to be easy to follow, and not intended to be performant.)

User agents may impose implementation-specific limits on otherwise unconstrained inputs, e.g. to prevent denial of service attacks, to guard against running out of memory, or to work around platform-specific limitations.

Implementations that use ECMAScript to implement the APIs defined in this specification must implement them in a manner consistent with the ECMAScript Bindings defined in the Web IDL specification, as this specification uses that specification’s terminology. [WEBIDL]

The events introduced by this specification implement the Event interface defined in the DOM Specification, [DOM]. Implementations must therefore support this specification.

2. Introduction

This section is non-normative.

This specification provides two new DOM events for obtaining information about the physical orientation and movement of the hosting device. The information provided by the events is not raw sensor data, but rather high-level data which is agnostic to the underlying source of information. Common sources of information include gyroscopes, compasses and accelerometers.

The first DOM event provided by the specification, deviceorientation, supplies the physical orientation of the device, expressed as a series of rotations from a local coordinate frame.

The second DOM event provided by this specification, devicemotion, supplies the acceleration of the device, expressed in Cartesian coordinates in a coordinate frame defined in the device. It also supplies the rotation rate of the device about a local coordinate frame. Where practically possible, the event should provide the acceleration of the device’s center of mass.

The following code extracts illustrate basic use of the events.

Registering to receive deviceorientation events:
window.addEventListener("deviceorientation", function(event) {
    // process event.alpha, event.beta and event.gamma
}, true);
A device lying flat on a horizontal surface with the top of the screen pointing West has the following orientation:
{
  alpha: 90,
  beta: 0,
  gamma: 0
};

To get the compass heading, one would simply subtract alpha from 360 degrees. As the device is turned on the horizontal surface, the compass heading is (360 - alpha).

A user is holding the device in their hand, with the screen in a vertical plane and the top of the screen pointing upwards. The value of beta is 90, irrespective of what alpha and gamma are.
A user facing a compass heading of alpha degrees is holding the device in their hand, with the screen in a vertical plane and the top of the screen pointing to their right. The orientation of the device is:
{
  alpha: 270 - alpha,
  beta: 0,
  gamma: 90
};
Registering to receive devicemotion events:
window.addEventListener("devicemotion", function(event) {
    // Process event.acceleration, event.accelerationIncludingGravity,
    // event.rotationRate and event.interval
}, true);
A device lying flat on a horizontal surface with the screen upmost has an acceleration of zero and the following value for accelerationIncludingGravity:
{
  x: 0,
  y: 0,
  z: 9.8
};
A device in free-fall, with the screen horizontal and upmost, has an accelerationIncludingGravity of zero and the following value for acceleration:
{
  x: 0,
  y: 0,
  z: -9.8
};
A device is mounted in a vehicle, with the screen in a vertical plane, the top uppermost and facing the rear of the vehicle. The vehicle is travelling at speed v around a right-hand bend of radius r. The device records a positive x component for both acceleration and accelerationIncludingGravity. The device also records a negative value for rotationRate.gamma:
{
  acceleration: {x: v^2/r, y: 0, z: 0},
  accelerationIncludingGravity: {x: v^2/r, y: 9.8, z: 0},
  rotationRate: {alpha: 0, beta: 0, gamma: -v/r*180/pi}
};

3. Scope

This section is non-normative.

This specification is limited to providing DOM events for retrieving information describing the physical orientation and motion of the hosting device. The intended purpose of this API is to enable simple use cases such as those in Use-Cases section. The scope of this specification does not include providing utilities to manipulate this data, such as transformation libraries. Nor does it include providing access to low sensor data, or direct control of these sensors.

4. Model

4.1. Device Orientation

This specification expresses a device’s physical orientation as a series of rotations relative to an implementation-defined reference coordinate frame.

The sequence of rotation steps is a set of intrinsic Tait-Bryan angles of type Z - X' - Y'' ([EULERANGLES]) that are applied on the device coordinate system defined in [ACCELEROMETER] and summarized below:

For a mobile device such as a phone or tablet, the device coordinate frame is defined relative to the screen in its standard orientation, typically portrait. This means that slide-out elements such as keyboards are not deployed, and swiveling elements such as displays are folded to their default position. If the orientation of the screen changes when the device is rotated or a slide-out keyboard is deployed, this does not affect the orientation of the coordinate frame relative to the device. For a laptop computer, the device coordinate frame is defined relative to the integrated keyboard.

Note: Users wishing to detect changes in screen orientation should refer to [SCREEN-ORIENTATION].

Rotations must use the right-hand convention, such that positive rotation around an axis is clockwise when viewed along the positive direction of the axis.

Note: the coordinate system used by this specification differs from CSS Transforms 2 § 4 The Transform Rendering Model, where the y axis is positive to the bottom and rotations follow the left-hand convention. Additionally, rotateSelf() and rotate(), specified in [GEOMETRY-1], apply rotations in an Z - Y' - X'' order, which differs from the order specified here.

A rotation represented by alpha, beta and gamma is carried out by the following steps:

  1. Rotate the device frame around its z axis by alpha degrees, with alpha in [0, 360).

    start orientation
    Device in the initial position, with the reference (XYZ) and body (xyz) frames aligned.
    rotation about z axis
    Device rotated through angle alpha about z axis, with previous locations of x and y axes shown as x0 and y0.
  2. Rotate the device frame around its x axis by beta degrees, with beta in [-180, 180).

    rotation about x axis
    Device rotated through angle beta about new x axis, with previous locations of y and z axes shown as y0 and z0.
  3. Rotate the device frame around its y axis by gamma degrees, with gamma in [-90, 90).

    rotation about y axis
    Device rotated through angle gamma about new y axis, with previous locations of x and z axes shown as x0 and z0.

Note: This choice of angles follows mathematical convention, but means that alpha is in the opposite sense to a compass heading. It also means that the angles do not match the roll-pitch-yaw convention used in vehicle dynamics.

4.1.1. Choice of reference coordinate system

A device’s orientation is always relative to another coordinate system, whose choice influences the kind of information that the orientation conveys as well as the source of the orientation data.

Relative orientation is measured with an accelerometer and a gyroscope, and the reference coordinate system is arbitrary. Consequently, the orientation data provides information about changes relative to the initial position of the device.

Note: In native platform terms, this is similar to a relative OrientationSensor on Windows, a game rotation vector sensor on Android, or the xArbitraryZVertical option for Core Motion.

Absolute orientation is measured with an accelerometer, a gyroscope and a magnetometer, and the reference coordinate system is the Earth’s reference coordinate system.

Note: In native platform terms, this is similar to an absolute OrientationSensor on Windows, a rotation vector sensor on Android, or the xMagneticNorthZVertical option for Core Motion.

4.2. Device Motion

This specification expresses a device’s motion in space by measuring its acceleration and rotation rate, which are obtained from an accelerometer and a gyroscope. The data is provided relative to the device coordinate system summarized in the previous section.

Acceleration is the rate of change of velocity of a device with respect to time. Is is expressed in meters per second squared (m/s2).

Linear acceleration represents the device’s acceleration rate without the contribution of the gravity force. When the device is laying flat on a table, its linear acceleration is 0 m/s2.

When the acceleration includes gravity, its value includes the effect of gravity and represents proper acceleration ([PROPERACCELERATION]). When the device is in free-fall, the acceleration is 0 m/s2. This is less useful in many applications but is provided as a means of providing best-effort support by implementations that are unable to provide linear acceleration (due, for example, to the lack of a gyroscope).

Note: In practice, acceleration with gravity represents the raw readings obtained from an Motion Sensors Explainer § accelerometer, or the [G-FORCE] whereas linear acceleration provides the readings of a Motion Sensors Explainer § linear-acceleration-sensor and is likely a fusion sensor. [MOTION-SENSORS] and [ACCELEROMETER] both contain a more detailed discussion about the different types of accelerometers and accelerations that can be measured.

The rotation rate measures the rate at which the device rotates about a specified axis in the device coordinate system. As with device orientation, rotations must use the right-hand convention, such that positive rotation around an axis is clockwise when viewed along the positive direction of the axis. The rotation rate is measured in degrees per second (deg/s).

Note: [MOTION-SENSORS] and [GYROSCOPE] both contain a more detailed discussion of gyroscopes, rotation rates and measurements.

5. Permissions Policy integration

This specification defines the following policy-controlled features:

Note: Usage of the policy-controlled features above by this specification is as follows:

6. Permissions API integration

This specification defines the following default powerful features:

Note: Usage of the powerful features above by this specification is as follows:

7. Task Source

The task source for the tasks mentioned in this specification is the device motion and orientation task source.

8. API

8.1. deviceorientation Event

partial interface Window {
    [SecureContext] attribute EventHandler ondeviceorientation;
};

[Exposed=Window, SecureContext]
interface DeviceOrientationEvent : Event {
    constructor(DOMString type, optional DeviceOrientationEventInit eventInitDict = {});
    readonly attribute double? alpha;
    readonly attribute double? beta;
    readonly attribute double? gamma;
    readonly attribute boolean absolute;

    static Promise<PermissionState> requestPermission(optional boolean absolute = false);
};

dictionary DeviceOrientationEventInit : EventInit {
    double? alpha = null;
    double? beta = null;
    double? gamma = null;
    boolean absolute = false;
};

The ondeviceorientation attribute is an event handler IDL attribute for the ondeviceorientation event handler, whose event handler event type is deviceorientation.

The alpha attribute must return the value it was initialized to. It represents the rotation around the Z axis in the Z - X' - Y'' intrinsic Tait-Bryan angles described in § 4.1 Device Orientation.

The beta attribute must return the value it was initialized to. It represents the rotation around the X' axis (produced after the rotation around the Z axis has been applied) axis in the Z - X' - Y'' intrinsic Tait-Bryan angles described in § 4.1 Device Orientation.

The gamma attribute must return the value it was initialized to. It represents the rotation around the Y'' axis (produced after the rotation around the Z and X' axes have been applied in this order) in the Z - X' - Y'' intrinsic Tait-Bryan angles described in § 4.1 Device Orientation.

The absolute attribute must return the value it was initialized to. It indicates whether relative orientation or absolute orientation data is being provided.

The requestPermission(absolute) method steps are:
  1. Let global be the current global object.

  2. Let hasTransientActivation be true if this’s relevant global object has transient activation, and false otherwise.

  3. Let result be a new promise in this’s relevant Realm.

  4. Run these steps in parallel:

    1. If absolute is true:

      1. Let permissions be « "accelerometer", "gyroscope", "magnetometer" ».

    2. Otherwise:

      1. Let permissions be « "accelerometer", "gyroscope" ».

    3. For each name of permissions:

      1. If name’s permission state is "prompt" and hasTransientActivation is false:

        1. Queue a global task on the device motion and orientation task source given global to reject result with a "NotAllowedError" DOMException.

        2. Return.

    4. Let permissionState be "granted".

    5. For each name of permissions:

      Note: There is no algorithm for requesting multiple permissions at once. However, user agents are encouraged to bundle concurrent requests for different kinds of media into a single user-facing permission prompt.

      1. If the result of requesting permission to use name is not "granted":

        1. Set permissionState to "denied".

        2. Break

    6. Queue a global task on the device motion and orientation task source given global to resolve result with permissionState.

  5. Return result.

To fire an orientation event given a event name (a string), window (a Window) and absolute (a boolean):
  1. Let orientation be null.

  2. Let topLevelTraversable be window’s navigable’s top-level traversable.

  3. Let virtualSensorType be "relative-orientation" if absolute is false, and "absolute-orientation" otherwise.

  4. If topLevelTraversable’s virtual sensor mapping contains virtualSensorType:

    1. Let virtualSensor be topLevelTraversable’s virtual sensor mapping[virtualSensorType].

    2. If virtualSensor’s can provide readings flag is true:

      1. Set orientation to the latest readings provided to virtualSensor with the "alpha", "beta", and "gamma" keys.

  5. Otherwise:

    1. If absolute is false:

      1. Set orientation to the device’s relative orientation in the tridimensional plane.

    2. Otherwise:

      1. Set orientation to the device’s absolute orientation in the tridimensional plane.

  6. Let permissions be null.

  7. If absolute is false:

    1. Set permissions to « "accelerometer", "gyroscope" ».

  8. Otherwise:

    1. Set permissions to « "accelerometer", "gyroscope", "magnetometer" ».

  9. Let environment be window’s relevant settings object.

  10. Run these steps in parallel:

    1. For each permission name in permissions:

      1. Let state be the result of getting the current permission state with permission name and environment.

      2. If state is not "granted", return.

    2. Queue a global task on the device motion and orientation task source given window to run the following steps:

      1. Let z rotation be orientation’s representation as intrinsic Tait-Bryan angles Z - X' - Y'' along the Z axis, or null if the implementation cannot provide an angle value.

      2. If z rotation is not null, limit z rotation’s precision to 0.1 degrees.

      3. Let x rotation be orientation’s representation as intrinsic Tait-Bryan angles Z - X' - Y'' along the X' axis, or null if the implementation cannot provide an angle value.

      4. If x rotation is not null, limit x rotation’s precision to 0.1 degrees.

      5. Let y rotation be orientation’s representation as intrinsic Tait-Bryan angles Z - X' - Y'' along the Y'' axis, or null if the implementation cannot provide an angle value.

      6. If y rotation is not null, limit y rotation’s precision to 0.1 degrees.

      7. Fire an event named event name at window, using DeviceOrientationEvent, with the alpha attribute initialized to z rotation, the beta attribute initialized to x rotation, the gamma attribute initialized to y rotation, and the absolute attribute initialized to absolute.

A significant change in orientation indicates a difference in orientation values compared to the previous ones that warrants the firing of a deviceorientation or deviceorientationabsolute event. The process of determining whether a significant change in orientation has occurred is implementation-defined, though a maximum threshold for change of 1 degree is recommended. Implementations may also consider that it has occurred if they have reason to believe that the page does not have sufficiently fresh data.

Note: Implementations must take § 11 Automation into account to determine whether a significant change in orientation has occurred, so that a virtual sensor reading update causes it to be assessed.

Whenever a significant change in orientation occurs, the user agent must execute the following steps on a navigable’s active window window:
  1. Let document be window’s associated Document.

  2. If document’s visibility state is not "visible", return.

  3. If the implementation cannot provide relative orientation or the resulting absolute orientation data is more accurate:

    1. Let absolute be true.

    2. Let policies be « "accelerometer", "gyroscope", "magnetometer" ».

  4. Otherwise:

    1. Let absolute be false.

    2. Let policies be « "accelerometer", "gyroscope" ».

  5. For each policy of policies:

    1. If document is not allowed to use the policy-controlled feature named policy, return.

  6. Invoke fire an orientation event with deviceorientation, window, and absolute.

If an implementation can never provide orientation information, the event should be fired with the alpha, beta and gamma attributes set to null, and the absolute attribute set to false.

8.2. deviceorientationabsolute Event

The deviceorientationabsolute event and its ondeviceorientationabsolute event handler IDL attribute have limited implementation experience.
partial interface Window {
    [SecureContext] attribute EventHandler ondeviceorientationabsolute;
};

The ondeviceorientationabsolute attribute is an event handler IDL attribute for the ondeviceorientationabsolute event handler, whose event handler event type is deviceorientationabsolute.

A deviceorientationabsolute event is completely analogous to the deviceorientation event, except that it must always provide absolute orientation data.

Whenever a significant change in orientation occurs, the user agent must execute the following steps on a navigable’s active window window:
  1. Invoke fire an orientation event with deviceorientationabsolute, window, and true.

If an implementation can never provide absolute orientation information, the event should be fired with the alpha, beta and gamma attributes set to null, and the absolute attribute set to true.

8.3. devicemotion Event

8.3.1. The DeviceMotionEventAcceleration interface

[Exposed=Window, SecureContext]
interface DeviceMotionEventAcceleration {
    readonly attribute double? x;
    readonly attribute double? y;
    readonly attribute double? z;
};

The DeviceMotionEventAcceleration interface represents the device’s acceleration as described in § 4.2 Device Motion. It has the following associated data:

x axis acceleration

The device’s acceleration rate along the X axis, or null. It is initially null.

y axis acceleration

The device’s acceleration rate along the Y axis, or null. It is initially null.

z axis acceleration

The device’s acceleration rate along the Z axis, or null. It is initially null.

The x getter steps are to return the value of this’s x axis acceleration.

The y getter steps are to return the value of this’s y axis acceleration.

The z getter steps are to return the value of this’s z axis acceleration.

8.3.2. The DeviceMotionEventRotationRate interface

[Exposed=Window, SecureContext]
interface DeviceMotionEventRotationRate {
    readonly attribute double? alpha;
    readonly attribute double? beta;
    readonly attribute double? gamma;
};

The DeviceMotionEventRotationRate interface represents the device’s rotation rate as described in § 4.2 Device Motion. It has the following associated data:

x axis rotation rate

The device’s rotation rate about the X axis, or null. It is initially null.

y axis rotation rate

The device’s rotation rate about the Y axis, or null. It is initially null.

z axis rotation rate

The device’s rotation rate about the Z axis, or null. It is initially null.

The alpha getter steps are to return the value of this’s x axis rotation rate.

The beta getter steps are to return the value of this’s y axis rotation rate.

The gamma getter steps are to return the value of this’s z axis rotation rate.

8.3.3. The DeviceMotionEvent interface

partial interface Window {
    [SecureContext] attribute EventHandler ondevicemotion;
};

[Exposed=Window, SecureContext]
interface DeviceMotionEvent : Event {
    constructor(DOMString type, optional DeviceMotionEventInit eventInitDict = {});
    readonly attribute DeviceMotionEventAcceleration? acceleration;
    readonly attribute DeviceMotionEventAcceleration? accelerationIncludingGravity;
    readonly attribute DeviceMotionEventRotationRate? rotationRate;
    readonly attribute double interval;

    static Promise<PermissionState> requestPermission();
};

dictionary DeviceMotionEventAccelerationInit {
    double? x = null;
    double? y = null;
    double? z = null;
};

dictionary DeviceMotionEventRotationRateInit {
    double? alpha = null;
    double? beta = null;
    double? gamma = null;
};

dictionary DeviceMotionEventInit : EventInit {
    DeviceMotionEventAccelerationInit acceleration;
    DeviceMotionEventAccelerationInit accelerationIncludingGravity;
    DeviceMotionEventRotationRateInit rotationRate;
    double interval = 0;
};

The ondevicemotion attribute is an event handler IDL attribute for the ondevicemotion event handler, whose event handler event type is devicemotion.

The acceleration attribute must return the value it was initialized to. When the object is created, this attribute must be initialized to null. It represents the device’s linear acceleration.

The accelerationIncludingGravity attribute must return the value it was initialized to. When the object is created, this attribute must be initialized to null. It represents the device’s acceleration with gravity.

The rotationRate attribute must return the value it was initialized to. When the object is created, this attribute must be initialized to null. It represents the device’s rotation rate.

The interval attribute must return the value it was initialized to. It represents the interval at which data is obtained from the underlying hardware and must be expressed in milliseconds (ms). It is constant to simplify filtering of the data by the Web application.

The requestPermission() method steps are:
  1. Let global be the current global object.

  2. Let hasTransientActivation be true if this’s relevant global object has transient activation, and false otherwise.

  3. Let result be a new promise in this’s relevant Realm.

  4. Run these steps in parallel:

    1. Let permissions be « "accelerometer", "gyroscope" ».

    2. For each name of permissions:

      1. If name’s permission state is "prompt" and hasTransientActivation is false:

        1. Queue a global task on the device motion and orientation task source given global to reject result with a "NotAllowedError" DOMException.

        2. Return.

    3. Let permissionState be "granted".

    4. For each name of permissions:

      Note: There is no algorithm for requesting multiple permissions at once. However, user agents are encouraged to bundle concurrent requests for different kinds of media into a single user-facing permission prompt.

      1. If the result of requesting permission to use name is not "granted":

        1. Set permissionState to "denied".

        2. Break

    5. Queue a global task on the device motion and orientation task source given global to resolve result with permissionState.

  5. Return result.

At an implementation-defined interval interval, the user agent must execute the following steps on a navigable’s active window window:
  1. Let document be window’s associated Document.

  2. If document’s visibility state is not "visible", return.

  3. For each policy of « "accelerometer", "gyroscope" »:

    1. If document is not allowed to use the policy-controlled feature named policy, return.

  4. Let topLevelTraversable be window’s navigable’s top-level traversable.

  5. Let platformLinearAcceleration be null.

  6. If topLevelTraversable’s virtual sensor mapping contains "linear-acceleration":

    1. Let virtualSensor be topLevelTraversable’s virtual sensor mapping["linear-acceleration"].

    2. If virtualSensor’s can provide readings flag is true, then set platformLinearAcceleration to the latest readings provided to virtualSensor.

  7. Otherwise, if the implementation is able to provide linear acceleration:

    1. Set platformLinearAcceleration to the device’s linear acceleration along the X, Y and Z axes.

  8. Let acceleration be null.

  9. If platformLinearAcceleration is not null:

    1. Set acceleration to a new DeviceMotionEventAcceleration created in window’s realm.

    2. Set acceleration’s x axis acceleration to platformLinearAcceleration’s value along the X axis, or null if it cannot be provided.

    3. If acceleration’s x axis acceleration is not null, limit its precision to no more than 0.1 m/s2.

    4. Set acceleration’s y axis acceleration to platformLinearAcceleration’s value along the Y axis, or null if it cannot be provided.

    5. If acceleration’s y axis acceleration is not null, limit its precision to no more than 0.1 m/s2.

    6. Set acceleration’s z axis acceleration to platformLinearAcceleration’s value along the Z axis, or null if it cannot be provided.

    7. If acceleration’s z axis acceleration is not null, limit its precision to no more than 0.1 m/s2.

  10. Let platformAccelerationIncludingGravity be null.

  11. If topLevelTraversable’s virtual sensor mapping contains "accelerometer":

    1. Let virtualSensor be topLevelTraversable’s virtual sensor mapping["accelerometer"].

    2. If virtualSensor’s can provide readings flag is true, then set platformAccelerationIncludingGravity to the latest readings provided to virtualSensor.

  12. Otherwise, if the implementation is able to provide acceleration with gravity:

    1. Set platformAccelerationIncludingGravity to the device’s linear acceleration along the X, Y and Z axes.

  13. Let accelerationIncludingGravity be null.

  14. If platformAccelerationIncludingGravity is not null:

    1. Set accelerationIncludingGravity to a new DeviceMotionEventAcceleration created in window’s realm.

    2. Set accelerationIncludingGravity’s x axis acceleration to platformAccelerationIncludingGravity’s value along the X axis, or null if it cannot be provided.

    3. If accelerationIncludingGravity’s x axis acceleration is not null, limit its precision to no more than 0.1 m/s2.

    4. Set accelerationIncludingGravity’s y axis acceleration to platformAccelerationIncludingGravity’s value along the Y axis, or null if it cannot be provided.

    5. If accelerationIncludingGravity’s y axis acceleration is not null, limit its precision to no more than 0.1 m/s2.

    6. Set accelerationIncludingGravity’s z axis acceleration to platformAccelerationIncludingGravity’s value along the Z axis, or null if it cannot be provided.

    7. If accelerationIncludingGravity’s z axis acceleration is not null, limit its precision to no more than 0.1 m/s2.

  15. Let platformRotationRate be null.

  16. If topLevelTraversable’s virtual sensor mapping contains "gyroscope":

    1. Let virtualSensor be topLevelTraversable’s virtual sensor mapping["gyroscope"].

    2. If virtualSensor’s can provide readings flag is true, then set platformRotationRate to the latest readings provided to virtualSensor.

  17. Otherwise, if the implementation is able to provide rotation rate:

    1. Set platformRotationRate to the device’s rotation rate about the X, Y and Z axes.

  18. Let rotationRate be null.

  19. If platformRotationRate is not null:

    1. Set rotationRate to a new DeviceMotionEventRotationRate created in window’s realm.

    2. Set rotationRate’s x axis rotation rate to platformRotationRate’s value about the X axis, or null if it cannot be provided.

    3. If rotationRate’s x axis rotation rate is not null, limit its precision to no more than 0.1 deg/s.

    4. Set rotationRate’s y axis rotation rate to platformRotationRate’s value about the Y axis, or null if it cannot be provided.

    5. If rotationRate’s y axis rotation rate is not null, limit its precision to no more than 0.1 deg/s.

    6. Set rotationRate’s z axis rotation rate to platformRotationRate’s value about the Z axis, or null if it cannot be provided.

    7. If rotationRate’s z axis rotation rate is not null, limit its precision to no more than 0.1 deg/s.

  20. Let environment be window’s relevant settings object.

  21. Run these steps in parallel:

    1. For each permission name in « "accelerometer", "gyroscope" »:

      1. Let state be the result of getting the current permission state with permission name and environment.

      2. If state is not "granted", return.

    2. Queue a global task on the device motion and orientation task source given window to run the following steps:

      1. Fire an event named devicemotion at window, using DeviceMotionEvent, with the acceleration attribute initialized to acceleration, the accelerationIncludingGravity attribute initialized to accelerationIncludingGravity, the rotationRate attribute initialized to rotationRate, and the interval attribute initialized to interval.

If an implementation can never provide motion information, the event should be fired with the acceleration, accelerationIncludingGravity and rotationRate attributes set to null.

9. Security and privacy considerations

The API defined in this specification can be used to obtain information from hardware sensors, such as accelerometer, gyroscope and magnetometer. Provided data may be considered as sensitive and could become a subject of attack from malicious web pages. The calibration of accelerometers, gyroscopes and magnetometers may reveal persistent details about the particular sensor hardware [SENSORID]. The main attack vectors can be categorized into following categories:

In light of that, implementations may consider visual indicators to signify the use of sensors by the web page. Additionally, this specification requires users to give express permission for the user agent to provide device motion and/or orientation data via the requestPermission() API calls.

Furthermore, to minimize privacy risks, the chance of fingerprinting and other attacks the implementations must:

Additionally, implementing these items may also have a beneficial impact on the battery life of mobile devices.

Further implementation experience is being gathered to inform the limit for the maximum sampling frequency cap.

10. Use-Cases and Requirements

10.1. Use-Cases

This section is non-normative.

10.1.1. Controlling a game

This section is non-normative.

A gaming Web application monitors the device’s orientation and interprets tilting in a certain direction as a means to control an on-screen sprite.

10.1.2. Gesture recognition

This section is non-normative.

A Web application monitors the device’s acceleration and applies signal processing in order to recognize certain specific gestures. For example, using a shaking gesture to clear a web form.

10.1.3. Mapping

This section is non-normative.

A mapping Web application uses the device’s orientation to correctly align the map with reality.

10.2. Requirements

This section is non-normative.

11. Automation

This specification can pose a challenge to test authors, as the events defined here depend on the presence of physical hardware whose readings cannot be easily controlled.

To address this challenge, this document builds upon the [WEBDRIVER2] extension commands and infrastructure laid out by Generic Sensor API § 9 Automation. This was chosen over the option of developing completely new and independent infrastructure with separate extension commands because there is significant overlap between the two specifications: not only does testing the [GENERIC-SENSOR] specification present similar challenges, but many derived APIs (e.g. [GYROSCOPE]) obtain and provide similar information.

This specification only requires implementations to support the Generic Sensor API § 9 Automation section of the [GENERIC-SENSOR] specification, not its interfaces and events.

11.1. Device Orientation Automation

Automation support for the deviceorientation event is built upon virtual sensors that represent accelerometers, gyroscopes and, optionally, magnetometers.

Orientation data retrieved from the platform by the user agent comes from accelerometers, gyroscopes and, optionally, magnetometers. Contrary to motion data, however, these lower-level readings must be transformed into Euler angles in the formation described in § 4.1 Device Orientation. Furthermore, the platform might provide extra APIs to the user agent that already perform some of those conversions from raw acceleration and rotation data.

Therefore, instead of requiring implementations (and automation users) to provide orientation readings via lower-level virtual sensors which use different units of measurement, this specification defines extra virtual sensor types for relative and orientation data in the format used by this specification.

11.1.1. Parse orientation reading data algorithm

To perform the parse orientation data reading algorithm, given a JSON Object parameters:
  1. Let alpha be the result of invoking get a property from parameters with "alpha".

  2. If alpha is not a Number, or its value is NaN, +∞, or −∞, return undefined.

  3. If alpha is not in the range [0, 360), then return undefined.

  4. Let beta be the result of invoking get a property from parameters with "beta".

  5. If beta is not a Number, or its value is NaN, +∞, or −∞, return undefined.

  6. If beta is not in the range [-180, 180), then return undefined.

  7. Let gamma be the result of invoking get a property from parameters with "gamma".

  8. If gamma is not a Number, or its value is NaN, +∞, or −∞, return undefined.

  9. If gamma is not in the range [-90, 90), then return undefined.

  10. Let reading be a new map.

  11. Set reading["alpha"] to alpha.

  12. Set reading["beta"] to beta.

  13. Set reading["gamma"] to gamma.

  14. Return reading.

Note: reading is defined as a map in the algorithm above to prevent a dependency on the sensor reading concept from the [GENERIC-SENSOR] specification. They should be interchangeable for the purposes of the algorithm above.

11.1.2. The "absolute-orientation" virtual sensor type

The per-type virtual sensor metadata map must have the following entry:

key

"absolute-orientation"

value

A virtual sensor metadata whose reading parsing algorithm is parse orientation data reading.

11.1.3. The "relative-orientation" virtual sensor type

The per-type virtual sensor metadata map must have the following entry:

key

"relative-orientation"

value

A virtual sensor metadata whose reading parsing algorithm is parse orientation data reading.

11.2. Device Motion Automation

The motion data retrieved from the platform by the user agent comes from accelerometers and gyroscopes. This specification defines certain per-type virtual sensor metadata entries that are shared with the [ACCELEROMETER] and [GYROSCOPE] specifications.

Accelerometer virtual sensors are used to provide acceleration with gravity data to the platform. Linear Acceleration virtual sensors are used to provide linear acceleration data to the platform. Gyroscope virtual sensors are used to provide rotation rate data to the platform.

11.2.1. The "accelerometer" virtual sensor type

The per-type virtual sensor metadata map must have the following entry:

key

"accelerometer"

value

A virtual sensor metadata whose reading parsing algorithm is parse xyz reading.

11.2.2. The "linear-acceleration" virtual sensor type

The per-type virtual sensor metadata map must have the following entry:

key

"linear-acceleration"

value

A virtual sensor metadata whose reading parsing algorithm is parse xyz reading.

11.2.3. The "gyroscope" virtual sensor type

The per-type virtual sensor metadata map must have the following entry:

key

"gyroscope"

value

A virtual sensor metadata whose reading parsing algorithm is parse xyz reading.

A Examples

This section is non-normative.

A.1 Calculating compass heading

This section is non-normative.

The following worked example is intended as an aid to users of the DeviceOrientation event.

Introduction section provided an example of using the DeviceOrientation event to obtain a compass heading when the device is held with the screen horizontal. This example shows how to determine the compass heading that the user is facing when holding the device with the screen approximately vertical in front of them. An application of this is an augmented-reality system.

More precisely, we wish to determine the compass heading of the horizontal component of a vector which is orthogonal to the device’s screen and pointing out of the back of the screen.

If v represents this vector in the rotated device body frame xyz, then v is as follows.

v = [0; 0; -1]

The transformation of v due to the rotation about the z axis can be represented by the following rotation matrix.

Z = [cos(alpha) -sin(alpha) 0; sin(alpha) cos(alpha) 0; 0 0 1]

The transformation of v due to the rotation about the x axis can be represented by the following rotation matrix.

X = [1 0 0; 0 cos(beta) -sin(beta); 0 sin(beta) cos(beta)]

The transformation of v due to the rotation about the y axis can be represented by the following rotation matrix.

Y = [cos(gamma) 0 sin(gamma); 0 1 0; -sin(gamma) 0 cos(gamma)]

If R represents the full rotation matrix of the device in the earth frame XYZ, then since the initial body frame is aligned with the earth, R is as follows.

R = ZXY = [[cos(alpha) cos(gamma)-sin(alpha) sin(beta) sin(gamma), -cos(beta) sin(alpha), cos(gamma) sin(alpha) sin(beta)+cos(alpha) sin(gamma)], [cos(gamma) sin(alpha)+cos(alpha) sin(beta) sin(gamma), cos(alpha) cos(beta), sin(alpha) sin(gamma)-cos(alpha) cos(gamma) sin(beta)], [-cos(beta) sin(gamma), sin(beta), cos(beta) cos(gamma)]]

If v' represents the vector v in the earth frame XYZ, then since the initial body frame is aligned with the earth, v' is as follows.

v' = Rv v' = [-cos(alpha)sin(gamma)-sin(alpha)sin(beta)cos(gamma); -sin(alpha)sin(gamma)+cos(alpha)sin(beta)cos(gamma); -cos(beta)cos(gamma)]

The compass heading θ is given by

theta = atan((v'_x)/(v'_y)) = atan((-cos(alpha)sin(gamma)-sin(alpha)sin(beta)cos(gamma))/(-sin(alpha)sin(gamma)+cos(alpha)sin(beta)cos(gamma)))

provided that β and γ are not both zero.

The compass heading calculation above can be represented in JavaScript as follows to return the correct compass heading when the provided parameters are defined, not null and represent absolute values.

var degtorad = Math.PI / 180; // Degree-to-Radian conversion

function compassHeading( alpha, beta, gamma ) {

  var _x = beta  ? beta  * degtorad : 0; // beta value
  var _y = gamma ? gamma * degtorad : 0; // gamma value
  var _z = alpha ? alpha * degtorad : 0; // alpha value

  var cX = Math.cos( _x );
  var cY = Math.cos( _y );
  var cZ = Math.cos( _z );
  var sX = Math.sin( _x );
  var sY = Math.sin( _y );
  var sZ = Math.sin( _z );

  // Calculate Vx and Vy components
  var Vx = - cZ * sY - sZ * sX * cY;
  var Vy = - sZ * sY + cZ * sX * cY;

  // Calculate compass heading
  var compassHeading = Math.atan( Vx / Vy );

  // Convert compass heading to use whole unit circle
  if( Vy < 0 ) {
    compassHeading += Math.PI;
  } else if( Vx < 0 ) {
    compassHeading += 2 * Math.PI;
  }

  return compassHeading * ( 180 / Math.PI ); // Compass Heading (in degrees)

}

As a consistency check, if we set γ = 0, then

theta = atan(-sin(alpha)sin(beta)/cos(alpha)sin(beta)) = -alpha

as expected.

Alternatively, if we set β = 90, then

theta = atan((-cos(alpha)sin(gamma)-sin(alpha)cos(gamma))/(-sin(alpha)sin(gamma)+cos(alpha)cos(gamma))) theta = atan(-sin(alpha+gamma)/cos(alpha+gamma)) = -(alpha+gamma)

as expected.

A.2 Alternate device orientation representations

This section is non-normative.

Describing orientation using Tait-Bryan angles can have some disadvantages such as introducing gimbal lock [GIMBALLOCK]. Depending on the intended application it can be useful to convert the Device Orientation values to other rotation representations.

The first alternate orientation representation uses rotation matrices. By combining the component rotation matrices provided in the worked example above we can represent the orientation of the device body frame as a combined rotation matrix.

If R represents the rotation matrix of the device in the earth frame XYZ, then since the initial body frame is aligned with the earth, R is as follows.

R = ZXY = [[cos(alpha) cos(gamma)-sin(alpha) sin(beta) sin(gamma), -cos(beta) sin(alpha), cos(gamma) sin(alpha) sin(beta)+cos(alpha) sin(gamma)], [cos(gamma) sin(alpha)+cos(alpha) sin(beta) sin(gamma), cos(alpha) cos(beta), sin(alpha) sin(gamma)-cos(alpha) cos(gamma) sin(beta)], [-cos(beta) sin(gamma), sin(beta), cos(beta) cos(gamma)]]
The above combined rotation matrix can be represented in JavaScript as follows provided passed parameters are defined, not null and represent absolute values.
var degtorad = Math.PI / 180; // Degree-to-Radian conversion

function getRotationMatrix( alpha, beta, gamma ) {

  var _x = beta  ? beta  * degtorad : 0; // beta value
  var _y = gamma ? gamma * degtorad : 0; // gamma value
  var _z = alpha ? alpha * degtorad : 0; // alpha value

  var cX = Math.cos( _x );
  var cY = Math.cos( _y );
  var cZ = Math.cos( _z );
  var sX = Math.sin( _x );
  var sY = Math.sin( _y );
  var sZ = Math.sin( _z );

  //
  // ZXY rotation matrix construction.
  //

  var m11 = cZ * cY - sZ * sX * sY;
  var m12 = - cX * sZ;
  var m13 = cY * sZ * sX + cZ * sY;

  var m21 = cY * sZ + cZ * sX * sY;
  var m22 = cZ * cX;
  var m23 = sZ * sY - cZ * cY * sX;

  var m31 = - cX * sY;
  var m32 = sX;
  var m33 = cX * cY;

  return [
    m11,    m12,    m13,
    m21,    m22,    m23,
    m31,    m32,    m33
  ];

};

Another alternate representation of device orientation data is as Quaternions. [QUATERNIONS]

If q represents the unit quaternion of the device in the earth frame XYZ, then since the initial body frame is aligned with the earth, q is as follows.

q = [[q_w], [q_x], [q_y], [q_z]] = [[cos(beta)cos(gamma)cos(alpha) - sin(beta)sin(gamma)sin(alpha)], [sin(beta)cos(gamma)cos(alpha) - cos(beta)sin(gamma)sin(alpha)], [cos(beta)sin(gamma)cos(alpha) + sin(beta)cos(gamma)sin(alpha)], [cos(beta)cos(gamma)sin(alpha) + sin(beta)sin(gamma)cos(alpha)]]
The above quaternion can be represented in JavaScript as follows provided the passed parameters are defined, are absolute values and those parameters are not null.
var degtorad = Math.PI / 180; // Degree-to-Radian conversion

function getQuaternion( alpha, beta, gamma ) {

  var _x = beta  ? beta  * degtorad : 0; // beta value
  var _y = gamma ? gamma * degtorad : 0; // gamma value
  var _z = alpha ? alpha * degtorad : 0; // alpha value

  var cX = Math.cos( _x/2 );
  var cY = Math.cos( _y/2 );
  var cZ = Math.cos( _z/2 );
  var sX = Math.sin( _x/2 );
  var sY = Math.sin( _y/2 );
  var sZ = Math.sin( _z/2 );

  //
  // ZXY quaternion construction.
  //

  var w = cX * cY * cZ - sX * sY * sZ;
  var x = sX * cY * cZ - cX * sY * sZ;
  var y = cX * sY * cZ + sX * cY * sZ;
  var z = cX * cY * sZ + sX * sY * cZ;

  return [ w, x, y, z ];

}

We can check that a Unit Quaternion has been constructed correctly using Lagrange’s four-square theorem

q_w^2 * q_x^2 * q_y^2 * q_z^2 = 1

as expected.

Acknowledgments

Lars Erik Bolstad, Dean Jackson, Claes Nilsson, George Percivall, Doug Turner, Matt Womer, Chris Dumez

12. Changes

This section summarizes substantial changes and notable editorial improvements to guide review. Full details are available from the commit log. Changes since the Candidate Recommendation 2016-08-18:

Index

Terms defined by this specification

Terms defined by reference

References

Normative References

[ACCELEROMETER]
Anssi Kostiainen. Accelerometer. 8 January 2024. CR. URL: https://www.w3.org/TR/accelerometer/
[DOM]
Anne van Kesteren. DOM Standard. Living Standard. URL: https://dom.spec.whatwg.org/
[ECMASCRIPT]
ECMAScript Language Specification. URL: https://tc39.es/ecma262/multipage/
[GENERIC-SENSOR]
Rick Waldron. Generic Sensor API. 25 January 2024. CR. URL: https://www.w3.org/TR/generic-sensor/
[HTML]
Anne van Kesteren; et al. HTML Standard. Living Standard. URL: https://html.spec.whatwg.org/multipage/
[INFRA]
Anne van Kesteren; Domenic Denicola. Infra Standard. Living Standard. URL: https://infra.spec.whatwg.org/
[ORIENTATION-SENSOR]
Kenneth Christiansen; Anssi Kostiainen. Orientation Sensor. 10 January 2024. WD. URL: https://www.w3.org/TR/orientation-sensor/
[PERMISSIONS]
Marcos Caceres; Mike Taylor. Permissions. 16 January 2024. WD. URL: https://www.w3.org/TR/permissions/
[PERMISSIONS-POLICY-1]
Ian Clelland. Permissions Policy. 18 December 2023. WD. URL: https://www.w3.org/TR/permissions-policy-1/
[RFC2119]
S. Bradner. Key words for use in RFCs to Indicate Requirement Levels. March 1997. Best Current Practice. URL: https://datatracker.ietf.org/doc/html/rfc2119
[WEBDRIVER2]
Simon Stewart; David Burns. WebDriver. 23 January 2024. WD. URL: https://www.w3.org/TR/webdriver2/
[WEBIDL]
Edgar Chen; Timothy Gu. Web IDL Standard. Living Standard. URL: https://webidl.spec.whatwg.org/

Informative References

[CSS-TRANSFORMS-2]
Tab Atkins Jr.; et al. CSS Transforms Module Level 2. 9 November 2021. WD. URL: https://www.w3.org/TR/css-transforms-2/
[EULERANGLES]
Euler Angles. URL: https://en.wikipedia.org/wiki/Euler_angles
[FINGERPRINT]
Mobile Device Identification via Sensor Fingerprinting. 6 Aug 2014. URL: https://arxiv.org/abs/1408.1416
[G-FORCE]
G-Force. URL: https://en.wikipedia.org/wiki/G-force
[GEOMETRY-1]
Simon Pieters; Chris Harrelson. Geometry Interfaces Module Level 1. 4 December 2018. CR. URL: https://www.w3.org/TR/geometry-1/
[GIMBALLOCK]
Gimbal Lock. URL: https://en.wikipedia.org/wiki/Gimbal_Lock
[GYROSCOPE]
Anssi Kostiainen. Gyroscope. 8 January 2024. CR. URL: https://www.w3.org/TR/gyroscope/
[INDOORPOS]
Shala, Ubejd; Angel Rodriguez. Indoor positioning using sensor-fusion in android devices. 2011. URL: http://www.diva-portal.org/smash/record.jsf?pid=diva2%3A475619&dswid=9050
[MOTION-SENSORS]
Kenneth Christiansen; Alexander Shalamov. Motion Sensors Explainer. 30 August 2017. NOTE. URL: https://www.w3.org/TR/motion-sensors/
[PROPERACCELERATION]
Proper acceleration. URL: https://en.wikipedia.org/wiki/Proper_acceleration
[QUATERNIONS]
Quaternions. URL: https://en.wikipedia.org/wiki/Quaternion
[SCREEN-ORIENTATION]
Marcos Caceres. Screen Orientation. 9 August 2023. WD. URL: https://www.w3.org/TR/screen-orientation/
[SENSORID]
Zhang, Jiexin; Beresford, Alastair R.; Sheret, Ian. SensorID: Sensor Calibration Fingerprinting for Smartphones. 2019. URL: https://doi.org/10.1109/SP.2019.00072
[TOUCH]
TouchSignatures: Identification of User Touch Actions and PINs Based on Mobile Sensor Data via JavaScript. 12 Feb 2016. URL: https://arxiv.org/abs/1602.04115

IDL Index

partial interface Window {
    [SecureContext] attribute EventHandler ondeviceorientation;
};

[Exposed=Window, SecureContext]
interface DeviceOrientationEvent : Event {
    constructor(DOMString type, optional DeviceOrientationEventInit eventInitDict = {});
    readonly attribute double? alpha;
    readonly attribute double? beta;
    readonly attribute double? gamma;
    readonly attribute boolean absolute;

    static Promise<PermissionState> requestPermission(optional boolean absolute = false);
};

dictionary DeviceOrientationEventInit : EventInit {
    double? alpha = null;
    double? beta = null;
    double? gamma = null;
    boolean absolute = false;
};

partial interface Window {
    [SecureContext] attribute EventHandler ondeviceorientationabsolute;
};

[Exposed=Window, SecureContext]
interface DeviceMotionEventAcceleration {
    readonly attribute double? x;
    readonly attribute double? y;
    readonly attribute double? z;
};

[Exposed=Window, SecureContext]
interface DeviceMotionEventRotationRate {
    readonly attribute double? alpha;
    readonly attribute double? beta;
    readonly attribute double? gamma;
};

partial interface Window {
    [SecureContext] attribute EventHandler ondevicemotion;
};

[Exposed=Window, SecureContext]
interface DeviceMotionEvent : Event {
    constructor(DOMString type, optional DeviceMotionEventInit eventInitDict = {});
    readonly attribute DeviceMotionEventAcceleration? acceleration;
    readonly attribute DeviceMotionEventAcceleration? accelerationIncludingGravity;
    readonly attribute DeviceMotionEventRotationRate? rotationRate;
    readonly attribute double interval;

    static Promise<PermissionState> requestPermission();
};

dictionary DeviceMotionEventAccelerationInit {
    double? x = null;
    double? y = null;
    double? z = null;
};

dictionary DeviceMotionEventRotationRateInit {
    double? alpha = null;
    double? beta = null;
    double? gamma = null;
};

dictionary DeviceMotionEventInit : EventInit {
    DeviceMotionEventAccelerationInit acceleration;
    DeviceMotionEventAccelerationInit accelerationIncludingGravity;
    DeviceMotionEventRotationRateInit rotationRate;
    double interval = 0;
};

MDN

DeviceMotionEvent/DeviceMotionEvent

Firefox29+SafariNoneChrome59+
Opera?Edge79+
Edge (Legacy)14+IENone
Firefox for Android?iOS Safari?Chrome for Android?Android WebView?Samsung Internet?Opera Mobile?
MDN

DeviceMotionEvent/acceleration

In all current engines.

Firefox6+SafariNoneChrome31+
Opera?Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView?Samsung Internet?Opera Mobile?
MDN

DeviceMotionEvent/accelerationIncludingGravity

In all current engines.

Firefox6+SafariNoneChrome31+
Opera?Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView?Samsung Internet?Opera Mobile?
MDN

DeviceMotionEvent/interval

In all current engines.

Firefox6+SafariNoneChrome31+
Opera?Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView?Samsung Internet?Opera Mobile?
MDN

DeviceMotionEvent/rotationRate

In all current engines.

Firefox6+SafariNoneChrome31+
Opera?Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView?Samsung Internet?Opera Mobile?
MDN

DeviceMotionEvent

In all current engines.

Firefox6+SafariNoneChrome31+
Opera?Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView?Samsung Internet?Opera Mobile?

Window/devicemotion_event

In all current engines.

Firefox6+SafariNoneChrome31+
Opera?Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView?Samsung Internet?Opera Mobile?
MDN

DeviceMotionEventAcceleration/x

In all current engines.

Firefox6+SafariNoneChrome31+
Opera?Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView?Samsung Internet?Opera Mobile?
MDN

DeviceMotionEventAcceleration/y

In all current engines.

Firefox6+SafariNoneChrome31+
Opera?Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView?Samsung Internet?Opera Mobile?
MDN

DeviceMotionEventAcceleration/z

In all current engines.

Firefox6+SafariNoneChrome31+
Opera?Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView?Samsung Internet?Opera Mobile?
MDN

DeviceMotionEventAcceleration

In all current engines.

Firefox6+SafariNoneChrome31+
Opera?Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView?Samsung Internet?Opera Mobile?
MDN

DeviceMotionEventRotationRate/alpha

In all current engines.

Firefox6+SafariNoneChrome31+
Opera?Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView?Samsung Internet?Opera Mobile?
MDN

DeviceMotionEventRotationRate/beta

In all current engines.

Firefox6+SafariNoneChrome31+
Opera?Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView?Samsung Internet?Opera Mobile?
MDN

DeviceMotionEventRotationRate/gamma

In all current engines.

Firefox6+SafariNoneChrome31+
Opera?Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView?Samsung Internet?Opera Mobile?
MDN

DeviceMotionEventRotationRate

In all current engines.

Firefox6+SafariNoneChrome31+
Opera?Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView?Samsung Internet?Opera Mobile?
MDN

DeviceOrientationEvent/DeviceOrientationEvent

Firefox17+SafariNoneChrome59+
Opera?Edge79+
Edge (Legacy)14+IENone
Firefox for Android?iOS Safari?Chrome for Android?Android WebView?Samsung Internet?Opera Mobile?
MDN

DeviceOrientationEvent/absolute

Firefox6+SafariNoneChrome7+
Opera?Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari?Chrome for Android?Android WebView?Samsung Internet?Opera Mobile?
MDN

DeviceOrientationEvent/alpha

In all current engines.

Firefox6+SafariNoneChrome7+
Opera?Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView3+Samsung Internet?Opera Mobile?
MDN

DeviceOrientationEvent/beta

In all current engines.

Firefox6+SafariNoneChrome7+
Opera?Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView3+Samsung Internet?Opera Mobile?
MDN

DeviceOrientationEvent/gamma

In all current engines.

Firefox6+SafariNoneChrome7+
Opera?Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView3+Samsung Internet?Opera Mobile?
MDN

DeviceOrientationEvent

In all current engines.

Firefox6+SafariNoneChrome7+
Opera15+Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView3+Samsung Internet1.0+Opera Mobile14+

Window/deviceorientation_event

In all current engines.

Firefox6+SafariNoneChrome7+
Opera12+Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView3+Samsung Internet?Opera Mobile12+
MDN

Window/devicemotion_event

In all current engines.

Firefox6+SafariNoneChrome31+
Opera?Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView?Samsung Internet?Opera Mobile?
MDN

Window/deviceorientation_event

In all current engines.

Firefox6+SafariNoneChrome7+
Opera12+Edge79+
Edge (Legacy)12+IE11
Firefox for Android?iOS Safari4.2+Chrome for Android?Android WebView3+Samsung Internet?Opera Mobile12+
MDN

Window/deviceorientationabsolute_event

In only one current engine.

FirefoxNoneSafariNoneChrome50+
Opera?Edge79+
Edge (Legacy)?IENone
Firefox for Android?iOS Safari?Chrome for Android?Android WebView?Samsung Internet?Opera Mobile?