Location Based

This article gives you a first glance to Location Based on AR.js.

It can be used for indoor (but with low precision) and outdoor geopositioning of AR content.

You can load places statically, from HTML or from Javascript, or you can load your data from local/remote json, or even through API calls. Choice is yours. On the article above there are all the options explained, as tutorials.

Location Based has been implemented for both three.js and A-Frame. Each of these is documented below.

This document is intended as reference documentation. There are also two tutorials available, with full example code:

A-Frame

AR.js offers A-Frame components to implement location-based AR. There are three variants of the components, detailed as below:

The components

Each variant above includes two components, a camera component which enables the location-based AR, and an entity-place component which enables setting components' latitude and longitude. The exact component names for each variant are shown below.

Component variant Camera component Entity-place component
new-location-based gps-new-camera gps-new-entity-place
projected gps-projected-camera gps-projected-entity-place
classic gps-camera gps-entity-place

Camera component (gps-new-camera, gps-projected-camera or gps-camera)

Required: yes

Max allowed per scene: 1

This component enables the Location AR. It has to be added to the camera entity. It makes possible to handle both position and rotation of the camera and it's used to determine where the user is pointing their device.

For example:

<a-camera gps-new-camera></a-camera>

Properties

Property Description Default Value Availability
positionMinAccuracy Minimum accuracy allowed for position signal 100 all
gpsMinDistance Setting this allows you to control how far the camera must move, in meters, to generate a GPS update event. Useful to prevent 'jumping' of augmented content due to frequent small changes in position. 5 all
simulateLatitude Setting this allows you to simulate the latitude of the camera, to aid in testing. 0 (disabled) all (but only triggers GPS update event in new-location-based)
simulateLongitude Setting this allows you to simulate the longitude of the camera, to aid in testing. 0 (disabled) all (but only triggers GPS update event in new-location-based)
simulateAltitude Setting this allows you to simulate the altitude of the camera in meters above sea level, to aid in testing. 0 (disabled) all
alert Whether to show a message when GPS signal is under the positionMinAccuracy false projected, classic
minDistance If set, places with a distance from the user lower than this value, are not shown. Only a positive value is allowed. Value is in meters. In the new-location-based components, please set the near clipping plane of the perspective camera. 0 (disabled) projected, classic
maxDistance If set, places with a distance from the user higher than this value, are not shown. Only a positive value is allowed. Value is in meters. In the new-location-based components, please set the far clipping plane of the perspective camera. 0 (disabled) projected, classic
gpsTimeInterval Setting this allows you to control how frequently to obtain a new GPS position. If a previous GPS location is cached, the cached position will be used rather than a new position if its 'age' is less than this value, in milliseconds. This parameter is passed directly to the Geolocation API's watchPosition() method. 0 (always use new position, not cached) all

Entity-place component (gps-new-entity-place, gps-projected-entity-place or gps-entity-place)

Required: yes

Max allowed per scene: no limit

This component makes each entity GPS-trackable. This assigns a specific world position to an entity, so that the user can see it when their device is pointing to its position in the real world. If the user is far from the entity, it will seem smaller. If it's too far away, it won't be seen at all.

It requires latitude and longitude as a single string parameter (example with a-box aframe primitive):

<a-box material="color: yellow" gps-new-entity-place="latitude: <your-latitude>; longitude: <your-longitude>"/>

⚡️ In addition, you can use the a-frame "position" parameter to assign a y-value to change the height of the content. This value should be entered as meters above or below (if negative) the current camera height. For example, this would assign a height of 30 meters, and will be displayed relative to the gps-new-camera's current height:

<a-box material="color: yellow" gps-new-entity-place="latitude: <your-latitude>; longitude: <your-longitude>" position="0 30 0"/>

Properties


Events

Take a look at the UI and Events page for Location Based Custom Events.


⚡️ Usually, in Location Based, it's nice to have the augmented content that will always face the user, so when you rotate the camera, 3D models or most of all, text, are well visible.

Look at this example in order to create gps-new-entity-place entities that will always face the user (camera).


Viewing every distant object

If your location-based AR content is distant from the user (around 1km or more), it is recommended to use the new arjs-webcam-texture component (introduced in AR.js 3.2.0), which uses a three.js texture to stream the camera feed and allows distant content to be viewed. This component is automatically injected if the videoTexture parameter of the arjs system is set to true and the sourceType is webcam. For example (code snippet only):

    <a-scene
      vr-mode-ui="enabled: false"
      arjs="sourceType: webcam; videoTexture: true; debugUIEnabled: false;"
    >

Reducing shaking effects

In location-based mode, 'shaking' effects can occur due to frequent small changes in the device's orientation, due to the high sensitivity of the device sensors such as the accelerometer.

If using AR.js 3.3.1 or greater (3.4.3 or greater for the new-location-based components), this can optionally be reduced using an exponential smoothing technique. Note that, if you are NOT using the new-location-based components, there are currently some occasional display artefacts with this if moving the device quickly or suddenly so please test before you enable it in a finished application; work to resolve these is on-going. Alternatively, please use the new-location-based components.

This is enabled by adding a custom look-controls component to your a-camera with a smoothingFactor property. This replaces A-Frames default look-controls component, which must be disabled.

The name of the custom look-controls component varies, depending on which version of the location-based components you are using:

For example, in the new-location-based components:

<a-camera id='camera1' look-controls-enabled='false' arjs-device-orientation-controls='smoothingFactor: 0.1' gps-new-camera='gpsMinDistance: 5'> </a-camera>

or, otherwise:

<a-camera id='camera1' look-controls-enabled='false' arjs-look-controls='smoothingFactor: 0.1' gps-projected-camera='gpsMinDistance: 5'> </a-camera>

Exponential smoothing works by applying a smoothing factor to each newly-read device rotation angle (obtained from sensor readings) such that the previous smoothed value counts more than the current value, thus reducing 'noise' and 'jitter'. If k is the smoothing factor:

smoothedAngle = k * newValue + (1 - k) * previousSmoothedAngle

It can be seen from this that the smaller the value of k (the smoothingFactor property), the greater the smoothing effect. In tests, 0.1 appears to give the best result.

You can also reduce 'jumping' of augmented content when near a place - a bad-looking effect due to GPS sensor's low precision. To do so you can use the gpsMinDistance property, as shown in the examples above. This will only update the position if the user has moved at least that number of metres.


Projection Details

The new-location-based and projected location-based components for AR.js uses Spherical Mercator (aka EPSG:3857) to store both the camera position and the position of added points of interest and other geographical data.

Spherical Mercator is the same projection used by Google Maps and projects the earth onto a flat surface. It works reasonably at most latitudes but is highly distorted near the poles. Latitude and longitude is projected into Spherical Mercator eastings and northings, which are approximately (but not exactly) equivalent to metres.

The rationale for this is to allow easy addition of more complex geographic data such as roads and paths. Such data can be projected and added to an AR.js scene, and then, because Spherical Mercator units approximate to metres (away from the poles), the coordinates can be used directly as WebGL/A-Frame world coordinates.

Calculating world coordinates of arbitrary augmented content

The new-location-based and projected components have some useful properties and methods which can be used to easily work with more specialist augmented content (for example, you might want to overlay AR polylines or polygons representing roads and paths, downloaded from geodata APIs such as OpenStreetMap). Such data can be downloaded from the API as lat/lon based coordinates, projected using AR.js API methods into Spherical Mercator (approximating to, but not exactly metres, but in tests good enough to use as world coordinates), and then added to the scene as a three.js object.

This is implemented differently in the new-location-based and projected components, but the external API is (as of 3.4.3) the same.

The key method is the latLonToWorld(lat, lon) method of the gps-new-camera and gps-projected-camera components. This converts latitude and longitude directly to world coordinates, performing the projection as the first step and then calculating the world coordinates from the projected coordinates. It will return a 2-member array containing the x and z world coordinates, allowing the developer to calculate or specify the y coordinate (altitude) independently.

Note that the sign of the Spherical Mercator northing is reversed to align with the OpenGL coordinate system (eastings are equivalent to x coordinates and northings to z coordinates).

gps-new-camera implements projection via the underlying AR.js three.js LocationBased object (see three.js documentation, below) which is responsible for the actual projection.

gps-projected-camera provides similar functionality but via a different method and with some implementation differences. In gps-projected-camera, unlike gps-new-camera, the original GPS position is set as the world origin.

three.js

The three.js API keeps track of your current GPS location (or allows you to set a fake location) and allows you to add three.js objects at a given latitude and longitude. It includes these classes:

These classes include the following methods:

LocationBased

WebcamRenderer

Renders the webcam feed.

DeviceOrientationControls

Represents the device orientation controls, i.e. accelerometer and magnetic field sensors, for determining the orientation of the device. Based on the sample included in the three.js distribution.

Using three.js location-based in an application

You are recommended to use npm to install AR.js, import it into your application, and use a bundler such as Webpack to build.

Here is a sample package.json:

{
    "dependencies": {
        "@ar-js-org/ar.js": "3.4.2",
    },
    "devDependencies": {
        "webpack": "^5.75.0",
        "webpack-cli": "^5.0.0"
    },
    "scripts": {
        "build": "npx webpack"
    }
}

and a sample webpack.config.js:

const path = require('path');

module.exports = {
    mode: 'development',
    entry: './index.js',
    output: {
        path: path.resolve(__dirname, 'dist'),
        filename: 'bundle.js'
    },
    optimization: {
        minimize: false
    }
};

This will build a bundle named bundle.js in the dist subdirectory from a source file index.js.

Here is an example of importing the components into an application:

import * as THREEx from './node_modules/@ar-js-org/ar.js/three.js/build/ar-threex-location-only.js'