Merge pull request #1516 from jspsych/feature-plugin-extensions

Plugin Extensions and Eye Tracking via WebGazer
This commit is contained in:
Josh de Leeuw 2021-02-16 09:37:16 -05:00 committed by GitHub
commit 599d1f5c72
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
21 changed files with 90747 additions and 10 deletions

View File

@ -390,6 +390,7 @@ The settings object can contain several parameters. The only *required* paramete
| minimum_valid_rt | numeric | The minimum valid response time for key presses during the experiment. Any key press response time that is less than this value will be treated as invalid and ignored. Note that this parameter only applies to _keyboard responses_, and not to other response types such as buttons and sliders. The default value is 0. | | minimum_valid_rt | numeric | The minimum valid response time for key presses during the experiment. Any key press response time that is less than this value will be treated as invalid and ignored. Note that this parameter only applies to _keyboard responses_, and not to other response types such as buttons and sliders. The default value is 0. |
| override_safe_mode | boolean | Running a jsPsych experiment directly in a web browser (e.g., by double clicking on a local HTML file) will load the page using the `file://` protocol. Some features of jsPsych don't work with this protocol. By default, when jsPsych detects that it's running on a page loaded via the `file://` protocol, it runs in _safe mode_, which automatically disables features that don't work in this context. Specifically, the use of Web Audio is disabled (audio will be played using HTML5 audio instead, even if `use_webaudio` is `true`) and video preloading is disabled. The `override_safe_mode` parameter defaults to `false`, but you can set it to `true` to force these features to operate under the `file://` protocol. In order for this to work, you will need to disable web security (CORS) features in your browser - this is safe to do if you know what you are doing. Note that this parameter has no effect when you are running the experiment on a web server, because the page will be loaded via the `http://` or `https://` protocol. | | override_safe_mode | boolean | Running a jsPsych experiment directly in a web browser (e.g., by double clicking on a local HTML file) will load the page using the `file://` protocol. Some features of jsPsych don't work with this protocol. By default, when jsPsych detects that it's running on a page loaded via the `file://` protocol, it runs in _safe mode_, which automatically disables features that don't work in this context. Specifically, the use of Web Audio is disabled (audio will be played using HTML5 audio instead, even if `use_webaudio` is `true`) and video preloading is disabled. The `override_safe_mode` parameter defaults to `false`, but you can set it to `true` to force these features to operate under the `file://` protocol. In order for this to work, you will need to disable web security (CORS) features in your browser - this is safe to do if you know what you are doing. Note that this parameter has no effect when you are running the experiment on a web server, because the page will be loaded via the `http://` or `https://` protocol. |
| case_sensitive_responses | boolean | If true, then jsPsych will make a distinction between uppercase and lowercase keys when evaluating keyboard responses, e.g. "A" (uppercase) will not be recognized as a valid response if the trial only accepts "a" (lowercase). If false, then jsPsych will not make a distinction between uppercase and lowercase keyboard responses, e.g. both "a" and "A" responses will be valid when the trial's key choice parameter is "a". Setting this parameter to false is useful if you want key responses to be treated the same way when CapsLock is turned on or the Shift key is held down. The default value is false. | | case_sensitive_responses | boolean | If true, then jsPsych will make a distinction between uppercase and lowercase keys when evaluating keyboard responses, e.g. "A" (uppercase) will not be recognized as a valid response if the trial only accepts "a" (lowercase). If false, then jsPsych will not make a distinction between uppercase and lowercase keyboard responses, e.g. both "a" and "A" responses will be valid when the trial's key choice parameter is "a". Setting this parameter to false is useful if you want key responses to be treated the same way when CapsLock is turned on or the Shift key is held down. The default value is false. |
extensions | array | Array containing information about one or more jsPsych extensions that are used during the experiment. Each extension should be specified as an object with `type` (required), which is the name of the extension, and `params` (optional), which is an object containing any parameter-value pairs to be passed to the extension's `initialize` function. Default value is an empty array. |
Possible values for the exclusions parameter above. Possible values for the exclusions parameter above.

View File

@ -0,0 +1,83 @@
# Extensions
Extensions are jsPsych modules that can interface with any plugin to extend the functionality of the plugin. A canonical example of an extension is eye tracking. An eye tracking extension allows a plugin to gather gaze data and add it to the plugin's data object.
## Using an Extension
To use an extension in an experiment, you'll load the extension file via a `<script>` tag (just like adding a plugin) and then initialize the extension in the parameters of `jsPsych.init()`.
```html
<head>
<script src="jspsych/jspsych.js"></script>
<script src="jspsych/extensions/some-extension.js"></script>
</head>
```
```js
jsPsych.init({
timeline: [...],
extensions: [
{type: 'some-extension', params: {...} }
]
})
```
To enable an extension during a trial, add the extension to the `extensions` list for the trial. Some extensions may also support or require an object of parameters to configure the extension:
```js
var trial = {
extensions: [
{type: 'some-extension', params: {...} }
]
}
```
## List of Extensions
Extension | Description
------ | -----------
[jspsych&#8209;ext&#8209;webgazer.js](/extensions/jspsych-ext-webgazer.md) | Enables eye tracking using the [WebGazer](https://webgazer.cs.brown.edu/) library.
## Writing an Extension
To create a new extension you must create an object that supports a few event callbacks. A barebones extension file looks like this:
```js
jsPsych.extensions['new-extension'] = (function () {
var extension = {};
extension.initialize = function(params){
// params are passed from the extensions parameter in jsPsych.init
}
extension.on_start = function(params){
// params are passed from the extensions parameter in the trial object
}
extension.on_load = function(params){
// params are passed from the extensions parameter in the trial object
}
extension.on_finish = function(params){
// params are passed from the extensions parameter in the trial object
return {
// any data that the extension returns here will be added to the trial data
}
}
return extension;
});
```
The four events that an extension must support are shown in the sample code.
`extension.initialize` is called with `jsPsych.init()`. This is where setup code for the extension can happen. This event will happen once per experiment, unlike the other events which occur with each trial. The `params` object can include whatever parameters are necessary to configure the extension. The `params` object is passed from the call to `jsPsych.init()` to the `extension.initialize` method. `extension.initialize` must return a `Promise` that resolves when the extension is finished initializing.
`extension.on_start` is called at the start of the plugin execution, prior to calling `plugin.trial`. This is where trial-specific initialization can happen, such as creating empty containers to hold data or resetting internal state. The `params` object is passed from the declaration of the extension in the trial object. You can use `params` to customize the behavior of the extension for each trial.
`extension.on_load` is called after `plugin.trial` has executed, which is typically when the plugin has finished executing initial DOM-modifying code and has set up various event listeners. This is where the extension can begin actively interacting with the DOM and recording data. The `params` object is passed from the declaration of the extension in the trial object. You can use `params` to customize the behavior of the extension for each trial.
`extension.on_finish` is called after the plugin completes. This can be used for any teardown at the end of the trial. This method should return an object of data to append to the plugin's data. Note that this event fires *before* the `on_finish` event for the plugin, so data added by the extension is accessible in any trial `on_finish` event handlers. The `params` object is passed from the declaration of the extension in the trial object. You can use `params` to customize the behavior of the extension for each trial.
The extension can also include any additional methods that are necessary for interacting with it. See the [webgazer extension](/extensions/jspsych-ext-webgazer.md) for an example.

View File

@ -0,0 +1,106 @@
# jspsych-ext-webgazer
This extension supports eye tracking through the [WebGazer](https://webgazer.cs.brown.edu/) library. For a narrative description of how to use this extension see the [eye tracking overview](/overview/eye-tracking.md).
## Parameters
### Initialization Parameters
Initialization parameters can be set when calling `jsPsych.init()`
```js
jsPsych.init({
extensions: [
{type: 'webgazer', params: {...}}
]
})
```
Parameter | Type | Default Value | Description
----------|------|---------------|------------
webgazer | object | `undefined` | You can explicitly pass a reference to a loaded instance of the webgazer.js library. If no explicit reference is passed then the extension will look for a global `webgazer` object. If you are loading webgazer.js via a `<script>` tag you do not need to set this parameter in most circumstances.
round_predictions | bool | true | Whether to round the `x`,`y` coordinates predicted by WebGazer to the nearest whole number. This *greatly* reduces the size of the data, as WebGazer records data to 15 decimal places by default. Given the noise of the system, there's really no need to record data to this level of precision.
### Trial Parameters
Trial parameters can be set when adding the extension to a trial object.
```js
var trial = {
type: '...',
extensions: [
{type: 'webgazer', params: {...}}
]
}
```
Parameter | Type | Default Value | Description
----------|------|---------------|------------
targets | array | [] | A list of elements on the page that you would like to record the coordinates of for comparison with the WebGazer data. Each entry in the array should be a valid [CSS selector string](https://www.w3schools.com/cssref/css_selectors.asp) that identifies the element. The selector string should be valid for exactly one element on the page. If the selector is valid for more than one element then only the first matching element will be recorded.
## Data Generated
Name | Type | Value
-----|------|------
webgazer_data | array | An array of objects containing gaze data for the trial. Each object has an `x`, a `y`, and a `t` property. The `x` and `y` properties specify the gaze location in pixels and `t` specifies the time in milliseconds since the start of the trial.
webgazer_targets | array | An array of objects contain the pixel coordinates of elements on the screen specified by the `.targets` parameter. Each object contains a `selector` property, containing the CSS selector string used to find the element, plus `top`, `bottom`, `left`, and `right` parameters which specify the [bounding rectangle](https://developer.mozilla.org/en-US/docs/Web/API/Element/getBoundingClientRect) of the element.
## Functions
In addition to the jsPsych webgazer-* plugins, the jsPsych webgazer extension provides a set of functions that allow the researcher to interact more directly with WebGazer. These functions can be called at any point during an experiment, and are crucial for building trial plugins that interact with WebGazer. All of the functions below must be prefixed with `jsPsych.extensions.webgazer` (e.g. `jsPsych.extensions.webgazer.faceDetected()`).
### faceDetected()
Returns `true` if WebGazer is ready to make predictions (`webgazer.getTracker().predictionReady` is `true`).
### showPredictions()
Turns on WebGazer's real-time visualization of predicted gaze location.
### hidePredictions()
Turns off WebGazer's real-time visualization of predicted gaze location.
### showVideo()
Turns on a display of the webcam image, guiding box for positioning the face, and WebGazer's estimate of the location of facial landmarks.
### hideVideo()
Turns off the camera display.
### resume()
Turns on gaze prediction. The extension will automatically handle this for you in most cases. You probably only need to use this if you are writing your own plugin that interfaces directly with WebGazer.
### pause()
Turns off gaze prediction. The extension will automatically handle this for you in most cases. You probably only need to use this if you are writing your own plugin that interfaces directly with WebGazer.
### startMouseCalibration()
Turns on mouse movement and mouse clicks as calibration events. While the `webgazer-calibration` plugin can also be used to run a parmeterized calibration routine, this calibration function call allows you to continuously calibrate WebGazer to any mouse movements or clicks throughout the experiment. For example, any *-button-response trial would also function as a WebGazer calibration event.
### stopMouseCalibration()
Stops WebGazer from using mouse movements and mouse clicks as calibration events.
### calibratePoint(x, y)
Instructs WebGazer to register the location `x`, `y` (in screen pixel coordinates) as a calibration event. Can be used for passive viewing calibration, i.e., instructing participants to fixate at a particular location.
### setRegressionType(regression_type)
Change the method that WebGazer is using to perform feature -> location regression. Valid options are `ridge`, `weightedRidge`, and `threadedRidge`. See the WebGazer docs for more information about these options.
The extension uses the default mode specified by WebGazer (currently `ridge`).
### getCurrentPrediction()
Get the current predicted gaze location from WebGazer. Returns an object with `x`, `y`, and `eyeFeature` properties. This function is asynchronus, so proper use requires either the `await` keyword in the context of another `async function` or using `.then()`.
```js
jsPsych.extensions.webgazer.getCurrentPrediction().then(function(data){
console.log(`Currently looking at coordinate ${data.x}, ${data.y}`)
});
```

View File

@ -138,3 +138,16 @@ jsPsych.init({
override_safe_mode: true override_safe_mode: true
}); });
``` ```
## Add extensions
Extensions are jsPsych modules that can run throughout the experiment and interface with any plugin to extend the functionality of the plugin. One example of an extension is eye tracking, which allows you to gather gaze data during any trial and add it to that trial's data object. If you want to use extensions in your experiment, you must specify this when you initialize the experiment with `jsPsych.init`. The `extensions` parameter in `jsPsych.init` is an array of objects, where each object specifies the extension that you'd like to use in the experiment. Below is an example of adding the webgazer extension.
```js
jsPsych.init({
timeline: [...],
extensions: [
{type: 'webgazer'}
]
});
```

View File

@ -0,0 +1,237 @@
# Eye Tracking
jsPsych supports eye tracking through the [WebGazer](https://webgazer.cs.brown.edu/) library. WebGazer uses computer vision techniques to identify features of the participant's eyes via a webcam and predicts gaze location. The system is calibrated by having the participant click on or look at known locations on the screen. These locations are linked to eye features. Gaze location is predicted using regression.
## Getting Started
First, [download WebGazer.js ](https://webgazer.cs.brown.edu/#download) and include it in your experiment file via a `<script>` tag. You'll also need to include jsPsych's [webgazer extension](/extensions/jspsych-ext-webgazer.md).
```html
<head>
<script src="jspsych/jspsych.js"></script>
<script src="webgazer.js"></script>
<script src="jspsych/extensions/jspsych-ext-webgazer.js"></script>
</head>
```
!!! tip
An example experiment using WebGazer is available in the **/examples** folder of the jsPsych release. See `webgazer.html`.
To use the WebGazer extension in an experiment, include it in the list of extensions passed to `jsPsych.init()`
```js
jsPsych.init({
timeline: [...],
extensions: [
{type: 'webgazer'}
]
})
```
To help the participant position their face correctly for eye tracking you can use the [jspsych-webgazer-init-camera plugin](/plugins/jspsych-webgazer-init-camera.ms). This will show the participant what the camera sees, including facial feature landmarks, and prevent the participant from continuing until their face is in good position for eye tracking.
```js
var init_camera_trial = {
type: 'webgazer-init-camera'
}
```
To calibrate WebGazer, you can use the [jspsych-webgazer-calibrate plugin](/plugins/jspsych-webgazer-calibrate.md). This plugin allows you to specify a set of points on the screen for calibration and to choose the method for calibrating -- either clicking on each point or simply fixating on each point. The location of calibration points is specified in percentages, e.g., `[25,50]` will result in a point that is 25% of the width of the screen from the left edge and 50% of the height of the screen from the top edge. Options for controlling other details of the calibration are explained in the [documentation for the plugin](/plugins/jspsych-webgazer-calibrate.md).
Note that instructions are not included in the calibration plugin, so you'll likely want to use a different plugin (e.g., `html-button-response`) to display instructions prior to running the calibration.
```js
var calibration_trial = {
type: 'webgazer-calibrate',
calibration_points: [[25,50], [50,50], [75,50], [50,25], [50,75]],
calibration_mode: 'click'
}
```
To measure the accuracy and precision of the calibration, you can use the [jspsych-webgazer-vaidate plugin](/plugins/jspsych-webgazer-validate.md). Like the calibration plugin, you can specify a list of points to perform validation on. Here you can specify the points as either percentages or in terms of the distance from the center of the screen in pixels. Which mode you use will probably depend on how you are defining your stimuli throughout the experiment. You can also specify the radius of tolerance around each point, and the plugin will calculate the percentage of measured gaze samples within that radius. This is a potentially useful heuristic for deciding whether or not to calibrate again. Options for controlling other details of the validation are explained in the [documentation for the plugin](/plugins/jspsych-webgazer-validate.md).
```js
var validation_trial = {
type: 'webgazer-validate',
validation_points: [[-200,200], [200,200],[-200,-200],[200,-200]],
validation_point_coordinates: 'center-offset-pixels',
roi_radius: 100
}
```
The validation procedure stores the raw gaze data for each validation point, the computed average offset from each validation point, the percentage of samples within the `roi_radius` for each validation point, and the number of samples collected per second.
```js
{
raw_gaze: [...],
percent_in_roi: [...],
average_offset: [...],
samples_per_sec: ...
}
```
We recommend performing calibration and validation periodically throughout your experiment.
To enable eye tracking for a trial in your experiment, you can simply add the WebGazer extension to the trial.
```js
var trial = {
type: 'html-keyboard-response',
stimulus: '<img id="scene" src="my-scene.png"></img>',
extensions: [
{
type: 'webgazer',
params: {
targets: ['#scene']
}
}
]
}
```
This will turn on WebGazer at the start of the trial.
The `params` property in the `extensions` declaration allows you to pass in a list of [CSS selector strings](https://www.w3schools.com/cssref/css_selectors.asp). The [bounding rectangle](https://developer.mozilla.org/en-US/docs/Web/API/Element/getBoundingClientRect) of the DOM element that matches each selector will be recorded in the data for that trial. This allows for easy alignment of the gaze data and objects on the screen.
```js
webgazer_targets : [
{selector: ..., top: ..., left: ..., right: ..., bottom:...},
{selector: ..., top: ..., left: ..., right: ..., bottom:...},
]
```
Gaze data will be added to the trial's data under the property `webgazer_data`. The gaze data is an array of objects. Each object has an `x`, a `y`, and a `t` property. The `x` and `y` properties specify the gaze location in pixels and `t` specifies the time in milliseconds since the start of the trial. Note that establishing the precision and accuracy of these measurements across the variety of web browsers and systems that your experiment participants might be using is quite difficult. For example, different browsers may cause small systematic shifts in the accuracy of `t` values.
```js
webgazer_data: [
{x: ..., y: ..., t: ...},
{x: ..., y: ..., t: ...},
{x: ..., y: ..., t: ...},
{x: ..., y: ..., t: ...}
]
```
## Tips for Improving Data Quality
These are some anecdotal observations about factors that improve data quality.
1. The quality of the camera feed is essential. Good lighting makes a big difference. You may want to encourage participants to perform any eye tracking experiments in a well-lit room.
2. Participants need to keep their head relatively still during and after calibration. The calibration is not robust to head movements.
3. WebGazer's click-based calibration can be used throughout the experiment. You can turn this on by calling `jsPsych.extensions.webgazer.startMouseCalibration()` at any point in the experiment. If you use a continue button to advance through the experiment and move the location of the continue button around you can be making small adjustments to the calibration throughout.
4. Computing the gaze predictions consumes more computational resources than most other things that jsPsych is typically used for. The sampling rate that WebGazer is able to achieve will depend on the computing power of the participant's device. You may want to ask the participant to close any non-essential software and browser windows prior to completing the experiment. You may also want to check that the sampling rate is sufficiently high as part of validation.
If you have tips based on your own experience please consider sharing them on our [discussion forum](https://github.com/jspsych/jsPsych/discussions) and we'll add to this list!
## Example
The code below shows a basic example of what it looks like when you put all of these things together in your experiment's HTML file.
```html
<html>
<head>
<script src="jspsych/jspsych.js"></script>
<script src="jspsych/plugins/jspsych-preload.js"></script>
<script src="jspsych/plugins/jspsych-image-keyboard-response.js"></script>
<script src="jspsych/plugins/jspsych-html-keyboard-response.js"></script>
<script src="jspsych/plugins/jspsych-webgazer-init-camera.js"></script>
<script src="jspsych/plugins/jspsych-webgazer-calibrate.js"></script>
<script src="jspsych/plugins/jspsych-webgazer-validation.js"></script>
<script src="js/webgazer.js"></script>
<script src="jspsych/extensions/jspsych-ext-webgazer.js"></script>
<link rel="stylesheet" href="jspsych/css/jspsych.css">
</head>
<body></body>
<script>
var preload = {
type: 'preload',
images: ['img/blue.png']
}
var init_camera = {
type: 'webgazer-init-camera'
}
var calibration = {
type: 'webgazer-calibrate'
}
var validation = {
type: 'webgazer-validate'
}
var start = {
type: 'html-keyboard-response',
stimulus: 'Press any key to start.'
}
var trial = {
type: 'image-keyboard-response',
stimulus: 'img/blue.png',
choices: jsPsych.NO_KEYS,
trial_duration: 1000,
extensions: [
{
type: 'webgazer',
params: {targets: ['#jspsych-image-keyboard-response-stimulus']}
}
]
}
jsPsych.init({
timeline: [init_camera, calibration, validation, start, trial],
preload_images: ['img/blue.png'],
extensions: [
{type: 'webgazer'}
]
})
</script>
</html>
```
Below is example data from the image-keyboard-response trial taken from the experiment above. In addition to the standard data that is collected for this plugin, you can see the additional `webgazer_data` and `webgazer_targets` arrays. The `webgazer_data` shows 21 gaze location estimates during the 1-second image presentation. The `webgazer_targets` array shows that there was one target, the image-keyboard-response stimulus, and tells you the x- and y-coordinate boundaries for the target (image) rectangle. By comparing each of the x/y locations from the `webgazer_data` locations array with the target boundaries in `webgazer_targets`, you can determine if/when the estimated gaze location was inside the target area.
```js
{
"rt": null,
"stimulus": "img/blue.png",
"key_press": null,
"trial_type": "image-keyboard-response",
"trial_index": 4,
"time_elapsed": 30701,
"internal_node_id": "0.0-4.0",
"webgazer_data": [
{ "x": 1065, "y": 437, "t": 39},
{ "x": 943, "y": 377, "t": 79},
{ "x": 835, "y": 332, "t": 110},
{ "x": 731, "y": 299, "t": 146},
{ "x": 660, "y": 271, "t": 189},
{ "x": 606, "y": 251, "t": 238},
{ "x": 582, "y": 213, "t": 288},
{ "x": 551, "y": 200, "t": 335},
{ "x": 538, "y": 183, "t": 394},
{ "x": 514, "y": 177, "t": 436},
{ "x": 500, "y": 171, "t": 493},
{ "x": 525, "y": 178, "t": 542},
{ "x": 537, "y": 182, "t": 592},
{ "x": 543, "y": 178, "t": 633},
{ "x": 547, "y": 177, "t": 691},
{ "x": 558, "y": 174, "t": 739},
{ "x": 574, "y": 183, "t": 789},
{ "x": 577, "y": 197, "t": 838},
{ "x": 584, "y": 214, "t": 889},
{ "x": 603, "y": 218, "t": 937},
{ "x": 606, "y": 221, "t": 987}
],
"webgazer_targets": [
{
"selector": "#jspsych-image-keyboard-response-stimulus",
"top": 135.33334350585938,
"bottom": 435.3333435058594,
"left": 490,
"right": 790
}
]
}
```

View File

@ -0,0 +1,60 @@
# jspsych-webgazer-calibrate
This plugin can be used to calibrate the [WebGazer extension](/extensions/jspsych-ext-webgazer.md). For a narrative description of eye tracking with jsPsych, see the [eye tracking overview](/overview/eye-tracking.md).
## Parameters
In addition to the [parameters available in all plugins](overview.md#parameters-available-in-all-plugins), this plugin accepts the following parameters. Parameters with a default value of *undefined* must be specified. Other parameters can be left unspecified if the default value is acceptable.
Parameter | Type | Default Value | Description
----------|------|---------------|------------
calibration_points | array | `[[10,10], [10,50], [10,90], [50,10], [50,50], [50,90], [90,10], [90,50], [90,90]]` | Array of points in `[x,y]` coordinates. Specified as a percentage of the screen width and height, from the left and top edge. The default grid is 9 points.
calibration_mode | string | `'click'` | Can specify `click` to have subjects click on calibration points or `view` to have subjects passively watch calibration points.
repetitions_per_point | numeric | 1 | The number of times to repeat the sequence of calibration points.
randomize_calibration_order | bool | `false` | Whether to randomize the order of the calibration points.
time_to_saccade | numeric | 1000 | If `calibration_mode` is set to `view`, then this is the delay before calibrating after showing a point. Gives the participant time to fixate on the new target before assuming that the participant is looking at the target.
time_per_point | numeric | 1000 | If `calibration_mode` is set to `view`, then this is the length of time to show a point while calibrating. Note that if `click` calibration is used then the point will remain on the screen until clicked.
## Data Generated
In addition to the [default data collected by all plugins](overview.md#data-collected-by-plugins), this plugin collects the following data for each trial.
Name | Type | Value
-----|------|------
No data currently added by this plugin. Use the [webgazer-validate](/plugins/jspsych-webgazer-validate.md) plugin to measure the precision and accuracy of calibration.
## Example
#### Click-based calibration with 5 points
```javascript
var calibration = {
type: 'webgazer-calibrate',
calibration_points: [[50,50], [25,25], [25,75], [75,25], [75,75]],
repetitions_per_point: 2,
randomize_calibration_order: true
}
```
### View-based calibration with 33 points, concentrated in the center
```javascript
var calibration = {
type: 'webgazer-calibrate',
calibration_points: [
[10,10],[10,50],[10,90],
[30,10],[30,50],[30,90],
[40,10],[40,30],[40,40],[40,45],[40,50],[40,55],[40,60],[40,70],[40,90],
[50,10],[50,30],[50,40],[50,45],[50,50],[50,55],[50,60],[50,70],[50,90],
[60,10],[60,30],[60,40],[60,45],[60,50],[60,55],[60,60],[60,70],[60,90],
[70,10],[70,50],[70,90],
[90,10],[90,50],[90,90]
],
repetitions_per_point: 1,
randomize_calibration_order: true,
calibration_mode: 'view',
time_per_point: 500,
time_to_saccade: 1000
}
```

View File

@ -0,0 +1,31 @@
# jspsych-webgazer-init-camera
This plugin initializes the camera and helps the participant center their face in the camera view for using the the [WebGazer extension](/extensions/jspsych-ext-webgazer.md). For a narrative description of eye tracking with jsPsych, see the [eye tracking overview](/overview/eye-tracking.md).
## Parameters
In addition to the [parameters available in all plugins](overview.md#parameters-available-in-all-plugins), this plugin accepts the following parameters. Parameters with a default value of *undefined* must be specified. Other parameters can be left unspecified if the default value is acceptable.
Parameter | Type | Default Value | Description
----------|------|---------------|------------
instructions | string | too long to put here | Instructions for the participant to follow.
button_text | string | Continue | The text for the button that participants click to end the trial.
## Data Generated
In addition to the [default data collected by all plugins](overview.md#data-collected-by-plugins), this plugin collects the following data for each trial.
Name | Type | Value
-----|------|------
No additional data collected.
## Example
#### Parameterless use
```javascript
var init_camera = {
type: 'webgazer-init-camera'
}
```

View File

@ -0,0 +1,43 @@
# jspsych-webgazer-calibrate
This plugin can be used to measure the accuracy and precision of gaze predictions made by the [WebGazer extension](/extensions/jspsych-ext-webgazer.md). For a narrative description of eye tracking with jsPsych, see the [eye tracking overview](/overview/eye-tracking.md).
## Parameters
In addition to the [parameters available in all plugins](overview.md#parameters-available-in-all-plugins), this plugin accepts the following parameters. Parameters with a default value of *undefined* must be specified. Other parameters can be left unspecified if the default value is acceptable.
Parameter | Type | Default Value | Description
----------|------|---------------|------------
validation_points | array | `[[10,10], [10,50], [10,90], [50,10], [50,50], [50,90], [90,10], [90,50], [90,90]]` | Array of points in `[x,y]` coordinates. The default grid is 9 points. Meaning of coordinates controlled by `validation_point_coordinates` parameter.
validation_point_coordinates | string | `'percent'` | Can specify `percent` to have validation point coordinates specified in percentage of screen width and height, or `center-offset-pixels` to specify each point as the distance in pixels from the center of the screen.
roi_radius | numeric | 200 | Tolerance around the validation point in pixels when calculating the percent of gaze measurements within the acceptable range.
repetitions_per_point | numeric | 1 | The number of times to repeat the sequence of calibration points.
randomize_validation_order | bool | `false` | Whether to randomize the order of the validation points.
time_to_saccade | numeric | 1000 | The delay before validating after showing a point. Gives the participant time to fixate on the new target before assuming that the participant is looking at the target.
validation_duration | numeric | 2000 | If `calibration_mode` is set to `view`, then this is the length of time to show a point while calibrating. Note that if `click` calibration is used then the point will remain on the screen until clicked.
point_size | numeric | 10 | Diameter of the validation points in pixels.
show_validation_data | bool | false | If `true` then a visualization of the validation data will be shown on the screen after the validation is complete. This will show each measured gaze location color coded by whether it is within the `roi_radius` of the target point. This is mainly intended for testing and debugging.
## Data Generated
In addition to the [default data collected by all plugins](overview.md#data-collected-by-plugins), this plugin collects the following data for each trial.
Name | Type | Value
-----|------|------
raw_gaze | array | Raw gaze data for the trial. The array will contain a nested array for each validation point. Within each nested array will be a list of `{dx,dy}` values specifying the distance from the target for that gaze point.
percent_in_roi | array | The percentage of samples within the `roi_radius` for each validation point.
average_offset | array | The average `x` and `y` distance from each validation point, plus the median distance `r` of the points from this average offset.
samples_per_sec | numeric | The average number of samples per second. Calculated by finding samples per second for each point and then averaging these estimates together.
## Example
#### 4 point validation using center offset mode
```javascript
var validation = {
type: 'webgazer-validate',
validation_points: [[-200,-200], [-200,200], [200,-200], [200,200]],
validation_point_coordinates: 'center-offset-pixels',
show_validation_data: true
}
```

View File

@ -48,3 +48,6 @@ Plugin | Description
[jspsych&#8209;visual&#8209;search&#8209;circle](/plugins/jspsych-visual-search-circle) | A customizable visual-search task modelled after [Wang, Cavanagh, & Green (1994)](http://dx.doi.org/10.3758/BF03206946). The subject indicates whether or not a target is present among a set of distractors. The stimuli are displayed in a circle, evenly-spaced, equidistant from a fixation point. [jspsych&#8209;visual&#8209;search&#8209;circle](/plugins/jspsych-visual-search-circle) | A customizable visual-search task modelled after [Wang, Cavanagh, & Green (1994)](http://dx.doi.org/10.3758/BF03206946). The subject indicates whether or not a target is present among a set of distractors. The stimuli are displayed in a circle, evenly-spaced, equidistant from a fixation point.
[jspsych&#8209;vsl&#8209;animate&#8209;occlusion](/plugins/jspsych-vsl-animate-occlusion) | A visual statistical learning paradigm based on [Fiser & Aslin (2002)](http://dx.doi.org/10.1037//0278-7393.28.3.458). A sequence of stimuli are shown in an oscillatory motion. An occluding rectangle is in the center of the display, and the stimuli change when they are behind the rectangle. [jspsych&#8209;vsl&#8209;animate&#8209;occlusion](/plugins/jspsych-vsl-animate-occlusion) | A visual statistical learning paradigm based on [Fiser & Aslin (2002)](http://dx.doi.org/10.1037//0278-7393.28.3.458). A sequence of stimuli are shown in an oscillatory motion. An occluding rectangle is in the center of the display, and the stimuli change when they are behind the rectangle.
[jspsych&#8209;vsl&#8209;grid&#8209;scene](/plugins/jspsych-vsl-grid-scene) | A visual statistical learning paradigm based on [Fiser & Aslin (2001)](http://dx.doi.org/10.1111/1467-9280.00392). A scene made up of individual stimuli arranged in a grid is shown. This plugin can also generate the HTML code to render the stimuli for use in other plugins. [jspsych&#8209;vsl&#8209;grid&#8209;scene](/plugins/jspsych-vsl-grid-scene) | A visual statistical learning paradigm based on [Fiser & Aslin (2001)](http://dx.doi.org/10.1111/1467-9280.00392). A scene made up of individual stimuli arranged in a grid is shown. This plugin can also generate the HTML code to render the stimuli for use in other plugins.
[jspsych&#8209;webgazer&#8209;calibrate](/plugins/jspsych-webgazer-calibrate) | Calibrates the WebGazer extension for eye tracking.
[jspsych&#8209;webgazer&#8209;init&#8209;camera](/plugins/jspsych-webgazer-init-camera) | Initializes the camera and helps the participant center their face for eye tracking.
[jspsych&#8209;webgazer&#8209;validate](/plugins/jspsych-webgazer-validate) | Performs validation to measure precision and accuracy of WebGazer eye tracking predictions.

88886
examples/js/webgazer.js Normal file

File diff suppressed because one or more lines are too long

162
examples/webgazer.html Normal file
View File

@ -0,0 +1,162 @@
<!DOCTYPE html>
<html>
<head>
<script src="../jspsych.js"></script>
<script src="../plugins/jspsych-html-keyboard-response.js"></script>
<script src="../plugins/jspsych-html-button-response.js"></script>
<script src="../plugins/jspsych-webgazer-init-camera.js"></script>
<script src="../plugins/jspsych-webgazer-calibrate.js"></script>
<script src="../plugins/jspsych-webgazer-validate.js"></script>
<script src="js/webgazer.js"></script>
<script src="../extensions/jspsych-ext-webgazer.js"></script>
<link rel="stylesheet" href="../css/jspsych.css">
<style>
.jspsych-content { max-width: 100%;}
</style>
</head>
<body></body>
<script>
var init_camera = {
type: 'webgazer-init-camera'
}
var calibration_instructions = {
type: 'html-button-response',
stimulus: `
<p>Great! Now the eye tracker will be calibrated to translate the image of your eyes from the webcam to a location on your screen.</p>
<p>To do this, you need to click a series of dots.</p>
<p>Keep your head still, and click on each dot as it appears. Look at the dot as you click it.</p>
`,
choices: ['Click to begin'],
post_trial_gap: 1000
}
var calibration = {
type: 'webgazer-calibrate',
calibration_points: [[50,50], [25,25], [25,75], [75,25], [75,75]],
//calibration_points: [[10,10],[10,30],[10,50],[10,70],[10,90],[30,10],[30,30],[30,50],[30,70],[30,90],[50,10],[50,30],[50,50],[50,70],[50,90],[70,10],[70,30],[70,50],[70,70],[70,90],[90,10],[90,30],[90,50],[90,70],[90,90]],
// calibration_points: [
// [10,10],[10,50],[10,90],
// [30,10],[30,50],[30,90],
// [40,10],[40,30],[40,40],[40,45],[40,50],[40,55],[40,60],[40,70],[40,90],
// [50,10],[50,30],[50,40],[50,45],[50,50],[50,55],[50,60],[50,70],[50,90],
// [60,10],[60,30],[60,40],[60,45],[60,50],[60,55],[60,60],[60,70],[60,90],
// [70,10],[70,50],[70,90],
// [90,10],[90,50],[90,90]],
repetitions_per_point: 1,
randomize_calibration_order: true,
}
var validation_instructions = {
type: 'html-button-response',
stimulus: `
<p>Let's see how accurate the eye tracking is. </p>
<p>Keep your head still, and move your eyes to focus on each dot as it appears.</p>
<p>You do not need to click on the dots. Just move your eyes to look at the dots.</p>
`,
choices: ['Click to begin'],
post_trial_gap: 1000
}
var validation = {
type: 'webgazer-validate',
validation_points: [[-200,-200], [-200,200], [200,-200], [200,200]],
validation_point_coordinates: 'center-offset-pixels',
show_validation_data: true
}
var task_instructions = {
type: 'html-button-response',
stimulus: `
<p>We're ready for the task now.</p>
<p>You'll see an arrow symbol (⬅ or ➡) appear on the screen.</p>
<p>Your job is to press A if ⬅ appears, and L if ➡ appears.</p>
<p>This will repeat 8 times.</p>
`,
choices: ['I am ready!'],
post_trial_gap: 1000
}
var fixation = {
type: 'html-keyboard-response',
stimulus: '<p style="font-size:40px;">+</p>',
choices: jsPsych.NO_KEYS,
trial_duration: 500
}
var trial = {
type: 'html-keyboard-response',
stimulus: function () {
return(
`<div style="position: relative; width: 400px; height: 400px;">
<div style="position: absolute; top:${jsPsych.timelineVariable('top', true)}%; left: ${jsPsych.timelineVariable('left', true)}%">
<span id="arrow-target" style="font-size: 40px; transform: translate(-50%, -50%);">${jsPsych.timelineVariable('direction', true) == 'left' ? '⬅' : '➡'}</span>
</div>
</div>`
)
},
choices: ['a', 'l'],
post_trial_gap: 750,
data: {
top: jsPsych.timelineVariable('top'),
left: jsPsych.timelineVariable('left')
},
extensions: [
{type: 'webgazer', params: {targets: ['#arrow-target']}}
]
}
var params = [
{ left: 0, top: 0, direction: 'left' },
{ left: 100, top: 0, direction: 'left' },
{ left: 0, top: 100, direction: 'left' },
{ left: 100, top: 100, direction: 'left' },
{ left: 0, top: 0, direction: 'right' },
{ left: 100, top: 0, direction: 'right' },
{ left: 0, top: 100, direction: 'right' },
{ left: 100, top: 100, direction: 'right' },
]
var trial_proc = {
timeline: [fixation, trial],
timeline_variables: params,
randomize_order: true
}
var done = {
type: 'html-button-response',
choices: ['CSV', 'JSON'],
stimulus: `<p>Done!</p><p>If you'd like to download a copy of the data to explore, click the format you'd like below</p>`,
on_finish: function(data){
if(data.button_pressed == 0){
jsPsych.data.get().localSave('csv','webgazer-sample-data.csv');
}
if(data.button_pressed == 1){
jsPsych.data.get().localSave('json', 'webgazer-sample-data.json');
}
}
}
var timeline = [];
timeline.push(init_camera);
timeline.push(calibration_instructions);
timeline.push(calibration);
timeline.push(validation_instructions);
timeline.push(validation);
timeline.push(task_instructions);
timeline.push(trial_proc);
timeline.push(done);
jsPsych.init({
timeline: timeline,
extensions: [
{type: 'webgazer'}
]
})
</script>
</html>

View File

@ -0,0 +1,60 @@
<!DOCTYPE html>
<html>
<head>
<script src="../jspsych.js"></script>
<script src="../plugins/jspsych-preload.js"></script>
<script src="../plugins/jspsych-image-keyboard-response.js"></script>
<script src="../plugins/jspsych-html-keyboard-response.js"></script>
<script src="../plugins/jspsych-webgazer-init-camera.js"></script>
<script src="../plugins/jspsych-webgazer-calibrate.js"></script>
<script src="js/webgazer.js"></script>
<script src="../extensions/jspsych-ext-webgazer.js"></script>
<link rel="stylesheet" href="../css/jspsych.css">
</head>
<body></body>
<script>
var preload = {
type: 'preload',
images: ['img/blue.png']
}
var init_camera = {
type: 'webgazer-init-camera'
}
var validation = {
type: 'webgazer-calibrate',
}
var start = {
type: 'html-keyboard-response',
stimulus: 'Press any key to start.'
}
var trial = {
type: 'image-keyboard-response',
stimulus: 'img/blue.png',
render_on_canvas: false,
choices: jsPsych.NO_KEYS,
trial_duration: 1000,
extensions: [
{
type: 'webgazer',
params: {targets: ['#jspsych-image-keyboard-response-stimulus']}
}
]
}
jsPsych.init({
timeline: [preload, init_camera, validation, start, trial],
extensions: [
{type: 'webgazer'}
],
on_finish: function() {
jsPsych.data.displayData();
}
})
</script>
</html>

View File

@ -0,0 +1,185 @@
jsPsych.extensions['webgazer'] = (function () {
var extension = {};
// private state for the extension
// extension authors can define public functions to interact
// with the state. recommend not exposing state directly
// so that state manipulations are checked.
var state = {};
// required, will be called at jsPsych.init
// should return a Promise
extension.initialize = function (params) {
return new Promise(function(resolve, reject){
if (typeof params.webgazer === 'undefined') {
if (window.webgazer) {
state.webgazer = window.webgazer;
} else {
reject(new Error('webgazer extension failed to initialize. webgazer.js not loaded. Load webgazer.js before calling jsPsych.init()'));
}
} else {
state.webgazer = params.webgazer;
}
if (typeof params.round_predictions === 'undefined'){
state.round_predictions = true;
} else {
state.round_predictions = params.round_predictions;
}
// sets up event handler for webgazer data
state.webgazer.setGazeListener(handleGazeDataUpdate);
// starts webgazer, and once it initializes we stop mouseCalibration and
// pause webgazer data.
state.webgazer.begin().then(function () {
extension.stopMouseCalibration();
extension.pause();
resolve();
})
// hide video by default
extension.hideVideo();
// hide predictions by default
extension.hidePredictions();
})
}
// required, will be called when the trial starts (before trial loads)
extension.on_start = function (params) {
state.currentTrialData = [];
state.currentTrialTargets = [];
}
// required will be called when the trial loads
extension.on_load = function (params) {
// set current trial start time
state.currentTrialStart = performance.now();
// resume data collection
state.webgazer.resume();
// set internal flag
state.activeTrial = true;
// record bounding box of any elements in params.targets
if(typeof params !== 'undefined'){
if(typeof params.targets !== 'undefined'){
for(var i=0; i<params.targets.length; i++){
var target = document.querySelector(params.targets[i]);
if(target !== null){
var bounding_rect = target.getBoundingClientRect();
state.currentTrialTargets.push({
selector: params.targets[i],
top: bounding_rect.top,
bottom: bounding_rect.bottom,
left: bounding_rect.left,
right: bounding_rect.right
})
}
}
}
}
}
// required, will be called when jsPsych.finishTrial() is called
// must return data object to be merged into data.
extension.on_finish = function (params) {
// pause the eye tracker
state.webgazer.pause();
// set internal flag
state.activeTrial = false;
// send back the gazeData
return {
webgazer_data: state.currentTrialData,
webgazer_targets: state.currentTrialTargets
}
}
extension.faceDetected = function () {
return state.webgazer.getTracker().predictionReady;
}
extension.showPredictions = function () {
state.webgazer.showPredictionPoints(true);
}
extension.hidePredictions = function () {
state.webgazer.showPredictionPoints(false);
}
extension.showVideo = function () {
state.webgazer.showVideo(true);
state.webgazer.showFaceOverlay(true);
state.webgazer.showFaceFeedbackBox(true);
}
extension.hideVideo = function () {
state.webgazer.showVideo(false);
state.webgazer.showFaceOverlay(false);
state.webgazer.showFaceFeedbackBox(false);
}
extension.resume = function () {
state.webgazer.resume();
}
extension.pause = function () {
state.webgazer.pause();
}
extension.stopMouseCalibration = function () {
state.webgazer.removeMouseEventListeners()
}
extension.startMouseCalibration = function () {
state.webgazer.addMouseEventListeners()
}
extension.calibratePoint = function (x, y) {
state.webgazer.recordScreenPosition(x, y, 'click');
}
extension.setRegressionType = function (regression_type) {
var valid_regression_models = ['ridge', 'weigthedRidge', 'threadedRidge'];
if (valid_regression_models.includes(regression_type)) {
state.webgazer.setRegression(regression_type)
} else {
console.warn('Invalid regression_type parameter for webgazer.setRegressionType. Valid options are ridge, weightedRidge, and threadedRidge.')
}
}
extension.getCurrentPrediction = async function () {
var prediction = await state.webgazer.getCurrentPrediction();
if(state.round_predictions){
prediction.x = Math.round(prediction.x);
prediction.y = Math.round(prediction.y);
}
return prediction;
}
// extension.addGazeDataUpdateListener(listener){
// state.webgazer.setGazeListener(listener);
// }
function handleGazeDataUpdate(gazeData, elapsedTime) {
if (gazeData !== null && state.activeTrial) {
var d = {
x: state.round_predictions ? Math.round(gazeData.x) : gazeData.x,
y: state.round_predictions ? Math.round(gazeData.y) : gazeData.y,
t: Math.round(performance.now() - state.currentTrialStart)
}
state.currentTrialData.push(d); // add data to current trial's data
}
}
return extension;
})();

View File

@ -1,10 +1,3 @@
/**
* jspsych.js
* Josh de Leeuw
*
* documentation: docs.jspsych.org
*
**/
window.jsPsych = (function() { window.jsPsych = (function() {
var core = {}; var core = {};
@ -106,7 +99,8 @@ window.jsPsych = (function() {
'minimum_valid_rt': 0, 'minimum_valid_rt': 0,
'experiment_width': null, 'experiment_width': null,
'override_safe_mode': false, 'override_safe_mode': false,
'case_sensitive_responses': false 'case_sensitive_responses': false,
'extensions': []
}; };
// detect whether page is running in browser as a local file, and if so, disable web audio and video preloading to prevent CORS issues // detect whether page is running in browser as a local file, and if so, disable web audio and video preloading to prevent CORS issues
@ -195,12 +189,39 @@ window.jsPsych = (function() {
function(){ function(){
// success! user can continue... // success! user can continue...
// start experiment // start experiment
startExperiment(); loadExtensions();
}, },
function(){ function(){
// fail. incompatible user. // fail. incompatible user.
} }
); );
function loadExtensions() {
// run the .initialize method of any extensions that are in use
// these should return a Promise to indicate when loading is complete
if (opts.extensions.length == 0) {
startExperiment();
} else {
var loaded_extensions = 0;
for (var i = 0; i < opts.extensions.length; i++) {
var ext_params = opts.extensions[i].params;
if (!ext_params) {
ext_params = {}
}
jsPsych.extensions[opts.extensions[i].type].initialize(ext_params)
.then(() => {
loaded_extensions++;
if (loaded_extensions == opts.extensions.length) {
startExperiment();
}
})
.catch((error_message) => {
console.error(error_message);
})
}
}
}
}; };
// execute init() when the document is ready // execute init() when the document is ready
@ -262,6 +283,13 @@ window.jsPsych = (function() {
// of the DataCollection, for easy access and editing. // of the DataCollection, for easy access and editing.
var trial_data_values = trial_data.values()[0]; var trial_data_values = trial_data.values()[0];
// handle extension callbacks
if(Array.isArray(current_trial.extensions)){
for(var i=0; i<current_trial.extensions.length; i++){
var ext_data_values = jsPsych.extensions[current_trial.extensions[i].type].on_finish(current_trial.extensions[i].params);
Object.assign(trial_data_values, ext_data_values);
}
}
// about to execute lots of callbacks, so switch context. // about to execute lots of callbacks, so switch context.
jsPsych.internal.call_immediate = true; jsPsych.internal.call_immediate = true;
@ -942,6 +970,13 @@ window.jsPsych = (function() {
trial.on_start(trial); trial.on_start(trial);
} }
// call any on_start functions for extensions
if(Array.isArray(trial.extensions)){
for(var i=0; i<trial.extensions.length; i++){
jsPsych.extensions[trial.extensions[i].type].on_start(current_trial.extensions[i].params);
}
}
// apply the focus to the element containing the experiment. // apply the focus to the element containing the experiment.
DOM_container.focus(); DOM_container.focus();
@ -966,6 +1001,13 @@ window.jsPsych = (function() {
trial.on_load(); trial.on_load();
} }
// call any on_load functions for extensions
if(Array.isArray(trial.extensions)){
for(var i=0; i<trial.extensions.length; i++){
jsPsych.extensions[trial.extensions[i].type].on_load(current_trial.extensions[i].params);
}
}
// done with callbacks // done with callbacks
jsPsych.internal.call_immediate = false; jsPsych.internal.call_immediate = false;
} }
@ -1244,6 +1286,10 @@ jsPsych.plugins = (function() {
return module; return module;
})(); })();
jsPsych.extensions = (function(){
return {};
})();
jsPsych.data = (function() { jsPsych.data = (function() {
var module = {}; var module = {};

View File

@ -46,6 +46,7 @@ nav:
- 'Record Browser Interactions': 'overview/record-browser-interactions.md' - 'Record Browser Interactions': 'overview/record-browser-interactions.md'
- 'Media Preloading': 'overview/media-preloading.md' - 'Media Preloading': 'overview/media-preloading.md'
- 'Fullscreen Experiments': 'overview/fullscreen.md' - 'Fullscreen Experiments': 'overview/fullscreen.md'
- 'Eye Tracking': 'overview/eye-tracking.md'
- 'Exclude Participants Based on Browser Features': 'overview/exclude-browser.md' - 'Exclude Participants Based on Browser Features': 'overview/exclude-browser.md'
- 'Automatic Progress Bar': 'overview/progress-bar.md' - 'Automatic Progress Bar': 'overview/progress-bar.md'
- 'Integrating with Prolific': 'overview/prolific.md' - 'Integrating with Prolific': 'overview/prolific.md'
@ -103,6 +104,12 @@ nav:
- 'jspsych-visual-search-circle': 'plugins/jspsych-visual-search-circle.md' - 'jspsych-visual-search-circle': 'plugins/jspsych-visual-search-circle.md'
- 'jspsych-vsl-animate-occlusion': 'plugins/jspsych-vsl-animate-occlusion.md' - 'jspsych-vsl-animate-occlusion': 'plugins/jspsych-vsl-animate-occlusion.md'
- 'jspsych-vsl-grid-scene': 'plugins/jspsych-vsl-grid-scene.md' - 'jspsych-vsl-grid-scene': 'plugins/jspsych-vsl-grid-scene.md'
- 'jspsych-webgazer-calibrate': 'plugins/jspsych-webgazer-calibrate.md'
- 'jspsych-webgazer-init-camera': 'plugins/jspsych-webgazer-init-camera.md'
- 'jspsych-webgazer-validate': 'plugins/jspsych-webgazer-validate.md'
- Extensions:
- 'Extensions': 'extensions/extensions.md'
- 'jspsych-ext-webgazer.js': 'extensions/jspsych-ext-webgazer.md'
- About: - About:
- 'About jsPsych': 'about/about.md' - 'About jsPsych': 'about/about.md'
- 'Getting Help': 'about/support.md' - 'Getting Help': 'about/support.md'

View File

@ -1,6 +1,6 @@
{ {
"name": "jspsych", "name": "jspsych",
"version": "6.1.0", "version": "6.3.0",
"description": "Behavioral experiments in a browser", "description": "Behavioral experiments in a browser",
"main": "jspsych.js", "main": "jspsych.js",
"directories": { "directories": {

View File

@ -0,0 +1,166 @@
/**
* jspsych-webgazer-calibrate
* Josh de Leeuw
**/
jsPsych.plugins["webgazer-calibrate"] = (function() {
var plugin = {};
plugin.info = {
name: 'webgazer-calibrate',
description: '',
parameters: {
calibration_points: {
type: jsPsych.plugins.parameterType.INT,
default: [[10,10], [10,50], [10,90], [50,10], [50,50], [50,90], [90,10], [90,50], [90,90]]
},
calibration_mode: {
type: jsPsych.plugins.parameterType.STRING,
default: 'click', // options: 'click', 'view'
},
repetitions_per_point: {
type: jsPsych.plugins.parameterType.INT,
default: 1
},
randomize_calibration_order: {
type: jsPsych.plugins.parameterType.BOOL,
default: false
},
time_to_saccade: {
type: jsPsych.plugins.parameterType.INT,
default: 1000
},
time_per_point: {
type: jsPsych.plugins.parameterType.STRING,
default: 1000
}
}
}
// provide options for calibration routines?
// dot clicks?
// track a dot with mouse?
// then a validation phase of staring at the dot in different locations?
plugin.trial = function(display_element, trial) {
var html = `
<div id='webgazer-calibrate-container' style='position: relative; width:100vw; height:100vh'>
</div>`
display_element.innerHTML = html;
jsPsych.extensions['webgazer'].resume();
var wg_container = display_element.querySelector('#webgazer-calibrate-container');
var reps_completed = 0;
var points_completed = -1;
var cal_points = null;
calibrate();
function calibrate(){
jsPsych.extensions['webgazer'].resume();
if(trial.calibration_mode == 'click'){
jsPsych.extensions['webgazer'].startMouseCalibration();
}
next_calibration_round();
}
function next_calibration_round(){
if(trial.randomize_calibration_order){
cal_points = jsPsych.randomization.shuffle(trial.calibration_points);
} else {
cal_points = trial.calibration_points;
}
points_completed = -1;
next_calibration_point();
}
function next_calibration_point(){
points_completed++;
if(points_completed == cal_points.length){
reps_completed++;
if(reps_completed == trial.repetitions_per_point){
calibration_done();
} else {
next_calibration_round();
}
} else {
var pt = cal_points[points_completed];
calibration_display_gaze_only(pt);
}
}
function calibration_display_gaze_only(pt){
var pt_html = '<div id="calibration-point" style="width:10px; height:10px; border-radius:10px; border: 1px solid #000; background-color: #333; position: absolute; left:'+pt[0]+'%; top:'+pt[1]+'%;"></div>'
wg_container.innerHTML = pt_html;
var pt_dom = wg_container.querySelector('#calibration-point');
if(trial.calibration_mode == 'click'){
pt_dom.style.cursor = 'pointer';
pt_dom.addEventListener('click', function(){
next_calibration_point();
})
}
if(trial.calibration_mode == 'view'){
var br = pt_dom.getBoundingClientRect();
var x = br.left + br.width / 2;
var y = br.top + br.height / 2;
var pt_start_cal = performance.now() + trial.time_to_saccade;
var pt_finish = performance.now() + trial.time_to_saccade + trial.time_per_point;
requestAnimationFrame(function watch_dot(){
if(performance.now() > pt_start_cal){
jsPsych.extensions['webgazer'].calibratePoint(x,y,'click');
}
if(performance.now() < pt_finish){
requestAnimationFrame(watch_dot);
} else {
next_calibration_point();
}
})
}
}
function calibration_done(){
if(trial.calibration_mode == 'click'){
jsPsych.extensions['webgazer'].stopMouseCalibration();
}
wg_container.innerHTML = "";
end_trial();
}
// function to end trial when it is time
function end_trial() {
jsPsych.extensions['webgazer'].pause();
jsPsych.extensions['webgazer'].hidePredictions();
jsPsych.extensions['webgazer'].hideVideo();
// kill any remaining setTimeout handlers
jsPsych.pluginAPI.clearAllTimeouts();
// gather the data to store for the trial
var trial_data = {
};
// clear the display
display_element.innerHTML = '';
// move on to the next trial
jsPsych.finishTrial(trial_data);
};
};
return plugin;
})();

View File

@ -0,0 +1,95 @@
/**
* jspsych-webgazer-init-camera
* Josh de Leeuw
**/
jsPsych.plugins["webgazer-init-camera"] = (function() {
var plugin = {};
plugin.info = {
name: 'webgazer-init-camera',
description: '',
parameters: {
instructions: {
type: jsPsych.plugins.parameterType.HTML_STRING,
default: `
<p>Position your head so that the webcam has a good view of your eyes.</p>
<p>Use the video in the upper-left corner as a guide. Center your face in the box and look directly towards the camera.</p>
<p>It is important that you try and keep your head reasonably still throughout the experiment, so please take a moment to adjust your setup as needed.</p>
<p>When your face is centered in the box and the box turns green, you can click to continue.</p>`
},
button_text: {
type: jsPsych.plugins.parameterType.STRING,
default: 'Continue'
}
}
}
plugin.trial = function(display_element, trial) {
var html = `
<div id='webgazer-init-container' style='position: relative; width:100vw; height:100vh'>
</div>`
display_element.innerHTML = html;
jsPsych.extensions['webgazer'].showVideo();
jsPsych.extensions['webgazer'].resume();
var wg_container = display_element.querySelector('#webgazer-init-container');
wg_container.innerHTML = `
<div style='position: absolute; top: 50%; left: calc(50% - 350px); transform: translateY(-50%); width:700px;'>
${trial.instructions}
<button id='jspsych-wg-cont' class='jspsych-btn' disabled>${trial.button_text}</button>
</div>`
var observer = new MutationObserver(face_detect_event_observer);
observer.observe(document, {
attributes: true,
attributeFilter: ['style'],
subtree: true
});
document.querySelector('#jspsych-wg-cont').addEventListener('click', function(){
observer.disconnect();
end_trial();
});
function face_detect_event_observer(mutationsList, observer){
if(mutationsList[0].target == document.querySelector('#webgazerFaceFeedbackBox')){
if(mutationsList[0].type == 'attributes' && mutationsList[0].target.style.borderColor == "green"){
document.querySelector('#jspsych-wg-cont').disabled = false;
}
if(mutationsList[0].type == 'attributes' && mutationsList[0].target.style.borderColor == "red"){
document.querySelector('#jspsych-wg-cont').disabled = true;
}
}
}
// function to end trial when it is time
function end_trial() {
jsPsych.extensions['webgazer'].pause();
jsPsych.extensions['webgazer'].hideVideo();
// kill any remaining setTimeout handlers
jsPsych.pluginAPI.clearAllTimeouts();
// gather the data to store for the trial
var trial_data = {
};
// clear the display
display_element.innerHTML = '';
// move on to the next trial
jsPsych.finishTrial(trial_data);
};
};
return plugin;
})();

View File

@ -0,0 +1,304 @@
/**
* jspsych-webgazer-validate
* Josh de Leeuw
**/
jsPsych.plugins["webgazer-validate"] = (function() {
var plugin = {};
plugin.info = {
name: 'webgazer-validate',
description: '',
parameters: {
validation_points: {
type: jsPsych.plugins.parameterType.INT,
default: [[10,10], [10,50], [10,90], [50,10], [50,50], [50,90], [90,10], [90,50], [90,90]]
},
validation_point_coordinates: {
type: jsPsych.plugins.parameterType.STRING,
default: 'percent' // options: 'percent', 'center-offset-pixels'
},
roi_radius: {
type: jsPsych.plugins.parameterType.INT,
default: 200
},
randomize_validation_order: {
type: jsPsych.plugins.parameterType.BOOL,
default: false
},
time_to_saccade: {
type: jsPsych.plugins.parameterType.INT,
default: 1000
},
validation_duration: {
type: jsPsych.plugins.parameterType.INT,
default: 2000
},
point_size:{
type: jsPsych.plugins.parameterType.INT,
default: 10
},
show_validation_data: {
type: jsPsych.plugins.parameterType.BOOL,
default: false
}
}
}
plugin.trial = function(display_element, trial) {
var trial_data = {}
trial_data.raw_gaze = [];
trial_data.percent_in_roi = [];
trial_data.average_offset = [];
var html = `
<div id='webgazer-validate-container' style='position: relative; width:100vw; height:100vh; overflow: hidden;'>
</div>`
display_element.innerHTML = html;
var wg_container = display_element.querySelector('#webgazer-validate-container');
var points_completed = -1;
var val_points = null;
var start = performance.now();
validate();
function validate(){
if(trial.randomize_validation_order){
val_points = jsPsych.randomization.shuffle(trial.validation_points);
} else {
val_points = trial.validation_points;
}
points_completed = -1;
jsPsych.extensions['webgazer'].resume();
//jsPsych.extensions.webgazer.showPredictions();
next_validation_point();
}
function next_validation_point(){
points_completed++;
if(points_completed == val_points.length){
validation_done();
} else {
var pt = val_points[points_completed];
validation_display(pt);
}
}
function validation_display(pt){
var pt_html = drawValidationPoint(pt[0], pt[1]);
wg_container.innerHTML = pt_html;
var pt_dom = wg_container.querySelector('.validation-point');
var br = pt_dom.getBoundingClientRect();
var x = br.left + br.width / 2;
var y = br.top + br.height / 2;
var pt_start_val = performance.now() + trial.time_to_saccade;
var pt_finish = pt_start_val + trial.validation_duration;
var pt_data = [];
requestAnimationFrame(function watch_dot(){
if(performance.now() > pt_start_val){
jsPsych.extensions['webgazer'].getCurrentPrediction().then(function(prediction){
pt_data.push({dx: prediction.x - x, dy: prediction.y - y, t: Math.round(performance.now()-start)});
});
}
if(performance.now() < pt_finish){
requestAnimationFrame(watch_dot);
} else {
trial_data.raw_gaze.push(pt_data);
next_validation_point();
}
});
}
function drawValidationPoint(x,y){
if(trial.validation_point_coordinates == 'percent'){
return drawValidationPoint_PercentMode(x,y);
}
if(trial.validation_point_coordinates == 'center-offset-pixels'){
return drawValidationPoint_CenterOffsetMode(x,y);
}
}
function drawValidationPoint_PercentMode(x,y){
return `<div class="validation-point" style="width:${trial.point_size}px; height:${trial.point_size}px; border-radius:${trial.point_size}px; border: 1px solid #000; background-color: #333; position: absolute; left:${x}%; top:${y}%;"></div>`
}
function drawValidationPoint_CenterOffsetMode(x,y){
return `<div class="validation-point" style="width:${trial.point_size}px; height:${trial.point_size}px; border-radius:${trial.point_size}px; border: 1px solid #000; background-color: #333; position: absolute; left:calc(50% - ${trial.point_size/2}px + ${x}px); top:calc(50% - ${trial.point_size/2}px + ${y}px);"></div>`
}
function drawCircle(target_x, target_y, dx, dy, r){
if(trial.validation_point_coordinates == 'percent'){
return drawCircle_PercentMode(target_x, target_y, dx, dy, r);
}
if(trial.validation_point_coordinates == 'center-offset-pixels'){
return drawCircle_CenterOffsetMode(target_x, target_y, dx, dy, r);
}
}
function drawCircle_PercentMode(target_x, target_y, dx, dy, r){
var html = `
<div class="validation-centroid" style="width:${r*2}px; height:${r*2}px; border: 2px dotted #ccc; border-radius: ${r}px; background-color: transparent; position: absolute; left:calc(${target_x}% + ${dx-r}px); top:calc(${target_y}% + ${dy-r}px);"></div>
`
return html;
}
function drawCircle_CenterOffsetMode(target_x, target_y, dx, dy, r){
var html = `
<div class="validation-centroid" style="width:${r*2}px; height:${r*2}px; border: 2px dotted #ccc; border-radius: ${r}px; background-color: transparent; position: absolute; left:calc(50% + ${target_x}px + ${dx-r}px); top:calc(50% + ${target_y}px + ${dy-r}px);"></div>
`
return html;
}
function drawRawDataPoint(target_x, target_y, dx, dy, ){
if(trial.validation_point_coordinates == 'percent'){
return drawRawDataPoint_PercentMode(target_x, target_y, dx, dy);
}
if(trial.validation_point_coordinates == 'center-offset-pixels'){
return drawRawDataPoint_CenterOffsetMode(target_x, target_y, dx, dy);
}
}
function drawRawDataPoint_PercentMode(target_x, target_y, dx, dy){
var color = Math.sqrt(dx*dx + dy*dy) <= trial.roi_radius ? '#afa' : '#faa';
return `<div class="raw-data-point" style="width:5px; height:5px; border-radius:5px; background-color: ${color}; opacity:0.8; position: absolute; left:calc(${target_x}% + ${dx-2}px); top:calc(${target_y}% + ${dy-2}px);"></div>`
}
function drawRawDataPoint_CenterOffsetMode(target_x, target_y, dx, dy){
var color = Math.sqrt(dx*dx + dy*dy) <= trial.roi_radius ? '#afa' : '#faa';
return `<div class="raw-data-point" style="width:5px; height:5px; border-radius:5px; background-color: ${color}; opacity:0.8; position: absolute; left:calc(50% + ${target_x}px + ${dx-2}px); top:calc(50% + ${target_y}px + ${dy-2}px);"></div>`
}
function median(arr){
var mid = Math.floor(arr.length/2);
var sorted_arr = arr.sort((a,b) => a-b);
if(arr.length % 2 == 0){
return sorted_arr[mid-1] + sorted_arr[mid] / 2;
} else {
return sorted_arr[mid];
}
}
function calculateGazeCentroid(gazeData){
var x_diff_m = gazeData.reduce(function(accumulator, currentValue, index){
accumulator += currentValue.dx;
if(index == gazeData.length-1){
return accumulator / gazeData.length;
} else {
return accumulator;
}
}, 0);
var y_diff_m = gazeData.reduce(function(accumulator, currentValue, index){
accumulator += currentValue.dy;
if(index == gazeData.length-1){
return accumulator / gazeData.length;
} else {
return accumulator;
}
}, 0);
var median_distance = median(gazeData.map(function(x){ return(Math.sqrt(Math.pow(x.dx-x_diff_m,2) + Math.pow(x.dy-y_diff_m,2)))}));
return {
x: x_diff_m,
y: y_diff_m,
r: median_distance
}
}
function calculatePercentInROI(gazeData){
var distances = gazeData.map(function(p){
return(Math.sqrt(Math.pow(p.dx,2) + Math.pow(p.dy,2)))
});
var sum_in_roi = distances.reduce(function(accumulator, currentValue){
if(currentValue <= trial.roi_radius){
accumulator++;
}
return accumulator;
}, 0);
var percent = sum_in_roi / gazeData.length * 100;
return percent;
}
function calculateSampleRate(gazeData){
var mean_diff = [];
for(var i=0; i<gazeData.length; i++){
if(gazeData[i].length > 1){
var t_diff = [];
for(var j=1; j<gazeData[i].length; j++){
t_diff.push(gazeData[i][j].t - gazeData[i][j-1].t)
}
mean_diff.push(t_diff.reduce(function(a,b) { return(a+b) },0) / t_diff.length);
}
}
if(mean_diff.length > 0){
return 1000 / (mean_diff.reduce(function(a,b) { return(a+b) }, 0) / mean_diff.length);
} else {
return null;
}
}
function validation_done(){
trial_data.samples_per_sec = calculateSampleRate(trial_data.raw_gaze).toFixed(2);
for(var i=0; i<trial.validation_points.length; i++){
trial_data.percent_in_roi[i] = calculatePercentInROI(trial_data.raw_gaze[i]);
trial_data.average_offset[i] = calculateGazeCentroid(trial_data.raw_gaze[i]);
}
if(trial.show_validation_data){
show_validation_data();
} else {
end_trial();
}
}
function show_validation_data(){
var html = '';
for(var i=0; i<trial.validation_points.length; i++){
html += drawValidationPoint(trial.validation_points[i][0], trial.validation_points[i][1]);
html += drawCircle(trial.validation_points[i][0], trial.validation_points[i][1], 0, 0, trial.roi_radius);
for(var j=0; j<trial_data.raw_gaze[i].length; j++){
html += drawRawDataPoint(trial.validation_points[i][0], trial.validation_points[i][1], trial_data.raw_gaze[i][j].dx, trial_data.raw_gaze[i][j].dy)
}
}
// debugging
html += '<button id="cont" style="position:absolute; top: 50%; left:calc(50% - 50px); width: 100px;" class="jspsych-btn">Continue</btn>';
wg_container.innerHTML = html;
wg_container.querySelector('#cont').addEventListener('click', end_trial);
jsPsych.extensions.webgazer.showPredictions();
}
// function to end trial when it is time
function end_trial() {
jsPsych.extensions['webgazer'].pause();
jsPsych.extensions['webgazer'].hidePredictions();
jsPsych.extensions['webgazer'].hideVideo();
// kill any remaining setTimeout handlers
jsPsych.pluginAPI.clearAllTimeouts();
// clear the display
display_element.innerHTML = '';
// move on to the next trial
jsPsych.finishTrial(trial_data);
};
};
return plugin;
})();

View File

@ -0,0 +1,207 @@
const utils = require('../testing-utils.js');
const root = '../../';
jest.useFakeTimers();
describe('jsPsych.extensions', function () {
beforeEach(function () {
require(root + 'jspsych.js');
require(root + 'plugins/jspsych-html-keyboard-response.js');
require('./test-extension.js');
});
test('initialize is called at start of experiment', function () {
var initFunc = jest.spyOn(jsPsych.extensions.test, 'initialize');
var timeline = [{type: 'html-keyboard-response', stimulus: 'foo'}];
jsPsych.init({
timeline,
extensions: [{type: 'test'}]
});
expect(initFunc).toHaveBeenCalled();
});
test('initialize gets params', function(){
var initFunc = jest.spyOn(jsPsych.extensions.test, 'initialize');
var timeline = [{type: 'html-keyboard-response', stimulus: 'foo'}];
jsPsych.init({
timeline,
extensions: [{type: 'test', params: {foo: 1}}]
});
expect(initFunc).toHaveBeenCalledWith({foo: 1});
});
test('on_start is called before trial', function(){
var onStartFunc = jest.spyOn(jsPsych.extensions.test, 'on_start');
var trial = {
type: 'html-keyboard-response',
stimulus: 'foo',
extensions: [
{type: 'test'}
],
on_load: function(){
expect(onStartFunc).toHaveBeenCalled();
}
}
jsPsych.init({
timeline: [trial]
});
utils.pressKey('a');
});
test('on_start gets params', function(){
var onStartFunc = jest.spyOn(jsPsych.extensions.test, 'on_start');
var trial = {
type: 'html-keyboard-response',
stimulus: 'foo',
extensions: [
{type: 'test', params: {foo: 1}}
],
on_load: function(){
expect(onStartFunc).toHaveBeenCalledWith({foo: 1});
}
}
jsPsych.init({
timeline: [trial]
});
utils.pressKey('a');
});
test('on_load is called after load', function(){
var onLoadFunc = jest.spyOn(jsPsych.extensions.test, 'on_load');
var trial = {
type: 'html-keyboard-response',
stimulus: 'foo',
extensions: [
{type: 'test'}
],
on_load: function(){
// trial load happens before extension load
expect(onLoadFunc).not.toHaveBeenCalled();
}
}
jsPsych.init({
timeline: [trial]
});
expect(onLoadFunc).toHaveBeenCalled();
utils.pressKey('a');
});
test('on_load gets params', function(){
var onLoadFunc = jest.spyOn(jsPsych.extensions.test, 'on_load');
var trial = {
type: 'html-keyboard-response',
stimulus: 'foo',
extensions: [
{type: 'test', params: {foo:1}}
]
}
jsPsych.init({
timeline: [trial]
});
expect(onLoadFunc).toHaveBeenCalledWith({foo:1});
utils.pressKey('a');
});
test('on_finish called after trial', function(){
var onFinishFunc = jest.spyOn(jsPsych.extensions.test, 'on_finish');
var trial = {
type: 'html-keyboard-response',
stimulus: 'foo',
extensions: [
{type: 'test', params: {foo:1}}
]
}
jsPsych.init({
timeline: [trial]
});
expect(onFinishFunc).not.toHaveBeenCalled();
utils.pressKey('a');
expect(onFinishFunc).toHaveBeenCalled();
});
test('on_finish gets params', function(){
var onFinishFunc = jest.spyOn(jsPsych.extensions.test, 'on_finish');
var trial = {
type: 'html-keyboard-response',
stimulus: 'foo',
extensions: [
{type: 'test', params: {foo:1}}
]
}
jsPsych.init({
timeline: [trial]
});
utils.pressKey('a');
expect(onFinishFunc).toHaveBeenCalledWith({foo:1});
});
test('on_finish adds trial data', function(){
var trial = {
type: 'html-keyboard-response',
stimulus: 'foo',
extensions: [
{type: 'test', params: {foo:1}}
]
}
jsPsych.init({
timeline: [trial]
});
utils.pressKey('a');
expect(jsPsych.data.get().values()[0].extension_data).toBe(true);
});
test('on_finish data is available in trial on_finish', function(){
var trial = {
type: 'html-keyboard-response',
stimulus: 'foo',
extensions: [
{type: 'test', params: {foo:1}}
],
on_finish: function(data){
expect(data.extension_data).toBe(true);
}
}
jsPsych.init({
timeline: [trial]
});
utils.pressKey('a');
});
});

View File

@ -0,0 +1,42 @@
jsPsych.extensions['test'] = (function () {
var extension = {};
// private state for the extension
// extension authors can define public functions to interact
// with the state. recommend not exposing state directly
// so that state manipulations are checked.
var state = {};
// required, will be called at jsPsych.init
// should return a Promise
extension.initialize = function (params) {
return new Promise(function(resolve, reject){
resolve();
});
}
// required, will be called when the trial starts (before trial loads)
extension.on_start = function (params) {
}
// required will be called when the trial loads
extension.on_load = function (params) {
}
// required, will be called when jsPsych.finishTrial() is called
// must return data object to be merged into data.
extension.on_finish = function (params) {
// send back data
return {
extension_data: true
}
}
return extension;
})();