Boxes

Given a box entry in params.geometries, Scale will annotate your image or video with boxes and return the position and dimensions of the boxes.

Parameters

Parameter

Type

Default

Description

objects_to_annotate

array

[]

A list of string or LabelDescription objects.

min_height

integer

0

The minimum height in pixels of the bounding boxes you'd like to be made.

min_width

integer

0

The minimum width in pixels of the bounding boxes you'd like to be made.

can_rotate

boolean

false

Allows a tasker to rotate the bounding box.

Response Fields

Key

Type

Description

uuid

string

A computer-generated unique identifier for this annotation. In video annotation tasks, this can be used to track the same object across frames.

type

string

String indicating geometry type: box

label

string

The label of this annotation, chosen from the objects_to_annotate array for its geometry. In video annotation tasks, any annotation objects with the same uuid will have the same label across all frames.

attributes

object

See the Annotation Attributes section for more details about the attributes response field.

left

float

The distance, in pixels, between the left border of the bounding box and the left border of the image.

top

float

The distance, in pixels, between the top border of the bounding box and the top border of the image.

width

float

The width, in pixels, of the bounding box.

height

float

The height, in pixels, of the bounding box.

If can_rotate was set to true, the following fields will supersede the above fields:

Key

Type

Description

rotation

float

The clockwise rotation in radians

vertices

An array of objects with a schema {x: 0, y: 0}

The vertices of the rotated bounding box

left

float

The distance, in pixels, between the left border of the unrotated bounding box and the left border of the image.

top

float

The distance, in pixels, between the top border of the unrotated bounding box and the top border of the image.

{
  "response": {
    "annotations": [
      {
        "type": "box",
        "label": "pedestrian",
        "attributes": {
            "moving": "yes"
        },
        "left": 2,
        "top": 4,
        "width": 3,
        "height": 5,
        "uuid": "65ec1f52-5902-4b39-bea9-ab6b4d58ef42"
      },
      {
        "type": "box",
        "label": "car",
        "attributes": {
            "moving": "yes"
        },
        "left": 7,
        "top": 5,
        "width": 14,
        "height": 5,
        "uuid": "0a6cd019-a014-4c67-bd49-c269ba08028a"
      },
      { ... },
      { ... }
    ]
  },
  "task_id": "5774cc78b01249ab09f089dd",
  "task": {
    // populated task for convenience
    ...
  }
}
{
  "response": {
    "annotations" : [ 
      {
        "label" : "car",
        "attributes" : {},
        "uuid" : "122a4270-f9b2-4f66-a9ca-2e06f0de66e5",
        "width" : 121.878523862864,
        "height" : 71.6961921895555,
        "rotation" : 1.2440145049532,
        "left" : 613.440037825633,
        "top" : 199.208745812549,
        "type" : "box",
        "vertices" : [ 
          {
            "x" : 688.769014855216,
            "y" : 165.835344251165
          }, 
          {
            "x" : 727.891633787782,
            "y" : 281.264089660824
          }, 
          {
             "x" : 659.989584658913,
            "y" : 304.27833956349
          }, 
          {
            "x" : 620.866965726348,
            "y" : 188.84959415383
          }
        ]
      }
      { ... },
      { ... }
    ]
  },
  "task_id": "5774cc78b01249ab09f089dd",
  "task": {
    // populated task for convenience
    ...
  }
}