Skip To Content

Detect Objects Using Deep Learning

Description

Detect Objects Using Deep Learning diagram

The DetectObjectsUsingDeepLearning operation can be used to detect objects from the imagery data using the designated deep learning model and generate a feature service for the detected objects.

Request parameters

ParameterDetails
inputRaster

(Required)

The image that will be classified. This can be specified as the portal item ID, image service URL, cloud raster dataset, shared raster dataset, a feature service with image attachments, or a raster dataset or image collection in the data store. At least one type of input must be provided in the JSON object. If multiple inputs are provided, the itemId takes priority.

Syntax: A JSON object describes the input raster.

Example:


//Portal Item ID
inputRaster={"itemId": <portal item id>}

//Image Service URL
inputRaster={"url": <image service url>}

//Feature Service URL
inputRaster={"url": <feature service url>}

//Cloud Raster URI or Shared Data Path
inputRaster={"uri": <cloud raster uri or shared data path>}

//Service Properties
inputRaster={"serviceProperties":{"name":"testrasteranalysis","serviceUrl":"https://<server name>/server/rest/services/Hosted/testrasteranalysis/ImageServer"},"itemProperties":{"itemId":"8cfbd3ec25584d0d8f4ed23b8ff7c43b", "folderId":"sdfwerfbd3ec25584d0d8f4"}}

//Data store URI 
inputRaster={"uri":"/rasterStores/rasterstorename/A/B/C"}
or
inputRaster={"uri":"/fileShares/filesharedatastorename/A/B/C"}
or
inputRaster={"uri":"/cloudStores/cloudstorename/A/B/C"}
outputObjects

(Required)

The output hosted feature service properties. If there is an existing hosted feature service, you can provide the portal item ID or service URL and the output path of the generated feature class will be used to update the existing service definition. The service tool can also generate a new hosted feature service with the service properties provided.

The output hosted feature service is stored and shared on the hosting server.

Syntax: A JSON object describes the output feature service.

Example:


//Portal Item ID
outputObjects={"itemId": <portal item id>}

//Hosted Feature Service URL
outputObjects={"url": <hosted feature service url>}

//Feature Class Local Output Path
outputObjects={"uri": <feature class local output path>}

//Service Properties
outputObjects={"serviceProperties":{"name":"testrasteranalysis","serviceUrl":"https://<server name>/server/rest/services/Hosted/testrasteranalysis/FeatureServer"},"itemProperties":{"itemId":"8cfbd3ec25584d0d8fed23b8ff7c43b", "folderId":"sdfwerfbd3ec25584d0d8f4"}}

model

(Required)

The deep learning model to use to detect objects. This can be specified as the deep learning model portal item IS, an .emd or .dlpk file, or the entire JSON string of the model definition.

Syntax: A JSON object describes the model.

Example:


//Portal Item
model={"itemId": "x2u130909jcvojzkeeraedf"}
model={"url": "https://<portal name>/portal/sharing/rest/content/items/x2u130909jcvojzkeeraedf"}

//.emd or .dlpk file
model={"uri": "\\\\sharedstorage\\sharefolder\\DetectTrees.emd"}
model={"uri": "\\\\sharedstorage\\sharefolder\\DetectTrees.dlpk"}
model={"uri": "/rasterStores/rasterstorename/A/B/DetectTrees.emd"}
model={"uri": "/rasterStores/rasterstorename/A/B/DetectTrees.dlpk"}

//.emd or .dlpk file stored in raster store with file share type
model = {"uri": "/fileShares/filesharedatastorename/A/B/ClassifyHouseDamage.emd"}
model={"uri": "/fileShares/filesharedatastorename/A/B/model.dlpk"}

JSON object example


model={"Framework":"TensorFlow","ModelConfiguration":"ObjectDetectionAPI","ModelFile":"frozen_inference_graph.pb","ModelType":"ObjectDetection","ImageHeight":850,"ImageWidth":850,"ExtractBands":[0,1,2],"Classes":[{"Value": 0,"Name":"Tree","Color":[0,255,0]}]}
modelArguments

The name value pairs of arguments and their values that can be customized by the clients.

Syntax: A JSON object describes the value pairs of arguments.

Example:

modelArguments={"name1":"value1","name2": "value2"}
runNMS

Specifies whether to perform nonmaximum suppression in which duplicate detected objects are identified and the duplicate with a lower confidence value is removed. When this parameter is set to true, duplicate objects with lower confidence scores will be removed. When it's set to false, all objects that are detected will be included in the output feature class. The default is false.

Values: true | false

confidenceScoreField

The name of the field in the feature service that contains the confidence scores as output by the object detection method. This parameter is required when runNMS is set to true.

Example

confidenceScoreField="Confidence"
classValueField

The name of the class value field in the output feature service. If no value is specified, the tool will use the standard class value fields Classvalue and Value. If these fields do not exist, all features will be treated as the same object class.

Example

classValueField="Classvalue"
maxOverlapRatio

The maximum overlap ratio for two overlapping objects, which is defined as the ratio of intersection area over union area. The default is 0.

Example

maxOverlapRatio=0.3
processAllRasterItems

Specifies how raster items in an image service will be processed. When this parameter is set to true, all raster items in the image service will be processed as separate images. When it's set to false, all raster items in the image service will be mosaicked together and processed. The default is false.

Values: true | false

context

Contains additional settings that affect task execution. This task has the following settings:

  • Cell Size (cellSize)—The output raster will have the resolution specified by cell size.
  • Extent (extent)—A bounding box that defines the analysis area.
  • Parallel Processing Factor (parallelProcessingFactor)—The specified number or percentage of processes will be used for the analysis.
  • Processor Type(processorType)—Processing will occur using the server computer CPU or GPU.
f

The response format. The default response format is html.

Values: html | json | pjson

Example usage

The following is a sample request URL for DetectObjectsUsingDeepLearning:

https://machine.domain.com/webadaptor/rest/services/System/RasterAnalysisTools/GPServer/DetectObjectsUsingDeepLearning?
inputRaster="url":"https://<server name>/arcgis/ArcGIS/rest/services/World_Imagery/MapServer"&outputFeatureClass={"serviceProperties":{"name":"DetectedTrees"}}&model={"itemId": "d8d3902b41854529a907ad9f42af5a06"}&modelArguments={"padding": "0", "batch_size": "16"}&classLabelField=ClassLabel&processAllRasterItems=false&context={"extent": {"xmin": -13160539.4563053,"ymin": 3998752.62631951,"xmax": -13160427.5538234,"ymax": 3998824.51069532,"spatialReference": {"wkid": 3857}},"processorType": "CPU","parallelProcessingFactor": 2}}&f=json

Response

When you submit a request, the task assigns a unique job ID for the transaction.

Syntax:

{
"jobId": "<unique job identifier>",
"jobStatus": "<job status>"
}

After the initial request is submitted, you can use jobId to periodically review the status of the job and messages as described in Checking job status. Once the job has successfully completed, use jobId to retrieve the results. To track the status, you can make a request of the following form:

https://<raster analysis tools url>/DetectObjectsUsingDeepLearning/jobs/<jobId>

When the status of the job request is esriJobSucceeded, you can access the results of the analysis by making a request of the following form:

https://<raster analysis tools url>/DetectObjectsUsingDeepLearning/jobs/<jobId>/results/outObjects

JSON Response example

The response returns the outObjects output parameter, which has properties for parameter name, data type, and value. The content of the value is always the output feature layer itemId and the image service URL.


{
  "paramName": "outObjects",
  "dataType": "GPFeatureRecordSetLayer",
  "value": {
    "itemId": "f121390b85ef419790479fc75b493efd",
    "url": "https://<server name>/arcgis/rest/services/Hosted/<service name>/ImageServer"
  }
}

Temas relacionados