DeepSight SDK  1.0.2
DeepSight SDK - crowd face analysis
DeepSight SDK Service

Intro

The DeepSight SDK service is a minimal HTTP(S) server proxying requests to the SDK. Using an HTTP server allows:

  • any programming language to be used
  • to ease up integration
  • to shift the heavy processing to remote infrastructure
  • to create an SDK based web API very easily

When the service process is started, a ds::DeepSight instance is created. The service will proxy all requests to that ds::DeepSight instance by translating Json request objects to C++ SDK calls and C++ SDK return values into Json response objects. If a different ds::DeepSight instance is required, a new SDK service process needs to be launched.

There is a one-to-one match between the endpoints provided by the service and the SDK API: the ds::DeepSight::authenticate SDK method is mapped to the /DeepSight/authenticate URI and so on. HTTPS requests and responses are handled in Json, and there is a direct correspondence between function input arguments and output as well. For example, the input parameters to a call to the ds::DeepSight::analyze method:

cv::Mat image = readFrame(...);
std::vector<ds::Person> result = ds.analyze(image);
result == std::vector<ds::Person>{ { 123, 11, 0.5, 0.75, 0.80, 0.85, 2.0, true, {2.0, 4.0, 6.0}, {0, 0, 45}, {30, 40, 100, 80}, 1586540955727408237, 0, 0, 0} }

translate to the following request:

{"image": base64_encoded_image_string}

and response:

[{ "ID": 123, "age": 11, "gender": 0.5, "smile": 0.75, "faceMask": 0.80, "detectionConfidence": 0.85, "distanceFromCamera": 2.0, "isLooking": "true", "worldpos": {"x": 2.0, "y": 4.0, "z": 6.0}, "headpose": {"yaw": 0, "pitch": 0, "roll": 45}, "boundingBox": {"x": 30, "y": 40, "width": 100, "height": 80}, "detectionTime": 1586540955727408237, "detectionDuration": 0, "totalAttentionDuration": 0, "currentAttentionDuration": 0}]

to a /DeepSight/analyze call.

Command Lines Arguments

The DeepSightService program is located under the service directory. It can be launched by specifying the following command line arguments:

  • -s or --settings : sets the path of the settings file to be used to initialize the SDK.
  • -n or --networks : sets the path to the neural networks directory to be used to initialize the SDK.
  • -p or --port : sets the port the HTTP server will listen to. default is 80.
  • --ssl : enables SSL encryption. Requires certificate.key and certificate.crt certificate files (see SSL Certificates).

--settings and --networks are mutually exclusive. The neural networks path is the minimum required setting to initialize the SDK. If more settings are needed then specify the path to a settings ini file like this one:

[Detectors]
useAge = 1
usePeopleCount = 0
useFaceMask = 1
useGender = 1
useHeadpose = 1
useSmile = 1
[FaceDetection]
maxFaceSize = 0
maxNumFaces = -1
minFaceSize = 0
[License]
key = your_license_key
lifetimeAuthFile =
opMode = DEVELOPMENT
[NeuralNetwork]
faceConfidenceThreshold = 0.8
neuralNetworksPath = /some/path/to/networks/
[System]
bodyTrackFps = DYNAMIC
dnnTarget = CPU
logLevel = INFO
numThreads = -1

If a license key is set in the settings, the SDK will automatically authenticate. Alternatively, the start-service.sh (Linux) or start-service.bat (Windows) script can be used for launching the process on port 10080 without SSL.

Request

All requests to the SDK service are HTTP POST requests. The body of the request is a Json format string representing the input parameters of the ds::DeepSight member function the URI wraps. For ds::DeepSight member functions that do not accept arguments, an empty json object {} needs to be passed. The Content-Type of the request should be application/json, which means that the request should not be url-encoded or a multipart, but a plain json string. Eg:

POST /DeepSight/analyze HTTP/1.1
Host: localhost:10080
Content-Length: 77
Content-Type: application/json
{"image": base64_encoded_image_string}

Images in Json requests should be base64 encoded. Accepted image formats are jpeg and png. Eg:

{"image": "/9j/4QC1RXhpZgAASUkq.......tQluM9ahaOPG7rX/9k="}

C++ objects passed as arguments in Json requests should be translated into Json objects where member variables names are preserved verbatim. Eg:

// cv::Rect Json representation:
{"x": 0, "y": 0, "width": 100, "height": 100}

C++ enum values are represented by their enumerator names. Eg:

{"operationMode": "DEVELOPMENT", "logLevel": "INFO", "dnnTarget": "CPU"}

ds::Settings C++ objects are represented in Json format as follows:

{"neuralNetworksPath": "../../networks/DeepSight/", "licenseKey": "12345678901234567890123456789012", "operationMode": "DEVELOPMENT", "logLevel": "INFO", "dnnTarget": "CPU", "faceConfidenceThreshold": 0.8, "minFaceSize": 0, "maxFaceSize": 0, "numThreads": -1}

Response

Responses are Json objects containing 3 fields:

  • error_code
  • description
  • result

Where the error_code is 0 for successful responses, or a value described in the section Error Codes. description is an empty string for successful responses, or the reason for failure. result is the Json representation for the return value of the function call. An example response to the previous /DeepSight/analyze call might be:

HTTP/1.1 200 OK
Date: Mon, 02 Dec 2019 15:30:10 GMT
Connection: Keep-Alive
Access-Control-Allow-Origin: *
Content-Type: application/json
Content-Length: 192
{"error_code": 0, "description": "", "result": []}

For ds::DeepSight member functions that do not return arguments, an empty json object {} is returned.

SSL Certificates

The SDK service supports SSL encryption. In order for this to work, SSL certificates are required in the directory where the DeepSightService process is launched. The certificate file names default to certificate.key and certificate.crt; these can be given as command-line arguments as well. Self signed certificates are provided in the package under the service directory; however those are provided only for testing purposes. It is recommended to create valid certificates for production purposes.

Error Codes

1000 An unexpected server error occurred. Check the description fields and report this error to suppo.nosp@m.rt@s.nosp@m.ightc.nosp@m.orp..nosp@m.com
1001 An unexpected error occurred during an SDK function call. Check the description fields and report this error to suppo.nosp@m.rt@s.nosp@m.ightc.nosp@m.orp..nosp@m.com
1100 Request size exceeds allowed limits. Maximum request size is 20MB
1101 Input image could not be decoded. Format was not jpeg or png, or an issue occurred during base64 decoding
1102 Json object input request could not be decoded. Check the json request formatting
1103 A field is missing in the request for the specific SDK call
1104 The input Json request has missing or malformed fields
1105 The combination of fields in the input Json request is wrong
1200 Could not detect a face in the input image

A call to a nonexistent endpoint will result in a HTTP status 404 response.

Endpoints

/

Returns a string containing the DeepSight version number. This is not an SDK call and does not return a Json response; instead it can be used as a health check to verify that the service is up and running.

/DeepSight/authenticate

Check ds::DeepSight::authenticate

/DeepSight/setSettings

Check ds::DeepSight::setSettings

/DeepSight/getSettings

Check ds::DeepSight::getSettings

/DeepSight/analyze

Check ds::DeepSight::analyze

/DeepSight/analyzeSingleImage

Check ds::DeepSight::analyzeSingleImage

/DeepSight/getLicenseTimeLeft

Check ds::DeepSight::getLicenseTimeLeft

/DeepSight/getPeopleCount

Check ds::DeepSight::getPeopleCount

/DeepSight/getFaceCount

Check ds::DeepSight::getFaceCount

/DeepSight/resetTracker

Check ds::DeepSight::resetTracker

/DeepSight/resetPeopleCount

Check ds::DeepSight::resetPeopleCount

/extra/getDefaultSettings

This endpoint returns a default constructed ds::Settings object. It wraps a fictitious C++ API function

ds::Settings getDefaultSettings();

whose output Json response has the following format:

{"maxFaceSize": 0, "dnnTarget": "CPU", "licenseKey": "", "neuralNetworksPath": "", "logLevel": "INFO", "minFaceSize": 0, "numThreads": -1, "faceConfidenceThreshold": 0.8, "operationMode": "DEVELOPMENT"}

Python Example

#!/usr/bin/env python3
import requests
from base64 import b64encode as b64
def SdkService_call(function, json_input):
endpoint = 'http://localhost:10080' + function
response = requests.post(endpoint, json = json_input, verify = False).json()
if response['error_code'] != 0:
raise Exception(response['description'])
return response['result']
def encode_image(image):
return b64(image).decode('utf-8')
def main():
# Read images
with open("image.jpg", 'rb') as image_fd:
image = encode_image(image_fd.read())
# Detect face
people = SdkService_call('/DeepSight/analyzeSingleImage', {'image': image})
print(people)
if __name__ == '__main__':
main()