Import Results using Rest API

Working Sample

Please look at Sample projects and Code snippets sections for a more concrete end-to-end example.

1. Use API Key for Importing Results

Please Generate API Key if you don't have one for your selected project.

2. Import test results

The import process is divided into 3 sections. 

  1. Get Upload file URL: This API gets all required test automation details from the user and responds with a URL to upload the results. 
  2. Upload your test result file to the generated URL: This API is used to upload the automation results file and import test results.
  3. Check Progress: This API is used to check the progress of automation result import.   

2.1 Get Upload file URL 

This API gets all required test automation details from the user and responds with a URL to upload the results. 

URL : https://qtmcloud.qmetry.com/rest/api/automation/importresult

Method : POST

REQUEST HEADER :

Content-Type : application/json

{generated-api-key}

Notes

  • zip file must contain files of the same format given in the 'format' param.
  • Request must contain json raw body payload, form-data is not supported.


Request Parameters

ParameterTypeRequiredDescriptionDefault
formatstringYesFormat of result file to be imported. Supported formats:
  1. CUCUMBER 
  2. TESTNG 
  3. JUNIT 
  4. QAF 
  5. HPUFT 
  6. SPECFLOW
NA
testCycleToReusestringNoIssue Key of the test cycle to be reusedNA
environmentstringNoName of the environment on which the test cycle has to be executedNo Environment
buildstringNoName of the build for test case executionBlank
isZipbooleanNo (Yes for QAF)Pass true for ZIP upload or pass false for single file uploadfalse
attachFilebooleanNo

Pass true to upload attachments in execution. For more details, Refer to automation help documents.

This parameter is supported only for

  • QAF
  • Cucumber
false
fieldsJSONNo

Provide additional fields to be added on the test case or test cycle level. Refer to the following table for more.

Note : If the cycle is reused, fields of the test cycle will be ignored.

Blank
matchTestStepsbooleanNo
  • True: Create/Reuse a test case with a summary and test steps that exactly match the automated test case uploaded through the result file. The execution results and other execution details of the test case and steps will be imported from the automation result file.

  • False:

When the Test Cycle is not been Reused

→ Create/Reuse a test case with a summary or test case key that exactly matches the automated test case uploaded through the result file, and exclude matching of test steps. The execution results of the test case will be imported or calculated based on the test case/step results from the automation result file. The execution result of the test case will be propagated to the test steps in the case of test case reuse/creation. Individual test case steps will not be matched and their execution results/details will not be picked from the result file.

When the Test Cycle is been Reused

→ When the Test Case Key is mapped in the result file and the Test Case Key is found linked to the Test Cycle.

In a project where propagation is off, the status of the step will not be mapped/changed.

The existing linked test case version, which is part of the Test Cycle will be used. If multiple versions of the same test case key are linked to the test cycle, the one which traced first will be used. The test steps will not be matched to create a new version or link a different version.

→ When the Test Case Key is mapped in the result file and the Test Case Key is not linked to the Test Cycle and the Test Case Key exists in the Test Case Library.

The existing latest version of the test case that matches based on the test case summary will be linked to the existing Test Cycle. If there are multiple test cases with the same summary exist, the one that is traced first will be linked to the existing Test Cycle. The test steps will not be matched to create a new version or link a different version.

→ When the Test Case Key is mapped in the result file and the Test Case Key is not found linked to the Test Cycle and the Test Case Key is not found in the Test Case Library, and the Test Case Summary matches with any existing test cases linked to the Test Cycle.

The existing linked Test Case version which is part of the Test Cycle will be used. The test steps will not be matched to create a new version.

→ The Test Case Key is not mentioned in the result file and the Test Case Summary matches any existing test case that is already linked to the Test Cycle.

The existing linked Test Case version which is part of the Test Cycle will be used based on the summary. The test steps will not be matched to create a new version.

→ When the Test Case Key is mapped in the result file and the Test Case Key does not exist in the Test Cycle OR is Not found in Library and the Test Case Summary also does not match any existing Test case in the Test Cycle and Test Case with the same summary is found in the Test Case Library.

The existing latest version of the test case that matches based on the test case summary will be linked to the existing Test Cycle. If there are multiple test cases with the same summary exist, the one that is traced first will be linked to the existing Test Cycle. The test steps will not be matched to create a new version or link a different version.

→ When the Test Case Key is mapped in the result file and the Test Case Key does not exist in the Test Cycle OR is Not found in the Library and the Test Case Summary also does not match any existing Test case in the Test Cycle OR is not found in the Test Case Library.

A new test case without steps will be created and will be linked to the Test Cycle being reused.

The execution results of the test case will be imported or calculated based on the test case/step results from the automation result file. The execution result of the test case will be propagated to the test steps in the case of test case reuse/creation. Individual test case steps will not be matched and their execution results/details will not be picked from the result file.

In a project where propagation is off, the status of the step will not be mapped or changed.


appendTestName

boolean

No

  • Applicable only for JUnit or TestNG frameworks automation result uploads with Hierarchy 2 or 3.

  • Accepted Values = false (Default),true

TestNG

  • False: Create the Test Case Summary as per the Test Method Name in the result file.

  • True: Append Test Name to Test Method Name while creating the Test Case Summary as “Test Name. Test Method Name”.

JUnit

  • False: Create the Test Case Summary as per the Test Case Name exists in the result file.

  • True: Append Test Suite Name to Test Case Name while creating the Test Case Summary as “Test Suite Name. Test Case Name”.

As per Settings

automationHierarchy

number

No

  • Applicable only for JUnit or TestNG frameworks.

Set Hierarchy for Automation Uploads “Test Cycle - Test Case - Test Step Hierarchy” as 1 (Default), 2 or 3

For TestNG

  • Value 1: The Test Name is created as Test Case and Test Methods are created as Test Steps in QTM4J.

  • Value 2: The Test Name is created as Test Cycle and the Test Method Name is created as Test Case. If there are multiple tests in the result file then multiple test cycles will be created based on the test name.

  • Value 3: The Suite Name is created as Test Cycle and the Test Method Name is created as Test Cases. If there are multiple suites in the result file then multiple test cycles will be created based on the suite name.

For JUnit

  • Value 1: The Test Suite Name/Class Name is created as a Test Case and the Test Case Names are created as Test Steps in QTM4J.

  • Value 2: The Test Suite Name is created as Test Cycle and the Test Case Name is created as Test Case. If there are multiple test suites in the result file then multiple test cycles will be created based on the suite name.

  • Value 3: The Test Cycle name will be auto-generated or manually provided by the user and the Test Case Name is created as Test Case. If there are multiple suites in the result file then only a single cycle will be created.

As per Settings

Note: If the Test Name or Test Suite Name length is more than 255 characters, the name will be truncated.

Supported Fields

Supported FieldsTypeTest CycleTest CaseDefaultComment
labelsarrayYesYesnull
componentsarrayYesYesnull
statusstringYesYesTO DO
prioritystringYesYesMedium
fixVersionIdnumberYesYesnull

folderId

number

YesYesnull

Organize automated test cases and test cycles in a folder when uploading automation results.


You can get the folderId by right clicking on the test case/test cycle folder and selecting option “Copy Folder Id”.

If the mentioned folderId does not exist, an error will be shown. The test cases/test cycles will not be created.

sprintIdnumberYesYesnull
summarystringYesNoAutomated Test Cycle

description

string

Yes

Yes

null


preconditionstringNoYesnull

assignee

string

Yes

Yes

Account Id of current User


Valid User Account Id

reporter

string

Yes

Yes

Account Id of current User


Valid User Account Id

estimatedTime

string

No

Yes

null


Pass time in ‘HH:MM:SS’ format

plannedStartDate

string

Yes

No

null

Pass date in 'dd/MMM/yyyy HH:mm' format

plannedEndDate

string

Yes

No

null

Pass date in 'dd/MMM/yyyy HH:mm' format

customFields

array

Yes

Yes

null

  • This array contains JSON object which has name and value property.

  • name property represents name of custom field, user can get it from custom fields screen inside configuration menu.

  • value property represents value of custom field.

    • For Date type custom field, pass value in 'dd/MMM/yyyy' format.

    • For DateTime type custom field, pass value in 'dd/MMM/yyyy HH:mm' format.

    • For Number type custom field, pass any numeric value

    • For Single option custom field like Radio button, Single Dropdown etc, pass option value as string. Ex. 'high'

    • For Multi options custom field like Checkbox, Multi Dropdown etc, pass value as comma separated string. Ex. 'high,medium,low'

    • For Single User picker type custom field, pass Jira Account Id of the user and For Multi-User picker, pass a comma-separated Jira Account Id of users.
    • For Labels type custom field pass value as comma separated string. Ex. 'tag1,tag2'
    • For Cascade type custom field, Pass two level of values. "value" node represents Level 1 dropdown option and cascadeValue node represents Level 2 dropdown option. Ex. "value": "option1", "cascadeValue": "option1_A".
    • Custom Fields should be passed in request if they are required.


Sample Request

{
    "format": "cucumber",
    "testCycleToReuse": "",
    "attachFile": true,
    "isZip": false,
    "environment": "",
    "build": "",
	"matchTestSteps": true,
    "fields": {
        "testCycle": {
            "labels": [
                "label1",
                "label2"
            ],
            "components": [
                "component1"
            ],
            "priority": "High",
            "status": "To Do",
			"sprintId":10000,
          	"fixVersionId":10000,
            "folderId": 1000,
            "summary": "Test Cycle Summary Automation",
            "description": "Test Cycle Automation Description", 			
            "assignee": "557058:b50b7a9b-8826-4769-97c8-3338b8ba7f22",
            "reporter": "557058:416340c9-0308-4f49-a538-c9f5b164242d",
            "plannedStartDate": "15/May/2020 00:00",
            "plannedEndDate": "30/May/2020 00:00",
            "customFields": [{
                    "name": "Multi Check Box",
                    "value": "MCB 1,MCB 2"
                },
                {
                    "name": "Date Type",
                    "value": "29/May/2020"
                },
                {
                    "name": "Date time",
                    "value": "29/May/2020 14:55"
                },
                {
                    "name": "Multi DropDown",
                    "value": "MDD 2,MDD 1"
                },
                {
                    "name": "Multi line text field",
                    "value": "QMetry Automation Testing using cucumber framework "
                },
                {
                    "name": "Number field",
                    "value": 1234567890
                },
                {
                    "name": "Single DropDown",
                    "value": "DD2"
                },
                {
                    "name": "Single line field",
                    "value": "QMetry Automation Testing using cucumber framework"
                },
                {
                    "name": "Single Radio Button",
                    "value": "Test C"
                }
            ]
        },
        "testCase": {
            "labels": [
                "label1",
                "label2"
            ],
            "components": [
                "component1"
            ],
            "priority": "High",
            "status": "To Do",
			"sprintId":10000,
          	"fixVersionId":10000,
            "folderId": 1000,
            "description": "Automated generated Test Case",
			"precondition": "Precondition of Test Case",
            "assignee": "557058:b50b7a9b-8826-4769-97c8-3338b8ba7f22",
            "reporter": "557058:416340c9-0308-4f49-a538-c9f5b164242d",
            "estimatedTime": "10:10:10",
            "customFields": [{
                    "name": "TC Multi DropDown",
                    "value": "Automation 1,Automation 2"
                },
                {
                    "name": "QA User RadioButton",
                    "value": "Sample user 2"
                }
            ]
        }
		"testCaseExecution":{
			"comment": "Test Case Execution Comment",
        	"actualTime": "10:10:10",
			"executionPlannedDate": "16/Feb/2024",
        	"assignee": "{Valid User Account Key}"
          	"customFields": [
              {
                "name": "custom field 1",
                "value": "high,medium,low"
              },
              {
                "name": "custom field 2",
                "value": 10.12
				}
            ]
        }
      }
    }          
	       


Responses

ResponseDescription

STATUS 200

Returned if the results file is uploaded successfully. The import process might take a while and you will be notified (by email or checking the status of the created test run) once the process is completed.

Example

{
  "url":"https://qtmcloud.qmetry.com/automationuploads/1560145450482_f52a7866-a345-4cad-b93e-a930135868d7.json?X-Amz-Algorithm=AWS4-HMACSHA256&X-Amz-Date=20190610T054410Z&X-Amz-SignedHeaders=.....",
  "message": "Generated Upload URL is valid for one time use and will expire in 5 minutes.",
  "trackingId": "f52a7866-a345-4cad-b93e-a930135868d7"
}

 
STATUS 400

                                  

Returned if import fails

If unsupported framework is sent in request

{
  "status":400,
  "errorMessage":"Framework ‘xyz’ is not supported.",
  "timestamp":"28/May/2019 04:58"
}

If zip file is not sent in QAF framework request

{  
  "status":400,
  "errorMessage":"Zip file format is required for QAF framework.",
  "timestamp":"28/May/2019 05:00"
} 

If one or more fields have invalid value

{  
  "status":400,
  "errorMessage":"Validation failed for fields.",
  "errors":[  
    "TestCase Components c1,c2 could not be found.",
    "TestCase Labels l1,l2 could not be found."
  ],
  "timestamp":"29/May/2019 12:44"
}

2.2 Upload test result file 

This API is used to upload the automation results file and imports test results.

URL :  {{URL generated from step 1}}

Method : PUT

Content-Type : multipart/form-data

Request Body - Binary : Your result file to be uploaded. Supported file extensions: .json, .xml and .zip (zip file must contain files of the format given in the 'format' param).

Note:File Attachment must be passed in raw payload, form-data is not supported.

Responses

ResponseDescription

STATUS 200

Returned if file is uploaded successfully.
STATUS 403

Returned if file upload happens after URL expiration time or if Content-type=multipart/form-data header is not passed

Example

URL Expiry


<Error>
    <Code>AccessDenied</Code>
    <Message>Request has expired</Message>
    <Expires>2017-02-27T12:34:11Z</Expires>
    <ServerTime>2017-02-27T13:40:54Z</ServerTime>
    <RequestId>3BCFE62FDD8F60D8</RequestId>
    <HostId>MXrdoCzy/BK2BdFdNFn613xa6jKegQpVpSzsC4CQhkk46f7Na+ImafoFHlN90FF2LiuupDr5x9U=</HostId>
</Error>


Signature Mismatch


<Error>
    <Code>SignatureDoesNotMatch</Code>
    <Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.</Message>
    <Expires>2017-02-27T12:34:11Z</Expires>
    <StringToSign>PUT ...</StringToSign>
    <SignatureProvided>cijMZITZJuv8r5mvgqksssPcV5M=</SignatureProvided>
    <StringToSignBytes>50 55 ...</StringToSignBytes>
    <RequestId>cijMZITZJuv8r5mvgqksssPcV5M=</RequestId>
    <HostId>MXrdoCzy/BK2BdFdNFn613xa6jKegQpVpSzsC4CQhkk46f7Na+ImafoFHlN90FF2LiuupDr5x9U=</HostId>
</Error>


2.3 Check Progress

This API is used to check the progress of automation result import. 


GET
application/json
apiKey : Get API Key

Responses

ResponseDescription

SUCCESS

Returned if parameters are validated successfully.

Example

{  
    "format":"QAF",
    "fileName":"1558945546874_840ba1e3-bf14-4f08-b19c-e5de6447711b.zip",
    "processStatus":"VALIDATION/PARSING/CREATING_ASSETS/SUCCESS",
    "importStatus":"SUCCESS/INPROCESS/FAILED",
    "startTime":"2019-05-27T08:25:47.000+0000",
    "endTime":null,
    "fileSize":1500,
    "trackingId":"840ba1e3-bf14-4f08-b19c-e5de6447711b",
    "files":[  
  
    ],
    "detailedMessage": "File Imported successfully.",
    "extraAttributes": {
        "build": "1.0.0 beta",
        "environment": "Chrome",
        "attachFile": false
    },
    "summary": {
        "testCycle": "Automated Test Cycle",
        "testCasesCreated": 2,
        "testCaseVersionsCreated": 3,
        "testCaseVersionsReused": 1,
        "testStepsCreated": 12,
        "testCycleIssueKey": "SAP-TR-3"
    }
  }

FAILED

Returned if any validation fails

Example


{  
    "format":"QAF",
    "fileName":"1558945546874_840ba1e3-bf14-4f08-b19c-e5de6447711b.zip",
    "processStatus":"VALIDATION/PARSING/CREATING_ASSETS/SUCCESS",
    "importStatus":"FAILED",
    "startTime":"2019-05-27T08:25:47.000+0000",
    "endTime":null,
    "fileSize":1500,
    "trackingId":"840ba1e3-bf14-4f08-b19c-e5de6447711b",
    "files":[  
  
    ],
    "detailedMessage":"{some detail message about current status of file}"
  }


3. View imported test results

Please refer to View Imported Test Results section to view imported test results.