Linkis Task submission and execution Rest API

  • The return of the Linkis Restful interface follows the following standard return format:
  1. {
  2. "method": "",
  3. "status": 0,
  4. "message": "",
  5. "data": {}
  6. }

Convention:

  • method: Returns the requested Restful API URI, which is mainly used in WebSocket mode.
  • status: return status information, where: -1 means no login, 0 means success, 1 means error, 2 means verification failed, 3 means no access to the interface.
  • data: return specific data.
  • message: return the requested prompt message. If the status is not 0, the message returned is an error message, and the data may have a stack field, which returns specific stack information.

For more information about the Linkis Restful interface specification, please refer to: Linkis Restful Interface Specification

  • Interface /api/rest_j/v1/entrance/submit

  • Submission method POST

  • Request Parameters

  1. {
  2. "executionContent": {
  3. "code": "show tables",
  4. "runType": "sql"
  5. },
  6. "params": {
  7. "variable": {// task variable
  8. "testvar": "hello"
  9. },
  10. "configuration": {
  11. "runtime": {// task runtime params
  12. "jdbc.url": "XX"
  13. },
  14. "startup": { // ec start up params
  15. "spark.executor.cores": "4"
  16. }
  17. }
  18. },
  19. "source": { //task source information
  20. "scriptPath": "file:///tmp/hadoop/test.sql"
  21. },
  22. "labels": {
  23. "engineType": "spark-2.4.3",
  24. "userCreator": "hadoop-IDE"
  25. }
  26. }

-Sample Response

  1. {
  2. "method": "/api/rest_j/v1/entrance/submit",
  3. "status": 0,
  4. "message": "Request executed successfully",
  5. "data": {
  6. "execID": "030418IDEhivebdpdwc010004:10087IDE_hadoop_21",
  7. "taskID": "123"
  8. }
  9. }
  • execID is the unique identification execution ID generated for the task after the user task is submitted to Linkis. It is of type String. This ID is only useful when the task is running, similar to the concept of PID. The design of ExecID is (requestApplicationName length)(executeAppName length)(Instance length)${requestApplicationName}${executeApplicationName}${entranceInstance information ip+port}${requestApplicationName}_${umUser}_${index}

  • taskID is the unique ID that represents the task submitted by the user. This ID is generated by the database self-increment and is of Long type

  • Interface /api/rest_j/v1/entrance/${execID}/status

  • Submission method GET

  • Sample Response

  1. {
  2. "method": "/api/rest_j/v1/entrance/{execID}/status",
  3. "status": 0,
  4. "message": "Get status successful",
  5. "data": {
  6. "execID": "${execID}",
  7. "status": "Running"
  8. }
  9. }
  • Interface /api/rest_j/v1/entrance/${execID}/log?fromLine=${fromLine}&size=${size}

  • Submission method GET

  • The request parameter fromLine refers to the number of lines from which to get, and size refers to the number of lines of logs that this request gets

  • Sample Response, where the returned fromLine needs to be used as a parameter for the next request of this interface

  1. {
  2. "method": "/api/rest_j/v1/entrance/${execID}/log",
  3. "status": 0,
  4. "message": "Return log information",
  5. "data": {
  6. "execID": "${execID}",
  7. "log": ["error log","warn log","info log", "all log"],
  8. "fromLine": 56
  9. }
  10. }
  • Interface /api/rest_j/v1/entrance/${execID}/progressWithResource

  • Submission method GET

  • Sample Response

  1. {
  2. "method": "/api/entrance/exec_id018017linkis-cg-entrance127.0.0.1:9205IDE_hadoop_spark_2/progressWithResource",
  3. "status": 0,
  4. "message": "OK",
  5. "data": {
  6. "yarnMetrics": {
  7. "yarnResource": [
  8. {
  9. "queueMemory": 9663676416,
  10. "queueCores": 6,
  11. "queueInstances": 0,
  12. "jobStatus": "COMPLETED",
  13. "applicationId": "application_1655364300926_69504",
  14. "queue": "default"
  15. }
  16. ],
  17. "memoryPercent": 0.009,
  18. "memoryRGB": "green",
  19. "coreRGB": "green",
  20. "corePercent": 0.02
  21. },
  22. "progress": 0.5,
  23. "progressInfo": [
  24. {
  25. "succeedTasks": 4,
  26. "failedTasks": 0,
  27. "id": "jobId-1(linkis-spark-mix-code-1946915)",
  28. "totalTasks": 6,
  29. "runningTasks": 0
  30. }
  31. ],
  32. "execID": "exec_id018017linkis-cg-entrance127.0.0.1:9205IDE_hadoop_spark_2"
  33. }
  34. }
  • Interface /api/rest_j/v1/entrance/${execID}/kill

  • Submission method POST

  • Sample Response

  1. {
  2. "method": "/api/rest_j/v1/entrance/{execID}/kill",
  3. "status": 0,
  4. "message": "OK",
  5. "data": {
  6. "execID":"${execID}"
  7. }
  8. }
  • Interface /api/rest_j/v1/jobhistory/{id}/get

  • Submission method GET

Request Parameters:

Parameter nameParameter descriptionRequest typeRequiredData typeschema
idtask idpathtruestring
  • Sample Response
  1. {
  2. "method": null,
  3. "status": 0,
  4. "message": "OK",
  5. "data": {
  6. "task": {
  7. "taskID": 1,
  8. "instance": "xxx",
  9. "execId": "exec-id-xxx",
  10. "umUser": "test",
  11. "engineInstance": "xxx",
  12. "progress": "10%",
  13. "logPath": "hdfs://xxx/xxx/xxx",
  14. "resultLocation": "hdfs://xxx/xxx/xxx",
  15. "status": "FAILED",
  16. "createdTime": "2019-01-01 00:00:00",
  17. "updatedTime": "2019-01-01 01:00:00",
  18. "engineType": "spark",
  19. "errorCode": 100,
  20. "errDesc": "Task Failed with error code 100",
  21. "executeApplicationName": "hello world",
  22. "requestApplicationName": "hello world",
  23. "runType": "xxx",
  24. "paramJson": "{\"xxx\":\"xxx\"}",
  25. "costTime": 10000,
  26. "strongerExecId": "execId-xxx",
  27. "sourceJson": "{\"xxx\":\"xxx\"}"
  28. }
  29. }
  30. }

Support for multiple result sets

  • Interface /api/rest_j/v1/filesystem/getDirFileTrees

  • Submission method GET

Request Parameters:

Parameter nameParameter descriptionRequest typeRequiredData typeschema
pathresult directoryquerytruestring
  • Sample Response
  1. {
  2. "method": "/api/filesystem/getDirFileTrees",
  3. "status": 0,
  4. "message": "OK",
  5. "data": {
  6. "dirFileTrees": {
  7. "name": "1946923",
  8. "path": "hdfs:///tmp/hadoop/linkis/2022-07-06/211446/IDE/1946923",
  9. "properties": null,
  10. "children": [
  11. {
  12. "name": "_0.dolphin",
  13. "path": "hdfs:///tmp/hadoop/linkis/2022-07-06/211446/IDE/1946923/_0.dolphin",//result set 1
  14. "properties": {
  15. "size": "7900",
  16. "modifytime": "1657113288360"
  17. },
  18. "children": null,
  19. "isLeaf": true,
  20. "parentPath": "hdfs:///tmp/hadoop/linkis/2022-07-06/211446/IDE/1946923"
  21. },
  22. {
  23. "name": "_1.dolphin",
  24. "path": "hdfs:///tmp/hadoop/linkis/2022-07-06/211446/IDE/1946923/_1.dolphin",//result set 2
  25. "properties": {
  26. "size": "7900",
  27. "modifytime": "1657113288614"
  28. },
  29. "children": null,
  30. "isLeaf": true,
  31. "parentPath": "hdfs:///tmp/hadoop/linkis/2022-07-06/211446/IDE/1946923"
  32. }
  33. ],
  34. "isLeaf": false,
  35. "parentPath": null
  36. }
  37. }
  38. }
  • Interface /api/rest_j/v1/filesystem/openFile

  • Submission method GET

Request Parameters:

Parameter nameParameter descriptionRequest typeRequiredData typeschema
pathresult pathquerytruestring
charsetCharsetqueryfalsestring
pagepage numberqueryfalseref
pageSizepage sizequeryfalseref
  • Sample Response
  1. {
  2. "method": "/api/filesystem/openFile",
  3. "status": 0,
  4. "message": "OK",
  5. "data": {
  6. "metadata": [
  7. {
  8. "columnName": "count(1)",
  9. "comment": "NULL",
  10. "dataType": "long"
  11. }
  12. ],
  13. "totalPage": 0,
  14. "totalLine": 1,
  15. "page": 1,
  16. "type": "2",
  17. "fileContent": [
  18. [
  19. "28"
  20. ]
  21. ]
  22. }
  23. }

Get the result as a CSV or Excel file

  • Interface /api/rest_j/v1/filesystem/resultsetToExcel

  • Submission method GET

Request Parameters:

Parameter nameParameter descriptionRequest typeRequiredData typeschema
autoFormatAutoqueryfalseboolean
charsetcharsetqueryfalsestring
csvSeeratorcsv Separatorqueryfalsestring
limitrow limitqueryfalseref
nullValuenull valuequeryfalsestring
outputFileNameOutput file namequeryfalsestring
outputFileTypeOutput file type csv or excelqueryfalsestring
pathresult pathqueryfalsestring
quoteRetouchEnableWhether to quote modificationqueryfalseboolean
sheetNamesheet namequeryfalsestring
  • Response
  1. binary stream
  • Interface /api/rest_j/v1/entrance/execute

  • Submission method POST

  • Request Parameters

  1. {
  2. "executeApplicationName": "hive", //Engine type
  3. "requestApplicationName": "dss", //Client service type
  4. "executionCode": "show tables",
  5. "params": {
  6. "variable": {// task variable
  7. "testvar": "hello"
  8. },
  9. "configuration": {
  10. "runtime": {// task runtime params
  11. "jdbc.url": "XX"
  12. },
  13. "startup": { // ec start up params
  14. "spark.executor.cores": "4"
  15. }
  16. }
  17. },
  18. "source": { //task source information
  19. "scriptPath": "file:///tmp/hadoop/test.sql"
  20. },
  21. "labels": {
  22. "engineType": "spark-2.4.3",
  23. "userCreator": "hadoop-IDE"
  24. },
  25. "runType": "hql", //The type of script to run
  26. "source": {"scriptPath":"file:///tmp/hadoop/1.hql"}
  27. }
  • Sample Response
  1. {
  2. "method": "/api/rest_j/v1/entrance/execute",
  3. "status": 0,
  4. "message": "Request executed successfully",
  5. "data": {
  6. "execID": "030418IDEhivebdpdwc010004:10087IDE_hadoop_21",
  7. "taskID": "123"
  8. }
  9. }