Uploading PDFs to Destination Servers
After a successful rendering priint:cloud can directly PUT por POST the PDF to a URI given in the configuration.
Upload is triggered on successful rendering and very similar to the callback except that it will contain the binary data of the PDF.
To activate uploads the REST API parameters must be configured to create jobs with an upload destination like in the following example:
{
"upload": "https://example.com/priint/v1/upload/{foreignKey}?entityType={entityType}"
}
This example uses two placeholders. Placeholders are not required in projects but fairly common to control the interaction between priint:cloud service and customer side web endpoints.
entityType is a custom field and will be replaced with whatever value if was given, when triggering the rendering on the /renderpdf endpoint.
The following parameters can be used in payloads or URI patterns:
| Name | Description |
|---|---|
| location | The absolute download URI for the PDF using the storage endpoint of the REST API |
| tenant | Name of the priint:cloud tenant |
| project | Name of the priint:cloud project |
| user | The authenticated user |
| ticket | priint:cloud ticket id useful to refer to a rendering |
| foreignKey | Value set in the original /renderpdf call that triggered the job. Used to correlate the upload to a content system specific request id. |
| dateTime | ISO DateTime string for the rendering |
| timestamp | Milliseconds since 1970 for the rendering |
| <custom> | Any custom variable set in the original /renderpdf call. |
For the URI pattern defined in the /restapi/config.ion you will need a connection entry in the /upload/config.ion. Both parts together allow you to adapt to nearly any existing endpoint. You can specify custom request headers, authentication, and build the payload from a pattern.
The PDF can be streamed as the response body (content-type: application/pdf) or as part of a form upload (content-type: multipart/formdata).
URL scheme is typically https, but ftp or sftp are supported as well.
If the endpoint responds with success (HTTP 2xx) and if no additional "callback" is configured then the rendering service regards the job as completed and the PDF as 'shipped' and will per default remove all files related to that job from the object store.
In case the endpoint fails the request will re-attempted once after a short delay The details of this behavior can be overridden by configuration.
Example 1: PDF directly in the response body
{
"connections": [
{
"method": "POST",
"url": "https://example.com/priint/v1/upload/",
"authentication": {
"type": "ApiKey",
"key": "API-KEY",
"value": "7271d91c0d8e104bcc93c9f7af5626bbfae2c0295eced7d86985b90f639e8844"
},
}
]
}
This is a very simple case where most defaults apply.
Only authentication is added to all POST requests to the upload endpoint.
Content type application/pdf will be selected automatically and binary data of the PDF will be sent.
Example 2: PDF as file upload
{
"connections": [
{
"method": "POST",
"url": "https://example.com/priint/v1/upload/",
"authentication": {
"type": "Basic",
"username": "suedoe",
"password": "9359d83cc66c78e61eb5f13e4b6a91f5071db712",
},
"header": {
"content-type": "multipart/form-data"
},
"payload": [
{
"fieldName": "{entityType}",
"mediaType": "application/json",
"value": "{\"id\":\"{idValue}\",\"attribute\":\"priint_pdf\",\"locale\":\"{locale}\",\"scope\":null}"
},
{
"fieldName": "file",
"filename": "{entityType}.{idValue}.{locale}.{ticket}.pdf",
"mediaType": "application/pdf"
}
]
}
]
}
This is a more complex example which creates an HTTP file upload similar to a web browser form.
The form got two fields: the first containing a string field with a JSON message, and the second one with the actual PDF file.
There is only one payload element with a filename allowed. This will contain the binary.
Placeholders in field properties will be replaced by their actual value.
entityType, idValue, locale all are custom fields and will be filled by whatever value if was given, when triggering the rendering on the /renderpdf endpoint.
If in the example foreignKey would be bd6c313f5094d2a45834eb4d92c2caa7, entityType would be productVariant , idValue would be 0815a, and locale would be de_DE then full request would look like this:
POST /priint/v1/upload/bd6c313f5094d2a45834eb4d92c2caa7?entityType=productVariant&downloadUrl=https%3A%2F%2Fapi.priintcoud.com%2Frest%2Ftenants%2Fexample.com%2Fprojects%2Fmy-first-datasheet%2Fstorage%2F76wirw7rwei842.pdf
Host: example.com
Authorization: Basic c3VlZG9lOjkzNTlkODNjYzY2Yzc4ZTYxZWI1ZjEzZTRiNmE5MWY1MDcxZGI3MTI
Content-Type: multipart/form-data; boundary=----7MA4YWxkTrZu0gW
----7MA4YWxkTrZu0gW
Content-Disposition: form-data; name="productVariant"; Content-Type="application/json"
{"id":"0815a","attribute":"priint_pdf","locale":"de_DE","scope":null}
----7MA4YWxkTrZu0gW
Content-Disposition: form-data; name="file"; Content-Type="application/pdf"; filename="productVariant.0815a.de_DE.13fK509P4d2ma458s34ePb4.pdf"
...binary...
----7MA4YWxkTrZu0gW
Using uploadType for Custom Uploaders
Some upload destinations require more than a single, straightforward HTTP request.
To support these cases, the configuration allows you to set a uploaderType parameter.
This tells the system to use a custom uploader implementation instead of the default “one HTTP transaction” flow. Each uploader type encapsulates the special logic needed for its target system.
Why Custom Uploaders?
Certain destinations cannot be handled with a simple PUT/POST:
-
Amazon S3
Requires AWS-specific headers (keys, signatures, hashes) that must be recalculated for each data block.
Large uploads may be split into multiple chunks. -
Bynder DAM
Involves a multi-step workflow:- Request an upload URL from Bynder’s API.
- Upload the binary to temporary cloud storage.
- Poll Bynder until the file is ingested.
- Update the DAM media asset reference.
Example
To use a custom uploader, set the uploadType in your connection configuration:
{
"connections": [
{
// url must be server-without-bucket-host plus bucket as a path component
"url": "https://s3.region-code.amazonaws.com/mybucket",
"method": "PUT",
"uploadType": "s3",
"authentication" : {
"accessKeyId": "AKIAxxxxxxxxxxxxxxxx",
"secretAccessKey": "{sealed:1:xxx:xxx:xxxxxxxxxxxxxxxxxxxxxxxx}",
}
}
]
}
For detail documentation see Custom Uploader.