Enable IBM Cloud Logs
Cloudflare Logpush supports pushing logs directly to IBM Cloud Logs via dashboard or API.
-
Log in to the Cloudflare dashboard ↗.
-
Select the Enterprise account or domain (also known as zone) you want to use with Logpush. Depending on your choice, you have access to account-scoped datasets and zone-scoped datasets, respectively.
-
Go to Analytics & Logs > Logpush.
-
Select Create a Logpush job.
-
In Select a destination, choose IBM Cloud Logs.
-
Enter the following destination information:
- HTTP Source Address - For example,
ibmcl://<INSTANCE_ID>.ingress.<REGION>.logs.cloud.ibm.com/logs/v1/singles
. - IBM API Key - For more information refer to the IBM Cloud Logs documentation ↗.
When you are done entering the destination details, select Continue.
-
Select the dataset to push to the storage service.
-
In the next step, you need to configure your logpush job:
- Enter the Job name.
- Under If logs match, you can select the events to include and/or remove from your logs. Refer to Filters for more information. Not all datasets have this option available.
- In Send the following fields, you can choose to either push all logs to your storage destination or selectively choose which logs you want to push.
-
In Advanced Options, you can:
- Choose the format of timestamp fields in your logs (
RFC3339
(default),Unix
, orUnixNano
). - Select a sampling rate for your logs or push a randomly-sampled percentage of logs.
- Enable redaction for
CVE-2021-44228
. This option will replace every occurrence of${
withx{
.
- Choose the format of timestamp fields in your logs (
-
Select Submit once you are done configuring your logpush job.
To set up an IBM Cloud Logs job:
- Create a job with the appropriate endpoint URL and authentication parameters.
- Enable the job to begin pushing logs.
To create a job, make a POST
request to the Logpush jobs endpoint with the following fields:
- name (optional) - Use your domain name as the job name.
- output_options (optional) - This parameter is used to define the desired output format and structure. Below are the configurable fields:
- output_type
- timestamp_format
- batch_prefix and batch_suffix
- record_prefix and record_suffix
- record_delimiter
- destination_conf - A log destination consisting of Instance ID, Region and IBM API Key ↗ in the string format below.
ibmcl://<INSTANCE_ID>.ingress.<REGION>.logs.cloud.ibm.com/logs/v1/singles?ibm_api_key=<IBM_API_KEY>
- max_upload_records (optional) - The maximum number of log lines per batch. This must be at least 1,000 lines or more. Note that there is no way to specify a minimum number of log lines per batch. This means that log files may contain many fewer lines than specified.
- max_upload_bytes (optional) - The maximum uncompressed file size for a batch of logs. We recommend a default value of 2 MB per upload based on IBM's limits, which our system will enforce for this destination. Since minimum file sizes cannot be set, log files may be smaller than the specified batch size.
- dataset - The category of logs you want to receive. Refer to Datasets for the full list of supported datasets.
Example request using cURL:
Required API token permissions
At least one of the following token permissions
is required:
Logs Write
curl "https://api.cloudflare.com/client/v4/zones/$ZONE_ID/logpush/jobs" \ --request POST \ --header "Authorization: Bearer $CLOUDFLARE_API_TOKEN" \ --json '{ "name": "<DOMAIN_NAME>", "output_options": { "output_type": "ndjson", "timestamp_format": "rfc3339", "batch_prefix": "[", "batch_suffix": "]", "record_prefix": "{\"applicationName\":\"ibm-platform-log\",\"subsystemName\":\"internet-svcs:logpush\",\"text\":{", "record_suffix": "}}", "record_delimiter": "," }, "destination_conf": "ibmcl://<INSTANCE_ID>.ingress.<REGION>.logs.cloud.ibm.com/logs/v1/singles?ibm_api_key=<IBM_API_KEY>", "max_upload_bytes": 2000000, "dataset": "http_requests", "enabled": true }'
Response:
{ "errors": [], "messages": [], "result": { "dataset": "http_requests", "destination_conf": "ibmcl://<INSTANCE_ID>.ingress.<REGION>.logs.cloud.ibm.com/logs/v1/singles?ibm_api_key=<IBM_API_KEY>", "enabled": true, "error_message": null, "id": <JOB_ID>, "kind": "", "last_complete": null, "last_error": null, "output_options": { "output_type": "ndjson", "timestamp_format": "rfc3339", "batch_prefix": "[", "batch_suffix": "]", "record_prefix": "{\"applicationName\":\"ibm-platform-log\",\"subsystemName\":\"internet-svcs:logpush\",\"text\":{", "record_suffix": "}}", "record_delimiter": "," }, "max_upload_bytes": 2000000, "name": "<DOMAIN_NAME>" }, "success": true}
To enable a job, make a PUT
request to the Logpush jobs endpoint. You will use the job ID returned from the previous step in the URL and send {"enabled": true}
in the request body.
Example request using cURL:
Required API token permissions
At least one of the following token permissions
is required:
Logs Write
curl "https://api.cloudflare.com/client/v4/zones/$ZONE_ID/logpush/jobs/$JOB_ID" \ --request PUT \ --header "Authorization: Bearer $CLOUDFLARE_API_TOKEN" \ --json '{ "enabled": true }'
Response:
{ "errors": [], "messages": [], "result": { "dataset": "http_requests", "destination_conf": "ibmcl://<INSTANCE_ID>.ingress.<REGION>.logs.cloud.ibm.com/logs/v1/singles?ibm_api_key=<IBM_API_KEY>", "enabled": true, "error_message": null, "id": <JOB_ID>, "kind": "", "last_complete": null, "last_error": null, "output_options": { "output_type": "ndjson", "timestamp_format": "rfc3339", "batch_prefix": "[", "batch_suffix": "]", "record_prefix": "{\"applicationName\":\"ibm-platform-log\",\"subsystemName\":\"internet-svcs:logpush\",\"text\":{", "record_suffix": "}}", "record_delimiter": "," }, "max_upload_bytes": 2000000, "name": "<DOMAIN_NAME>" }, "success": true}
Was this helpful?
- Resources
- API
- New to Cloudflare?
- Products
- Sponsorships
- Open Source
- Support
- Help Center
- System Status
- Compliance
- GDPR
- Company
- cloudflare.com
- Our team
- Careers
- © 2025 Cloudflare, Inc.
- Privacy Policy
- Terms of Use
- Report Security Issues
- Trademark