Announcing Altostra partnership with Splunk

We teamed up with Splunk so our users can automatically send application logs from Altostra deployments straight to Splunk without code changes.
Yossi Ittach

December 2 2021 · 7 min read


Announcing Altostra partnership with Splunk


Logging in serverless is a pain in the rear end. Altostra partners with the top ELK / Monitoring services out there so you can seamlessly forward all your logs to your favorite provider, and recently, we added Splunk. You can skip right to the technical details.

Logging in Serverless

Logging on regular services is (relatively) easy - you configure your logger, it forwards everything to your file/drive/service/ELK, and that's it.

When you move to serverless, logging becomes a bit annoying. When running serverless, you're limited by both resources and run time: The last thing you want to do is waste your Lambda function's CPU and run time on waiting for EFS drives, or worse, sending text-heavy http requests with your logs' content to external services.

Automate and forget

There are several ways to work with the logs limitation - some developers just use their cloud provider's ELK, like AWS CloudWatch logs. Others will manually define triggers for each lambda, so every time a log is emitted, it will trigger another lambda, which will deliver the logs to their preferred service.

As you can guess, both these approaches have disadvantages: they do not scale well, they make separation of environments very hard, they don't allow log tagging, and so forth.

Logging in Altostra

Here at Altostra we're dedicated to making serverless seamless - so we decided to make logging as easy as possible. If you're working with either DataDog,, or our latest addition, Splunk, you can just setup your integration, and we'll make sure your deployment delivers all your logs to your favorite service.

You can even setup a different logging provider / account for each environment - so you can easily separate your Dev, QA and Production accounts, and make sure there's no cross-contamination between them.

Setting up the integration

Getting Splunk HEC details

Splunk Cloud allows you to define an Http Event Collector (HEC) that you can send http events and logs to. To connect to that HEC, we'll need 2 things: The HEC token, and the HEC Url

HEC Token

To get the HEC Token, you'll need to create a HEC. In your Splunk account, go to "Settings" -> "Data Inputs" -> "HTTP Event Collector" -> "Add New".

After naming your HEC and selecting the indexes, you'll get new HEC Token. That's the one we'll be using.

Create HEC token


The HEC Url Is your splunk cloud url, with :8088/services/collector/event at the end, i.e

Testing your HEC endpoint

To make sure everything works correctly, you can curl your HEC Url and check for the log message on your Splunk dashboard using curl, in this format:

curl -k  https://${YOUR_HEC_ADDRESS}:8088/services/collector/event -H'Authorization: Splunk ${YOUR_HEC_TOKEN}' -d '{"event":"Hello, Splunk!", "sourcetype": "manual"}'


$ curl -k -H'Authorization: Splunk f78a6626-fca0-4a9c-ae50-1234' -d '{"event":"Hello, Splunk!", "sourcetype": "manual", "source": "Altostra"}'

And your result should like this:


Now you can go to your splunk dashboard, and look for the latest message:

Search Splunk

Adding Splunk to Altostra

Next, you need to set up the integration in Altostra:

  1. Go to Altostra Account Settings > Integrations > Observability.
  2. Click Connect on the Splunk integration.
  3. Enter a Name, the HEC Token you created earlier, and the HEC Url.
  4. Click Connect to create the integration.
connect splunk account

You can create multiple integrations to use for different environments, like development, QA, staging, production, or whatever suits your needs.

Using the integration

Once you create a Splunk integration, it becomes available to use in your environment settings.

When you deploy projects to an environment configured with the Splunk integration, Altostra automatically adds the necessary resources and configuration to send all logs produced by your Lambda functions to your Splunk HEC.

To configure an Altostra environment to send logs to Splunk:

  1. Go to Altostra Environments and click the environment you wish to configure.
  2. Switch to the Settings tab.
  3. Select the Splunk integration you’ve set up earlier under Log Shipping.
  4. Click Save Changes.
environment settings

Next steps

Want to give it a try? We provide a free-forever plan for developers. Create your free account today at

Want to stay up to date with the latest Altostra news? Join our community on Discord.

We want to hear from you! Share your Altostra experience with us on Twitter @AltostraHQ and LinkedIn and any suggestions that can make your work with Altostra even better.

Happy coding!

By submitting this form, you are accepting our Terms of Service and our Privacy Policy

Thanks for subscribing!

Ready to Get Started?

Get Started for Free

Copyright © 2022 Altostra. All rights reserved.